Do your bank’s processes allow you to scale?

You have lending and credit processes that have worked well for years. Growth has been gradual so you have not noticed, nor been able to measure, the cracks that have appeared in the foundation of those processes. Now, you’ve started an integration effort for a bank that’s been acquired, or you’ve been tasked by the Board to increase loan volume by 50% over the next 18 months while maintaining your current cost structure. Both events give you pause. Will your existing processes scale in a seamless manner? Or, will they break down under the pressure of increased workload and cause significant delays in processing; creating disruption in customer service and increasing your risk exposure? And, to potentially make matters worse, do you have a handle on how you will measure any potential performance degradation? Staff performance, monitoring and management will no longer be as simple as it once was when the team was smaller. What do you do? Immediately hire more staff, right? Or possibly shift responsibilities so your higher-paid, client-facing staff is taking on more steps in the process since they know the clients and opportunities best? We’ve seen this happen, and it’s probably one of the most expensive paths you can choose. While these steps might seem to be the easiest way to respond, they are not your best course of action for the long term.

We believe that now is the time whether you are experiencing rapid growth, an efficiency crisis or are in a steady state to evaluate and tune your credit processes from the top down. Our experience tells us that starting with well-defined goals and performance objectives is key. Creating processes that identify well-understood milestones, potential chokepoints, risk factors and key pressure points can give you and your team a firm, predictable set of metrics from which to manage your lending and credit processes. This will create clarity, so you are able to effectively communicate the processes and metrics to both the management team for buy-in and support, and to the teams involved so they can execute against a well-defined set of performance objectives.

Once solid metrics are in place, you can then apply growth, contraction or other resource models with confidence to predict how the overall processes will perform; measuring your level of responsiveness to your clients and accurately predicting your costs and profitability.

Capstone Client Visit

Global Wave Gives Back

Global Wave Group is active in our local community, Orange County, California. Here is a great article from the University of California, Irvine about Global Wave’s sponsorship of a 2018 capstone project in the Department of Informatics. Happy Holidays!

Mobile Banking Prototype Exemplifies Value of Capstone Classes for Students and Businesses






Banking Software Containers

Containerization for Private Cloud / On-Premises: The Future of Banking Software

After a new value proposition takes the market by storm, as with Salesforce’s SaaS model introduced in 1999, the value proposition will eventually be digested, taken apart and reconstituted.

Grant Miller, CEO of Southern California-based Replicated, rebuilds the SaaS value proposition in a surprising way, as detailed in a TechCrunch opinion piece.

Today, Miller argues, two of the original SaaS arguments are shopworn and are no longer compelling: “Go multi-tenant to save costs” and “Centralize services to ease deployment, maintenance and upgrades”

  • With respect to multi-tenant, the cost of computing is low, whatever setup you choose. Security and control are paramount; cost savings on computing resources is somewhat less important. Thus the rise of the private cloud and the hybrid cloud.
  • With respect to centralized services, software buyers have never been comfortable with having customer data on a SaaS provider’s servers.  Containerization is a new technology that makes centralized services optional.

Containerization is the future of on-prem. “Enterprise software packed as a set of Docker containers orchestrated by Kubernetes or Docker Swarm, for example, can be installed pretty much anywhere and be live in minutes,” Miller writes.

Bankers, consider that private cloud / on-premises software can be served to users in a browser application (just like SaaS), with deployment, updates and upgrades provided in containers to your IT team. Private cloud / on-premises gains the ease-of-use of SaaS while retaining behind-the-firewall security and control.

The next time a SaaS vendor tells you that your ultra-sensitive bank customer data is safe with them, you could give them your best Office Space impression:

“Yeah, I’m gonna need you to containerize your SaaS, okay? Ah, I almost forgot, I’m also gonna need you to go ahead and deploy, maintain and upgrade behind our firewall. So, if you could do that, that would be great…”






First Foundation Bank Implementation

Awesome Implementation of Credit Track at First Foundation Bank

Credit Track Implementation at First Foundation Bank

As a fast-growing commercial and private bank with $4.5 billion in assets, First Foundation Bank was delighted with their procurement and implementation of Credit Track, Global Wave Group’s straight-through commercial loan origination system.






Guest Post – CECL Pointers: What to Do Now, What You May Have Missed For Later

AuditOne, LLC Co-CEO Jeremy Taylor has prepared a summary of the proactive measures financial institutions need to consider now to better prepare for the new Current Expected Credit Loss (CECL) standard. We’re delighted to host this guest blog post.


A lot is being written these days about the new Current Expected Credit Loss (CECL) standard for the ALLL and what it’s going to do to bankers’ lives.  There are plenty of summaries available out there.  We’re going to stick here to two angles.

  1. What you need to do now to prepare.  For many institutions (the non-public), there’s still about three years until you need to be reporting your loan loss reserving in accordance with CECL.  Which means temptation to postpone. But there are a couple of things all institutions should be doing right now to lay the groundwork, even if time can still be taken for other things (like considering alternative calculation methodologies, available vendor models). That’s because #2 below will require a lot of planning, to ensure you have those needs fully anticipated and ready to go.  Which will in turn become the top agenda item for #1.
    1. Form a CECL Committee.  At a smaller institution, the obvious participants are the CCO, CFO and COO/CIO (or their designees), all of them having direct interests in the process.  At this earlier stage, the Committee will have an education role for the bank, and will need to be gathering information for future decisions on models, methodologies, et al.  But its key near-term responsibility will be to:
    2. Identify and arrange for collection of all required data.  This applies both in terms of time series (i.e., as far back as can reasonably be gathered) and
      cross-sectionally (i.e., a broader range of data series than currently required). It applies both to internal data (i.e., loss and other performance characteristics for the institution’s loan portfolio, down to the borrower and loan level) and external (e.g., macroeconomic conditions in relevant markets, peer bank loan performance metrics).  It should be noted that identification of data needs will require at least some sense of how reserve requirements will be calculated (modeled).
  2. What may not have registered.  The 2016 guidance on CECL was deliberately vague as to how to go about setting up a CECL-compliant approach.  This was appropriate simply because of the vast differences across the US financial system in size, sophistication, data availability, MIS capabilities, in-house expertise/understanding, etc., etc.  But there are some key features or characteristics of CECL whose significance and implications may not have fully registered, that we thought might be helpful to highlight.
    1. The general vs. specific reserving distinction (i.e., FAS 5 vs. 114) is going away. That’s because the current approach to impairment analysis is in line with the general CECL approach (whatever the loan quality) – i.e., estimating potential loss over remaining life of the loan.  So the carve-out of impaired loans, with their own manual of requirements, will no longer be needed.
    2. But there will still be pooling.  CECL envisages estimation of potential loss on the basis of pooling assets with similar (risk of loss) characteristics, similar to today’s approach.  That could apply to impaired assets, such as mortgages or consumer loans with common borrower and structural features and common drivers of credit impairment.  But it is likely that larger commercial loans that are adversely graded will continue to be handled and reported individually.
    3. CECL will apply not just to loans but also to securities.  But not to a trading portfolio.  For HTM securities, you’ll need to estimate a lifetime credit loss, just like for loans.  For AFS, rather than the current requirement of (irreversible) OTTI assessment, there will be a valuation adjustment to reflect the difference between fair value and amortized cost.  Estimation of lifetime expected loss can be done on a pooled basis for securities with similar risk characteristics.
    4. When you book a new loan or security, you book the expected credit loss as an expense right away.  It’s no longer the incurred loss approach of booking when a loss is deemed probable.  Rather, it’s an up-front estimation as to how much might be lost actuarially, given the mortality (i.e., default and recovery) characteristics of that type of borrower and loan.  On average you’re going to lose a little making a given type of loan; recognizing this with a day one loss provision is entirely appropriate.  Doing so will help remind us that our credit spread is intended to cover that expected loss amount (with capital there to protect against outlier (“unexpected”) losses).
    5. CECL’s impact on reserve levels may be material – but shouldn’t be excessive.  Intuitively, moving from losses already incurred (which in practice is typically calculated based on a one-year loss horizon) to a life of loan should boost the required reserves; it means a longer period over which losses might occur. True, but there are offsetting effects.  Most importantly, smaller financial institutions today are typically carrying booked reserves in excess of required (i.e., calculated) levels – and that’s after using Q-factors to push up the required levels.  The move to CECL will push up required loss reserves, but for many institutions that may still lie below the current actual reserve level.
    6. Regulators recognize that CECL implementation will vary widely.  For large institutions, splitting probability of default (PD) from loss given default (LGD) will be expected, along with more powerful migration or vintage analysis approaches.  Smaller institutions, on the other hand, should be able to build on their current ALLL methodology in order to satisfy regulators – e.g., still starting with historic loss rates, but looking back over a longer time horizon; still adding on Q-factor adjustments, but looking out over a longer (remaining life) horizon.  However:
    7. More institutions will find vendor software worth considering – as much for managing the more onerous data expectations as for increases in complexity of calculations required.

 






Vanity Project: “Let’s disrupt our day-to-day business and aggravate our people”

We want to highlight an applicable banking blog post from last year: Scott Hodgins’ warning about Salesforce.com vanity projects. This topic is sure to become evergreen in the banking industry, as financial services is a heavily-targeted industry vertical for Salesforce CRM (customer relationship management).

Complexity bias, a cognitive bias in favor of complex solutions, is a possible cause for vanity projects. Design guru Don Norman explains one motivation behind our complexity bias: “We seek rich, satisfying lives, and richness goes along with complexity.”

Making things worse, increased knowledge increases the preference for complexity. When knowledgeable bankers attempt a vanity project or ‘big bang’ software implementation with hundreds or thousands of affected users, complexity bias may be at work.

Choosing a Salesforce platform for banking can be rational, but the following cost must be factored into a rational decision: The friction and effort for users across the bank or credit union to think in the Salesforce platform’s visual language and workflow. Users have long experience with the Microsoft platform’s visual language. Dynamics and Power BI deserve a look for that reason alone. Salesforce’s analytics and artificial intelligence efforts are exciting, but Microsoft has recently launched competing products and services.

Here at Global Wave Group, we strive to use a design language already familiar to Microsoft users. This familiar design language is infused into our new user interface release of version four of our flagship product Credit Track. Credit Track integrates easily with Salesforce or Dynamics. Salesforce has an open API that is painless for us to work with. Further, Credit Track includes a CRM designed specifically for commercial lenders, an alternative to general CRMs such as Salesforce and Dynamics.

What’s the alternative to a vanity project or a big bang? As Hodgins says, in smaller or phased projects, vendors and executive sponsors are held accountable. We would add that in addition to holding vendors and executive sponsors accountable, users should be held accountable, but only if they are given a fair deal. Hold users accountable for adopting new software, in exchange for (a) disrupting their lives as little as possible and (b) procuring software that uses a familiar visual language.