Gluent

How Banks Are Reducing the Regulatory Burden with Gluent and Google Cloud

In the decade following the global financial crisis, governments have significantly increased regulatory requirements on financial institutions. This additional regulatory burden has resulted in hundreds of century-old community banks needing to merge and larger banks spending billions of dollars to maintain compliance. According to the Thomson Reuters 2020 Cost of Compliance Report, financial institutions listed keeping up with regulatory change, budget and resource allocation, and data protection as their top three compliance challenges. Let’s look at how the combination of Gluent Data Platform and Google Cloud can help your bank simplify regulatory challenges.

One Version of the Truth

One of the main complexities with the current regulatory environment is the continual evolution of regulatory requirements. Whether it’s a governing body passing new regulations or a new auditor examining the bank every few months, compliance teams are in a constant state of adaptation, with each update and audit requiring different information than the last. For many financial institutions, the burden is further intensified as their data is either residing in operational silos such as retail, treasury, and corporate banking, or copies of the data are being stored in multiple locations. In both scenarios the issue that needs to be addressed is the same – ensuring that compliance officers have access to accurate and complete information.

Gluent’s transparent data virtualization enables your bank to automatically offload data from the operational silos into Google BigQuery and drop the data from the original Oracle database(s). Then Gluent Data Platform’s transparent query engine can present the data back to any Oracle database or application, without needing to write ETL. This cloud warehouse creates a single version of the data, ensuring your compliance officers have access to complete and accurate information.

View Gluent’s Process for Migrating to BigQuery

Additionally, once the data is centralized in BigQuery, Machine Learning and AI provide the opportunity for numerous increased efficiencies. Banks can utilize enhanced analytics to correlate previously unrelated data points, providing better fraud and anti-money laundering detection, resulting in less financial loss. Customer data can be analyzed to provide more accurate financial and investment recommendations, resulting in increased income and customer satisfaction. Lastly, through access to larger datasets, the bank’s underwriting teams can improve their decision making to ensure the soundness of loans.

Decreasing Expense

Financial institutions are in a unique position to gather unparalleled amounts of information on their customers. While this creates opportunities for efficiency, it also generates the expense of storing and maintaining rapidly growing datasets. This issue is illustrated by the fact that BYN Mellon and Santander spend 29% and 24% respectively of their annual operating costs on IT. Due to the exponential data growth and regulatory requirements from the FDIC to maintain customer records for a minimum of five years, scaling on-prem databases to contain this data is no longer cost-efficient.

Rather than making massive infrastructure commitments years in advance and facing increasing Oracle costs, Google BigQuery provides a serverless, scalable, and cost-effective cloud warehouse.

BigQuery automatically replicates storage in redundant locations at no additional cost, eliminating the need for expensive on-prem data replication for disaster recovery.

BigQuery allows for the separation of storage and compute. Decoupling these resources provides your team the ability to increase query performance or storage capacity independently while limiting expense at the same time.

Whether you want on-demand pricing to scale with your usage, flat-rate pricing for simplified accounting, or a multi-year commitment to leverage sustained use discounts, Google Cloud and Gluent allow customers to drive their pricing model.

Gluent Data Platform’s automated offload and transparent query functionality provide further cost savings, enabling customers to redeploy their capital to offset the costs for hiring, training, and retaining compliance officers.

Avoiding Data
Duplication

Zero Application
Changes

Zero ETL
Creation

Increased
Performance

1. Avoiding Data Duplication

Traditional migration and virtualization often results in data being duplicated on-prem and in the cloud. Gluent migrates data to BigQuery, drops the data from the on-prem environment, and then Gluent transparently virtualizes only the required data back to Oracle as needed. This results in a near immediate realization of ROI.

2. Zero Application Changes

Instead of spending 18 months to plan and rewrite your application to use the syntax of Google BigQuery, our Transparent Query Engine ensures your legacy applications work as they always have, with no code changes required. This significantly reduces your costs by eliminating 1,000’s of people hours of development time required to rewrite application code.

3. Zero ETL Creation

Migrating data is the easy part, right? Even if you intend to leave your legacy application as is and extract data into Google BigQuery, you will need to create a data pipeline for each table you want to move. This adds effort, complexity and maintenance to your environment. With Gluent Data Platform, we virtualize your data model, handle the data type mapping and automatically create the rules for data movement.

4. Increased Performance

Cloud native analytic engines are very different than legacy databases and a typical migration project can require a significant performance tuning effort. Gluent has a built in optimization engine that pushes most processing to Google BigQuery via our Smart Connector, allowing you to reduce your need for Oracle cores, resulting in decreased costs and increased performance.

Securing Your Data

Unfortunately, Ransomware has become a highly relevant concern within the financial industry over the past several years. The former CEO of Cisco, John Chambers, said he believes there will be 65,000 ransomware attacks on U.S. companies in 2021 alone. In one of the most recent attacks, Valley National Bank, a $41b asset bank in New Jersey, was targeted in a ransomware attack by Avaddon Ransomware. After gaining access to the bank’s network, the criminal group selected databases, exfiltrated sensitive data and confidential documents, and then sent a message to the bank stating they would release the sensitive information if their demands were not met.

While maintaining an on-prem network may provide the perception of hardened security, banks must conduct an authentic evaluation of their resources and ability to successfully combat cybercriminals. An alternative option is to utilize the industry-leading security controls and infrastructure Google Cloud has in place.

Secure by Design

A common concern regarding any cloud is, “How secure is the data while not physically located on-prem?” Google Cloud’s infrastructure utilizes a multi-layered security approach providing security through the entire information processing lifecycle. Here is a look at Google Infrastructure Security Layers:

Additionally, by default, Google Cloud provides data encryption in transit and at rest and maintains numerous compliance certifications including ISO/IEC 27001, 27017, 27018, 27701, SOC 1/2/3, PCI-DSS, and FedRAMP.

IAM Access Controls

Google’s Cloud Identity Access Management provides enterprise-grade access controls while simplifying compliance with an automated audit trail history. Google’s Designing and Deploying Data Security White Paper further describes IAM as, “The master control center for authorizing who can take action on any particular resource within the GCP environment you manage…IAM policies propagate down the hierarchy structure of your GCP environment:

  • Organization level: This resource represents your company. IAM roles granted at this level are inherited down to all resources within the organization.
  • Project level: Projects are a way of creating a boundary of services and resources.
  • Resource level: Individual services and objects managed within a project.”

Mitigate Data Exfiltration

As explained in Mukesh Khattar’s Medium article, Mitigate Data Exfiltration Risks in GCP using VCP Service Controls, “VPC Service Controls enables security administrators to establish security perimeter (referred to as “the perimeter” henceforth) around sensitive data stored in Google Cloud Platform resources such as Google Cloud Storage (GCS), BigTable instances, and BigQuery datasets. When we enforce the perimeter, resources can freely exchange data within the perimeter boundary as long as IAM policies and VPC Firewall policies allow it. However, any access from outside is blocked by default.” 

Peer Success Stories

Global Payments Company

To meet regulatory and company policy requirements, this company needed to archive ten years of transaction history. However, the archiving resulted in immense storage pressure as their primary transactional database exceeds 1PB. By using Gluent’s archiving capabilities, they can offload this data in months instead of years to the more cost-efficient Google BigQuery, enabling faster reporting and limitless scaling as the growth rate within the database exceeds 1TB per day.

Global Investments Bank

Historically, data was being archived to ZFS from Exadata. When users required access to archived data, a manual process with a 24-hour turnaround was performed to load the data back to Exadata from ZFS. Gluent Data Platform provides the bank the ability to offload data and present the archived data automatically without rewriting any code, eliminating the manual 24-hour process.

Regulatory requirements are not going away, but Gluent and Google Cloud can help mitigate their impact through unifying your data, reducing your expenses, and increasing your security.

Let’s discuss your cloud readiness.