CIO Influence
Data Management Datacentre Featured Guest Authors

Database Modernization: More Than Choosing A Shiny New Database Target

Database Modernization: More Than Choosing A Shiny New Database Target

Former CTO and IT veteran David Walker thinks CIOs looking to make the most of new cloud databases might still need a bit of help

Database modernization means many things to many people, but it should mostly mean going cloud native at the database layer. (I could talk about the real reasons for doing this, like outrageous licensing costs and the folly of relying on a database designed in the 1980s to do cloud in 2023, but let’s draw a veil over all that for now!)

Fundamentally, the motivation for database modernization should be to achieve the benefits of cloud that were sold to us all as: ending over-speccing and redundancy.  China’s Singles Day is the busiest day of card transactions in the world, requiring a massive compute. Back in the day, if you were a global payments provider, you bought a huge server to do that, on-prem but for most of the year this server sat idle and did nothing, because almost all your transactions were in the UK and you just needed 10% of the machine.

News for the CIO:  Snowflake Launches Telecom Data Cloud to Help Telecommunications Service Providers Monetize Data…

In other words, old-style IT meant buying infrastructure, services, and DBA support to meet one or two peaks. With a true cloud native database you are immediately, for free, elastically scalable. This means you can add more nodes and then take them away again, avoiding all of that asset wastage. And part of being cloud native means it’s not just on one server so, if your big server goes down you don’t lose any business.

For very good technical and basic enterprise IT architecture reasons (we can’t wish away) trying to make monolithic databases work in the cloud, or kludges like splitting up your databases into tiny chunks or even going NoSQL, just doesn’t work. You know this, and you also probably realize that you need a full database modernization.

That’s fine. Going cloud native gives you greater cost savings, greater resilience, and greater scalability. QED, cloud database is the way to go. Ahead of you lies the nirvana of finally realizing the full potential of cloud.

And this is where—potentially—your troubles could begin. If you barge into this move without some planning (and probably, a little expert advice).

Top Insights: Crisis? What Crisis? Managing “Bursty” Data Flows With Event-Driven Architecture Means Retailers…

The Steps to Cloud Database Heaven

Say you have a new cloud native, distributed SQL database target/replacement you want to standardize on. The problem is that if your new database partner cannot hand-hold you through some potentially tricky steps, this could get complicated very quickly.

That could cost you time and money. If you have to do it on your own, it might not be feasible to get there using the route you’re planning to travel. Why do I say that? Re-writing your application to work with a replacement and getting your data into a new home is a non-trivial task. This must be done properly as, after all, your data is without question the most important asset your business has. If it’s in multiple databases, possibly of archaic provenance, this could be tricky; there’s likely to be a lot of it, and it’s probably not well understood, either.

You almost certainly also have more than one database type to deal with (e.g., Oracle AND SQLServer AND Snowflake and so on) plus many instances of these databases. In fact, enterprises generally have between two and five database types; many have more than 10!

Aiming for the Optimal Combination of Database Types and Instances

I’m not saying you can’t have different database types for different use cases—some databases excel at analytics, and some at handling commercial or banking transactions. Your organization will probably have some Snowflake, some Cassandra, some PostgreSQL, some DB2 – maybe even our very own YugabyteDB (thank you!), and so on. But, if possible, you want to reduce the number of types of databases and their instances, as you almost certainly are needlessly duplicating data, resulting in all the associated problems of managing and integrating it and keeping it tidy.

If you can, you should use the excuse of your database modernization program to slim down your database stable and reduce your operational complexity. Using too many databases adds to the number (hence cost) of DBAs that you have, and the different skills you require in-house.

 

Ideally you’d want two or three types of databases, and then an instance of a database for each business service that you want to run. There’s an optimal combination that reduces costs and maximises resilience and all the other things that go with the database—which is the point of the exercise (database modernisation/moving to cloud native database), after all.

 

Moving these databases from off, again say, Oracle (nothing personal, Larry!) means redesigning the way that data is laid out in the system to improve access. It’s not going to be a straight copy of data and not a simple lift and shift. A lot of work is required.

 

Don’t be disheartened: There is some complexity here, but there is also modern tooling emerging from the cloud native world (backed by good professional services help) that can automate a lot of this. The good news is that this is a one-time effort worth doing and with the right help you can genuinely reduce your ongoing operational costs. Once there, you get to exploit all the cool cloud native features—but these benefits won’t come for free, there is an upfront cost and intellectual effort in doing this work.

 

NB, moving to a new database is an exercise that you have to do many times for each of your applications. If a database modernisation project is moving one application, then the manual cost might be acceptable; But, if you need to do it for 200 applications (hardly out of the question for, say, a financial services organisation), you need to industrialise and optimise it. So, if your cost is £50,000 for one migration, an automated way of doing this at scale could save you millions. Again, doing this properly and comprehensively will pay off in the long run.

 

A way to do this that’s as pain free as possible

So–perhaps for the first time?–there’s now a real, actionable business case for working with transactional business apps in the cloud using a standard like PostgreSQL.

But be aware that this is not a simple task and it requires thought and planning. So, perhaps it’s time to go to market to find the new cloud database you want to move to, plus a way to do it that’s as pain free as possible.

In other words, to transition to the database modernisation future you want, don’t try and skip the necessary steps of planning, assessing, migrating, finalising and cutover.

More importantly, don’t even start your database modernisation journey unless your new  cloud native transaction-friendly database partner has the right tools to help you through all of those critical steps.

[To share your insights with us, please write to sghosh@martechseries.com]

Related posts

Databricks Sets a New ETL Benchmark for 1 Billion Data Records

Sudipto Ghosh

Immuta Among the First Snowflake Partners to Integrate with Snowflake’s Data Lineage

More Than A Third Of Sensitive Business Information Entered into Generative AI Apps is Regulated Personal Data: Netskope Threat Labs

PR Newswire