Blogger: Marcus Collins
In my recent post I detailed key points organizations should consider when embarking on a database consolidation initiative. In this post we’ll look at the different approaches and describe a structured approach which will It will lead to more predictable and repeatable results, be more efficient to execute, and provide the transparency required by IT governance processes.
There are two different approaches to database consolidation namely:
- Hardware consolidation
- Software consolidation
With Hardware Consolidation multiple databases servers are consolidated onto a single physical server using server virtualization.
A benefit of the hardware consolidation approach:
- Risk reduction: Hardware consolidation is inherently a simpler process than software consolidation because no changes are being made to the physical database. All database objects and associated access privileges remain intact, so the testing required after the consolidation activity is minimal.
A drawback of the hardware consolidation approach:
- All or nothing: Hardware consolidation involves moving the server and all its constituent databases as a unit. As business processes evolve, a possibility exists that the databases on a server contend for system resources. Of course, this may be resolved by moving to a more powerful server.
With Software Consolidation multiple schemas, derived from one or more distinct databases, are consolidated into a single database.
The benefits of the software consolidation approach are:
Finer granularity: The software consolidation approach offers a finer degree of granularity than does hardware consolidation. Because the “unit of movement” is a database schema (i.e., one or more applications), this approach can be deployed when the databases on a server are contending for resources, have version incompatibility, and so on.
Database redesign: The software consolidation approach allows the database to be physically redesigned. The redesign can involve index rebuilds, data file placement to eliminate disk performance hot spots, data sorting, and archiving to improve performance.
Database version upgrade: The software consolidation approach can be combined with a database version upgrade. The rigorous testing required by such an upgrade can be used to validate the consolidation activity.
The drawbacks of the software consolidation approach are:
High risk: Software consolidation is inherently more complex and therefore involves higher risk than hardware consolidation. The source databases will have their structure, privileges, and so on, and all instance data extracted. The extracted data will then be inserted into the target database. The testing of the application after this extract/insert will be extensive.
Change control: Once deployed, databases are often accessed by users for reporting, data mining, and so on, that are unknown to the IT department. Major schema changes may impact these user applications and/or processes.
Organizations operate in a high competitive and regulated environment and so efficient and repeatable business processes are integral to an organization’s wellbeing. This applies equally to the database consolidation process. The recommended process is shown schematically below:
The Burton Group Structured Approach to Database Consolidation (SADC) takes a series of inputs and, through a complex tradeoff analysis, outputs a plan for a detailed analysis phase followed by the execution of a series of phased consolidations. The tradeoff analyses take the set of possible consolidation scenarios (i.e., schema-to-database and database-to-server combinations) and use the business objectives and constraints (i.e., prioritized tradeoff criteria) to determine the optimal consolidation scenario.
Organizations should treat database consolidation as a continuous process. At any given time, organizations will be either preparing for or executing a set of consolidation activities or phases. The reason for this is that the one of the key drivers for the SADC process is the hardware refresh schedule. In a large organizations, hardware will be refreshed on a cyclic schedule (e.g., three to five years), so in any given year, approximately 20% of the hardware will be refreshed.
The inputs to the process are:
- Application portfolio
- Hardware refresh cycle
- Consolidation approaches
- Consolidation considerations
The tradeoff is made using:
- Prioritized tradeoff criteria
In a recently published document - Structured Approach to Database Consolidation, Burton Group Senior Analyst Marcus Collins explores the topics described in this post in more detail. Each approach has its benefits and drawbacks and the document details theses. Adopting the approaches outlined in the document will allow organizations to develop a roadmap for the successful reduction in operational complexity.