Last time we discussed about Impact Assessment , the next logical step after this is to perform the actual upgrade or migration of the code.

Moving the Code: Performing Upgrade or Migration of the Objects

When we talk about product upgrades, always the product vendor provides tools by which the objects in the earlier version can be upgraded to the latest version. Yes we would see some objects failing through while using such tools; these are the ones that would need rework after the upgrade process.

When we talk about product migration like moving from Cognos to Business Objects or Business Objects to Cognos, there is good scope for us to look for some ways to automate the code migration. Earlier discussions have been on how to leverage the metadata for understanding the environment, now we are looking at an option on how to manipulate or transform the metadata so that an object in platform ‘A’ becomes compliant to platform ‘B’.

Steps involved in building an automated product migration process

Perform metadata level object mapping between the two platforms, determine the gaps. This would actually be a ‘by product’ of ‘Step 2’ in Impact Assessment

Build individual components that would

  • Read the metadata from the source platform and prepare a repository
  • Have the knowledge of the match & gap between the platforms, could be reference tables
  • Transform the ‘source’ metadata and write out as understood by the ‘target’ platform by using the reference tables

Benefits of Automated Migration

  • Helps avoid creation of objects from scratch
  • Ensures availability of time for testing (core task) than code development
  • Enables team to have a flexible skillset
  • A faster way of delivering things when a ‘one to one’ migration from the source platform is seen as a must

Automated Migration Challenges

Transforming the source metadata to the target platform would be a challenge, especially with data manipulation functions. Having a good understanding of the gaps will help; a reference table mapping the functions between the platforms would be useful. In scenarios where a function cannot be converted to the target platform, a comment can be written into a log file enabling quicker attention.

Have seen good success in writing such automated migration components though not 100%. With almost every products providing good SDK kits for reading and as well writing metadata and as well with the support for XML structures, writing such bridges for object migration are getting easier.

Whether the objects in a product are migrated/upgraded in an automated way or not, the following activity of ‘Validation’ plays a key role in ensuring the final quality, next time let us discuss on some of the means for effective validation ….

Posted by Muneeswara C Pandian
Comments (2)
November 24th, 2008

Comments (2)

Muneeswara C Pandian - December 2nd, 2008

Definitely User & Object security definition is one key metadata that would need to be moved from the current platform to the target platform. Many BI products provide SDK to add/update/delete user info and as well change the security definition of objects through an automated process. In 2007 all major BI product releases has seen significant change in the security architecture from their earlier versions, which has made security definition even in product upgrades a special task. Generally the security definitions do not directly match between platforms and as well many a times we like to do it better in the new platform, so the steps involved would be like 1. Export the user definition and object security from the current platform to an excel 2. Make the required re-arrangement and definition changes in the excel as applicable to the target platform 3. Import the updated definitions from the excel to the target platform Export/Import would be components built using the SDK provided by the BI product which can read and write the metadata. Regarding Object Scheduling, one of the options that I have seen working is not related to actual recreation of source ‘schedule’ metadata into the target platform but more of an automation of schedule script creation. All BI products support scheduling of the reports via SDK APIs. The following steps should enable automated schedule creation or the script generation 1. A list of reports in an excel sheet along with the info of the scheduling 2. Have a component that can read the excel and invoke the SDK API of the reports or create specific schedule script for each report Thanks for reading the blog.

Ravikiran Dharmavaram - November 26th, 2008

Hi Pandian, How do we deal with security, Architecure involved details like scheduling servers & Folder level rights and so on while doing migration, I understand for upgrade there are tools to do that, How about migration from one tool to another? Ravikiran.

Comments are closed.