facebooklinkedinrsstwitterBlogAsset 1PRDatasheetDatasheetAsset 1DownloadForumGuideLinkWebinarPRPresentationRoad MapVideofacebooklinkedinrsstwitterBlogAsset 1PRDatasheetDatasheetAsset 1DownloadForumGuideLinkWebinarPRPresentationRoad MapVideo

Hero Profiles

Ongoing Q&As with IT and business leaders focused on best practices, thinking, and tips for data integration success


Vinod Kumar is the Assistant Vice President of Software Development and Information Technology at Hannover Life Reassurance Company of America, the North American life and health reinsurance subsidiary of the Hannover Re Group. He is also the Chairman of the Advisory Board for Mphasis Wyde, where he is focused on providing insights on program management and IT and infrastructure planning. With over 16 years of experience in information technology, Vinod has a proven track record for driving business innovation and process improvements. He has played an integral part in HLR America’s integration project and has been instrumental in orchestrating the success of the project on a global scale for the company. In the following discussion, Vinod provides key insights he has gained from the data integration project and points out ways for companies to get a good head start on integration.

Q: What were the goals with your integration project?
A: We wanted our business and IT departments to combine responsibilities and make decisions about adopting new technologies or solutions that would increase business productivity, efficiency and revenue, while reducing cost. This is critical for us, and for most companies, as Gartner predicted that by 2018, 15% of workers would rely on priority services to discover, organize and contextualize information. Insurance companies are usually slow to adopt new technologies and methods and in the reinsurance segment, we are one step behind the broader market. We saw a need to give all stakeholders in the organization more organized and trustworthy data so that they could make effective decisions quickly and efficiently. We set forth with a goal that our integration effort would provide a single, consistent source of data for any downstream processes.

Vinod Kumar

Vinod Kumar
Assistant Vice President of Software Development
and Information Technology
Hannover Life Reassurance Company of America

To do this, we ultimately had to integrate information from a number of differing data sources. Because we dealt with data from disparate insurance clients, it was diversified and the data size ranged from small and manageable to complex and large. With this set of diversified sources, we needed to pay special attention to the integration process because the more sources, the more complicated the process. Our data model needed to be comprehensive enough to hold all possible data values from these various sources and provide a consistent data structure for any downstream process, now and in the future.

“We saw a need to give all stakeholders in the organization more organized and trustworthy data so that they could make effective decisions quickly and efficiently.”

Q: What were the key things your organization had to understand and change to get integration right?
A: First, we only looked at our US office and didn’t expect to implement the initiative across all of our offices, but our other offices saw the benefits and wanted to be a part of it. This ended up quadrupling the complexity of the project in both scope and management. With this change, we saw the need to implement a common system and get consensus from all the stakeholders involved in order to ensure success. This also meant that we had to factor in geographical and local regulations. Although the concept of reinsurance is common globally, there are various variables involved when it comes to administering the business from different locations. Because of regulatory factors, data considerations and client-specific projects, we had to rethink plans once we broadened the scope beyond the US to make it work for all of our local offices.

Another change was that we wanted IT to be heavily involved in developing the solution and then to hand it over to business users who are the experts in data. However, the scope and adding the additional offices required us to shift from a solution to a framework because we didn’t think having IT manage the data for the users was the most effective idea. We developed a framework that would be common to all, with the idea that each local office would add specifics for administering their own business. During the course of the development, we made a lot of changes as we learned and discussed the project and as people and teams got onboard.

Q: What was the biggest hurdle your teams had to overcome?
A: Even if people know a new process is likely to achieve better results, change in general is difficult across teams. This comes not only with new opportunities and challenges, but also with uncertainty, skepticism and fear, which we had to carefully deal with up front. We’ve tried to mitigate these concerns by involving the local offices and the necessary stakeholders in every decision-making process. This way we are able to get consensus as we move along rather than waiting until something is done and checking back.

Q: What results have you seen from the integration project so far, and how have you seen attitudes change since starting to implement an integration project?
A: The results to date have been really great. Because all of our offices have an equal share in the decision-making process, they feel that their opinions are valued and are very supportive. Although the effort is centralized in the US, we are including people from other offices, which has helped us understand their requirements and adapt the framework necessary for better scalability and robustness going forward. We also smartly included their input during development, making them feel a part of what is being built and proud of the results.

Vinod Kumar

Q: Have there been any pleasant surprises that data integration has brought to you or your customers?
A: One of the surprises has been finding out that there were processes already in place and that were already working somehow. Everyone had a system in place that was comprised of old technologies and manual steps, but somehow, everything was working more or less. A major goal for this project was to deliver a solution that greatly improved efficiency and automated the process to remove errors. We started the integration initiative by identifying all the efficiencies we wanted from the current system and then layered on the improvements we needed to improve results. When we were done, we had one single source of data that can be readily used by all the stakeholders downstream. This is a definite improvement in that it directly improves results and indirectly reduces costs.

Q: Can you give us an example of a difficult integration and what made it so tough?
A: For manufacturers, a prime integration opportunity is an ERP integration; specifically, ERP integration to Salesforce. Because ERP is the backbone of many manufacturers’ businesses, it’s natural to connect it to a CRM platform so that everyone has a clear view of every customer’s history and activity. But you have to understand the nuances between platforms. Organizations that jump in without understanding this usually run into the natural learning curve that comes with integrating ERP platforms to Salesforce and other business-critical platforms such as e-commercey.

In one example, we had a customer whose ERP system was hosted and managed by a third party. Our team had performed dozens of integrations with this same ERP system without an issue. In this case, however, the customer’s system configuration and security setup was much different, and there was little flexibility to make changes. We were not able to use the standard mechanisms to track data changes or easily access the system. On top of that, this customer had a very high volume of data transactions, which required us to get creative to come up with a solution that worked for their environment. For example, to ensure there’s no interference with the customer’s production system, a nightly system backup is performed to refresh the integrated data. This introduced some challenges with tracking deletions and we had to introduce processes to compare the current backup data against the previous backup. As you might imagine, this required processing large amounts of data during off-hours, so special attention is paid to ensure efficient processing.

“…it is essential that as soon as you start a data integration project, you make sure that the goals are clear from both an IT and business standpoint.”

Q: What are the biggest things everyone should think about when it comes to data integration?
A: The end goal of any integration project is for stakeholders to have complete trust in the data. This can only happen when the data takes center stage during the integration. Therefore, it is essential to pay close attention to the four magical V’s of data: Variety, Volume, Velocity, and Veracity.

  • Variety relates to the format and structure of the data. Integration projects need to be designed to handle various formats which can increase with the number of data sources considered for the project.
  • Volume is the size of the data and it can be viewed in three dimensions: the width, the length and the size. The size of the data, or the space required to store and manage data, is directly proportional to the width and the length of the data.
  • Velocity is the frequency at which the data is received. The project design depends on whether data gets consumed in real- or near real-time, offline or batch processing.
  • Veracity refers to the quality of the data and if it is curated, mastered, cleansed and reliable.

Other key factors to pay close attention to are the roles of governance and compliance, understanding how the data will be used and making sure the goals are aligned across teams, users and management. For us, that meant handling data with respect to Personal Information (PI), which can be different across location with different standards and policies. When it comes to data usage, the key is to identify the value that the data is going to provide. It’s not just about providing a terabyte or petabyte of data to be managed, but about the value the solution is going to provide to its stakeholders.

The final point is that it is essential that as soon as you start a data integration project, you make sure that the goals are clear from both an IT and business standpoint. Data integration projects are heavily collaborative in comparison to other projects between IT and other sectors of the business, and both groups have equal responsibility in making sure each project is successful. IT needs to understand what value the department is expecting and the department needs to provide information so IT can put together a valuable system. IT has to be upfront with the users in terms of how the data fits the requirements and what will be delivered at the end of the day. This includes using the right set of tools to do the job based on the requirements and the data considerations. When it comes to any system development for a set of users, performance is the chief impediment to the solution. Even if the system meets all the requirements, if it does not perform well, users will reject it. In fact, we give performance equal priority to any other business set requirement. We remind each other constantly that data integration, data transformation and data analysis have to be quick in order to be effective.

Q: So far, what has been a pleasant and surprising outcome from your data integration project?
A: One thing I’m really impressed with is how IT and business have been so closely involved in working together. In traditional application or system development, users provide the basic requirements and the application development team uses it as a reference in developing the system and providing deliverables to the users. However, the data integration project has brought IT and business close together and has created strong bonds between the two. Before the project, we always preached the theory that IT and business should collaborate, but this project has actually given them the chance through aspects of project execution. I’ve realized that when IT and business work together, the outcome of the results more than doubles. There is something very impressive about two strong teams coming together to produce an outcome that affects the organizational future, and I’m really moved by that.

Q: What are the key questions people should ask before jumping into any data integration project?
A: There are really four questions essential to getting data integration right:
1. Clarify the goals and ask what is the purpose of this data integration project? Based on the answer, you should determine the right technology and resources to make sure we have the right candidates from both IT and business who can support the development of the project.
2. What business resources do we need to get the right answers? IT needs to ask all the questions to the appropriate business teams who use the data, so they can have the right answers and the right decisions can be made in developing the system.
3. How should we tackle getting all users involved at the right time, place and level? A strategy needs to be developed up front for the project to make sure no one important is left out of the decision-making process.
4. How do we create prototypes of the system so we identify issues before full launch? Having a useful prototype or proof of concept for data integration projects helps make the process of development much smoother. By creating one prototype and one proof of concept, we were able to reach the point where we can know and see how we are going to be truly successful.

For a data integration project to be truly effective, it helps to have a set of objectives based on the initial requirements. I would strongly recommend that people spend time doing the initial discovery and then doing a proof of concept or prototyping the initial assumptions to make sure that the project is going along the right path. This ensures that corrections can be made way ahead of the game.

Vinod Kumar