Knowing who is who in the zoo is more important than what systems are out there to connect to By Greg Craven October 12, 2016 Back when I started off in the industry, some 20 something years ago (I do pretend I am still in my 20s so that number has a nice ring to it) there was only one IT Department with one manager in most large organizations. Now there are multiple managers within different departments, some aligned to different parts of the organization. Some pieces are outsourced, some in-sourced and some have contractors working on it. When it comes to connecting most systems together, the industry is focused on “Having a connector to this or that” while the real hard part is how to connect to that particular implementation of that system. As the technologies evolved over the years the pillars (or silos) of teams evolved. So providing an integration solution to connect multiple systems together is more of a project management (herding cats) nightmare than a connector nightmare. Let’s take a typical mid-sized company that wants to connect their cloud based applications (CRM, HR, etc.) to their on-premises applications (SAP, Oracle Finance, Dynamics, Databases, etc.). Pretty simple task as we have all the connector options and worse case we can always fall back on a web-serviced based JSON/XML connector and database connectors. The problem of “do we have a connector to each system” is solved within minutes. The real problem and the time killer is how to connect and to whom we will give access. If we consider the layers of technology involved (taking OSI model as a method of stepping through access): Physical Layer – how is the server connected and what speed limits could this be restricted by (is the server connected?) Data Link Layer – what level of QoS do we have, are there any restrictions, which VLAN are we on and what does that VLAN have/not have access to? Network Layer – can we perform a network test to each system we need to connect? Transport Layer – can we retain a connection and what is the performance of that connection? Session Layer – what are the authentication mechanisms for each system? Can we authenticate? Presentation Layer – can we gain access to the metadata behind each system? Do we have sufficient rights? Application Layer – Can we see a sample of the data that we are connecting to? Does the data look like what we expected? Can we perform updates, inserts, upserts, deletes and reads? Has the application been customized and can we access those customization’s? Achieving all of this requires working with different IT teams both internally and externally. It may require working with vendors or other developers outside of the organization as well. Consider the following roles (not an exhaustive list) that would require gaining their trust and knowledge/assistance: Server/Hardware manager – Virtual server, capacity, server install Operating System specialists – Windows / Linux / AIX / etc. Ability to run your integration software? Installation, patching and maintenance? Remote access to the server? Network manager – In which zone was the server installed? Does it have connectivity to each system? Remote access to the server? Security/Firewall – Which ports are locked down and needed opening for this new service? Is the anti-virus software causing issues? Remote access to the server? Browser access to the server? Cloud application specialist – Method of access, security, ability to access? Can we log in? Database Administrators – Database access, rights, simple database read tests Specialist applications (SAP BAPI developers) – Are there some custom BAPIs that need to be used? Which of the standard BAPI’s should not be used? Can we use the fat client/web application to view and query the system? Can we use a test/development system? Application developers – Is there a standard method for requirements gathering, development methodology, peer reviews, user acceptance testing, system testing, load testing? When we are required to prove we can connect to a system, we spend 90% of our time working with the people above and 10% in doing the actual connection. Knowing who to work with and gaining their trust and buy-in is the real hard yards. About Greg Craven Greg Craven is a Technology Consultant and Director of Technology at Cumulus Technologies, an Actian partner in Australia. He has been integral part of Cumulus’s consultancy since joining in 2008, including a stint as the interim CIO for Arrow Energy where he oversaw restructuring for all of their I.T. operations including recruitment. Greg has held many titles over the span of his career including General Manager, CTO and CIO. He studied at the University of Central Queensland where he earned a degree in mathematics and computer sciences.