Month: October 2018

The roadmap of many oil and gas companies today stands on the cusp of digitization. While other similarly sized industries such as aeronautics, renewable energy and manufacturing have successfully adopted technology to help boost business and processes, oil and gas still loses money due to a lack of efficiency in operations.According to Deloitte, this loss stands at a whopping $35 billion reported by listed upstream, oilfield services, and integrated companies worldwide in 2016.

The same report also states that even an increase of 1% in capital productivity is enough to write off this cumulative net loss. The empirical data seems to clearly indicate that the oil and gas industry is primed for a digital transformation.

The easy availability of connected devices, combined with smart apps that can streamline all aspects of the production process by utilizing the latest cutting-edge technology, makes the choice to go digital a no-brainer.

We at Incture Technologies are actively working towards aiding this digital transformation. Our apps and solutions have helped transform operations and processes across multiple industries, and quite a few of our marquee clients are working in the oil and gas sector.

One of the prime examples is an ongoing project with a leading multi-national oil and gas company. The company works with thousands of resources taking care of day-to-day operations, repair, maintenance and many other such tasks.

The staff provided data on pen and paper, which resulted in very low visibility for the company’s admin team, as well as process inefficiency and an inability to accurately predict costs of operations. They wanted to improve operational efficiency, reduce manual paperwork and digitize the entire process.

This is where Incture stepped in to provide two solutions in the form of the Digital Field Ticket (DFT) and Integrated Operations Platform (IOP) apps.

DFT is used for tasks where third party vendors are involved. Earlier the whole process of invoice creation and verification would take weeks, due to geographical difficulties. But with DFT, the vendor staff can upload data from any location, even while offline!

The company now requires less than 8 hours to create a digital ticket, seek approval and document in the service center sheet.

DFT has been adopted for 40 vendors so far and plans are in place to implement the solution for the company’s operations with more than 400 vendors.

The IOP app, on the other hand, is a one-stop solution to monitor all aspects of the production process. Earlier there was no system in place to effectively monitor requirements, task assignments and task completion. The data about operations was also scattered across multiple systems, making data anlytics a difficult task.

IOP helped the company communicate more effectively, assign tasks to the best suited teams, achieve real-time visibility and capture data across the board. This helped them reduce turnaround time by as much as 40% in some cases.

The apps were also integrated with multiple process points; thus adding transparency, accuracy and predictability to all invoicing functions undertaken by all stakeholders.

This just one example of how our technology is helping industries adapt to a brave new world and boost profitability while they’re at it. If you want to see what kind of value we can create for your organization, reach out to us at for a free demo!

I would like to begin this article by first introducing the CIO Guide that was published by SAP, documenting SAP’s vision around integrating SAP applications in both a cloud and hybrid landscape. I personally think that the guide is a great starting point to deal with one of the evolving architecture decision point around integration especially when organisations move towards adopting cloud applications.

While SAP treats the enterprise in a ‘SAPcentric’ way (this is SAP’s own disillusioned ‘Geocentric’ model of the enterprise world), most enterprises are heterogeneous (no surprises there) and the EA needs to be considerate of that fact. In this article, I would like to bring together all components and develop a single consolidated view of this portfolio, share an independent view around the SAP integration technology portfolio and the positioning of these different platforms/technologies in the enterprise.

Not many may realize, but SAP has a massive portfolio around integration (this is inclusive of the wider context of integration i.e. process optimization, DQ, Edge integration etc). A quick snapshot of this portfolio is below;

Note: The one in blue are the on-premise solutions and the others are cloud offerings, part of the SAP Cloud Platform.

There is already a good blog that introduces the cloud solutions that should provide you with a decent understanding of the capabilities and help with a choice of what to use when.

Below are extracts on the onpremise solutions;

Process Orchestration – SAP’s strategic onpremise integration solution for enterprise A2A and B2B integration along with Business Process Management and Business Rules Management.

Fitment: Enterprise landscape integration, business process optimization and automation, rules management.

Operational Process Intelligence (OpInt) – This is a solution in my view SAP’s Advanced BAM. OpInt promises Process transparency, Limitless scenario transparency and Insight to action around business processes with the key USP being the real time data context aspect.

Fitment: Real time insight to action for operations (linked to critical business processes)

Process Mining – A data based process discovery tool that promises to help enterprises gain complete transparency into how processes are executed, increase process efficiency by identifying process deviations and weaknesses, improve compliance by detecting non-compliant processes and drive profitability.

Fitment: Process discovery and a foundation for business transformation

SAP Data Services (BODS) – This is SAP’s technology encapsulating data integrator and data quality. BODS is one of the most popular solutions when it comes to data provisioning and ETL, used widely in data migration and DQ led initiatives.

Fitment: Data integration and Data quality initiatives

Gateway – SAP Gateway is an open standards-based framework that developers can use to more easily connect non-SAP applications to SAP applications. It is also used to connect to and access SAP applications from mobile devices. As a result, it is a chief enabler of some of SAP’s newest and most prominent technologies, including the SAP Mobile Platform and SAP Fiori.

Fitment: UI/UX transformation for the SAP landscape

You would notice that most of the technologies have an overlap in terms of capabilities (functional and technical) and it tends to create a natural confusion for any enterprise architect team to provide a guidance on tool fitment.

I would assume that a company like SAP would have consolidated their portfolio around these technologies but apparently it is either it has been a slow process at their end or that their internal strategy once again is so SAPcentric that they seem to spawn parallel products with overlapping capabilities.

A closer look might reveal that at times these products are very scenario specific, catering to distinct use cases. Say for example Gateway – I have often wondered, wouldn’t it have been better that the capability of exposing SAP business data as ODATA services be a part of the Process Orchestration (SAP PO) suite, driving consolidation? But for two reasons, Gateway now has a strong fitment in the landscape, one being the fact that not all ERP customers would have a SAP PO licence and the other being that specific use case Gateway looks to target that is enabling business data as lightweight APIs (ODATA) for the consumption of primarily UIs.

Again, from a consolidation perspective, I would have wanted to see the consolidation of distinct platforms like SAP Process Integration, SAP BRM and SAP BPM into SAP PO happening sooner. While SAP PO is now a very stable platform and is one of the leaders in this space, a further consolidation could position SAP stronger at the enterprise level. An example for this would be the API management solution. As of today, API management is a separate technology component both on the cloud and onpremise. Most technology vendors have already consolidated this capability onto their ESB or integration-PaaS solutions making their platforms much more comprehensive for the new enterprise’s (read the advent of cloud, mobility and UX) integration needs.

But having said the above, SAP comes out very strongly with their portfolio and the set of technology tools provides one with an exhaustive capability list spanning most needs of enterprise integration.

On the cloud, SAP seems to be investing significantly and innovations are being rolled out for general availability very frequently. One such example would be SCP-Integration. SCP-I started out with being a cloud integration solution for mainly SAP to SAP applications (onpremise to cloud or cloud to cloud). Over the last 15 months, it has positioned itself in the iPaaS category with major capability additions around connectivity to varied systems, support for multiple protocols and the recent addition of B2B/EDI capabilities. SCP-I is a challenger in many ways, but it still serves a limited scenarios coverage. In comparison to its onpremise counterpart i.e. SAP PO, SCP-I has its shortcomings. While customers should look at taking advantage of the packaged integration content that is one of the key USP of SCP-I, two specific capability areas that deserves attention from a roadmap perspective are around the connectivity (adapters) and B2B integration.

Note: This blog documents the existing gaps around EDI/B2B capability on the iPaaS solution.

The recent addition of SCP-Workflow and Rules packs a lot of power into the cloud offering, opening up a lot of possibilities around innovation and application development. Other individual components like the IoT services, SDI, SDS, API-M etc put together makes it one of the most competitive iPaaS currently that customers can possibly adopt.

So it comes down to one question and that is how does one make a choice around what technology to adopt? While the above can guide you to possible fitments, at a EA capability level, many decision makers are forced to concern themselves with the choice of onpremise vs cloud solution stack.

In my many discussions with CIOs and IT stakeholders, there is an urge to go onto cloud. I sense that though on most occasions this is driven by the long term strategy of cloud adoption, at times is also misplaced in many ways.

I would want to choose my words carefully here but what is a point of view if not for its genuineness. Most IT decision makers are being swayed into the technology bandwagon that is cloud esp. around integration. Keeping aside an immediate benefit say via the attractive licence models and a possible reduction in your OPEX, lets not forget that each tool has its own fitment. So when it comes to investing in an onpremise solution vs an iPaaS to manage the enterprise integration needs, I believe the decision has to be predominantly driven based on the answers to the below questions;

  • Is the enterprise landscape predominantly onpremise or cloud?
  • Does you immediate roadmap include multiple cloud applications?
  • What are the various integration points and the protocols in your landscape?
  • How much B2B/EDI heavy are you as an organisation?
  • Is connected devices part of your IT strategy?
  • Do you manage data on the cloud (read analytics, reporting etc)?
  • Do you have a microservices architecture planned?
  • Do you have an imminent need to govern and monetize your APIs?

Asking these questions will help strengthen your decision making. A simple rule of thumb I believe works well is that if more than 80% of enterprise applications are onpremise, then investing in an onpremise solution to manage integration is ideal. If it is that 80% of you applications are on the cloud, iPaaS is the way to go. But you will find that most companies are in a process of transition and hence the importance of an hybrid architecture where both the onpremise solution and the iPaaS become complementary. Even in this case, one is forced to answer the question of what to use where. Here we need to scrutinize the capability of platforms (keeping a close track on their roadmap).

A good example would be say the EDI integration. Many large organisations depend on the EDI solution to run most of their business. Hence EDI becomes mission critical. Imagine you are a customer with your core ERP solution onpremise, using a legacy EDI solution running a large set of interfaces with a large set of trading partners and are looking to modernize. If you are faced with a decision to use SAP PO vs SCP-I, at this point in time, I would be inclined to use SAP PO due to the stable core and the rich capability the platform provides around EDI/B2B. On the other hand, if you had a relatively smaller EDI landscape, one would chose to onboard trading partners onto SCP-I because the trade off can be managed. Do note that these decisions are time frame driven. In say 6 months from today, SAP decides to close all gaps around the capabilities/functionalities in this space, one will lean towards using an iPaaS solution for doing EDI.

Also is the fact that of having a consolidated landscape. If you are an enterprise heavily invested on onpremise applications, will it make sense that you use an iPaaS solutions for your B2B and also your A2A? Why would you want to integrate two onpremise applications via an iPaaS solution? The answer always lies in your enterprise strategy around cloud adoption. This is also where the hybrid landscape gets reinforced. If you are heavily onpremise in the application space with that estate continued to be retained onpremise in the long term, I would place my bets on an onpremise solution for integration and for the scenario that is vice versa on an iPaaS solution. But most cases when you are embarking on a cloud journey, hybrid integration would find its fitment. The idea is to be aware of the capabilities of the platforms and base your decision along with your business needs.

I personally feel that for the next 5 years, the relevance of hybrid landscape will become more and more prominent. IT decision makers and Enterprise architects will have to define decision points and enforce the right usage of platforms, primarily driven through use case fitments.

We look forward to meeting you at the SAP Sapphire and ASUG Annual Conference 2018, 05-07 Jun, Orlando, FL. We are at Booth-1089C.
To see Session Catalog: Click here

Electronic Data Interchange (EDI) is the computer-to-computer exchange of business documents in a standard electronic format between business partners. There are several EDI standards in use today, including ANSI, EDIFACT, TRADACOMS, ODETTE, VDA etc. And, for each standard there are many different versions.

When two businesses decide to exchange EDI documents, they must agree on the specific EDI standard and version. Businesses typically use an EDI translator; either as in-house software or via an EDI service provider – to translate the EDI format so the data can be used by their business applications and thus enabling the processing of transacting documents.

Many large enterprises have invested heavily into EDI solutions, typically in an internal EDI translation system or middleware. Most such enterprises today are looking at modernizing this landscape due to the typical IT needs around consolidation, licence expiry, end of support and reduction of TCO.

To embark on an EDI modernization journey – It is a path not easy to traverse. There are many challenges that customers must factor in as part of their migration planning. This article will look to discuss them, with possible solutions that can be embedded into the planning process.

Challenge 1 : Lack of legacy knowledge

Usually one of the most common and the crucial of all challenges is when there is little or no expertise around the legacy EDI estate. While the customer has an operations team (in most cases a very lean one keeping the lights on), over the course of time has lost most of the knowledge of the finer technical details. Lack of documentation or updated specifications (functional or technical) adds more fuel to the fire. This is typically a result of ‘stabilized’ environment over long period of operation. The challenge thus becomes that during the modernization exercise, the migration team will have little support in de-compiling mapping specifications which are the crux of any EDI migration exercise.

While some legacy platforms provide the configuration and mapping implementation as exportable files (csv mostly), there are aspects that has to be documented, for example the use of lookup tables or cross references, routing logic based on rules etc that will be at the risk of being missed.

Solution: Any EDI modernization program should factor in sufficient time in form of analysis. The legacy team should be deeply embedded into the program along with the migration team. This can not only help facilitate a meaningful analysis phase but also help identify technical challenges and other risks up ahead in the project life cycle.

This partnership between the legacy team and the migration team should continue through out the project. Where possible, the EDI migration team can also take forward the knowledge thru this engagement to help de-compile the legacy EDI solution. This is a possibility over the course of the project and helps accelerate the migration.

A pilot release/early release with a contained/minimal/low impact scope should be planned before the larger scope of the project is fully initiated. This will help the project team fine tune the approach and also for the wider stakeholders to appreciate the challenges of migration upfront.

And remember, document everything from day one!


Challenge 2 : Heavy Customization

This is typical of almost all legacy landscapes where over the years customized code has engulfed the EDI landscape. Custom maps, custom EDI types, flat structures and in-house developed structures would have been constantly creeping into the EDI estate. More the customization, more complex the migration.

Solution : The modernization project should be wary of this and not look at doing a one to one migration. Most times, simplification could be the answer to address such situations. As part of the analyse and design phases, the project should look for opportunities;

  • To rationalize and consolidate
  • Adopt Standard EDI messages
  • Design with re-usability as the core principle

Challenge 3 : Poor Governance

Due to poor governance structures and processes, legacy landscapes are usually plagued with multiple versions of code, duplicated efforts, inefficient exception handling and longer RCA time.

Solution : Like any integration landscape, establishing a governance model is important. The modernization initiative should not only look at migrating off from the legacy platform to the latest technology platform but also put in place a process around governing the landscape as part of the ongoing operations. SOA principles around governance can be tweaked to help establish best practices in the modernized landscape.


Challenge 4 : Long cycle times around trading partner on-boarding

A spill over of the challenges around points 2 & 3, on-boarding a trading partner is usually seen as a cumbersome activity. This is usually due to the multiple rounds of conversations to get to a mutual agreement around message exchanges, transfer protocols, transformation logic and business rules.

Solution : Enforce standards as much as possible. While appreciating that there  will be customization, it is important that where possible standard messages sets are leveraged. Where customizations are needed, extend the standard set rather than creating a completely new custom structure.

Trading partner management (TPM) functionalities as part of the EDI platforms are also highly recommend to be leveraged. TPM provides flexible configuration driven framework to quickly provision an EDI flow for trading partners. TPM helps create template based implementation which is an important accelerator.

Once again, designing with re-usability and agility (standards adoption, re-usable libraries, configuration framework, BRMS etc) becomes the key mantra.


Challenge 5 : Disruptive Migration

Most stakeholders are concerned about disruption to business during the cutover to the new EDI platform. This is a major challenge as some EDI platforms are responsible for almost 70-80% of the business transactions of an enterprise. Long cutover or prolonged downtime can be prove fatal despite a strong business continuity plan. Post go live, multiple defects or bugs can hamper the platform adoption and the confidence of IT/business.

Solution : During the planning of the project, there are multiple aspects that need to considered. Involvement of trading partners for UAT, a strong SIT/Quality testing prior to UAT using production files, setting a benchmark/matrices around testing, possible automation of test suites are the very basic to a successful migration.

Most project fail or the migration run into long timelines due to crunched testing or rather the poor quality of testing. Make the availability of production files/data for testing mandatory. The more the test data, better your chances of catching defects early on.

Also equally important is the execution of the cutover plan. Keeping all stakeholders like the legacy , infra, network/security teams along with the trading partners and the internal business contacts constantly informed is crucial. Do cutover plan walkthroughs and identify clear owners before the D-Day.

It is also recommended that deployments are done in a staggered manner. This helps manage risks and the post go live defects (if any) much efficiently.

We look forward to meeting you at the SAP Sapphire and ASUG Annual Conference 2018, 05-07 Jun, Orlando, FL. We are at Booth-1089C.
To make an appointment with us: Click here
To see Session Catalog: Click here