CIO.com Virtual Roundtable: Establishing A Strategic Foundation For Modern Data Fabric Across Hybrid Multi-cloud Infrastructures

  • A majority of IT organizations in mid- to large-sized organizations are now using a complex combination of on-premises hardware, software and cloud-based enterprise services.

  • These services are constantly evolving in response to the introduction of new technologies and changing business requirements.

  • Optimizing data management across this hybrid, multi-cloud infrastructure environment is increasingly recognized as a critical area of competency that will influence the ability of IT to effectively contribute to business transformation goals and objectives.

IBM.jpg

According to IDC, the majority of IT organizations in mid- to large-sized organizations are now using a complex combination of on-premises hardware and software and cloud-based enterprise services that are constantly evolving in response to the introduction of new technologies and changing business requirements. Optimizing data management across this hybrid, multi-cloud infrastructure environments is increasingly recognized as a critical area of competency that will influence the ability of IT to effectively contribute to the achievement of modernization and business transformation goals and objectives.

Cio.com_logo_.jpg

Leading executives from around the country gathered for a CIO.com virtual roundtable discussion co-hosted with Atulya Jyoti and Shekhar Jadhav of IBM.

Here is what they had to say:

  • One of the major success factors for moving forward aggressively with business transformation -- and associated technology modernization -- initiatives in large organizations will revolve around strategic prioritization. As the conversation got started, there was a major area of consensus around the reality that most enterprises will be building more -- not less -- complex infrastructures. This will find critical data scattered across a mix of old and new on-premises resources that must be connected to a plethora of cloud-based infrastructure, platform, and application technologies.

  • From a data management perspective, the challenge appears to be less about simplification, and more about the integrated and intuitive orchestration or growing complexity. In other words, developing a strategic approach to managing data across multiple generations and locations. The objective being to deploy technology in a manner that supports data-driven decision making by laying the foundation for cross-infrastructure data analytics and inter-platform automation.

  • It is a daunting task. But here -- the roundtable participants agreed -- is where the “carpenter’s credo” of measuring twice and cutting once will come in handy. Long-term planning that has enterprise-wide buy-in is a critical prerequisite to the actual implementation of new data management practices.

  • As one executive observed, not everything has to be transformed at the same time. That said, the entire team should all share a common vision of what a desired “future state” looks like from a data management perspective across enterprises and infrastructure elements. “First, you need to make sure that you really understand what needs to be transformed. And then you have to understand the implications that these specific actions will have across multiple different environments.”

  • This is important from both a data analytics and automation perspective. All long-term planning, stated one executive, should probably emphasize putting strategic thought into replacing manual integration processes. “We have had a significant amount of what I lovingly call manual integration activity. While this may work when you're a smaller company. It's not going to serve organizations well as they go into big growth mode. Throwing people at the problem is not what you want to do.”

  • With a clear long-term vision in place -- that focuses on analytics and automation -- organizations can start prioritizing what parts of the business to start working on. The objective should be to leverage the “common vision” to eventually tie these different elements together. “Over the last couple of years, we have transitioned certain systems on to other systems. Now it's the time for us to start connecting things up.”

  • This point of view was supported by another executive who noted: “Now that we have some integrations and automation in place, we're going to do a big master integration.” Getting through this, he added, requires the engaged participation of functional groups in the organizations who are currently running their own processes and their own tools. Time spent early on investing in tool harmonization is important. “Organizations want to avoid situations in which there are 25 to 30 different -- but redundant -- technologies in place.”

  • One of the very interesting issues that emerged as a result of this line of discourse revolved around road-map development. Specifically, we discussed moving beyond a lift-and-shift mentality to explore ways to optimize data management in complex environments. As one executive put it: “I'd like to remove that one step of moving data to a staging area and see how we can get to more data directly. We want to figure out how to push the organization to waste fewer steps.”

  • One way to streamline this process and access data in its native environment is to harness the power of metadata. “It is important to explore how to consolidate IP data and IP metrics by tapping into events, and logs. We are living in a world which is constantly popping up containers and popping up new environments and then blowing them up,” explained one participant. The key is to figure out how to access data in a production environment without harming performance.

  • Several executives suggested that this problem can -- at least in part -- be addressed with automated “data replication.” This is the process of making multiple copies of data and storing them at different locations to improve their overall accessibility across a network. [NOTE: According to IDC, a significant amount of enterprise spend will shift from conventional disaster recovery and protection to cloud-based data protection and endpoint protection. This is because it will also address enterprise data management issues to advance the AI analytics and automation agenda.]

  • According to Atulya from IBM: “Effective modernization initiatives should enable IT operations to exit traditional data lake paradigms to take advantage of ‘lake house’ strategies.” These are open, scalable, framework platforms to optimize “data wrangling” (the process of cleaning and unifying messy complex data sets for easy access and analysis), machine learning, and data science in general. The objective should be to establish a data conditioning environment that uses data management and optimization techniques which result in the intelligent routing, optimization, and protection of data for storage or data movement in a computer system.

  • All of this, pointed out, IBM’s Shekhar, requires a holistic, multidisciplinary approach to thinking about the creation of enterprise-wide data fabric strategies. “We do not recommend that our clients construct platforms in a vacuum. They should think about how all operations will access data and operate on those platforms. That means it's important that whoever designs anything for today's hybrid infrastructures keep agility, automation, repeatability, and reliability top of mind so that they don't waste time.” This should result in a force multiplier effect that frees up time and resources to create new business by driving innovative ideas.

To read more or for information on roundtable participation, please visit www.CIO.com.