Data Fabric architecture is the key to modernizing data management and integration


Data management agility has become a top priority for organizations in an increasingly diverse, distributed and complex environment. To reduce human error and overall costs, data and analytics (D&A) managers need to go beyond traditional data management practices and move towards modern solutions such as data-based integration. on AI.

The emerging design concept called ‘data fabric’ can be a robust solution to the ever-present challenges of data management, such as high-cost and low-value data integration cycles, frequent maintenance of past integrations, growing demand for real-time data and event-driven data sharing, and more.

What is the Data Fabric?

Gartner defines data fabric as a design concept that serves as an integrated layer (fabric) of data and connection process. A data factory uses continuous analytics on existing, discoverable, and inferred metadata assets to support the design, deployment, and use of integrated and reusable data across all environments, including hybrid and multicloud platforms.

Data Fabric leverages both human and machine capabilities to access data in place or support its consolidation, as needed. It continuously identifies and connects data from disparate applications to discover unique and business-relevant relationships between available data points. Insight supports redesigned decision making, delivering more value through quick access and understanding than traditional data management practices.

For example, a supply chain leader using a data structure can more quickly add newly encountered data assets to known relationships between supplier delays and production delays, and improve decisions with the new data (or for new suppliers or new customers).

Think of Data Fabric as an autonomous car

Consider two scenarios. In the first, the driver is active and gives full attention to the route, and the autonomous element of the car has minimal or no intervention. In the second, the driver is slightly lazy and loses concentration, and the car immediately goes into semi-autonomous mode and makes the necessary course corrections.

The two scenarios summarize how the data factory works. It first monitors data pipelines as a passive observer and begins to suggest much more productive alternatives. When the data ‘pilot’ and machine learning are comfortable with repeated scenarios, they complement each other by automating improvisation tasks (which consume too many manual hours), while leaving management free to decide. focus on innovation.

What D&A Executives Need to Know About Data Fabric

  • Data Fabric is not just a combination of traditional and contemporary technologies, but a design concept that changes the direction of human and machine workloads.
  • New and upcoming technologies such as semantic knowledge graphs, active metadata management, and integrated machine learning (ML) are required to realize the data matrix design.
  • The design optimizes data management by automating repetitive tasks such as profiling datasets, discovering and aligning the schema to new data sources, and, in its most advanced form, task healing. failed data integration.
  • No existing stand-alone solution can facilitate a full-fledged Data Fabric architecture. D&A leaders can ensure a formidable data fabric architecture using a mix of built and purchased solutions. For example, they can opt for a promising data management platform with 65-70% of the capacity needed to assemble a data structure. The missing capacities can be reached with a homemade solution.

How can D&A leaders ensure a Data Fabric architecture that delivers business value?

To deliver business value through data factory design, D&A managers need to ensure a solid technology foundation, identify the core capabilities required, and evaluate existing data management tools.

Here are the main pillars of a data fabric architecture that D&A leaders should know.

No 1. The Data Fabric must collect and analyze all forms of metadata

Contextual information forms the basis of a dynamic Data Fabric design. There should be a mechanism (like a well-connected metadata pool) that allows the data factory to identify, connect, and analyze all kinds of metadata such as technical, business, operational, and social.

Leave A Reply

Your email address will not be published.