Dynatrace announced the launch of OpenPipeline®, a new core technology that provides customers with a single pipeline to manage petabyte-scale data ingestion into the Dynatrace® platform to fuel secure and cost-effective analytics, AI, and automation.
Dynatrace OpenPipeline empowers business, development, security, and operations teams with full visibility into and control of the data they are ingesting into the Dynatrace platform while preserving the context of the data and the cloud ecosystems where they originate.
Additionally, it evaluates data streams five to ten times faster than legacy technologies. As a result, organizations can better manage the ever-increasing volume and variety of data emanating from their hybrid and multicloud environments and empower more teams to access the Dynatrace platform’s AI-powered answers and automations without requiring additional tools.
Dynatrace OpenPipeline works with other core Dynatrace platform technologies, including the Grail™ data lakehouse, Smartscape® topology, and Davis® hypermodal AI, to deliver the following benefits:
- Petabyte scale data analytics: Leverages patent-pending stream processing algorithms to achieve significantly increased data throughputs at petabyte scale.
- Unified data ingest: Enables teams to ingest and route observability, security, and business events data–including dedicated Quality of Service (QoS) for business events–from any source and in any format, such as Dynatrace® OneAgent, Dynatrace APIs, and OpenTelemetry, with customizable retention times for individual use cases.
- Real-time data analytics on ingest: Allows teams to convert unstructured data into structured and usable formats at the point of ingest—for example, transforming raw data into time series or metrics data and creating business events from log lines.
- Full data context: Enriches and retains the context of heterogeneous data points—including metrics, traces, logs, user behavior, business events, vulnerabilities, threats, lifecycle events, and many others—reflecting the diverse parts of the cloud ecosystem where they originated.
- Controls for data privacy and security: Gives users control over which data they analyze, store, or exclude from analytics and includes fully customizable security and privacy controls, such as automatic and role-based PII masking, to help meet customers’ specific needs and regulatory requirements
- Cost-effective data management: Helps teams avoid ingesting duplicate data and reduces storage needs by transforming data into usable formats—for example, from XML to JSON—and enabling teams to remove unnecessary fields without losing any insights, context, or analytics flexibility.
“OpenPipeline is a powerful addition to the Dynatrace platform,” said Bernd Greifeneder, CTO at Dynatrace. “It enriches, converges, and contextualizes heterogeneous observability, security, and business data, providing unified analytics for these data and the services they represent. As with the Grail data lakehouse, we architected OpenPipeline for petabyte-scale analytics. It works with Dynatrace’s Davis hypermodal AI to extract meaningful insights from data, fueling robust analytics and trustworthy automation. Based on our internal testing, we believe OpenPipeline powered by Davis AI will allow our customers to evaluate data streams five to ten times faster than legacy technologies. We also believe that converging and contextualizing data within Dynatrace makes regulatory compliance and audits easier while empowering more teams within organizations to gain immediate visibility into the performance and security of their digital services.”
Dynatrace OpenPipeline is expected to be generally available for all Dynatrace SaaS customers within 90 days of this announcement, starting with support for logs, metrics, and business events.
The Latest
Industry experts offer predictions on how NetOps, Network Performance Management, Network Observability and related technologies will evolve and impact business in 2025 ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 6 covers cloud, the edge and IT outages ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 5 covers user experience, Digital Experience Management (DEM) and the hybrid workforce ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 4 covers logs and Observability data ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 3 covers OpenTelemetry, DevOps and more ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 2 covers AI's impact on Observability, including AI Observability, AI-Powered Observability and AIOps ...
The Holiday Season means it is time for APMdigest's annual list of predictions, covering IT performance topics. Industry experts — from analysts and consultants to the top vendors — offer thoughtful, insightful, and often controversial predictions on how Observability, APM, AIOps and related technologies will evolve and impact business in 2025 ...
Technology leaders will invest in AI-driven customer experience (CX) strategies in the year ahead as they build more dynamic, relevant and meaningful connections with their target audiences ... As AI shifts the CX paradigm from reactive to proactive, tech leaders and their teams will embrace these five AI-driven strategies that will improve customer support and cybersecurity while providing smoother, more reliable service offerings ...
We're at a critical inflection point in the data landscape. In our recent survey of executive leaders in the data space — The State of Data Observability in 2024 — we found that while 92% of organizations now consider data reliability core to their strategy, most still struggle with fundamental visibility challenges ...