Multicasting in this context refers to the process of directing data streams to two or more destinations. This might look like sending the same telemetry data to both an on-premises storage system and a cloud-based observability platform concurrently. The two principal benefits of this strategy are cost savings and service redundancy.
■ Cost Savings: Depending on the use-case, storing or processing data in one location might be cheaper than another. By multicasting the data, businesses can choose the most cost-effective solution for each specific need, without being locked into one destination.
■ Service Redundancy: No system is foolproof. By sending data to multiple locations, you create a built-in backup. If one service goes down, data isn't lost and can still be accessed and analyzed from another source.
The following are 10 things to consider before multicasting you observability data:
1. Consistency of User Expectations
It's crucial that both destinations receive data reliably and consistently. If it is unclear to users what data resides in which platform, it will impede adoption and make this strategy less effective. A common heuristic is to keep all of your data in a cheaper observability platform and send the more essential data to the more feature rich expensive platform. Likewise if one platform has data integrity issues due to the fact that no one is using it outside of break glass scenarios, it will reduce the effectiveness of this strategy.
2. Data Consistency
While it's good to have a process for evaluating the correctness of your data, when you write data to two systems, not everything will always line up. This could be due to ingestion latency, differences in how each platform rolls up long term data, or even just the graphing libraries that are used. Make sure to set the right expectations with teams, that small differences are expected if both platforms are in active use.
3. Bandwidth and Network Load
Transmitting the same piece of data multiple times can put an additional load on your network. This is more of an issue if you're sending data out from a cloud environment where you have to pay the egress cost.
Additionally, some telemetry components are aggregation points that can push the limits of vertical scaling (for example carbon relay servers). Multicasting the data may not be possible directly at that point in the architecture due to limitations in how much data can traverse the NIC. It's essential to understand the impact on bandwidth and provision appropriately.
4. Cost Analysis
While multicasting can lead to savings, it's crucial to do a detailed cost analysis. Transmitting and storing data in multiple places might increase costs in certain scenarios.
5. Security and Compliance
Different storage destinations might have different security features and compliance certifications. Ensure that all destinations align with your company's security and regulatory needs.
6. Tool Integration
Not all observability tools might natively support multicasting data. Some observability vendors' agents can only send data to their product. You may need to explore a multi-agent strategy in cases like that.
7. Data Retrieval and Analysis
With data residing in multiple locations, the way your teams will need to engage with the data may differ. If you're using a popular open source telemetry dashboarding tool, then there will be at least some degree of consistency with how to engage with the data, even if the query syntax supported by each platform is different. This becomes a little more challenging if your teams are using the UI of the higher cost observability platform.
8. Data Lifecycle Management
Consider how long you need the data stored in each location. You might choose to have short-term data in one location and long-term archival in another.
9. Maintenance and Monitoring
With more destinations come more points of potential failure. Implement robust monitoring to ensure all destinations are consistently available and performing as expected. This is a good opportunity to introduce cross monitoring, where each observability stack monitors the other.
10. Migration and Scalability
As your business grows, you might need to migrate or scale your lower cost observability platform. Ensure the chosen destinations support such migrations without significant overhead.
Conclusion
Multicasting data that is collected by your observability tools offers an innovative approach to maximize both cost efficiency and system resilience. However, like all strategies, it comes with its set of considerations. By understanding and preparing for these considerations, businesses can harness the power of this approach to create observability solutions that are both robust and cost-effective.
The Latest
Industry experts offer predictions on how NetOps, Network Performance Management, Network Observability and related technologies will evolve and impact business in 2025 ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 6 covers cloud, the edge and IT outages ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 5 covers user experience, Digital Experience Management (DEM) and the hybrid workforce ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 4 covers logs and Observability data ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 3 covers OpenTelemetry, DevOps and more ...
In APMdigest's 2025 Predictions Series, industry experts offer predictions on how Observability and related technologies will evolve and impact business in 2025. Part 2 covers AI's impact on Observability, including AI Observability, AI-Powered Observability and AIOps ...
The Holiday Season means it is time for APMdigest's annual list of predictions, covering IT performance topics. Industry experts — from analysts and consultants to the top vendors — offer thoughtful, insightful, and often controversial predictions on how Observability, APM, AIOps and related technologies will evolve and impact business in 2025 ...
Technology leaders will invest in AI-driven customer experience (CX) strategies in the year ahead as they build more dynamic, relevant and meaningful connections with their target audiences ... As AI shifts the CX paradigm from reactive to proactive, tech leaders and their teams will embrace these five AI-driven strategies that will improve customer support and cybersecurity while providing smoother, more reliable service offerings ...
We're at a critical inflection point in the data landscape. In our recent survey of executive leaders in the data space — The State of Data Observability in 2024 — we found that while 92% of organizations now consider data reliability core to their strategy, most still struggle with fundamental visibility challenges ...