Remove Availability Remove Data Remove Latency Remove Scalability
article thumbnail

Why applying chaos engineering to data-intensive applications matters

Dynatrace

The jobs executing such workloads are usually required to operate indefinitely on unbounded streams of continuous data and exhibit heterogeneous modes of failure as they run over long periods. This significantly increases event latency. Performance is usually a primary concern when using stream processing frameworks.

article thumbnail

Scalable Annotation Service?—?Marken

The Netflix TechBlog

Scalable Annotation Service — Marken by Varun Sekhri , Meenakshi Jindal Introduction At Netflix, we have hundreds of micro services each with its own data models or entities. But there are more interesting cases where users want to store temporal (time-based) data or spatial data. Movie Entity with id 1234 has violence.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

The Power of Caching: Boosting API Performance and Scalability

DZone

Caching is the process of storing frequently accessed data or resources in a temporary storage location, such as memory or disk, to improve retrieval speed and reduce the need for repetitive processing. Bandwidth optimization: Caching reduces the amount of data transferred over the network, minimizing bandwidth usage and improving efficiency.

Cache 246
article thumbnail

Presentation: Azure Cosmos DB: Low Latency and High Availability at Planet Scale

InfoQ

Mei-Chin Tsai, Vinod discuss the internal architecture of Azure Cosmos DB and how it achieves high availability, low latency, and scalability. By Mei-Chin Tsai, Vinod Sridharan

Latency 52
article thumbnail

Data Reprocessing Pipeline in Asset Management Platform @Netflix

The Netflix TechBlog

This platform has evolved from supporting studio applications to data science applications, machine-learning applications to discover the assets metadata, and build various data facts. Hence we built the data pipeline that can be used to extract the existing assets metadata and process it specifically to each new use case.

Media 237
article thumbnail

Nine ways technology executives can get significant business value with the right observability platform

Dynatrace

Data with context can improve your ability to deliver on your goals, modernize your organization, and accelerate business transformation. These outcomes are made easy through the platform’s unique ability to turn data into answers and action, in contextual, real-time, and cost-effective ways that were previously impossible.

article thumbnail

Bulldozer: Batch Data Moving from Data Warehouse to Online Key-Value Stores

The Netflix TechBlog

By Tianlong Chen and Ioannis Papapanagiotou Netflix has more than 195 million subscribers that generate petabytes of data everyday. Data scientists and engineers collect this data from our subscribers and videos, and implement data analytics models to discover customer behaviour with the goal of maximizing user joy.

Latency 243