Remove Cache Remove Design Remove Latency Remove Scalability
article thumbnail

Designing Instagram

High Scalability

Design a photo-sharing platform similar to Instagram where users can upload their photos and share it with their followers. High Level Design. Component Design. API Design. We have provided the API design of posting an image on Instagram below. API Design. Problem Statement. Architecture. Fetching User Feed.

Design 334
article thumbnail

Supporting Diverse ML Systems at Netflix

The Netflix TechBlog

Since its inception , Metaflow has been designed to provide a human-friendly API for building data and ML (and today AI) applications and deploying them in our production infrastructure frictionlessly. Deployment: Cache To produce business value, all our Metaflow projects are deployed to work with other production systems.

Systems 226
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

How RevenueCat Manages Caching for Handling over 1.2 Billion Daily API Requests

InfoQ

RevenueCat extensively uses caching to improve the availability and performance of its product API while ensuring consistency. The company shared its techniques to deliver the platform, which can handle over 1.2 billion daily API requests. The team at RevenueCat created an open-source memcache client that provides several advanced features.

Cache 105
article thumbnail

Benchmark (YCSB) numbers for Redis, MongoDB, Couchbase2, Yugabyte and BangDB

High Scalability

We note that for MongoDB update latency is really very low (low is better) compared to other dbs, however the read latency is on the higher side. The latency table shows that 99th percentile latency for Yugabyte is quite high compared to others (lower is better). Again Yugabyte latency is quite high. Conclusion.

article thumbnail

Self-Host Your Static Assets

CSS Wizardry

Users might already have the file cached. If website-a.com links to [link] , and a user goes from there to website-b.com who also links to [link] , then the user will already have that file in their cache. On a slower, higher-latency connection, the story is much, mush worse. Penalty: Caching. All completely avoidable.

Cache 274
article thumbnail

Optimizing CDN Architecture: Enhancing Performance and User Experience

IO River

CDNs cache content on edge servers distributed globally, reducing the distance between users and the content they want.‍CDNs use load-balancing techniques to distribute incoming traffic across multiple servers called Points of Presence (PoPs) which distribute content closer to end-users and improve overall performance.

article thumbnail

Optimizing CDN Architecture: Enhancing Performance and User Experience

IO River

CDNs cache content on edge servers distributed globally, reducing the distance between users and the content they want.‍CDNs CDN architecture also focuses on caching, load balancing, routing, and optimizing content delivery, which can be measured by: cache offloading and round-trip time (RTT).‍RTT