Remove Cache Remove Latency Remove Serverless Remove Traffic
article thumbnail

Seamlessly Swapping the API backend of the Netflix Android app

The Netflix TechBlog

We went from an essentially serverless model in a monolithic service, to deploying and maintaining a new microservice that hosted our app backend endpoints. This allows the app to query a list of “paths” in each HTTP request, and get specially formatted JSON (jsonGraph) that we use to cache the data and hydrate the UI.

Latency 233
article thumbnail

Revisiting “Serverless Architectures”

The Symphonia

I started writing “ Serverless Architectures ” in May 2016. Fast forward to two years later and the article has had more than half a million visits, regularly appears in the top five Google search results for “Serverless”, and helped launched Symphonia ?—?my Serverless is a highly dynamic area and two years is a lifetime in this world.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

How We Optimized Performance To Serve A Global Audience

Smashing Magazine

It increases our visibility and enables us to draw a steady stream of organic (or “free”) traffic to our site. While paid marketing strategies like Google Ads play a part in our approach as well, enhancing our organic traffic remains a major priority. The higher our organic traffic, the more profitable we become as a company.

article thumbnail

A one size fits all database doesn't fit anyone

All Things Distributed

Use cases such as gaming, ad tech, and IoT lend themselves particularly well to the key-value data model where the access patterns require low-latency Gets/Puts for known key values. The purpose of DynamoDB is to provide consistent single-digit millisecond latency for any scale of workloads.

Database 167
article thumbnail

Content Management Systems of the Future: Headless, JAMstack, ADN and Functions at the Edge

Abhishek Tiwari

Recently I was asked about content management systems (CMS) of the future - more specifically how they are evolving in the era of microservices, APIs, and serverless computing. Secondly, having a CDN in front of origin (static site or APIs) reduces the global and regional latency. Eventually, we decided to move them to Jekyll.

Systems 63
article thumbnail

Lessons Learned Rebuilding A Large E-Commerce Website With Next.js (Case Study)

Smashing Magazine

That was until we went to production with our highest traffic customer. To mitigate the performance issues, we had to add a lot of (unbudgeted) extra servers and had to aggressively cache pages on a reverse proxy. It can be hosted on a CDN like Vercel or Netlify, which results in lower latency. As a result, they found that a 0.1s

Website 113