Remove Big Data Remove Database Remove Hardware Remove Speed
article thumbnail

What is Greenplum Database? Intro to the Big Data Database

Scalegrid

Greenplum Database is a massively parallel processing (MPP) SQL database that is built and based on PostgreSQL. It can scale towards a multi-petabyte level data workload without a single issue, and it allows access to a cluster of powerful servers that will work together within a single SQL interface where you can view all of the data.

Big Data 321
article thumbnail

Even more amazing papers at VLDB 2019 (that I didn’t have space to cover yet)

The Morning Paper

MongoDB is an important database, and this paper explains the tunable (per-operation) consistency models that MongoDB provides and how they are implemented under the covers. Microsoft have a paper describing their new recovery mechanism in Azure SQL Database , the key feature being that it can recovery in constant time. Yes please!

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

Expanding the Cloud - Cluster Compute Instances for Amazon EC2.

All Things Distributed

During my academic career, I spent many years working on HPC technologies such as user-level networking interfaces, large scale high-speed interconnects, HPC software stacks, etc. Given the specialized nature of these platforms, they require dedicated resources to maintain and operate and put a big burden on the IT organization.

Cloud 118
article thumbnail

Expanding the Cloud: Introducing Amazon QuickSight

All Things Distributed

However, the data infrastructure to collect, store and process data is geared toward developers (e.g., In AWS’ quest to enable the best data storage options for engineers, we have built several innovative database solutions like Amazon RDS, Amazon RDS for Aurora, Amazon DynamoDB, and Amazon Redshift. Big data challenges.

Cloud 137
article thumbnail

5 data integration trends that will define the future of ETL in 2018

Abhishek Tiwari

ETL refers to extract, transform, load and it is generally used for data warehousing and data integration. ETL is a product of the relational database era and it has not evolved much in last decade. There are several emerging data trends that will define the future of ETL in 2018. Machine learning meets data integration.

article thumbnail

Spice up your Analytics: Amazon QuickSight Now Generally Available in N. Virginia, Oregon, and Ireland.

All Things Distributed

They require companies to provision and maintain complex hardware infrastructure and invest in expensive software licenses, maintenance fees, and support fees that cost upwards of thousands of dollars per user per year. Customers can visualize their data by picking a table and then getting to a visualization in just a few clicks.

Analytics 152