Tags programming-Free documents Library

Spark: Cluster Computing with Working Sets

Spark: Cluster Computing with Working Sets

To use Spark, developers write a driver program that im-plements the high-level control flow of their application and launches various operations in parallel. Spark pro-vides two main abstractions for parallel programming: resilient distributed datasets and parallel operations on these datasets (invoked by passing a function to apply on adataset).Cited by: 6020Publish Year: 2010Author: Matei Zaharia, Mosharaf Chowdhury, Michael J. Fr