running apache spark with docker swarm
-
Updated
Feb 25, 2021 - Dockerfile
Apache Spark is an open source distributed general-purpose cluster-computing framework. It provides an interface for programming entire clusters with implicit data parallelism and fault tolerance.
running apache spark with docker swarm
PySpark in Docker Containers
Docker setup for Apache Spark and the R sparklyr package
Dockerimage of morpheus, the project from opencypher previously known as Cypher for Apache Spark
This repository holds examples and documentation about the most used tools in the data engineering ecosystem.
Small setup of development environment for Apache Spark with docker
Containers configuration saved from other tasks related to work or personal projects
Apache Spark cluster connected to a Jupyter Notebook instance
Collection of Apache Spark docker images for OKDP
Setting up a simple Apache Spark environment used for working with Spark in a development environment.
Created by Matei Zaharia
Released May 26, 2014