sparkR.init {SparkR}R Documentation

Initialize a new Spark Context.

Description

This function initializes a new SparkContext.

Usage

sparkR.init(master = "", appName = "SparkR",
  sparkHome = Sys.getenv("SPARK_HOME"), sparkEnvir = list(),
  sparkExecutorEnv = list(), sparkJars = "", sparkRLibDir = "",
  sparkPackages = "")

Arguments

master

The Spark master URL.

appName

Application name to register with cluster manager

sparkHome

Spark Home directory

sparkEnvir

Named list of environment variables to set on worker nodes.

sparkExecutorEnv

Named list of environment variables to be used when launching executors.

sparkJars

Character string vector of jar files to pass to the worker nodes.

sparkRLibDir

The path where R is installed on the worker nodes.

sparkPackages

Character string vector of packages from spark-packages.org

Examples

## Not run: 
##D sc <- sparkR.init("local[2]", "SparkR", "/home/spark")
##D sc <- sparkR.init("local[2]", "SparkR", "/home/spark",
##D                  list(spark.executor.memory="1g"))
##D sc <- sparkR.init("yarn-client", "SparkR", "/home/spark",
##D                  list(spark.executor.memory="1g"),
##D                  list(LD_LIBRARY_PATH="/directory of JVM libraries (libjvm.so) on workers/"),
##D                  c("jarfile1.jar","jarfile2.jar"))
## End(Not run)

[Package SparkR version 1.4.1 Index]