Programming and Deploying Apache Spark Applications
Course

Programming and Deploying Apache Spark Applications

Skillsoft
Updated Mar 20, 2018
Apache Spark is a cluster computing framework for fast processing of Hadoop data. Spark applications can be written in Scala, Java, or Python. In this course, you will learn how to develop Spark applications using Scala, Java, or Python. You will also learn how to test and deploy applications to a cluster, monitor clusters and applications, and schedule resources for clusters and individual applications.
;