Going deep with Spark Streaming

Scale
06/02/2015 - 14:30 to 15:10
Stage 2
long talk (40 min)
Intermediate

Session abstract: 

Today if a byte of data were a gallon of water, in only 10 seconds there would be enough data to fill an average home, in 2020 it will only take 2 seconds.  The Internet of Things is driving a tremendous amount of this growth, providing more data at a higher rate then we’ve ever seen. With this explosive growth comes the demand from consumers and businesses to leverage and act on what is happening right now. Without stream processing these demands will never be met, and there will be no big data and no Internet of Things.  Apache Spark, and Spark Streaming in particular can be used to fulfill this stream processing need now and in the future. In this talk I will peel back the covers and we will take a deep dive into the inner workings of Spark Streaming; discussing topics such as DStreams, input and output operations, transformations, and fault tolerance.  After this talk you will be ready to take on the world of stream processing using Apache Spark.

Video: 

Slide: 

Corporate-Design: Extragestaltung, Margarethe Hausstätter
Ilustration: cyan, Berlin