Course: Big Data analysis with Hadoop and RHadoop

Europe/Ljubljana
Zoom Meeting

Zoom Meeting

Description

This training course will focus on the foundations of “Big Data” processing by introducing the Hadoop distributed computing architecture and providing an introductory level tutorial for Big Data analysis using Hadoop, Rhadoop, and R libraries parallel, doParallel, foreach and Rmpi. Although online, the course will be hands-on, allowing participants to work interactively on real data on the High Performance Computing environment of the University of Ljubljana and on the Vienna Scientific Cluster.


Organizers

   

This course is an EuroCC event jointly organised by EuroCC Slovenia, EuroCC Slovakia and EuroCC Austria.

 


Lecturers

  • Prof. Janez Povh, University of Ljubljana, Slovenia (applied mathematics, high performance computing, big data analysis)
  • Lucia Absalon Bautista, University of Ljubljana, Slovenia (big data analysis)
  • Dr. Giovanna Roda, EuroCC Austria, BOKU, and TU Wien, Austria (high performance computing, big data analysis)
  • Liana Akobian, TU Wien, Austria (high performance computing, big data analysis)

 

SupportContact
  • Wednesday, 19 October
    • Introduction
    • Introduction to HADOOP

      Introduction to Big Data
      The Hadoop Distributed Computing Architecture
      First hands-on excercise on the cluster

    • Break
    • HDFS

      The Hadoop Distributed File System: blocks, partitions, load balancing, replication/erasure coding, fault tolerance, data locality
      Hands-on example: managing data on HDFS

    • Break
    • MapReduce (MR)

      Explaning the MR computing model
      Split/ map/ sort & shuffle/combine/reduce
      Hands-on demos

    • Break
    • Hands-on exercise with MR
  • Thursday, 20 October
    • Introduction to Day 2
    • Introduction to R

      Connecting to RStudio web server at HPC@UL
      Creating and running own R scripts
      Creating, retrieving, saving data files
      Standard data management operations on data frames
      Data management with dplyr, magritt

    • Break
    • Advanced and Big data management with R

      Dana manipulations with apply functions apply, lapply, sapply, vapply, tapply and mapply
      Big Data management and analysis using one computing node with functions for efficient parallel loops parLapply, parSapply, mcLapply and foreach-dopar

    • Break
    • Big Data management and analysis with Rmpi and RHadoop

      Big Data management and analysis using many computing nodes and library Rmpi
      Preparing and storing big data to HDFS using rhdfs library
      Retriving from and managing big data in HDFS by plyrmr and rhdfs library

    • Break
    • Big data analysis with RHadoop

      Preparing map-reduce scripts to make basic data analysis tasks (extreme values, counts, mean values, dispersions, visualisations) using rhdfs library

    • Wrap-up