From January 4th to Mid march 2016, I taught an introductory course in big data technology, platforms and tools, at Collège de Bois de Boulogne in Montréal, Canada.
For the most part, it was a hands-on course.
Students needed to know at least one programming language from the following: C, C++, Java, or Python, and some familiarity with basic statistics and SQL.
Because of the introductory nature of the course, I introduced the fundamental platforms, such as Hadoop, MapReduce v1 and v2, yarn, HDFS and some other tools, such as Pig and Hive. Afterwards, the course introduced cloudera manager as an administration tool. The students were required to install small-sized clusters using either cloudera manager, Ambari or vanilla install.
For the most part, the students definitely were able to carry out this task. Some sample installation reports are listed below:
TP - Mise en Place d'un cluster_hortonworks_CBH
TP1Cluster_reza_naidji_conteneur_LXC
Tp1_cloudera_mario_nadon
Students were also required to choose a topic of their own for a final project. The application domain was to be based on the students' own interest. Overall, most students saw the final project as an opportunity to apply what they learned in the class for their own needs, either for their future work requirements or for the upcoming courses in the big data specialization at College de Bois de Boulogne. Some of the projects are listed below:
Hadoop ProjetSession Charles Brisson, Mario Nadon et Yadong Wang
BD3ProjetSession_Yvon Cadieux_Angelo Fernandes
Présentation Éric TREMBLAY et Raoul_kouanda
PrésentationYacine BELHOUL, Abdellilah NAFIA et Cadrick NOUTCHA
Projet de Session Khedidja Seridi et salim Rahali
TP2_session Reda Louahala et Albert Zhu
presentation_Hadoop_nasser_amami_deneus
Overall, it was a challenging, yet a very satisfying course for most students.