Location

Description

Guaranteed Job Placement Ad

We are looking to provide training with guaranteed job placement as a full time big data engineer in a Big 4 consulting company which is our direct client. This client is looking to hire multiple entry level big data engineers nationwide.

To take this training, you must meet the following criteria:

Minimum 1 year of Building and deploying Java applications

Minimum 1 year understanding of traditional ETL tools & RDBMS

US Citizen and Green card holders

Able to travel 75% which is required by the job.

If you do not meet the prerequisite we can provide the prerequisite training for an extra cost, however you need to be US citizen or green card holder.

Disclaimer - We will train you and prepare you for this job. We will act as your coach. However you have to pass the interview with our Big 4 client which we will prepare you for. You have to work very hard. If you are not willing to put in the required effort, we cannot guarantee your success in the end. We will provide regular feedback to each student during the class so you will know where you stand in terms of your readiness for this open position.

Why Big Data Training from Omni212?

1. Omni212 is a Hortonworks Hadoop Consulting Partner.2. Two of our most experienced, Certifed Hadoop instructors teach this class.3. Employees of Microsoft, Cap Gemini, McKesson, FMR Boston, have taken our classes.4. Upon Registration and payment, immediate access to the previous class recordings, course material on the cloud is immediately provided. 5. Class recordings for this class will be made available. 6. Post class support7. Course material available. 8. Cloud account on Microsoft Azure with Hands-on lab exercises under the guidance of two experienced big data, hadoop instructors.9. Career advancement and Job placement assistance

Next class starting

May 22, 2017

Video Conference link

Will be sent upon your registration and payment

Pre-requisite

Minimum 1 year of Building and deploying Java applications

Minimum 1 year understanding of traditional ETL tools & RDBMS

US Citizen and Green card holders

Able to travel 75% which is required by the job.

If you do not meet the pre-requisite we can provide the prerequisite training for an extra cost

Software access

A Microsoft cloud Azure account will be provided to every student where they will install hortonworks hadoop on the cloud virtual machines. Students will carry out the hands-on lab exercises with instructor guidance.

Course Outline

Session 1: Big Data Basics

• An introduction to Big Data?

• Why is Big Data? Why now?

• The Three Dimensions of Big Data (Three Vs)

• Evolution of Big Data

• Big Data versus Traditional RDBMS Databases

• Big Data versus Traditional BI and Analytics

• Big Data versus Traditional Storage

• Key Challenges in Big Data adoption

• Benefits of adoption of Big Data

• Introduction to Big Data Technology Stack

• Apache Hadoop Framework

• Introduction to Microsoft HDInsight – Microsoft’s Big Data Service

Hands-On Lab:

• Creating Azure Storage Account

• Creating HDInsight Cluster

• Using services on HDInsight Cluster

Session 2: The Big Data Technology Stack

• Basics of Hadoop Distributed File System (HDFS)

• Basics of Hadoop Distributed Processing (Map Reduce Jobs)

Hands-On Lab:

• Loading files to Azure storage account

• Moving files across HDInsight Cluster

• Remote Access to Azure Storage Account and HDInsight Cluster

Session 3: Deep dive into Hadoop Storage System (HDFS) (1 Hour)

• HDFS

• Reading files with HDFS

• Writing files with HDFS

• Error Handling

Hands-On Lab:

• Accessing Hadoop configuration files using HDInsight Cluster

Session 4: Processing Big Data –MapReduce and YARN

• How MapReduce works

• Handling Common Errors

• Bottlenecks with MapReduce

• How YARN (MapReduceV2) works

• Difference between MR1 and MR2

• Error Handling

Hands-On Lab:

• Running a simple MapReduce application (word count)

• Running a custom MapReduce application (census data)

• Running MapReduce via PowerShell

• Running a MapReduce application using PowerShell

• Monitoring application status

Session 5: Big Data Development Framework

• Introduction to HIVE

• Introduction to PIG

• HBase

Hands-On Lab:

• Loading the data into HIVE

• Submitting Pig jobs using HDInsight

• Submitting Pig jobs via PowerShell

Session 6: Big Data Integration and Management

• Big Data Integration using Polybase

• Big Data Management using Ambari

Hands-On Lab:

• Fetching HDInsight data into SQL

• Using Ambari for managing HDInsight cluster

Session 7: Big Data – BI and Reporting using Power BI

• Introduction to Power BI

• Usual workflow of Power BI

• Power BI Ecosystem

• Getting Data into Power BI

• Reports vs Dashboards

• Additional elements of Power BI Reports

Hands-On Lab:

• Fetching HDInsight Data into Power BI desktop

• Data Modelling using Power BI desktop

• Creating reports using Power BI desktop

Session 8: PowerBI.com services – Deep dive

• Power BI Dashboards

• Natural Language Query

• Power BI Workspaces – Personal and Group Workspaces

• Sharing using OneDrive for Business

Hands-On Lab:

• Publishing reports to Powerbi.com

• Sharing reports using OneDrive for Business

End-to-End Use Case Implementation- Lab Exercise

• Use case -Healthcare Analytics using Hadoop framework through Microsoft HDInsight and Power BI

Class Size: Maximum 22

Price: $595

PLEASE NOTE:1. Each student get access to a login account on the cloud, Microsoft Azure, where they install Hadoop on a cloud virtual machine and perform hands-on lab exercises with instructor guidance.There will be 2 experienced big data, hadoop instructors supporting the students throughout the class.