Meta Edversity

ETL

Cloud Labs
Projects
Svg Vector Icons : http://www.onlinewebfonts.com/icon
Assignments
Progress Reports

Ab Initio

The Ab Initio training course provides details about how Ab Initio and client-server model work together. During this training, candidates learn data analysis and manipulation, and batch processing GUI based parallel processing product basically used to fetch, manipulate and load data.

Prerequisites
  • Basics of Data warehousing
  • Knowledge on Unix commands
  • Knowledge on writing SQL queries

Alteryx

This training will get candidates up and running with the fundamentals of importing, transforming and exporting data from Alteryx. The objective of the training is to learn about Alteryx product features, getting data in and out of Alteryx and analysing the data. 

Prerequisites
  • There are no prerequisites for this training.

Apache Nifi

This training provides the fundamental concepts and experience necessary to automate the ingress, flow, transformation, and egress of data using Apache NiFi. Candidates will create and run NiFi dataflows for a variety of scenarios. They will gain expertise using processors, connections, and process groups, and will use NiFi Expression Language to control the flow of data from various sources to multiple destinations. Participants will monitor dataflows, examine progress of data through a dataflow, and connect dataflows to external systems such as Kafka, HDFS, and HBase. 

Prerequisites
  • Basic experience with Linux and exposure to big data concepts and applications is helpful.

AWS Glue

If you have data that needs to be subjected to analytics, then you will likely need to put that data through an extract, transform and load (ETL) process, AWS Glue is a fully managed service designed to do just this.  Through a series of simple configurable options, candidates can select their source data to be processed by AWS Glue allowing them to turn it into catalogued, searchable, and queryable data. This training will take them through the fundamentals of AWS Glue to get them started with this service.

Azure Data Factory

In this training, we’re going to review the features, concepts, and requirements that are necessary for designing data flows and how to implement them in Microsoft Azure. We’re also going to cover the basics of data flows, common data flow scenarios, and what all is involved in designing a typical data flow.

Prerequisites
  • To get the most from this training, candidates should have at least a basic understanding of data flows and what they are used for.

Google Cloud Dataflow

Cloud Dataflow is a serverless data processing service that runs jobs written using the Apache Beam libraries. When you run a job on Cloud Dataflow, it spins up a cluster of virtual machines, distributes the tasks in your job to the VMs, and dynamically scales the cluster based on how the job is performing. It may even change the order of operations in your processing pipeline to optimize your job. In this course, candidates will learn how to write data processing programs using Apache Beam and then run them using Cloud Dataflow. They will also learn how to run both batch and streaming jobs. 

IBM Infosphere Datastage

This training enables project administrators and ETL developers to acquire the skills necessary to develop parallel jobs in DataStage. The emphasis is on developers. Administrative functions that are relevant to DataStage developers are fully discussed. Candidates will learn to create parallel jobs that access sequential and relational data and combine and transform the data using functions and other job components.

Prerequisites
  • Basic knowledge of Windows operating system
  • Familiarity with database access techniques

Informatica PowerCenter

This training teaches candidates advanced mapping techniques and performance tuning and provides a thorough review of session partitioning features found in PowerCenter version 9.5.1.

Prerequisites
  • Some Informatica PowerCenter experience

Microsoft SQL Server Integrated Services (SSIS)

This SQL SSIS training will equip candidates with skills needed to work with SQL Server Integration Services (SSIS) for Business Intelligence. It involves working on the extraction, integration, and transformation of data to align it with business needs. This SSIS training will make candidates proficient in gathering data from flat files, XML, and relational data sources. Candidates will learn various aspects of SSIS such as data flow, ODBC set up and connection manager, flat file connection. transformation, import export transformation, split and join, merge, and union of all transformation.

Pentaho

Pentaho Training teaches candidates how to develop Business Intelligence (BI) dashboard using Pentaho BI tool from scratch. This training explores the fundamentals of Pentaho Data integration, creating an OLAP Cube, integrating Pentaho BI suite with Hadoop, and much more through the best practices. Our Pentaho Training Course also provides real-time projects to enhance candidates’ skills and successfully clear the Pentaho Data Integration certification exam.

SAP Data Services

This training will give candidates deep knowledge about SAP Data Services, such as installation, scheduling, security and configuration.

Prerequisites
  • Understanding of data handling and administration in SAP Solutions
  • Basic Knowledge of SQL
  • Data Warehousing Concepts

SSAS

Developing SQL Data Models training imparts core skills to implement multidimensional databases using SSAS (SQL Server Analysis Services), and create tabular semantic data models for analysis. This SSAS training is meant for database professionals who create enterprise BI solutions in order to fulfil the role of a BI Developer. This SQL Data Model Development training is also suitable for power users, information workers and data analysts.

Prerequisites
  • Working knowledge of Transact-SQL.
  • Working knowledge of relational databases.

Talend

Talend training course foster your group’s capacity to get the most worth from your Talend programming. Planned around useful activities, the training amplify the improvement of abilities utilizing the product. Talend gives an advancement environment that allows candidates to interface with many source and Big Data stores, without learning and compose confounded code.

Prerequisites
  • A fundamental comprehension of SQL can help.

Schedule a Demo

Fill the form to schedule a demo with us.