About Professional certification 'IBM Data Warehouse Engineer' course
This professional certificate is designed to help you develop the skills and portfolio needed for entry-level work in business intelligence (BI) or data warehouse engineering. The online courses in this program will immerse you in the in-demand role of a data warehouse engineer and provide you with the core skills needed to work with a variety of tools and databases to design, deploy, operate, and manage enterprise data warehouses (EDWs).
By the end of this Professional Certificate training, you will be able to perform the key tasks required of a Data Warehouse Engineer. You will work with relational database management systems (RDBMS) and query data using SQL statements.
You'll use Linux/UNIX shell scripts to automate repetitive tasks and build data pipelines using tools like Apache Airflow and Kafka to extract, transform, and load (ETL) data. You will gain experience managing databases and data warehouses.
Finally, you will design and implement data storage systems and use business intelligence tools to analyze and extract information using reports and dashboards.
This program is suitable for anyone with a passion for learning, and will suit you whether you have a college degree or not, and does not require any prior data science or programming experience.
Applied Learning Project
Each course includes numerous hands-on assignments and a project to sharpen and apply the concepts and skills learned. By the end of the program, you will design, implement, configure, query and maintain multiple databases and create data pipelines using real-world tools and data warehouses, allowing you to build a portfolio of job-ready skills.
You'll start by initializing a database instance in the Cloud. Next, you will design databases using Entity Relationship Diagrams (ERDs) and create database objects such as tables and keys using MySQL, PostgreSQL and IBM Db2
You will then learn to query databases using SQL using SELECT, INSERT, UPDATE, and DELETE statements, and learn how to filter, sort, and aggregate result sets. Next, you'll become familiar with common Linux/Unix shell commands and use them to create Bash scripts.
You will create data pipelines for batch and streaming ETL jobs using Apache Airflow and Kafka. Finally, implement data warehouses and create BI dashboards.