Title: Cloud Data Engineer

Primary Skill: Data Engineering

Secondary Skill: ETL, SQL, Java or Python

Seniority: Intermediate/Senior

Years: 3+

Emp Type: FTE

Location: Canada-wide Remote

Client position: 237

Description:

Our Client is a recognized and certified Managed Service Provider (MSP) for AWS, Azure, and GCP. They were positioned as a Leader in the 2020 Gartner Magic Quadrant report for Public Cloud Infrastructure Professional and Managed Services. They have more than 1000 customers worldwide and help them gain a competitive edge through the cloud.

Job Description:
* Design, build and implement complex data solutions in the cloud (AWS, GCP and/or Microsoft).
* Understanding of on premise and on cloud data platforms - databases, data marts, datahubs, data warehouses and data lakes.
* Ability to consult on data governance, data management, data life cycle and ETL performance improvements.
* Translate business requirements into technical specifications; establish and define details, definitions, and requirements of applications, components and enhancements.
* Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
* Document data ingestion and ETL program designs, present findings, conduct peer code reviews.
* Develop and execute test plans to validate code.
* Build CI/CD pipelines/process for automation.

Requirements
* B.A. /B.S. Degree is required
* Google Cloud Data Engineer Certification preferred
* 3+ years experience building complex ETL programs with Informatica, DataStage, Spark, Dataflow Snowflake, etc.
* Experience and / or exposure to one or more of the following - Apache Beam, GCP Cloud Dataflow, GCP Dataprep, Trifacta, GCP Data Fusion, CDAP, * GCP Dataproc, AWS Data Pipeline, AWS Glue
* 3+ years experience in Python and/or Java, developing complex SQL queries, and working with relational database technologies.
* Experience configuring big data solutions in a cloud environment (AWS, Azure or GCP).
* Experience using cloud storage and computing technologies such as BigQuery, Snowflake, RedShift or Synapse.
* Experience developing complex technical and ETL programs within a Hadoop ecosystem.
* Must have a bachelor’s degree in Computer Science, Technology, Computer Information Systems, Computer Applications, Engineering, or a related field.
* Strong understanding of cloud technology (especially GCP and AWS) with a broad understanding of infrastructure and GCP and AWS Ecosystem technologies
* Understanding and some experience with Professional Services and Consulting engagements

External URL external

Edit | Back (all JDs) | Back (client position)