Azure Data Factory Engineer


Industry: Technology Consulting | Remote

2 years ago

Primary Skills Required
Design and implement Data Engineering Solutions and ETL Processes with the Azure stack

Resource Type
Direct Hire

135,000.00 to 175,000.00 Salary + Bonus


Job Description

This client is a fast-growing managed Microsoft partner who specializes in providing complex data analytics solutions to clients globally.  Great culture, very low turn-over, and the majority of the clients are in the Fortune 1000.  They use a remote model so this position can be located anywhere in the U.S.  

Principal Duties and Responsibilities:

  • Design and implement Data Engineering Solutions and ETL Processes with the Azure stack including Azure Data Factory, Azure Data Lake, Azure SQL Server, Databricks, etc.
  • Independently solve complex technical problems.
  • Ability to work under pressure and maintain composure and professionalism in a developing environment with multiple changing priorities/tasks.
  • Participate in the definition and revision of development best practices and standards.

Technical Skills:

  • Competently perform advanced technical tasks with minimal supervision, including design and implementation of Data Engineering Solution components (Data Ingestion, Curation, Process Orchestration, etc.)
  • Strong understanding of Enterprise ETL tools, Data Engineering Technology stacks and solutions.
  • Strong understanding of Software Engineering principles and best practices.
  • Strong understanding of ETL design patterns and architectures (ETL vs ELT)
  • Strong understanding of cloud data topologies (i.e. data lake)
  • Strong SQL Querying and Programming skills
  • Good understanding of data analytics architectural approaches and data models (i.e. Kimball, Data Vault)
  • Understanding of all aspects of development including, but not limited to, gathering requirements, development of technical components related to process scope and supporting testing and post-implementation support.
  • Ability to work and partner with users and stakeholders to gather solution requirements.
  • Ability to adapt to new technical innovations and business processes.

Education and Experience Requirements:

  • 5 or more years of relevant experience
  • Bachelor’s degree or certification required
  • Experience with one or more Enterprise ETL tools (Azure Data Factory, Informatica).
  • Experience working with RDBMS platforms (SQL Server) and MPP Platforms (Azure Synapse, Snowflake, Redshift)
  • Experience in the development and implementation of complex, high-volume Data Engineering solutions.
  • Experience ingesting data from a variety of sources and mediums including relational systems, APIs, webhooks, event streams, data lakes, etc.
  • Experience working with cloud technology stacks (Azure, AWS, Google Cloud)
  • Experience applying Software Development best practices and following SDLC processes.