In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
- Design, develop, and manage our data infrastructure on AWS, with a focus on data warehousing solutions.
- Write efficient, complex SQL queries for data extraction, transformation, and loading.
- Utilize DBT for data modelling and transformation.
- Use Python for data engineering tasks, demonstrating strong work experience in this area.
- Implement scheduling tools like Airflow, Control M, or shell scripting to automate data processes and workflows. Participate in an Agile environment, adapting quickly to changing priorities and requirements.
- Proven expertise in AWS technologies, with a strong understanding of AWS services. Experience in Redshift is optional
- Experience in data warehousing with a solid grasp of SQL, including ability to write complex queries.
- Proficiency in Python, demonstrating good work experience in data engineering tasks.
- Familiarity with scheduling tools like Airflow, Control M, or shell scripting.
- Excellent communication skills, willing attitude towards learning.
- Knowledge of DBT for data modelling and transformation is a plus.
- Experience with PySpark or Sp.rk is highly desirable.
- Familiarity with DevOps, CI/CD, and Airflow is beneficial.Experience in Agile environments is a nice-to-have.