In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience.
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
- Design, develop, and manage our data infrastructure on AWS, with a focus on data warehousing solutions.
- Write efficient, complex SQL queries for data extraction, transformation, and loading.
- Utilize DBT for data modelling and transformation.
- Use Python for data engineering tasks, demonstrating strong work experience in this area.
- Implement scheduling tools like Airflow, Control M, or shell scripting to automate data processes and workflows. Participate in an Agile environment, adapting quickly to changing priorities and requirements
- Mandatory Skills: Candidate should have worked on traditional Data warehousing with any database (Oracle or DB2 or SQL Server) (Redshift optional)
- Candidate should have string SQL skills and ability to write complex queries using analytical functions. Prior working experience on AWS platform
- Python programming experience for data engineering .Experience in PySpark/Spark
- Working knowledge of Data Pipelines tool Airflow The below skills are nice to have: Experience with DBT, Exposure to working in an Agile environment. Proven ability to troubleshoot and resolve production issues under a DevOps model
- A track record of continuously identify opportunities to improve the performance and quality of your ecosystem. Experience monitoring performance and ensuring
- Knowledge of DBT for data modelling and transformation is a plus.
- Experience with PySpark or Spark is highly desirable