A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe.
You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat.
Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience.
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
Skilled Snowflake & ETL/ELT Developer to design, develop, and optimize data solutions for electric utility data initiatives. In this role, you will build and maintain scalable, secure, and high-performance data pipelines and data models leveraging Snowflake’s cloud data platform. You will support the integration and analysis of complex data sets from advanced metering infrastructure (AMI), distributed energy resources (DER), grid sensors, and other operational systems to enable data-driven decision-making across the electric utility landscape.
- Design and develop scalable data pipelines and data models using Snowflake for electric utility data use cases.
- Develop, optimize, and maintain ETL/ELT workflows to ingest and process large volumes of structured and semi-structured data.
- Implement and maintain data security, privacy, and governance standards within Snowflake environments.
- Collaborate with data architects, data engineers, and business analysts to design solutions that meet business requirements and performance standards.
- Write complex SQL queries for data extraction, transformation, and loading (ETL/ELT), as well as for analytics and reporting purposes.
- Monitor and troubleshoot performance and data quality issues to ensure reliability and accuracy of data assets.
- Automate and orchestrate data workflows using tools such as dbt, Airflow, or native Snowflake capabilities.
- Support advanced analytics and data science teams with optimized and well-structured datasets.
- Stay up-to-date on Snowflake features and best practices to continuously improve platform capabilities.
Snowflake and ETL (multiple technologies – Informatica, Databricks, ADF etc), Azue, Python