At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk.
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
What We Are Looking For:
We are in search of a skilled Consultant Data Engineer to join our expanding team of experts. This role will be pivotal in the design and development of Snowflake Data Cloud solutions, encompassing responsibilities such as constructing data ingestion pipelines, establishing sound data architecture, and implementing stringent data governance and security protocols.
The ideal candidate brings experience as a proficient data pipeline builder and adept data wrangler, deriving satisfaction from optimizing data systems from their foundational stages. Collaborating closely with database architects, data analysts, and data scientists, the Data Engineer will play a crucial role in ensuring a consistent and optimal data delivery architecture across ongoing customer projects.
This position demands a self-directed individual comfortable navigating the diverse data needs of multiple teams, systems, and products. If you are enthusiastic about the prospect of contributing to a startup environment and supporting our customers in their next generation of data initiatives, we invite you to explore this opportunity.
As of April 2025, Hakkoda has been acquired by IBM and will be integrated in the IBM organization. Your recruitment process will be managed by IBM. IBM will be the hiring entity
- Bachelor’s degree in engineering, computer science or equivalent area
- 3+yrs in related technical roles with experience in data management, database development, ETL, and/or data prep domains.
- Experience developing data warehouses.
- Experience building ETL / ELT ingestion pipelines.
- Proficiency in using cloud platform services for data engineering tasks, including managed database services (Snowflake and its pros and cons vs Redshift, BigQuery etc) and data processing services (AWS Glue, Azure Data Factory, Google Dataflow).
- Skills in designing and implementing scalable and cost-effective solutions using cloud services, with an understanding of best practices for security and compliance.
- Knowledge of how to manipulate, process and extract value from large disconnected datasets.
- SQL and Python scripting experience require, Scala and Javascript is a plus.
- Cloud experience (AWS, Azure or GCP) is a plus.
- Knowledge of any of the following tools is also a plus: Snowflake, Matillion/Fivetran or DBT.
- Strong interpersonal skills including assertiveness and ability to build strong client relationships.
- Strong project management and organizational skills.
- Ability to support and work with cross-functional and agile teams in a dynamic environment.
- Advanced English required.