At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk.
At Hakkoda, a Data Engineer Consultant will work in the design and development of Snowflake Data Cloud solutions. This work includes data ingestion pipelines, data architecture, data governance and security.
What we are looking for:
We are looking for a savvy Data Engineer with consulting skills to join our growing team of experts. This position will work in the design and development of Snowflake Data Cloud solutions. The work includes data ingestion pipelines, data architecture, data governance and security. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing customer projects. This role will require you to be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of working for a start-up company to support our customers' next generation of data initiatives.
As of April 2025, Hakkoda has been acquired by IBM and will be integrated in the IBM organization. Your recruitment process will be managed by IBM. IBM will be the hiring entity.
- Bachelor’s degree in engineering, computer science or equivalent area
- 3+yrs in related technical roles with experience in data management, database development, ETL, and/or data prep domains.
- Experience developing data warehouses.Experience building ETL / ELT ingestion pipelines.
- Knowledge of how to manipulate, process and extract value from large disconnected datasets.
- SQL and Python scripting experience require, Scala and Javascript is a plus.
- Cloud experience (AWS, Azure or GCP) is a plus.
- Knowledge of any of the following tools is also a plus: Snowflake, Matillion/Fivetran or DBT.
- Strong interpersonal skills including assertiveness and ability to build strong client relationships.
- Strong project management and organizational skills.
- Ability to support and work with cross-functional and agile teams in a dynamic environment.
- Advanced English required.