At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk.
We are looking for motivated people who are experienced with building data warehouses and analytics systems in the cloud (AWS, Azure, GCP, Snowflake).
What we are looking for:
We are looking for a Sr. Consultant, Data Engineer to join our growing team of experts. This position will work in the design and development of Snowflake Data Cloud solutions. The work includes data ingestion pipelines, data architecture, data governance and security. The ideal candidate is an experienced data pipeline builder and migrations who enjoys optimizing data systems and building them from the ground up.
The Sr. Consultant, Data Engineer will develop database architectures, data warehouses and will ensure optimal data delivery architecture is consistent throughout ongoing customer projects. On this role you will be leading technical teams. The right candidate will be excited by the prospect of working for a start-up company to support our customers' next generation of data initiatives.
As of April 2025, Hakkoda has been acquired by IBM and will be integrated in the IBM organization. Your recruitment process will be managed by IBM. IBM will be the hiring entity.
- Bachelor’s degree in engineering, computer science or equivalent area.
- Expertise in evaluating, selecting, and integrating ingestion technologies to solve complex data challenges.
- Leadership in architectural decisions for high-throughput data ingestion frameworks, including real-time data processing and analytics.
- Mentorship of junior engineers in best practices for data ingestion, performance tuning, and troubleshooting.
- 5+yrs in related technical roles, data management, database development, ETL,Data - - Warehouses and pipelines.
- Experience designing and developing data warehouses (Teradata, Oracle Exadata, Netezza, SQL Server, Spark)
- Experience building ETL / ELT ingestion pipelines with tools like DataStage, Informatica, Matillion
- SQL scripting
- Cloud experience on AWS (Azure, GCP are nice to have as well)
- Python Scripting, Scala is require.
- Ability to prepare reports and present to internal and customer stakeholders
- Track record of sound problem solving skills and action oriented mindset
- Strong interpersonal skills including assertiveness and ability to build strong client relationships
- Ability to work in Agile teams
- Experience hiring, developing and managing a technical team
- Advanced English