You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including Software and Red Hat.
Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. You will be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience.
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
A Data Engineer specializing in Data Platforms is responsible for building applications using Big Data technologies, with a strong emphasis on API development. The role demands expertise in traditional application development, analytics libraries, open-source NLP tools, and statistical computing.
Key Responsibilities:- Lead and mentor development teams, ensuring quality and resolving technical challenges.
- Design and implement integration solutions using Confluent Kafka, SOAP, and REST APIs.
- Collaborate across teams for seamless platform integration.
- Develop and test components throughout the software lifecycle.
- Work with both on-premise and cloud environments (e.g., OpenShift).
- Optimize performance through logging and monitoring.
- Actively participate in Agile processes and continuous improvement efforts.
Required Professional and Technical Expertise
- 2–4 years of experience leading technical teams and reviewing work products.
- 2–5 years of hands-on development experience using Confluent Kafka.
- Strong experience with integration frameworks and API development (REST & SOAP).
- Proficiency in Python, OpenShift, and database technologies.
- Solid understanding of asynchronous messaging, logging, and middleware.
- Experience working in Agile/Scrum environments.
- Ability to support testing teams and troubleshoot integration issues.
- Experience with CI/CD workflows, DevOps practices, and cloud-native development.
- Familiarity with performance tuning and scalable architecture design.
Nice-to-have:
- 5+ years of experience in the energy & utility industry.