At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk.
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
As an AWS Managed Services Architect, you will play a pivotal role in architecting and optimizing the infrastructure and operations of a complex Data Lake environment for Hakkoda’s clients. You’ll leverage your strong expertise with AWS services to design, implement, and maintain scalable and secure data solutions while driving best practices.
You will work collaboratively with delivery teams across the U.S., Costa Rica, Portugal, and other regions, ensuring a robust and seamless Data Lake architecture. In addition, you’ll proactively engage with clients to support their evolving needs, oversee critical AWS infrastructure, and guide teams toward innovative and efficient solutions.
This role demands a hands-on approach, including designing solutions, troubleshooting, optimizing performance, and maintaining operational excellence.
Responsibilities
- AWS Data Lake Architecture: Design, build, and support scalable, high-performance
architectures for complex AWS Data Lake solutions.
- Service Expertise: Deploy and manage solutions using AWS services, including but not limited to:
- EMR (Elastic MapReduce): Optimize and maintain EMR clusters for big data
processing.
- AWS Batch: Design workflows to execute batch processing workloads
effectively.
- SageMaker: Support data science teams with scalable model training and
deployment.
- Glue: Implement Glue jobs for ETL/ELT processes to ensure efficient data
ingestion and transformation.
- Lambda: Develop serverless solutions to automate processes and manage
events.
- IAM Policies: Define and enforce security policies to control resource access
and maintain governance.
- IoT TimeStream Database: Design solutions to handle time-series data at scale.
- DynamoDB: Build and optimize scalable NoSQL database solutions.
- Data Governance & Security: Enforce compliance, governance, and security best
practices, ensuring data protection and privacy throughout the architecture.
- Performance Optimization: Monitor and fine-tune performance across AWS resources to ensure cost-effective and efficient operations.
- Automation: Develop Infrastructure as Code (IaC) solutions using tools like AWS
CloudFormation, Terraform, or similar.
- Client Collaboration: Work closely with clients to understand their business goals and ensure the architecture aligns with their needs.
- Team Leadership: Act as a technical mentor for delivery teams and provide support in troubleshooting, design reviews, and strategy discussions.
- Innovation: Stay updated on AWS advancements, best practices, and emerging tools to incorporate into solutions.
- Documentation: Develop and maintain architecture diagrams, SOPs, and
knowledge-sharing materials for internal and client-facing purposes.
As of April 2025, Hakkoda has been acquired by IBM and will be integrated in the IBM organization. Your recruitment process will be managed by IBM. IBM will be the hiring entity.
● Experience:
- 7+ years of hands-on experience in cloud architecture and infrastructure
(preferably AWS).
- 3+ years of experience specifically architecting and managing Data Lake or big
data solutions on AWS.
● Technical Skills:
- Expertise in AWS services such as EMR, Batch, SageMaker, Glue, Lambda,
IAM, IoT TimeStream, DynamoDB, and more.
- Strong programming skills in Python for scripting and automation.
- Proficiency in SQL and performance tuning for data pipelines and queries.
- Experience with IaC tools like Terraform or CloudFormation.
- Knowledge of big data frameworks such as Apache Spark, Hadoop, or similar.
● Data Governance & Security:
- Proven ability to design and implement secure solutions, with strong knowledge
of IAM policies and compliance standards.
● Problem-Solving:
- An analytical and problem-solving mindset to resolve complex technical
challenges.
● Collaboration:
- Exceptional communication skills to engage with technical and non-technical
stakeholders.
- Ability to lead cross-functional teams and provide mentorship.
● Education:
- Bachelor’s Degree (BA/BS) in Computer Science, Information Systems, or a
related field.
- AWS Certifications, such as Solutions Architect Professional or Big Data Specialty.
- Experience with Snowflake, Matillion, or Fivetran in hybrid cloud environments.
- Familiarity with Azure or GCP cloud platforms.
- Understanding of machine learning pipelines and workflows.