DevOps Engineering Lead
Open
Global Consulting Company

7 - 9 years

Salary: Not Disclosed

Chennai (India)

Job ID: 20250109R0001

Job Description:

Job Summary
• skilled and motivated DevOps Engineer with a strong background in AWS big-data solutions
• expertise in implementing CI/CD pipelines using Harness.io.
• As a key member of our dynamic team, you will play a crucial role in designing, developing, and maintaining robust, scalable, and secure data pipelines and infrastructure for our big-data applications.
Accountabilities
• AWS Big-Data Solutions: Collaborate with cross-functional teams to design, deploy, and manage AWS-based big-data solutions, including data storage, processing, and analytics services.
• Leverage AWS services such as Amazon S3, Amazon EMR, Amazon Redshift, and AWS Glue to architect efficient and scalable data workflows. Harness.io Implementation:
• Lead the adoption and utilization of Harness.io for the continuous integration and continuous deployment (CI/CD) pipelines.
• Design, configure, and automate CI/CD workflows to streamline the development, testing, and deployment processes of big-data applications.
• Security Validation: Integrate robust security practices into the CI/CD pipelines and build/release processes. Implement security checks, vulnerability scanning, and compliance validation to ensure data privacy, integrity, and protection at every stage of the pipeline.
• Infrastructure as Code (IaC): Champion the IaC approach for managing infrastructure resources. Use tools like AWS CloudFormation or Terraform to provision and manage AWS resources, ensuring consistency and reproducibility.
• Automation and Orchestration: Drive automation initiatives to increase operational efficiency. Automate repetitive tasks, infrastructure provisioning, and configuration management using scripting languages and tools like Ansible.
• Collaboration and Knowledge Sharing: Foster a culture of collaboration, knowledge sharing, and continuous improvement within the DevOps and broader engineering teams.
• Mentor junior team members and participate in peer code reviews.
• Best Practices and Innovation: Stay up-to-date with the latest trends, tools, and technologies in the AWS and big-data domain. Introduce innovative solutions and best practices to enhance the performance, reliability, and security of our data infrastructure.

Must Have :

• Basic Qualifications: Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience).
• Proven experience in designing and deploying AWS big-data solutions, leveraging services like S3 and Glue.
• Hands-on expertise in implementing CI/CD pipelines using Harness.io or similar tools for big-data applications.
• Strong knowledge of security principles and experience integrating security checks into CI/CD pipelines (eg Sonarqube, Checkmarx).
• Proficiency in infrastructure automation using tools like AWS CloudFormation, Terraform, or similar.
• Solid scripting skills in languages such as Python, Bash, or PowerShell. Experience with IaC, configuration management tools (e.g., Ansible), and version control systems (e.g., Bit Bucket).
• Strong problem-solving skills, ability to troubleshoot complex issues, and an eye for detail. Excellent communication and teamwork abilities, with a focus on fostering a collaborative and inclusive work environment.

Position(s) Open: 1

Technical Skills: Terraform, Harness/Genkins, Repository Management, DevSecOps (Basics) - Sonar cube, Checkmarx, Wiz, AWS Infra Architecture / Networking Architecture, . Data Services - SNS, SQS, Kinesis, . Basic - Shell Scripting / Python Scripting

Soft Skills: Communication, Problem-solving, Teamwork, Analytical Thinking, Critical Thinking, Decision Making

Relevant Experience: 5 Yrs

Notice Period Expected (Maximum): Immediate Joiner

Domain Knowledge: Any

Education: Under Graduates

Course: Any

Specialization: Any

Work Mode: Hybrid

Engagement Type: Full Time

Job Type: Employment, Contract

Published On: 09/01/2025