SDLC Engineer Sr Consultant || Visa- No H1B || Location - Dallas, TX ; Onsite work - must be local to DFW area. at Dallas, Texas, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=458573&uid= From: Mohit Joshi, Exarca Inc [email protected] Reply to: [email protected] Role : SDLC Engineer Sr Consultant Visa: No H1B Location: Dallas, TX ; Onsite work - must be local to DFW area. Duration: 10 months+ Pre-Qualifying Questions: Explain your experience in designing complex scripted multistage pipelines, maintaining and supporting such initiatives. Explain your experience with Groovy/Java/Python or any other high-level languages which are generally used to build scripted pipelines (yaml based dynamic pipeline definition doesn't count). Explain your experience with Terraform Enterprise. Experience/Knowledge building Kubernetes cluster and its internals (networking, exposing service from container) Do you have any OpenShift experience Explain your understanding of AWS core infrastructure services Are you local to Dallas and willing to work onsite Are you willing to consider a perm offer after 5 months of contract work Skills: CICD Pipelines (Jenkins or similar) - especially Scripted pipelines using groovy or any high-level language (not yaml definition alone) Groovy/Java/Python - language skills Intermediate and above. Kubernetes - Experience building cluster, develop/deploying Helm charts and troubleshooting skills. Terraform enterprise - development of terraform code, understand how terraform works. Requirements: Scripted CICD pipelines Experience designing complex scripted multistage pipelines, maintaining and supporting such initiatives Experience running/maintaining Jenkins cluster with multiple such pipelines in Kubernetes/any other container orchestration platform. Experience working with artifact repositories like Artifactory. Programming skills - Intermediate and above Languages: Groovy/Java/Python or any other high-level languages which are generally used to build scripted pipelines (yaml based dynamic pipeline definition doesn't count) Experience developing libraries/software/programs for automating tasks, that may have been used in dynamic pipelines mentioned above. Kubernetes Experience developing Helm chart / yaml definition and deploying those into K8s. Experience/Knowledge building Kubernetes cluster and its internals (networking, exposing service from container) Good troubleshooting skills Terraform enterprise: Experience developing terraform code. Good understanding of how terraform works. Experience developing and maintaining pipelines to deploy AWS resources using terraform (opensource / Enterprise) Experience using Hashicorp Vault or similar tools. AWS: Good understanding of AWS core infrastructure services (VPC/EC2/S3/Lambda, etc.) Good understanding of how IAM permissions work (don't need to be an expert) Can work autonomously, deliver code with minimal supervision from a set of requirements. Additional: Has excellent communication skills to work as a member of a team. Must be able to effectively participate in architecture and design discussions that produce requirements. Must be able to interpret requirements and code in a manner that encourages re-use and maintainability. Must be knowledgeable of underlying technologies to build optimized, efficient code. Ability to function in an agile-based environment and provide good daily feedback on team stand-up calls. Participate in code release and production deployment, have an aggressive approach to fixing bugs, defects and good at troubleshooting. Knowledge of using OpenShift will be an added advantage. Mohit Joshi Exarca Inc [email protected] +14699830493 Ext : 493 Keywords: sthree Texas http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=458573&uid= |
[email protected] View All |
02:54 AM 28-Jul-23 |