Nikhil - Sr.Devops Engineer/Sr. SRE |
[email protected] |
Location: Atlanta, Georgia, USA |
Relocation: Yes |
Visa: H1B |
Resume file: Nikhil_Resume (1)_1744032441151.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |
PROFILE SR.DEVOPS ENGINEER
A dedicated DevOps and Cloud Engineer with over 10 years of IT experience, specializing in Azure DevOps and multi-cloud environments. Skilled in automating complex processes, managing CI/CD pipelines, and implementing Infrastructure as Code (IaC) across Azure, AWS, and GCP. Proficient in Python, Shell, Groovy, and PowerShell, with expertise in cloud-native technologies, containerization (Docker, Kubernetes), and code quality tools. Proficient in building and optimizing data pipelines, utilizing tools like Apache Spark, Kafka, and AWS services such as S3, Lambda, Fargate, and Athena. COMPETENCIES Strong knowledge and experience in Amazon Web Services (EC2, S3, VPC, Route 53, EBS, ELB, IAM, AMI, RDS, Security Groups, Cloud Watch, Auto Scaling, Data Ingestion) and Google Cloud Platform services such as (GKE, MIG, GCR, VPC, GCE, Pub/Sub, IAM, KMS, Cloud Run, Cloud Function, Cloud Monitoring). Expert in implementation of Azure Cloud services including ARM templates, Azure Virtual Networks, AKS, Virtual Machines, Cloud Services, Resource Groups, Express Route, Traffic Manager, VPN, Load Balancing, Application Gateways, Autoscaling. Experience in deploying, managing, and scaling containerized applications using Azure Kubernetes Service (AKS) and Elastic Kubernetes service (EKS). Experience with container orchestration tools like Kubernetes and Docker Swarm for containerizing applications or services. Deep and broad understanding of cloud/cloud hybrid platforms (IDaaS/IaaS/SaaS/PaaS) and associated security tools and processes. Conducted comprehensive security assessments of cloud service offerings (AWS, Azure, GCP) to identify potential threats, risks, and vulnerabilities. Implemented Conditional Access policies within Entra ID to enforce granular access controls based on various conditions. Configured and managed Entra ID self-service password reset. Implemented Entra ID Privileged Identity Management (PIM) to manage and control access to privileged roles. Evaluated cloud provider security controls and features (IAM, network security, data encryption, logging/monitoring) to ensure alignment with organizational security policies. Contributed to the development of cloud security policies, standards, and procedures. Experience with cloud security tools and technologies such as AWS Security Hub, Azure Security Center, Google Cloud Security Command Center. Securing sensitive information and automatic secrets retrieval in Azure Key vault and secret management. Hands on experience in installing and administrating CI tools like Jenkins, Bamboo, TeamCity and other Tools like SonarQube, Nexus, GitHub, JIRA, Atlassian stack of tools like, FishEye, Stash. Proficient in setting up and maintaining JFrog Artifactory repositories for storing build artifacts, Docker images, and binaries, ensuring efficient artifact lifecycle management. Experience with writing templates for infrastructure automation using Cloud Formation and Terraform API on AWS and Azure as a code. Implemented compliance and governance standards, ensuring adherence to regulatory requirements and organizational policies. Developing serverless applications to handle events using Cloud Functions. Managing and analyzing large datasets with Big Query and Data Analytics. Managed infrastructure deployments using Terraform, ensuring consistent and reliable environment setups. Strong understanding of IAM, firewalls, encryption, and cloud security best practices. Extensive experience with scripting languages like Python,Perl, Bash. Experience implementing and administering monitoring tools Splunk, Kibana and Nagios. Experienced in administering Hitachi, Netapp and HP storage arrays. CERTIFICATIONS Certified AWS DevOps Engineer Professional Google Cloud Certified Professional Cloud Architect Microsoft certified DevOps Engineer Expert (Azure) PROFESSIONAL EXPERIENCE Sr. DevOps Engineer | Hiscox, Atlanta, GA May, 2023 Present Led and managed the migration of CI/CD pipelines from Atlassian Bamboo to Azure DevOps for multiple applications. Experience in infrastructure as code (IaC) with Terraform to manage cloud infrastructure. Automated infrastructure deployments by integrating ARM templates with ADO pipelines and implemented role-based access control (RBAC) policies. Created and managed Kubernetes clusters on GCP, Azure and on-premises environments, ensuring high availability, scalability, and optimal resource utilization. Assessed and implemented Azure security features including Azure Active Directory, Network Security Groups, Azure Key Vault, and Azure Security Center. Designed and optimized resilient ETL pipelines with Apache Spark on AWS EMR, enabling seamless ingestion, transformation, and processing of structured and unstructured data at scale. Managed access to Azure resources using role-based access control (RBAC) within Entra ID, implementing least privilege principles. Automated IAM tasks using PowerShell, streamlining user management within Entra ID and reducing administrative overhead. Utilized Kubernetes pod lifecycle management to handle pod creation, scaling, and termination, ensuring efficient resource utilization, minimizing downtime, and maintaining high application performance. Utilized HPA to dynamically scale pods based on CPU utilization, ensuring optimal resource utilization, minimizing costs, and maintaining high application performance. Developed high-performance, distributed data processing applications in Scala, optimizing transformations and enrichment for petabyte-scale datasets. Integrated autoscaling with monitoring tools like Prometheus, Grafana by providing real-time insights into cluster performance and enabling data-driven decision-making. Automated provisioning and configuration of Windows and Linux nodes using Puppet, ensuring consistent system configurations. Integrated NoSQL (MongoDB, DynamoDB) and SQL (PostgreSQL, Athena) databases into streaming architectures, optimizing query execution and data storage for maximum performance. Collaborated with cross-functional teams to refine and optimize CI/CD workflows, aligning DevOps practices with organizational goals. Developed and deployed serverless data workflows using AWS Lambda, API Gateway, and Fargate. Monitored pipeline health, performance, and SLAs using DataDog, AWS CloudWatch, and Prometheus. TECHNICAL SKILLS: Skill Area Technical Skills Cloud Services Azure, AWS, GCP. Applications Oneshield, Webmethods Scripting/Programming Languages Python, Scala, Bash, PowerShell, GoLang. Continuous Integration Tools Bamboo ,Github actions, ADO. Virtualization Technologies Docker and Kubernetes, Activemq. Configuration Management Terraform, Puppet. Monitoring Tools Prometheus, Grafana, Splunk, Dynatrace. Source Code Management Tools Git, GitHub, BitBucket, gitlab. Data Engineer (Devops) | Daimler Trucks, Portland, OR June, 2018 March 2023 Involved in setting up Google cloud infrastructure using Terraform. Hands on experience with configuration management tools such as Chef, Puppet. Developed and maintained CI/CD pipelines using Git, Jenkins, and GitLab CI in Infrastructure as Code using Terraform to automate build, test, and deployment processes. Automated IAM user, role, and group management using Terraform, AWS CLI, gcloud CLI, and Python scripts Spearheaded the adoption of Terraform for IaC, developing and maintaining infrastructure across multiple environments. Designed and implemented modular Terraform configurations to ensure reusable and maintainable infrastructure code. Implemented custom IAM roles for GCP services like Cloud Functions, Cloud Run, GKE, and BigQuery, minimizing over-permissioning. Setup GCP Firewall rules to allow or deny traffic to and from the VM's instances based on specified configuration. Designed and maintained high-performance, scalable batch and real-time data processing pipelines using Scala, Apache Spark, and Kafka, ensuring seamless operation and reliability. Engineered and optimized robust ETL pipelines for structured (SQL-based) and unstructured data (JSON, Parquet, ORC), driving significant improvements in data transformation efficiency. Leveraged Kafka to implement a real-time, event-driven architecture, enabling seamless and high-throughput data ingestion and stream processing at scale. Managed and optimized MongoDB and PostgreSQL clusters, refining query performance, indexing strategies, and overall system efficiency. Architected AWS-native data solutions using S3, Athena, Fargate, and Lambda, streamlining and automating data processing workflows. Automated end-to-end deployment of data pipelines with CI/CD tools like GitLab and Jenkins, enabling faster and more reliable updates. Monitored and swiftly resolved pipeline failures, significantly improving system reliability, reducing downtime, and boosting operational efficiency. Implemented audit logging and monitoring with AWS CloudTrail, GCP Cloud Audit Logs, and SIEM integrations (Splunk, Datadog). Involved with PD teams in the process of lift and shift of applications from onprem to GCP cloud infrastructure. Expert in various Azure services like Compute (Web Roles, Worker Roles), Caching, Azure SQL, NoSQL, Storage, and Network services, Azure Active Directory (AD), API Management, Scheduling, Azure AutoScaling, and PowerShell Automation. Performed high-level operational maintenance, upgrade and support for Kafka. Handled all environment builds, designed and implemented cluster setup, capacity planning for kafka and Apache Spark. TECHNICAL SKILLS: Skill Area Technical Skills Cloud Services Azure, AWS, GCP. Scripting Languages Shell, Bash, PowerShell, Java, Groovy and Python. Continuous Integration Tools Jenkins, Bamboo ,Github actions,Sonarqube, ADO. Virtualization Technologies Docker and Kubernetes, VMware, Virtual Box, Activemq. Configuration Management Ansible, Chef, Terraform, Puppet. Monitoring Tools Splunk, Cloud Watch, ELK Stack, Grafana. Source Code Management Tools Git, GitHub, Subversion (SVN), Bit Bucket. Storage Migration Engineer| Datalink, Atlanta, GA July, 2015 Apr, 2018 Performed storage provisioning on Netapp 7-mode and CDOT. Managed CIFS and NFS shares on ENAS and Netapp CDOT. Created CIFS shares on CDOT and mounted them to various servers. Performed migrations at host level for different operating systems. Configured and Managed Hitachi Universal Replicator for DR replication. worked extensively on creating replication pairs from a VSP that replicates to USP. Migrated data from 7-mode to CDOT using various techniques and tools. Worked with HORCM for local replication and HUR for remote replication. Configured FCIP tunnel between two Brocade DCX in Primary Data Centre to DR site. Mounted NFS shares using CLI commands and created junction paths for shares. Performed storage replication using Storage Navigator 2 and Replication Manager. Performed storage provisioning on AMS and HUS150 using Storage navigator Modular2. Installed & configured 7MTT tool on various jump servers for migration of Netapp storage. Administered and monitored EMC storage arrays VMAX and VNX for SAN. Education: Master s Degree: MS in Technology Management, Southeast Missouri State University, Cape Girardeau, MO (2013-2015) Keywords: continuous integration continuous deployment sthree active directory information technology hewlett packard microsoft Georgia Idaho Missouri |