c2c Multiple Req at Remote, Remote, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=2070799&uid= Requirement ID: | 9818885 | Posted On: | Jan-10-2025 | Role name: | Analyst | Role Description: | Basic Qualifications 5+ year of experience with object-oriented/object function scripting languages: Python, Java, etc 3+ years of leading development of large scale cloud-based services with platforms like AWS, GCP or Azure and developing and operating cloud-based distributed systems. Experience building and optimizing data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management Strong computer science fundamentals in data structures, algorithm design, problem solving, and complexity Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Software development experience in big data technologies Databricks, Hadoop, Hive, Spark(PySpark) Familiarity with distributed systems and computing at scale. Advanced working experience with databases SQL & NoSQL is required. Proficiency in data processing using technologies like Spark Streaming, Spark SQL, Expertise in developing big data pipelines using technologies like Kafka, Storm, Experience with large scale data warehousing, mining or analytic systems. Ability to work with analysts to gather requirements and translate them into data engineering tasks Aptitude to independently learn new technologies. Experience automating deployments with continuous integration and continuous delivery systems Experience with DevOps , automation using Terraform or similar products are preferred . | Competencies: | Oracle Business Intelligence (OBIEE), Oracle OBIEE Application Development | Experience (Years): | 6-8 | Essential Skills: | Summary of Key ResponsibilitiesResponsibilities and essential job functions include but are not limited to the following: Leads large-scale, complex, cross-functional projects build technical roadmap for the WFM Data Services platform . Leading and reviewing design artifacts Build and own the automation and monitoring frameworks that showcase reliable, accurate, easy-to-understand metrics and operational KPIs to stakeholders for data pipeline quality Execute proof of concept on new technology and tools to pick the best tools and solutions Supports business objectives by collaborating with business partners to identify opportunities and drive resolution; Communicating status and issues to Sr Starbucks leadership and stakeholders; Directing project team and cross functional teams on all technical aspects of the projects Lead with engineering team to build and support real-time, highly available data, data pipeline and technology capabilities Translate strategic requirements into business requirements to ensure solutions meet business needs Define & implement data retention policies and procedures Define & implement data governance policies and procedures Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability Enable team to pursue insights and applied breakthroughs, while also driving the solutions to Starbucks scale Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of structured and unstructured data sources and using big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Perform root cause analysis to identify permanent resolutions to software or business process issues | Desirable Skills: | Basic Qualifications 5+ year of experience with object-oriented/object function scripting languages: Python, Java, etc 3+ years of leading development of large scale cloud-based services with platforms like AWS, GCP or Azure and developing and operating cloud-based distributed systems. Experience building and optimizing data pipelines, architectures and data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management Strong computer science fundamentals in data structures, algorithm design, problem solving, and complexity Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Software development experience in big data technologies Databricks, Hadoop, Hive, Spark(PySpark) Familiarity with distributed systems and computing at scale. Advanced working experience with databases SQL & NoSQL is required. Proficiency in data processing using technologies like Spark Streaming, Spark SQL, Expertise in developing big data pipelines using technologies like Kafka, Storm, Experience with large scale data warehousing, mining or analytic systems. Ability to work with analysts to gather requirements and translate them into data engineering tasks Aptitude to independently learn new technologies. Experience automating deployments with continuous integration and continuous delivery systems Experience with DevOps , automation using Terraform or similar products are preferred . | Country: | United States | Branch | City | Location: | TCS - Seattle, WA SEATTLE Seattle,WA | BA Recruiter Name: | CHAITANYA TANGIRALA | Start Date: | Jan-31-2025 | Duration (Months): | 5 | Status: | Open | Keywords: | Oracle BI , Python, Java, AWS, GCP or Azure, DevOps, REST / SOAP services, Data Anlaytics, Tableau, Data Science | Requirement ID: | 9834183 | Posted On: | Jan-10-2025 | Role name: | Developer | Role Description: | DevOps Engineer with Lead experience working on AWSNetworking: AWS network load balancer, Route 53, API GatewayContainer orchestration/management: 70+ services running in ECSSecurity: AWS Cert manager, cloudfrontDatabases: MySQL, Postgres, Redis, S3AWS LambdasInfrastructure as code terraformDeployment pipeline w/ Jenkins | Competencies: | Digital : Cloud DevOps | Experience (Years): | 6-8 | Essential Skills: | DevOps Engineer with Lead experience working on AWSNetworking: AWS network load balancer, Route 53, API GatewayContainer orchestration/management: 70+ services running in ECSSecurity: AWS Cert manager, cloudfrontDatabases: MySQL, Postgres, Redis, S3AWS LambdasInfrastructure as code terraformDeployment pipeline w/ Jenkins | Desirable Skills: | DevOps Engineer with Lead experience working on AWSNetworking: AWS network load balancer, Route 53, API GatewayContainer orchestration/management: 70+ services running in ECSSecurity: AWS Cert manager, cloudfrontDatabases: MySQL, Postgres, Redis, S3AWS LambdasInfrastructure as code terraformDeployment pipeline w/ Jenkins | Country: | United States | Branch | City | Location: | TCS - Los Angeles LOS ANGELES Torrance, CA | BA Recruiter Name: | Balaji B | Start Date: | Jan-29-2025 | Duration (Months): | 12 | Status: | Open | Keywords: | AWS, DevOps | Requirement ID: | 9830887 | Posted On: | Jan-10-2025 | Role name: | Lead | Role Description: | Analysis and transformation using AWS Glue / AthenaDevelopment using lambda functionsFull understanding of AWS components, such as S3 buckets, Event Bridge, SNS, etc.Understanding of infra components, such as Direct Connect and Transit Gateway.Any hands-on experience with migrating dbs and/or Informatica from on-prem to cloud would be beneficial. | Competencies: | Digital : Amazon Web Service(AWS) Cloud Computing, Digital : Informatica BDM | Experience (Years): | 8-10 | Essential Skills: | Analysis and transformation using AWS Glue / AthenaDevelopment using lambda functionsFull understanding of AWS components, such as S3 buckets, Event Bridge, SNS, etc.Understanding of infra components, such as Direct Connect and Transit Gateway.Any hands-on experience with migrating dbs and/or Informatica from on-prem to cloud would be beneficial. | Desirable Skills: | Analysis and transformation using AWS Glue / AthenaDevelopment using lambda functionsFull understanding of AWS components, such as S3 buckets, Event Bridge, SNS, etc.Understanding of infra components, such as Direct Connect and Transit Gateway.Any hands-on experience with migrating dbs and/or Informatica from on-prem to cloud would be beneficial. | Country: | United States | Branch | City | Location: | TCS - Dallas, TX Plano Plano, TX | BA Recruiter Name: | Balaji B | Start Date: | Mar-31-2025 | Duration (Months): | 9 | Status: | Open | Keywords: | AWS Glue | Requirement ID: | 9833748 | Posted On: | Jan-10-2025 | Role name: | Technical Lead | Role Description: | Lead the design, code, debug and documentation of applications in Java technologies like J2EE, SpringBoot, SpringCloud framework, Spring MVC, Streams and API, Lead the proof of concept work to move existing real time ingestion stack to more modern tech stack. Build Micro services that connect to Relational, NoSQL & Graph databases Build software components that integrate with a workflow engine and/or ESB to execute asynchronous business processes Collaborate with stakeholders to define specifications and delivery project timelines. Perform code reviews to ensure high quality and consistent coding practices Support and troubleshoot issues (process & system), identify root cause, and proactively recommend sustainable corrective actions Ensure that IT and business standards and procedures are maintained in accordance with Company policies and all audit, security and regulatory requirements Partner with other IT Managers, Architects to ensure alignment, drive efficiencies and ensure continued accountability for product and services delivery within assigned business units and across IT Other duties as assigned | Competencies: | Advanced Java Concepts | Experience (Years): | 10 & Above | Essential Skills: | Ten (10) years experience with Java technologies including Spring, SpringBoot, SpringCloud. Two (2) years experience in Kafka and Spark. Excellent understanding of multi-threading & concurrency in java is required. Experience leading all tasks in the development cycle such as business process definition, configuration and unit testing, assistance and follow-up with user testing, co-ordination of transports, production, training, post-live support and fixes, etc. Experience with RESTful, JSON, XML, SOAP web service Familiarity with backend architectures, Micro services patterns, Object oriented design patterns, and Messaging architecture Available for moderate overnight local travel (35%) Able to work flexible hours and be available for emergency response on short notice | Desirable Skills: | Experience with RESTful, JSON, XML, SOAP web service J2EE, SpringBoot, SpringCloud framework, Spring MVC, Streams and API Experience in Kafka and Spark | Country: | United States | Branch | City | Location: | TCS - NEW YORK 2, NY NEW YORK New York-2, NY | BA Recruiter Name: | SUNDARRAJAN MURALI | Start Date: | Mar-03-2025 | Duration (Months): | 7 | Status: | Open | Keywords: | Requirement ID: | 9833683 | Posted On: | Jan-10-2025 | Role name: | Engineer | Role Description: | Position: Data Engineer with extensive experience in building data integration pipelines in CI/CD modelExperience: Lead 12 + years , Sub-Leads 7+ years of experience Ability to design and develop a high-performance data pipeline framework from scratch o Data ingestion across systemso Data quality and curationo Data transformation and efficient data storageo Data reconciliation, monitoring and controls Support reporting model and other downstream application needso Skill in technical design documentation, data modeling and performance tuning applications o Lead and manage a team of data engineers, contribute towards code reviews, and guide the team in designing and developing convoluted data pipelines adhering to the defined standards.o Be hands on, performs POCs on the open source/licensed tools in the market and share recommendations.o Provide technical leadership and contribute to the definition, development, integration, test, documentation and support across multiple platforms (GCP, Python, HANA) o Establish a consistent project management framework and develop processes to deliver high quality software, in rapid iterations, for the business partners in multiple geographies o Participate in a team that designs, develops, troubleshoots, and debugs software programs for databases, applications, tools etc. o Experience in balancing production platform stability, feature delivery and reduction of technical debt across a broad landscape of technologies.o Skill in the following platform, tools and technologies GCP cloud platform GCS, Big Query, Streaming (pub/sub), data proc and data flow Python, PYSpark, Kafka, SQL, scripting & Stored procs Data warehouse, distributed data platforms and data lake Database definition, schema design, Looker Views, Models CI/CD pipelineo Proven track record in scripting code in Python, PySpark and SQLo Excellent structured thinking skills, with the ability to break down multi-dimensional problemso Ability to navigate ambiguity and work in a fast-moving environment with multiple stakeholderso Good communication skills and ability to coordinate and work with cross functional teams. | Competencies: | Digital : Python, Digital : Google Cloud | Experience (Years): | 10 & Above | Essential Skills: | Position: Data Engineer with extensive experience in building data integration pipelines in CI/CD modelExperience: Lead 12 + years , Sub-Leads 7+ years of experience Ability to design and develop a high-performance data pipeline framework from scratch o Data ingestion across systemso Data quality and curationo Data transformation and efficient data storageo Data reconciliation, monitoring and controls Support reporting model and other downstream application needso Skill in technical design documentation, data modeling and performance tuning applications o Lead and manage a team of data engineers, contribute towards code reviews, and guide the team in designing and developing convoluted data pipelines adhering to the defined standards.o Be hands on, performs POCs on the open source/licensed tools in the market and share recommendations.o Provide technical leadership and contribute to the definition, development, integration, test, documentation and support across multiple platforms (GCP, Python, HANA) o Establish a consistent project management framework and develop processes to deliver high quality software, in rapid iterations, for the business partners in multiple geographies o Participate in a team that designs, develops, troubleshoots, and debugs software programs for databases, applications, tools etc. o Experience in balancing production platform stability, feature delivery and reduction of technical debt across a broad landscape of technologies.o Skill in the following platform, tools and technologies GCP cloud platform GCS, Big Query, Streaming (pub/sub), data proc and data flow Python, PYSpark, Kafka, SQL, scripting & Stored procs Data warehouse, distributed data platforms and data lake Database definition, schema design, Looker Views, Models CI/CD pipelineo Proven track record in scripting code in Python, PySpark and SQLo Excellent structured thinking skills, with the ability to break down multi-dimensional problemso Ability to navigate ambiguity and work in a fast-moving environment with multiple stakeholderso Good communication skills and ability to coordinate and work with cross functional teams. | Desirable Skills: | Position: Data Engineer with extensive experience in building data integration pipelines in CI/CD modelExperience: Lead 12 + years , Sub-Leads 7+ years of experience Ability to design and develop a high-performance data pipeline framework from scratch o Data ingestion across systemso Data quality and curationo Data transformation and efficient data storageo Data reconciliation, monitoring and controls Support reporting model and other downstream application needso Skill in technical design documentation, data modeling and performance tuning applications o Lead and manage a team of data engineers, contribute towards code reviews, and guide the team in designing and developing convoluted data pipelines adhering to the defined standards.o Be hands on, performs POCs on the open source/licensed tools in the market and share recommendations.o Provide technical leadership and contribute to the definition, development, integration, test, documentation and support across multiple platforms (GCP, Python, HANA) o Establish a consistent project management framework and develop processes to deliver high quality software, in rapid iterations, for the business partners in multiple geographies o Participate in a team that designs, develops, troubleshoots, and debugs software programs for databases, applications, tools etc. o Experience in balancing production platform stability, feature delivery and reduction of technical debt across a broad landscape of technologies.o Skill in the following platform, tools and technologies GCP cloud platform GCS, Big Query, Streaming (pub/sub), data proc and data flow Python, PYSpark, Kafka, SQL, scripting & Stored procs Data warehouse, distributed data platforms and data lake Database definition, schema design, Looker Views, Models CI/CD pipelineo Proven track record in scripting code in Python, PySpark and SQLo Excellent structured thinking skills, with the ability to break down multi-dimensional problemso Ability to navigate ambiguity and work in a fast-moving environment with multiple stakeholderso Good communication skills and ability to coordinate and work with cross functional teams. | Country: | United States | Branch | City | Location: | TCS - Sunnyvale SAN JOSE San Jose, CA | BA Recruiter Name: | RAGESH K | Start Date: | Feb-05-2025 | Duration (Months): | 6 | Status: | Open | Keywords: | Data Analytics, GCP, Python, PYSpark, SQL | Requirement ID: | 9836184 | Posted On: | Jan-10-2025 | Role name: | Developer | Role Description: | Good in development using JAVA language. Hands on experience in Application development .Understanding of Software development lifecycleOutstanding troubleshooting skills, attention to detail, written and spoken communication skillsKnowledge of Oracle PL/ SQLUsage of LINUX CommandsAbility to collaborate with others to solve a complex problemGood Communication & Customer facing expApplication Testing | Competencies: | Core Java | Experience (Years): | 8-10 | Essential Skills: | Good in development using JAVA language. Hands on experience in Application development .Understanding of Software development lifecycleOutstanding troubleshooting skills, attention to detail, written and spoken communication skillsKnowledge of Oracle PL/ SQLUsage of LINUX CommandsAbility to collaborate with others to solve a complex problemGood Communication & Customer facing expApplication Testing | Desirable Skills: | Good in development using JAVA language. Hands on experience in Application development .Understanding of Software development lifecycleOutstanding troubleshooting skills, attention to detail, written and spoken communication skillsKnowledge of Oracle PL/ SQLUsage of LINUX CommandsAbility to collaborate with others to solve a complex problemGood Communication & Customer facing expApplication Testing | Country: | United States | Branch | City | Location: | TCS - Seattle, WA SEATTLE Seattle,WA | BA Recruiter Name: | CHAITANYA TANGIRALA | Start Date: | Feb-28-2025 | Duration (Months): | 4 | Status: | Open | Keywords: | Java Full Stack- Core Java, Spring, Microservices, DB, Java script, HTML,CSS,REACT JS, NODE JS | Requirement ID: | 9832946 | Posted On: | Jan-10-2025 | Role name: | Technical Lead | Role Description: | Core Java, Spring, Spring Boot | Competencies: | Digital : Amazon Web Service(AWS) Cloud Computing, Digital : Spring Boot, Core Java | Experience (Years): | 8-10 | Essential Skills: | Core Java, Spring, Spring Boot | Desirable Skills: | Core Java, Spring, Spring Boot | Country: | United States | Branch | City | Location: | TCS - Houston, TX HOUSTON Houston, TX | BA Recruiter Name: | RAGESH K | Start Date: | Jan-29-2025 | Duration (Months): | 0 | Status: | Open | Keywords: | Core Java, Spring, Spring Boot | Requirement ID: | 9835998 | Posted On: | Jan-10-2025 | Role name: | Developer | Role Description: | 10+ years experience in ETL processes and tools such as Apache Spark, Java, Springbatch, J2EE 5+ years experience in Hadoop, Hive, SQL and noSQL databasesExcellent debugging skillsGood understanding of third-party dependency management and transitive dependency issuesStrong knowledge on IMS databaseUnderstanding of the software development life cycleExperience with implementation and release management activitiesGood understanding of unit/system and functional testing methodologiesExperience working in large transaction-based systemsExperience writing technical designsExperience documenting technical functionsStrong Communication and good leadership skills Design, code, test, debug, document, maintain, and modify computer programs of high complexity, significance, and risk Participate in application architecture functions including estimating and defining time tables, cost and project tasks Recommend solutions to improve business with a focus on core architecture, technology strategies and standards Guide others through change impact analysis Establish, refine and integrate development and test environment tools and software as needed Review, analyze, refine and integrate development and test environment tools and software as needed Create and recommend improvements to unit, test plans and testing process based on assessment of organizational needs Collaborate closely with teams in all stages of software development lifecycle including design, development and testing of the system Design basic and detailed program specifications while ensuring that expected application performance levels are achieved by managing interfaces, service levels, standards, and configurations Guide technical staff and business partners to investigate, review, and solve complex, multidisciplinary business problems Monitor operating efficiency and organizational needs of existing application systems and identify opportunities to fine-tune and optimize applications of developed projects and recommend technical solutions Demonstrate solid understanding of the business needs driving the projects | Competencies: | Digital : Apache Spark, Core Java | Experience (Years): | 8-10 | Essential Skills: | 10+ years experience in ETL processes and tools such as Apache Spark, Java, Springbatch, J2EE 5+ years experience in Hadoop, Hive, SQL and noSQL databasesExcellent debugging skillsGood understanding of third-party dependency management and transitive dependency issuesStrong knowledge on IMS databaseUnderstanding of the software development life cycleExperience with implementation and release management activitiesGood understanding of unit/system and functional testing methodologiesExperience working in large transaction-based systemsExperience writing technical designsExperience documenting technical functionsStrong Communication and good leadership skills Design, code, test, debug, document, maintain, and modify computer programs of high complexity, significance, and risk Participate in application architecture functions including estimating and defining time tables, cost and project tasks Recommend solutions to improve business with a focus on core architecture, technology strategies and standards Guide others through change impact analysis Establish, refine and integrate development and test environment tools and software as needed Review, analyze, refine and integrate development and test environment tools and software as needed Create and recommend improvements to unit, test plans and testing process based on assessment of organizational needs Collaborate closely with teams in all stages of software development lifecycle including design, development and testing of the system Design basic and detailed program specifications while ensuring that expected application performance levels are achieved by managing interfaces, service levels, standards, and configurations Guide technical staff and business partners to investigate, review, and solve complex, multidisciplinary business problems Monitor operating efficiency and organizational needs of existing application systems and identify opportunities to fine-tune and optimize applications of developed projects and recommend technical solutions Demonstrate solid understanding of the business needs driving the projects | Desirable Skills: | 10+ years experience in ETL processes and tools such as Apache Spark, Java, Springbatch, J2EE 5+ years experience in Hadoop, Hive, SQL and noSQL databasesExcellent debugging skillsGood understanding of third-party dependency management and transitive dependency issuesStrong knowledge on IMS databaseUnderstanding of the software development life cycleExperience with implementation and release management activitiesGood understanding of unit/system and functional testing methodologiesExperience working in large transaction-based systemsExperience writing technical designsExperience documenting technical functionsStrong Communication and good leadership skills Design, code, test, debug, document, maintain, and modify computer programs of high complexity, significance, and risk Participate in application architecture functions including estimating and defining time tables, cost and project tasks Recommend solutions to improve business with a focus on core architecture, technology strategies and standards Guide others through change impact analysis Establish, refine and integrate development and test environment tools and software as needed Review, analyze, refine and integrate development and test environment tools and software as needed Create and recommend improvements to unit, test plans and testing process based on assessment of organizational needs Collaborate closely with teams in all stages of software development lifecycle including design, development and testing of the system Design basic and detailed program specifications while ensuring that expected application performance levels are achieved by managing interfaces, service levels, standards, and configurations Guide technical staff and business partners to investigate, review, and solve complex, multidisciplinary business problems Monitor operating efficiency and organizational needs of existing application systems and identify opportunities to fine-tune and optimize applications of developed projects and recommend technical solutions Demonstrate solid understanding of the business needs driving the projects | Country: | United States | Branch | City | Location: | TCS - Minneapolis Downtown, MN MINNEAPOLIS Minneapolis, MN | BA Recruiter Name: | RAGESH K | Start Date: | Feb-17-2025 | Duration (Months): | 3 | Status: | Open | Keywords: | 10+ years experience in ETL processes and tools such as Apache Spark, Java, Springbatch, J2EE 5+ years experience in Hadoop, Hive, SQL and noSQL databases | Requirement ID: | 9835902 | Posted On: | Jan-10-2025 | Role name: | Developer | Role Description: | Full Stack Developer | Competencies: | Advanced Java Concepts, Digital : Spring Boot | Experience (Years): | 6-8 | Essential Skills: | Full Stack Developer | Desirable Skills: | Full Stack Developer | Country: | United States | Branch | City | Location: | TCS - Cincinnati, OH MILFORD Milford, OH | BA Recruiter Name: | Shanmugapriya R | Start Date: | Feb-06-2025 | Duration (Months): | 2 | Status: | Open | Keywords: | Full Stack Developer | Requirement ID: | 9835908 | Posted On: | Jan-10-2025 | Role name: | Developer | Role Description: | Full Stack Developer | Competencies: | Advanced Java Concepts, Digital : Spring Boot | Experience (Years): | 6-8 | Essential Skills: | Full Stack Developer | Desirable Skills: | Full Stack Developer | Country: | United States | Branch | City | Location: | TCS - Cincinnati, OH MILFORD Milford, OH | BA Recruiter Name: | Shanmugapriya R | Start Date: | Feb-07-2025 | Duration (Months): | 2 | Status: | Open | Keywords: | Full Stack Developer | Requirement ID: | 9837599 | Posted On: | Jan-10-2025 | Role name: | Developer | Role Description: | Drive innovation by infusing key technologies such as Azure Kubernetes Service (AKS), Azure DevOps, Azure Functions, Key Vault, Logic Apps, Power Platform, and AI services. Manage and administer Azure resources, ensuring optimal performance, security, and cost-efficiency. Implement DevSecOps practices to integrate security into the DevOps process, ensuring secure and compliant software delivery. Create and maintain CI/CD pipelines to automate the software delivery process. Monitor and analyze system performance using tools like Datadog and Splunk. Design and architect Azure DevOps pipelines from the ground up. Implement FinOps practices to optimize cloud financial management and cost-efficiency. | Competencies: | Digital : Microsoft Azure, Digital : DevOps | Experience (Years): | 10 & Above | Essential Skills: | Drive innovation by infusing key technologies such as Azure Kubernetes Service (AKS), Azure DevOps, Azure Functions, Key Vault, Logic Apps, Power Platform, and AI services. Manage and administer Azure resources, ensuring optimal performance, security, and cost-efficiency. Implement DevSecOps practices to integrate security into the DevOps process, ensuring secure and compliant software delivery. Create and maintain CI/CD pipelines to automate the software delivery process. Monitor and analyze system performance using tools like Datadog and Splunk. Design and architect Azure DevOps pipelines from the ground up. Implement FinOps practices to optimize cloud financial management and cost-efficiency. | Desirable Skills: | Drive innovation by infusing key technologies such as Azure Kubernetes Service (AKS), Azure DevOps, Azure Functions, Key Vault, Logic Apps, Power Platform, and AI services. Manage and administer Azure resources, ensuring optimal performance, security, and cost-efficiency. Implement DevSecOps practices to integrate security into the DevOps process, ensuring secure and compliant software delivery. Create and maintain CI/CD pipelines to automate the software delivery process. Monitor and analyze system performance using tools like Datadog and Splunk. Design and architect Azure DevOps pipelines from the ground up. Implement FinOps practices to optimize cloud financial management and cost-efficiency. | Country: | United States | Branch | City | Location: | TCS - San Diego,CA SAN DIEGO San Diego,CA | BA Recruiter Name: | RAGESH K | Start Date: | Feb-08-2025 | Duration (Months): | 5 | Status: | Open | Keywords: | Azure DevOps Engineer | -- Thanks & Regards Niranjana Manojj | Talent Acquisition Recruiter (U S Division) Teamware Solutions Inc |8951 Cypress Waters Blvd, Suite #1092, Dallas Texas 75019 D : 4692186483 |Office: 469-218-6483 Ext: 201 : niranjana.m @twsol.com Disclaimer This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to which they are addressed. If you have received this email in error please notify the system manager. This message contains confidential information and is intended only for the individual named. If you are not the named addressee you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately by e-mail if you have received this e-mail by mistake and delete this e-mail from your system. If you are not the intended recipient you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited. Keywords: continuous integration continuous deployment business analyst artificial intelligence javascript business intelligence sthree database rlang information technology procedural language California Colorado Idaho Minnesota New York Ohio Texas Washington c2c Multiple Req [email protected] http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=2070799&uid= |
[email protected] View All |
09:01 PM 10-Jan-25 |