Requirement for Bigdata Engineer withGCPin Phoenix, AZ. Only Locals at Phoenix, Arizona, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=2127361&uid= Hi , Trust you are doing great and staying safe. Please find the attached resume of my consultants. We can also share all the required documents. Visa: H1B Thanks & Regards, Narasimha Select Jarvis LLC Sales recruiter M: +1 501-205-(2774) E: [email protected] Gmail: [email protected] (202 North Walton Blvd., Suite 32, Bentonville, AR 72712) From: [email protected] [mailto:[email protected]] On Behalf Of Avnish Naagar Sent: 29 January 2025 16:57 To: Avnish Naagar <[email protected]> Subject: Requirement for Bigdata Engineer withGCP in Phoenix, AZ. Only Locals Role : Bigdata Engineer with GCP Location- Phoenix, AZ. Only Locals Duration : 12+ Months Work Authorization : H4EAD, GC, H1B, USC, OPT EAD will work Job Description: We are looking for a Big Data Engineer with expertise in Google Cloud Platform (GCP) to design, develop, and optimize large-scale data processing systems. The ideal candidate will have experience working with GCP data services, big data frameworks, and data pipeline orchestration to drive scalable and efficient data solutions. Key Responsibilities: Design, develop, and maintain end-to-end data pipelines on GCP. Work with BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, and other GCP services for data processing. Optimize data storage, retrieval, and transformation processes for scalability and performance. Develop and maintain ETL/ELT pipelines using Apache Spark, Apache Beam, or Cloud Data Fusion. Ensure data quality, governance, and security within the cloud environment. Collaborate with data scientists, analysts, and application teams to deliver data-driven solutions. Automate data workflows and orchestration using Cloud Composer (Apache Airflow). Implement real-time data streaming solutions using Pub/Sub, Kafka, or similar tools. Monitor and troubleshoot data pipelines to ensure reliability and performance. Work with Terraform, CloudFormation, or Infrastructure as Code (IaC) for environment setup and automation. Required Skills & Qualifications: X+ years of experience in Big Data Engineering with a focus on GCP. Hands-on experience with Google Cloud BigQuery, Dataflow, Dataproc, Cloud Composer (Airflow), and Pub/Sub. Strong programming skills in Python, Java, or Scala. Experience with SQL, NoSQL databases, and data warehousing concepts. Expertise in Apache Spark, Apache Beam, or Hadoop ecosystems. Familiarity with real-time data processing and streaming technologies. Knowledge of CI/CD, DevOps practices, and Infrastructure as Code (IaC). Strong understanding of data governance, security, and compliance best practices. Experience with Terraform, Kubernetes, or Docker is a plus. GCP certification (e.g., Professional Data Engineer) is a plus. Preferred Qualifications: Experience working with multi-cloud or hybrid cloud environments. Familiarity with machine learning workflows and MLOps. Experience integrating GCP services with third-party tools and APIs -- Thanks & Regards Avnish Naagar -- -- Keywords: continuous integration continuous deployment information technology green card Arizona Arkansas Requirement for Bigdata Engineer withGCPin Phoenix, AZ. Only Locals [email protected] http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=2127361&uid= |
[email protected] View All |
03:38 AM 30-Jan-25 |
Attached File(s): Venkatasai_Resume_1__1738188526201.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |