Home

Big Data Engineer with GCP in Phoenix, AZ. Only Locals at Phoenix, Arizona, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2135008&uid=

Role : Big Data
Engineer with GCP 

Location- Phoenix,
AZ. Only Locals

Duration : 12+ Months

Work Authorization :
H4EAD,  H1B, USC, OPT EAD
will work

Must Have

Must Have 10+ Years
Experience

Must have Linkedin Created
before 2017

Must Have Local DL Card of
Phoenix, AZ

Job Description:

We are looking for a Big
Data Engineer with expertise in Google Cloud Platform (GCP) to
design, develop, and optimize large-scale data processing systems. The ideal
candidate will have experience working with GCP data services, big data frameworks, and data pipeline
orchestration to drive scalable and efficient data solutions.

Key Responsibilities:

Design, develop, and
maintain end-to-end
data pipelines on GCP.

Work with BigQuery, Dataflow, Dataproc,
Pub/Sub, Cloud Storage, and other GCP services for data
processing.

Optimize data storage,
retrieval, and transformation processes for scalability and performance.

Develop and maintain ETL/ELT pipelines
using Apache Spark,
Apache Beam, or Cloud Data Fusion.

Ensure data quality, governance, and
security within the cloud environment.

Collaborate with data
scientists, analysts, and application teams to deliver data-driven solutions.

Automate data workflows
and orchestration using Cloud
Composer (Apache Airflow).

Implement real-time data
streaming solutions using Pub/Sub,
Kafka, or similar tools.

Monitor and troubleshoot
data pipelines to ensure reliability and performance.

Work with Terraform, CloudFormation, or
Infrastructure as Code (IaC) for environment setup and
automation.

Required Skills & Qualifications:

10+
years
of experience in Big
Data Engineering with a focus on GCP.

Hands-on experience with
Google Cloud
BigQuery, Dataflow, Dataproc, Cloud Composer (Airflow), and Pub/Sub.

Strong programming
skills in Python,
Java, or Scala.

Experience with SQL, NoSQL databases, and data
warehousing concepts.

Expertise in Apache Spark, Apache Beam, or
Hadoop ecosystems.

Familiarity with real-time data processing and
streaming technologies.

Knowledge of CI/CD, DevOps practices, and
Infrastructure as Code (IaC).

Strong understanding of data governance, security, and
compliance best practices.

Experience with Terraform, Kubernetes, or Docker
is a plus.

GCP certification (e.g.,
Professional Data
Engineer) is a plus.

Preferred Qualifications:

Experience working with multi-cloud or hybrid cloud
environments.

Familiarity with machine learning workflows and
MLOps.

Experience integrating
GCP services with third-party
tools and APIs

--

Thanks & Regards
Avnish Naagar

--

Keywords: continuous integration continuous deployment information technology Arizona
Big Data Engineer with GCP in Phoenix, AZ. Only Locals
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2135008&uid=
[email protected]
View All
12:25 AM 01-Feb-25


To remove this job post send "job_kill 2135008" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,