Home

Hiring for GCP Data Engineer (Java) - Phoenix, AZ - Long Term at Phoenix, Arizona, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2216793&uid=

From:

Kevin,

DCTS

[email protected]

Reply to:   [email protected]

Position:              GCP Data Engineer (Java)

Location:             Phoenix, AZ Day one on-site
Duration:             Long term

Exp 09+ Years and  USC and H1B

Role Description:

Design & Develop: Build and maintain scalable data platform frameworks leveraging Big Data technologies (Spark, Hadoop, Kafka, Hive, etc.) and GCP services (BigQuery, Dataflow, Pub/Sub, etc.).
Data Pipeline Development: Develop, optimize, and manage batch and real-time data pipelines to support business intelligence, analytics, and AI/ML workloads.
Java Development: Utilize Java to build efficient, high-performance data processing applications and frameworks.
Cloud Architecture: Design and implement cloud-native data solutions on GCP, ensuring reliability, security, and cost efficiency.
ETL & Data Integration: Work with structured and unstructured data sources, integrating data from multiple systems into a unified platform.
Performance Tuning: Optimize data processing performance by fine-tuning Spark jobs, SQL queries, and distributed computing environments.
Collaboration: Work closely with data scientists, analysts, and software engineers to deliver high-quality data solutions.
Automation & Monitoring: Implement CI/CD pipelines for data workflows and set up monitoring solutions to track system health and performance.

Required Skills & Qualifications:

Strong proficiency in Java for data engineering and backend development.
Hands-on experience with Big Data technologies (Hadoop, Spark, Kafka, Hive, HBase, etc.).
Expertise in GCP services: BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer (Airflow), Dataproc, etc.
Experience in developing data platform frameworks to support scalable and reusable data solutions.
SQL & NoSQL database experience (e.g., BigQuery, PostgreSQL, Cassandra, MongoDB).
Knowledge of ETL/ELT processes and data modeling concepts.
Experience with CI/CD tools (Git, Jenkins, Terraform) and infrastructure as code (IaC).
Understanding of distributed computing principles and high-performance data processing.
Strong problem-solving skills and ability to work in a fast-paced, agile environment.

Keywords: continuous integration continuous deployment artificial intelligence machine learning Arizona
Hiring for GCP Data Engineer (Java) - Phoenix, AZ - Long Term
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2216793&uid=
[email protected]
View All
09:21 PM 28-Feb-25


To remove this job post send "job_kill 2216793" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 6

Location: Phoenix, Arizona