Home

12+ years of exp in Data Engineering at Remote at Remote, Remote, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2037704&uid=

From:

Shivani Sharma,

Abidi Solutions

[email protected]

Reply to:   [email protected]

HI, 

Only GC, USC 

Job Title
: GCP Data Engineer, Please read the JD carefully before shareing the profile, also share LinkedIn

Location:
 Cincinnati, Ohio (Remote)

Rate:
 $60 - $65/hour (C2C)

 Kroger

Type:
 ContractPosition Overview
As a GCP Data Engineer, you will play a crucial role in developing and maintaining advanced data solutions that enable effective decision-making across the organization. This position focuses on designing and implementing scalable, efficient, and secure data pipelines and platforms while leveraging cutting-edge technologies. Your expertise in GCP services, Vertex AI, and feature engineering will be instrumental in driving data initiatives and enhancing business capabilities.
You will collaborate with cross-functional teams to solve complex data engineering challenges, ensuring alignment with organizational goals. This role demands a mix of technical expertise, leadership, and a passion for innovation.Key Responsibilities
Technical Leadership: Lead projects, facilitate collaboration across teams, and address complex data engineering challenges.
Data Pipeline Development: Design, build, and maintain scalable data pipelines for ingestion, transformation, and integration using tools like Kafka and Databricks.
Feature Engineering: Develop and manage feature engineering pipelines for machine learning workflows using Vertex AI, BigQuery ML, and Python.
Digital Innovation: Modernize and extend core data assets across SQL, NoSQL, cloud-based, and real-time streaming platforms.
Automated Testing: Design and implement automated unit, integration, and performance testing frameworks to ensure data quality and reliability.
Workflow Optimization: Optimize data workflows for performance, cost-efficiency, and scalability in large datasets.
Mentorship: Train and mentor team members on best practices in data engineering and Agile principles.
Documentation: Draft and review architectural diagrams, interface specifications, and design documents for clear communication of technical solutions.
Cost/Benefit Analysis: Evaluate solutions and present cost/benefit analyses to leadership for informed decision-making.Required Qualifications
Experience:
4+ years of professional data development experience.
5+ years of Java development and
2+ years of Python development.

3+ years of experience with data pipelines, workflows, and Kafka.
2+ years of feature engineering for machine learning pipelines.
Technical Expertise:
SQL and NoSQL technologies.
GCP services such as
BigQuery, Vertex AI Platform, Cloud Storage, AutoMLOps, and Dataflow.
CI/CD pipelines, version control (e.g., Git), and automated testing frameworks.
ETL and Data Warehousing concepts.
Agile Methodologies: Strong understanding of Agile principles, particularly Scrum.Preferred Qualifications
Knowledge of structured streaming tools such as Spark, Kafka, or EventHub.
Experience with GitHub SaaS/GitHub Actions.
Familiarity with Databricks, PySpark, and Spark development.

Keywords: continuous integration continuous deployment artificial intelligence machine learning green card
12+ years of exp in Data Engineering at Remote
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2037704&uid=
[email protected]
View All
08:45 PM 24-Dec-24


To remove this job post send "job_kill 2037704" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 7

Location: Cincinnati, Ohio