Home

Sr. GCP Data Engineer - 12 years of experience || Client expectations to work during EST time zone || CON at Remote, Remote, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=834526&uid=

From:

Rex,

Smart IT Frame

[email protected]

Reply to:   [email protected]

Role: Sr. GCP Data Engineer - 12 years of experience

Location: Client expectations to work during EST time zone. 

Duration: Long Term

Interview Mode: Video

Interview process: Video call

Note: Visa documents, DL is mandatory, need genuine profiles. No H1b's transfer please

Job Description:

Minimum 8 Years of experience in Data Engineering Projects.

Minimum 5 Years of experience in GCP.

Minimum 8 Years of experience in SQL/PLSQL scripting

Minimum 5 Years of experience in ETL and Data warehousing.

Ability to build batching solutions

Exposure to Project management tools like JIRA, Confluence, and GIT

Strong problem-solving and analytical skills

Good communication skills

Strong exposure and experience in Big Query, Composer, Python and CICD Pipelines.

Good Understanding of Distributed Data Platforms.

Should have worked as a Sr. Data engineer in a medium/large scale Data Warehouse solution.

Experience in Migrating Legacy Data Warehousing Solutions to GCP Cloud.

Deep exposure & hands-on GCP Cloud Native ETL / ELT services with a deep understanding of BigQuery, Looker, or any other reporting platform.

Possess in-depth knowledge and hands-on development experience operationalizing large-scale ingestion, processing, and consumption using either DataProc, Dataflow, or cloud fusion.

Strong understanding and experience with Storage infrastructure, event-based architecture using Cloud Functions, Monitoring, Logging, and Auditing services of GCP.

Strong experience on either one or more MPP Data Warehouse Platforms, prefer BigQuery, CloudSQL, Cloud Spanner, Fire store, or similar.

Strong Development Experience on at least one or more event-driven streaming platforms, prefer PUB/SUB, Kafka

Strong Data Orchestration experience using tools such as Cloud Functions, Dataflow, Cloud Composer, Apache Airflow, or related.

Should be familiar with GCP Data Migration programs that enable the identification of potential risks in time for technical interventions.

Value add skills

GCP Professional Data Engineer certification is an added advantage.

Understanding of Terraform scripts

Understanding of DevOps Pipelines

Keywords: information technology
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=834526&uid=
[email protected]
View All
10:41 PM 07-Nov-23


To remove this job post send "job_kill 834526" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 1

Location: ,