Home

Data Engineer (GCP, Big Query, Snowflake, Azure, Delta) :: Seattle, WA :: Contract at Seattle, Washington, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2241155&uid=

Hello,

Hope you are doing well. This is Jyoti Chauhan from Vlink. We are currently seeking a talented and motivational consultant for the position of
Data Engineer (GCP, Big Query, Snowflake, Azure, Delta) at Vlink. Based on your background and experience, I believe you could an excellent fit for this role. Please find the Job Description for the position below, and if comfortable please share your
updated resume.

Job Title: Data Engineer (GCP, Big Query, Snowflake, Azure, Delta)

Location: Seattle, WA - hybrid (local only)

Employment Type: Contract

Duration: 6+ Months

Visa: H1B

Experience: 10+ years

About VLink: Started in 2006 and headquartered in Connecticut, VLink is one of the fastest growing digital technology services and consulting companies. Since its inception, our innovative team members have been solving the most complex business, and
IT challenges of our global clients.

Job Description:

***MUST HAVE INDUSTRY EXPERIENCE in ECOMMERCE/RETAIL***

**Need snowflake - to migrate data into snowflake

**experience with massive amounts of data -- this company has over 13 million users***

**GCP, Google Analytics, Azure, Pipelines, Delta**

As an Engineer II, you will bring a high level of technical knowledge, but also an ability to spread knowledge to your co-workers. You will help form the core of our engineering practice at the company by contributing to all areas of development and operations
(pre-production to production). You will be an example of what good engineering looks like and help others around you refine their skills. You will be part of a day-to-day production release team and may perform on-call support functions as needed. Having
a DevOps mindset is the key to success in this role, as Engineers are commonly part of full DevOps teams that "own all parts of software development, release pipelines, production monitoring, security and support.

Data Engineering Projects

Data pipeline creation and maintenance. Stack: Google Cloud Platform (GCP), Azure Cloud, Azure Databricks, Snowflake

o Includes engineering documentation, knowledge transfer to other engineers, future enhancements and maintenance

Create secure data views and publish them to the Enterprise Data Exchange via Snowflake for other teams to consume

Data pipeline modernization and migration via Databricks Delta Live Tables (DLT) and Unity Catalog

Leverage existing CICD process for pipeline deployment

Adhere to PII encryption and masking standards

Data Engineering Tools/Techniques

Orchestration tools- ADF, Airflow, Five Tran

Languages- SQL, Python

Data Modeling- Star and Snowflake Schema

Streaming- Kafka, EventHub, Spark, Snowflake Streaming

DevOps Support

Support improvements to current CICD process

Production monitoring and failure support

Provide an escalation point and participate in on-call support rotations

Participate in discussions on how to improve DevOps

Be aware of product release and how that impacts our business

Take part in Agile ceremonies

Perform engineering assignments using existing procedures and best practices

Conduct research to aid in product troubleshooting and optimization efforts

Participate in and contribute to our Engineering Community of Practice

Qualifications:

Completed bachelors degree or diploma (or equivalent experience) in Computer Science, Software Engineering or Software Architecture preferred; candidates with substantial and relevant industry experience are also eligible

5+ years of relevant engineering experience

Google Professional Data Engineer Certification is preferred

Experience in Big Table, Click Stream data migration, Semi-Structured and Un-Structured data management

Experience with Google GCP and Big Query

Experience with developing complex SQL queries

Experience with CI/CD principles and best practices

Experience with Azure Data Factory, Azure Data Bricks, Snowflake, and Storage Accounts.

Experience working with a Data Engineering team and understanding of Data Engineering practices.

Ability to learn, understand, and work quickly with new emerging technologies, methodologies, and solutions in the Cloud/IT technology space

Experience with bug tracking and task management software such as JIRA, etc.

Experienced in managing outages, customer escalations, crisis management, and other similar circumstances.

Employment Practices:

EEO, ADA, FMLA Compliant

VLink is an equal opportunity employer. At VLink, we are committed to embracing diversity, multiculturalism, and inclusion. VLink does not discriminate on the basis of race, color, religion, sex, national origin, disability status, protected veteran status,
or any other characteristic protected by law. All aspects of employment including the decision to hire, promote, or discharge, will be decided on the basis of qualifications, merit, performance, and business needs.

Thanks and Regards,

Jyoti Chauhan

US: +1 (860) 751-1176

E-mail:

[email protected]

United States | Canada | India | Indonesia
|
Keywords: continuous integration continuous deployment information technology Colorado Washington
Data Engineer (GCP, Big Query, Snowflake, Azure, Delta) :: Seattle, WA :: Contract
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2241155&uid=
[email protected]
View All
08:14 PM 10-Mar-25


To remove this job post send "job_kill 2241155" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 7

Location: Seattle, Washington