Home

Looking for Data Engineer- REMOTE: No H1B CPT OPT at Remote, Remote, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=540861&uid=

From:

Aakanksha Singh,

RCI

[email protected]

Reply to:   [email protected]

Job Title: 

Data Engineer

LOCATION:
REMOTE

Duration: 6+ Month Contract  

VISA: No OPT, CPT or H1B

Tech Stack used: (Must have skills)

- Search and Auto complete Vespa Search Engine (vectorized docs)

- Argo Flow Data ingestion Listen to Kafka, API, S3, give us data in any form.

- Data processing, schema Vespa, AWS and Azure, multi cloud strategy

- AKS containers , Jenkins flow, 50 plus repos, ML modeling 50K training cost

- Graphana based logs

- Log data aggregation

- Some Big Data  

Your Impact:

Combine your technical expertise and problem-solving passion to work closely
with clients, turning complex ideas into end-to-end solutions that transform our

Translate clients requirements to system design and develop a solution that
delivers business value

Lead, design, develop and deliver large-scale data systems, data processing and
data transformation projects

Automate data platform operations and manage the post-production system and
processes

Conduct technical feasibility assessments and provide project estimates for the
design and development of the solution

Mentor, help and grow junior team members

Your Skills & Experience:

Demonstrable experience in data platforms involving implementation of end to
end data pipelines

Good communication and willingness to work as a team

Hands-on experience with at least one of the leading public cloud data platforms
(Amazon Web Services, Azure or Google Cloud)

Implementation experience with column-oriented database technologies (i.e., Big
Query, Redshift, Vertica), NoSQL database technologies (i.e., DynamoDB,
BigTable, Cosmos DB, etc.) and traditional database systems (i.e., SQL Server,
Oracle, MySQL)

Experience in implementing data pipelines for both streaming and batch
integrations using tools/frameworks like Glue ETL, Lambda, Google Cloud
DataFlow, Azure Data Factory, Spark, Spark Streaming, etc.

Ability to handle module or track level responsibilities and contributing to tasks
hands-on

Experience in data modeling, warehouse design and fact/dimension
implementations

Experience working with code repositories and continuous integration

Set Yourself Apart With:

Developer certifications for any of the cloud services like AWS, Google Cloud or Azure

Understanding of development and project methodologies

Willingness to travel

Keywords: machine learning sthree database
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=540861&uid=
[email protected]
View All
11:19 PM 17-Aug-23


To remove this job post send "job_kill 540861" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,