Home

Azure Data Engineer with Data bricks, Delta lake, Hive, HDFS : Cary, NC/Onsite at Cary, North Carolina, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2010067&uid=

Job Title: Azure Data Engineer with Data bricks, Delta lake, Hive, HDFS

Location: Cary, NC/Onsite

Role Description:          

Collect, store, process and analyze large datasets to build and implement extract, transfer, load (ETL) processes

Develop reusable frameworks to reduce the development effort involved thereby ensuring cost savings for the projects.

Develop quality code with thought through performance optimizations in place right at the development stage.

Appetite to learn new technologies and be ready to work on new cutting-edge cloud technologies.

Work with team spread across the globe in driving the delivery of projects and recommend development and performance improvements.

Building and Implementing data ingestion and curation process developed using Big data tools such as Spark(Scala/python/Java), Data bricks, Delta lake, Hive, HDFS, Sqoop ,Kafka,
Kerberos, Impala etc.

Hands on expertise in: Implementing analytical data stores in Azure platform using ADLS/ADF/Data Bricks and Cosmos DB and CDP 7x

Ingesting huge volumes data from various platforms for Analytics needs and writing high-performance, reliable and maintainable ETL code Strong

SQL knowledge and data analysis skills for data anomaly detection and data quality assurance.

Proficiency and extensive Experience with Spark & Scala, Python and performance tuning is a MUST

Hive database management and Performance tuning is a MUST. (Partitioning / Bucketing )

Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance.

Strong analytic skills related to working with unstructured datasets

Performance tuning and problem-solving skills is a must.

Code versioning experience using Bitbucket/AzDo. Working knowledge of AzDo pipelines would be a big plus.

Monitoring performance and advising any necessary infrastructure changes.

Strong experience in building designing Data warehouses, data stores for analytics consumption. (real time as well as batch use cases)

Eagerness to learn new technologies on the fly and ship to production

Expert in technical program delivery across cross-functional / LOB teams

Expert in driving delivery through collaboration in highly complex, matrixed environment

Possesses strong leadership and negotiation skills

Excellent communication skills, both written and verbal

Ability to interact with senior leadership teams in IT and business Preferred

Expertise in Python and experience writing Azure functions using Python/Node.js

Experience using Event Hub for data integrations.

Hands on expertise in: Implementing analytical data stores in Azure platform using ADLS/Azure data factory /Data Bricks and Cosmos DB (mongo/graph API) Experience ingesting
using Azure data factory, Complex ETL using Data Bricks.

Eagerness to learn new technologies on the fly and ship to production

Competencies:              Digital : Microsoft Azure, ORACLE SQL

Experience (Years):       6-8

Essential Skills:              Data bricks, Delta lake, Hive, HDFS, Sqoop ,Kafka, Kerberos, Impala etc, Spark (Scala/python/Java), Hive,
HDFS, Sqoop ,Kafka, Kerberos, Impala etc. and CDP 7.x

Desirable Skills:             Data bricks, Delta lake, Hive, HDFS, Sqoop ,Kafka, Kerberos, Impala etc, Spark (Scala/python/Java), Hive, HDFS,
Sqoop ,Kafka, Kerberos, Impala etc. and CDP 7.x

Keywords:         Azure Data Engineer

Thanks & Regards,

Sachin Chaudhary

(Sr. IT Recruiter)

E: 

[email protected]

K&K Global Talent Solutions Inc.

7901 4th St N, St Petersburg, Florida 33702, US. 

US
| Canada | Germany | Indi
a

*This email and any attachments are confidential
and may be privileged. If you are not the intended recipient, please notify the sender and delete this email from your system
.

We will not save your details in any of our emails, Computers, Databases, Intranet portals, Cloud
database etc. Your details will only be used for the purpose of recruitment for the job that we discussed.  We will delete your details as soon as the recruitment process is finished. In case if you are not interested to receive emails, please revert to us
by mentioning REMOVE in the subject line.

--

Keywords: javascript database information technology North Carolina
Azure Data Engineer with Data bricks, Delta lake, Hive, HDFS : Cary, NC/Onsite
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2010067&uid=
[email protected]
View All
01:14 AM 13-Dec-24


To remove this job post send "job_kill 2010067" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 10

Location: Cary, North Carolina