Home

Placement requirement for DevOps Technical lead(Databricks, Azure Data Lake, Python) in Greenfield, IN (100% on-site from day 1) for Long term contract at Greenfield, Indiana, USA
Email: [email protected]
https://jobs.nvoids.com/job_details.jsp?id=2136494&uid=
From:

Mohit Jaiswal,

Intelligenz IT

[email protected]

Reply to:   [email protected]

Hi,
I hope you and your family are doing well.
I have a good position for you with my client. This point of time I don't know whether you are looking for a new job or not. But just thought if I can share the details and then confirm from you about your interest level for the opportunity. If you are interested and an available consultant, send me your most updated resumes in word format and contact details.

Job Title:              DevOps Technical lead  (Databricks, Azure Data Lake, Python)
Location:              Greenfield, IN (100% on-site from day 1)
Duration:              Long term contract

START DATE ASAP....

Only H1B Resume Accepted 

Educational Qualification*
Any Bachelor's Degree good in Computer Science, Information Technology

Job Description 
We are seeking a highly skilled Data Engineering Specialist to join our team. The ideal candidate will have extensive experience in cloud technologies, DevOps development practices and Data Engineering to support and enhance our RDAP initiatives.

Key Responsibilities:
Design, develop, and maintain Databricks Lakehouse solutions sourcing from  cloud platforms such as Azure Synapse and GCP.
Implement and manage DevOps and CICD workflows using tools like GitHub.
Apply best practices in test-driven development, code review, branching strategies, and deployment processes.
Build, manage, and optimize Python packages using tools like setup, poetry, wheels, and artifact registries.
Develop and optimize data pipelines and workflows in Databricks, utilizing PySpark and Databricks Asset Bundles.
Manage and query SQL databases (Unity Catalog, SQL Server, Hive, Postgres).
Implement orchestration solutions using Databricks Workflows, Airflow, and Dagster.
Work with event-driven architectures using Kafka, Azure Event Hub, and Google+C4 Cloud Pub/Sub.
Develop and maintain Change Data Capture (CDC) solutions using tools like Debezium.
Extensive experience in design and implementation of data migration projects specifically involving Azure Synapse and Databricks Lakehouse
Manage cloud storage solutions, including Azure Data Lake Storage and Google Cloud Storage.
Configure and manage identity and access solutions using Azure Active Directory, including AD Groups, Service Principals, and Managed Identities.

Primary (Must have skills)*

Python package builds(setup,poetry, wheels,artifact registries)

Specific technologies

Databricks(PySpark,Databricks Asset Bundles)

Open File Formats (Delta/Parquet/Iceberg/etc)

SQL Databases (Unity Catalog, SQL Server, Hive, Postgres)

Orchestration Tools(Databricks Workflows, Airflow, Dagster)

Azure Data Lake Storage

Azure Active Directory (AD groups, Service Principles,Managed Identities)

Experience Range
15 years

Secondary Skills (Good To have)*
Kafka, Azure Event Hub, Cloud Pub/Sub
Change Data Capture (Debizum)
Google Cloud Storage
Soft skills/other skills (If any)

Communication Skills:
Ability to convey complex technical concepts in a clear and concise manner to both technical and non-technical stakeholders.
Strong documentation skills for creating process guidelines, technical workflows, and reports.
Problem-Solving and Analytical Thinking:
Capability to troubleshoot and resolve issues efficiently.
Analytical mindset for optimizing workflows and improving system performance.
Lead Responsibilities
Effective in Customer interactions for understanding requirements, participating in design discussions and translating requirements into deliverables by working with the development team at Offshore
Effective in collaborating with cross-functional teams across development, operations, and business units.
Strong interpersonal skills to build and maintain productive relationships with team members.

Regards,

Mohit Jaiswal
Intelligenz IT
Work : 646-502-7441

Keywords: active directory information technology
Placement requirement for DevOps Technical lead(Databricks, Azure Data Lake, Python) in Greenfield, IN (100% on-site from day 1) for Long term contract
[email protected]
https://jobs.nvoids.com/job_details.jsp?id=2136494&uid=
[email protected]
View All
07:21 PM 03-Feb-25


To remove this job post send "job_kill 2136494" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 12

Location: Greenfield, Indiana