Home

Databricks | Python Engineer || Remote PST Hours at Remote, Remote, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=1862218&uid=

From:

Ishavdeep Singh,

Cloud Think Technologies

[email protected]

Reply to: [email protected]

Databricks / Python Engineer Responsibilities

To perform this job successfully an individual must be able to perform the following essential duties satisfactorily. Other duties may be assigned to address business needs and changing business practices.

Participate as a member of an Agile team developing Data Engineering solutions.

Engage in requirements gathering and technical design discussions to meet business needs.

Design and develop generic, scalable data pipelines in Azure Data factory and Databricks with python for on-prem and cloud data sources.

Assemble large, complex sets of data that meet non-functional and functional business requirements.

Leverage your curiosity for solving unstructured data problems and ability to manipulate and optimize large data sets to advance business problem-solving.

Contribute to documentation, testing and cross-training of other team members.

Work closely with others to assist and resolve production issues.

Databricks / Python Engineer Qualifications

Bachelor's degree in computer science, computer engineering, a related field, or equivalent experience.

5+ years of data engineering or equivalent experience.

5+ years of hands-on experience in developing and deploying data architecture strategies or engineering practices.

5+ years of experience with complex SQL queries and knowledge of database technologies.

Expert-level coding experience with PySpark and Python.

Expert-level technical experience with Apache Spark / Azure Databricks.

Proficient in using and designing solutions on Azure Cloud infrastructure (particularly Azure Data Factory) and Azure DevOps.

Proficient with core business intelligence and data warehousing technology.

Proficient designing and developing data integration solutions using ETL tools such as Azure Data Factory and/or SSIS.

Proficient with software development practices such as Agile, TDD, and CI/CD.

Ability to collaborate and communicate professionally, both verbally and in writing, at all levels of the organization, particularly bridging conversations between data and business stakeholders.

Preferred Qualifications

Experience with Snowflake.

Experience with graph databases or graph libraries.

Kafka or other streaming technologies.

Elastic Search.

Experience in the rail or other commodities driven industry.

Keywords: continuous integration continuous deployment
Databricks | Python Engineer || Remote PST Hours
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=1862218&uid=
[email protected]
View All
01:09 AM 22-Oct-24


To remove this job post send "job_kill 1862218" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,