Home

Sr Data Engineer ( Somerset NJ )Onsite at Berkeley Heights, New Jersey, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=747262&uid=

From:

James Smith,

Rivago Infotech

[email protected]

Reply to:   [email protected]

Position   Senior Data Engineer

Experience: 8 +  yrs

# of positions: 4

Location:     Berkeley Heights, NJ OR Somerset NJ -Onsite USA

 Senior Data Engineer who is bright, driven, customer focused professional.

Data Engineers are responsible for delivering a variety of engineering services to customers world-wide. Assignments will vary based on the candidates skills and experience. Typical assignments may involve cluster installation, ETL, solution design and application development, platform services. Our growing customer base includes most of the Fortune 50 companies which makes the work assignments technically challenging yet rewarding. This role provides a significant opportunity to learn and apply big data technologies and solve related complex problems i.e., Data Fabric and Unified Analytics Software, working closely with Product Engineering to enable you to become a SME collaborating closely with the Customer.

Proactive Prerequisite Role Restrictions:
We are looking for a high achiever who are eligible to work
Up to Twenty percent Business Travel as required.

How You'll Make Your Mark:
Master the Ezmeral Data Fabric and Container Platform, including MapR-FS, MapR-DB Binary and JSON Tables, MapR-Streams, Kubernetes, and Eco-System products and maintain proficiency and currency as the technology evolves and advances.
Achieve and maintain proficiency with cluster and framework sizing, installation, debugging, performance optimization, migration, security, and automation.
Achieve proficiency with MapR DB Binary and Json Tables sizing performance tuning and multi-master replication.
Achieve proficiency with MapR Event Stream sizing, performance tuning and multi-master replication.
Ensure the on-time delivery and quality of Professional Service engagements.
Be a technical interface between the customer (Data Science/Data Analysts) and the delivery team, point of escalation between Customer and Product Engineering
Provide best practice in exploiting the software to meet the Customer Use Cases
Providing technical thought-leadership and advisory on technologies and processes at the core of the data domain, as well as data domain adjacent technologies
Engaging and collaborating with both internal and external teams and be a confident participant as well as a leader
Assisting with solution improvement
Be a technical voice to customers and a community via blogs, User Groups (UGs) and participation at leading industry conferences.
Stay current in best practices, tools, and applications used in the delivery of professional service engagements.

About You:
5+ years of experience administering any flavor of Linux.
3+ years of experience in architecting, administrating, configuring, installation, and maintenance of Open Source Big-data applications, with focused experience on MapR distribution
3+ years hands-on experience with supporting Hadoop ecosystem technologies
Source Big-data applications, with focused experience on MapR or CDP distribution.
Expertise in administration of MapR DB / Hive / HBase / Spark / Oozie / Kafka
Strong scripting skills (Bash or Python preferred)
Familiarity with commercial IT infrastructures including storage, networking, security, virtualization, and systems management.
Good understanding of High availability
Able to implement Hadoop Data Security using Groups/Roles
Implement and manage Cluster security.
Ability to troubleshoot problems and quickly resolve issues.
Cluster maintenance as well as creation and removal of nodes
Performance tuning of Hadoop clusters and Hadoop MapReduce routines
Screen cluster job performances and capacity planning
Monitor cluster connectivity and security.
Collaborating with application teams to install operating system and MapR updates, patches, version upgrades when required.
Integration with other Hadoop platforms
Familiarity with either Ansible, Puppet or Chef
Familiarity with Kubernetes
Proficiency in basic Java or Scala programming preferred but not required.
Bachelor's degree in CS or equivalent experience
Strong verbal and written communication skills are required.

Ideal candidate will have any/all the following:

RHCE certification, Bash or Python scripting, Automation using Ansible, Kubernetes administration, basic knowledge of Spark.

Keywords: database information technology New Jersey
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=747262&uid=
[email protected]
View All
01:41 AM 13-Oct-23


To remove this job post send "job_kill 747262" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 12

Location: Berkeley Heights, New Jersey