Home

Senior Big data Architect need USC GC local to Washington DC at Washington, DC, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=1216302&uid=

From:

roshini,

s3 infinity

[email protected]

Reply to:   [email protected]

Role: Senior Big data Architect

Location: Remote 

Duration: Contract

client :octa

Only Locals  Washington DC and GC and USC

they need someone with Databricks experience,About NFF

Networking for Future, Inc. (NFF) is a Washington, DC based company offering a performance-focused approach to delivering transformational IT business solutions. We take pride in keeping users productive and engaged by providing business and IT teams with the solutions they need to improve their performance in a dynamic, connected world.

NFF is the only Cisco Gold Partner headquartered in the District of Columbia with Advanced Specializations in all major IT disciplines. In addition to Cisco, NFF holds key strategic partnerships with VMware, NetApp, Microsoft, Riverbed, Splunk and many System Integrators. NFF is an ISO 9001:2015 certified company and has been ranked in Inc. Magazine's 500/5000 Fastest Growing Companies list since 2007.

We offer expert solutions relevant to: Network Infrastructures, Data Center & Cloud, Network & Endpoint Security, Application Assurance, Collaboration & Mobility and Staff Augmentation. About this Position / Responsibilities

NFF,Inc. seeks an experienced IT Consultant to support the design, development, implementation and maintenance of an enterprise Big Data solution as part of our Data Modernization Effort.

This role will provide expertise to support the development of a Big Data / Data Lake system architecture that supports enterprise data operations, including the Internet of Things (IoT) / Smart City projects, enterprise data warehouse, the open data portal, and data science applications. This is an exciting opportunity to work as a part of a collaborative senior data team manager.  This architecture includes an Databricks, Microsoft Azure platform tools (including Data Lake, Synapse), Apache platform tools (including Hadoop, Hive, Impala, Spark, Sedona, Airflow) and data pipeline/ETL development tools (including Streamsets, Apache NiFi, Azure Data Factory). 

Requirements
Coordinates IT project management, engineering, maintenance, QA, and risk management.
Plans, coordinates, and monitors project activities.
Develops technical applications to support users.
Develops, implements, maintains and enforces documented standards and procedures for the design, development, installation, modification, and documentation of assigned systems.
Provides training for system products and procedures.
Performs application upgrades.
Performs, monitoring, maintenance, or reporting on real- time databases, real-time network and serial data communications, and real-time graphics and logic applications.
Troubleshoots problems.Qualifications

Skills
Required 5+ years experience implementing Big Data storage and analytics platforms such as Databricks and Data Lakes
Required 5+ years knowledge of Big Data and Data Architecture and Implementation best practices
Required 5+ years knowledge of architecture and implementation of networking, security and storage on cloud platforms such as Microsoft Azure
Required 5+ years experience with deployment of data tools and storage on cloud platforms such as Microsoft Azure
Required 5+ years knowledge of Data-centric systems for the analysis and visualization of data, such as Tableau, MicroStrategy, ArcGIS, Kibana, Oracle
Required 10+ years experience querying structured and unstructured data sources including SQL and NoSQL databases
Required 5+ years experience modeling and ingesting data into and between various data systems through the use of Data Pipelines
Required 5+ years experience with implementing Apache data products such as Spark, Sedona, Airflow, Atlas, NiFi, Hive, Impala
Required 5+ years experience with API / Web Services (REST/SOAP)
Required 3+ years experience with complex event processing and real-time streaming data
Required 3+ years experience with deployment and management of data science tools and modules such as JupyterHub
Required 3+years experience with ETL, data processing, analytics using languages such as Python, Java or R
Required 16+ years planning, coordinating, and monitoring project activities
Required 16+ years leading projects, ensuring they are in compliance with established standards/procedures
Preferred 3+ years experience with Clouder

Keywords: quality analyst sthree rlang information technology green card
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=1216302&uid=
[email protected]
View All
06:55 PM 14-Mar-24


To remove this job post send "job_kill 1216302" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,