Enterprise Data Architectneed Florida candidates USC GC at Tallahassee, Florida, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1207799&uid= From: roshini, s3 infinity [email protected] Reply to: [email protected] Title: Enterprise Data Architect (onsite) Work authorization: any (candidate must be authorized to work in US) Must-have skills: 8+ y with Data Architecture and Modeling: Data Warehouse, Data Lake, Data Modeling; 10 + y with Data Pipeline and Processing: Data Pipeline, ETL, Kafka, Data Mesh, Data Fabric; Programming Languages: SQL, Python ; Databases: Relational and NoSQL (Document, Graph, Key-Value); Cloud Platforms: Azure, AWS; enterprise architecting exp PREFERRED : Data Governance and Management: Data Governance, Master Data Management, Data Quality, Data Catalog, Metadata Management; BI and Analytics: Business Intelligence, Advanced Analytics, Machine Learning; Artificial Intelligence: Artificial Intelligence, AI-Augmented Solutions; Development and Operations: Azure DevOps, DataOps; Testing and QA: Data Testing/QA Best Practices, Performance Testing Source Control: Source Control Systems Location: Tallahassee, FL PLEASE SEND LOCAL CANDIDATES ONLY Client: Florida Department of Health Seniority on the skill/s required on this requirement: SR. Earliest Start Date: ASAP Type: Temporary Project Estimated Duration: 12 months with possible extension(s) Additional information: The candidate should be able to provide an ID if the interview is requested. The candidate interviewing must be the same individual who will be assigned to work with our client. Requirements: Availability to work 100% at the Clients site in Tallahassee, FL (required); Experience interfacing directly with various lines of business; must demonstrate an understanding of general business operations, preferably in healthcare related fields. (5+ years); Experience in Data Warehouse and Data Lake architecture, design, and Development (8+ years); Experience in data architecture data modeling (including Entity Relationship, Logical, Conceptual, and Physical models), and data profiling/reverse engineering in both schema-on-read and schema-on-write environments. The experience includes proficiency with standard modeling tools like Erwin (10+ years); Experience with Data Pipeline tools, including design, performance optimization and development/engineering, for both batch (i.e., ETL) and streaming (i.e., Kafka) capabilities(10+ years); Experience with Data Mesh and/or Data Fabric architecture design and/or Implementation (1+ years); Experience with SQL programming inclusive of stored procedures, functions, and triggers. (10+ years) Experience with Python or similar object-oriented high-level programming language. (5+ years); Experience with relational and No SQL (Document, Graph, Key-Value) databases. (10+ years) Experience implementing cloud architecture patterns in cloud platforms like Azure or AWS, including ingress/egress, security considerations and storage/processing cost optimization (10+ years); Experience designing technology solutions using composable architecture and microservices (5+ years); Experience designing data hubs via a canonical data model, as well as governing, incorporating the use of, and optimizing APIs for data exchanges (5+ years); Experience with source control systems such as Azure DevOps (5+ years); Experience utilizing Dev or DataOps processes (2+years); Experience in data testing/QA best practices (including performance), tools and automation (5+ years); Experience architecting and implementing Data Governance related solutions such as Master Data Management, Data Quality, Data Catalog and Metadata Management. (5+ years); Experience architecting and implementing Business Intelligence tools (10+ years); Experience architecting and implementing Advanced Analytics platforms, including model life-cycle management and machine learning (5+ years); Experience designing solutions augmented with artificial intelligence (2+ years); Experience working with large volumes of data in the petabyte range. (5+ years) Responsibilities include but are not limited to the following: Assist the Department with reviewing current and future state artifacts and recommendations from the vendor engaged to supply the technical framework supporting the data and analytics modernization initiative as directed by the Department. Actively contribute to building, documenting, and visualizing the Data & Analytics Strategy and Roadmap, as well as ensuring alignment with business requirements and the Departments 5-year Strategic Plan as directed by the Department. Assist the Department design the enterprise data modernization architecture framework that will support data governance and quality efforts as well as the data ecosystem including all data pipelines, data structures and storage, access management/security, sensitive data protection, processing, and integration. Assist the Department design and implement technologies that support efficient enterprise Data Governance operations, such as Master Data Management, Data Quality, and Data Catalog, to also encompass features of an enterprise Master Person Index (MPI) domain solution which is a key component of the overall data modernization strategic effort for successful systems integration and interoperability of Department systems, as directed by the Department. The Contractor, its employees, subcontractors, and agents must comply at all times with all Department data security procedures and policies in the performance of the scope of work as specified in the Application and Data Security, and Confidentiality document attached herein Keywords: quality analyst artificial intelligence business intelligence sthree Florida Idaho http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1207799&uid= |
[email protected] View All |
08:25 PM 12-Mar-24 |