Data Architect at Remote, Remote, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1438868&uid= From: Vikrama Rao, ValiantIQ INC [email protected] Reply to: [email protected] Job Title: Data Architect C2C Rate: $60/hr. on C2C Position Type: Contract Location: Boston, MA (Hybrid) Client: Brown Brothers Harriman Primary Skills Data Warehouse, Oracle Job Description Seeking a Sr. Data Architect who has experience working on modern data platforms with the capabilities of supporting bigdata, relational/Non-relational databases, data warehousing, analytics, machine learning and Data Lake. Key responsibilities will include developing and migrating off legacy Oracle Data Warehouses to a new data platform as the foundation for a key set of offerings running on Oracle Exadata and Cloudera's distribution technology. Sr. Data Architect will continue to support, develop, and drive the data roadmap supporting our system and business lines. Responsibilities: Participate in strategic planning and contribute to the organizations data strategy and roadmap. Completely understand the current DW systems and user communities data needs and requirements. Define Legacy Data Warehouse migration strategy. Understand existing target platform and data management environment. Build the Facilitate the establishment of a secure data platform on OnPrem Cloudera infrastructure. Document and develop ETL logic and data flows to facilitate the easy usage of data assets, both batch and real-time streaming. Migrate, operationalize and support of the platform. Manage and provide technical guidance and support to the development team, ensuring best practices and standards are followed. Qualifications: Bachelor's degree 10+ years of experience in an IT, preliminary on hands on development Strong knowledge of architectural principles, frameworks, design patterns and industry best practices for design and development. 6+ years real data warehouse project experience Strong hands-on experience with Snowflake Strong hands-on experience with Spark Strong hands-on experience with Kafka Experience with performance tuning of SQL Queries and Spark Experience in designing efficient and robust ETL/ELT workflows and schedulers. Experience working with Git, Jira, and Agile methodologies. End-to-end development life-cycle support and SDLC processes Communication skills both written and verbal Strong analytical and problem-solving skills Self-driven, work in teams and independently if required. Working experience with Snowflake, AWS/Azure/GCP Working experience in a Financial industry is a plus. Thanks & Regards, Vikrama Rao Recruitment Executive- ValiantIQ Inc. "Searching Best Minds Searching Best Minds" Email: [email protected] P. 704-249-2259 F. (302) 482-3672 Disclaimer: If you are not interested in receiving our e-mails then please reply with a "REMOVE" in the subject line for automatic removal. And mention all the e-mail addresses to be removed with any e-mail addresses, which might be diverting the e-mails to you. We are sorry for the inconvenience. Keywords: information technology Massachusetts Data Architect [email protected] http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1438868&uid= |
[email protected] View All |
09:16 PM 30-May-24 |