Azure Data Architect with snowflake at Scottsdale, Arizona, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=632456&uid= From: Rex, Smart IT Frame [email protected] Reply to: [email protected] Hi All, PFB Role: Azure Data Architect with snowflake + DBT + Python+ Data modelling Location: Scottsdale, AZ Day 1 onsite Duration: Long Term Contract Interview Mode: Video Start Date: ASAP Interview process: Video call Job Description: At least 5 years of experience working in various Big Data Technologies Proactively manage risks, respond to client escalations and troubleshoot engagement issues to provide effective governance.. Build roadmaps and program plans with track/project leads. Regular reviews of projects and key deliverables to ensure quality and predictability. Manage projects with varied engagement, pricing and execution models across ADM, support, Data Analytics and transformation initiatives. Enhanced customer experience on all existing engagements Work closely with Account Manager(s) to manage client relationships across business, technology and operations. Provide timely and accurate progress updates on business opportunities to senior management Experience in handing end to end data projects. Expert in day-to-day PMO activities with full ownership. Good planner and Agile specialist to plan and execute development cycle in sprints. Status reporting as per cadence, should be able to defend with logical viewpoint. Good exposure to GCP data services like BigQuery, CloudSQL, Knows Cloud Functions, Data Flow, Cloud Composer, Apache Airflow Deployment, release and handover experience specific to data projects. Java Spark technical 4+ years of experience in Java Spark At least 5 years of experience in Data warehousing Strong understanding and hands-on experience on the Big Data stack (HDFS, Sqoop, Hive, Java etc.) Big Data solution design and architecture Design, sizing and implementation of Big Data platforms based Deep understanding of Cloudera and or Hortonworks stack (Spark, Installation and configuration, Navigator, Oozie, Ranger etc.) Agile/Scrum methodology experience is required Experience in SCMs like GIT; and tools like JIRA Experience in RDMS and No SQL databases Knowledge in Data Structures, Algorithms etc. Well versed with SDLC life cycle having exposure to various Phases Good to have Datawarehouse exposure Keywords: information technology Arizona http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=632456&uid= |
[email protected] View All |
06:40 PM 13-Sep-23 |