Remote But Local Remote GCP Data Engineer /USC or GC at Remote, Remote, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=776817&uid= Hi, Currently, I am recruiting candidates for one of my requirements as mentioned below. If you have a matching profile, please send me the updated resume along with contact details at the earliest. Job Title : Remote GCP Data Engineer Project Location: - Remote Duration : 06 Months Only USC or GC Please Do not Share profile without Local DL Please Do not Share profile without Local DL Must have Local ID (Georgia) Proof Or DL Must have Local ID (Georgia) Proof Or DL Rate $50/hr Max Max Job Description What Youll Do Build automated ML/AI modules, job, and data preparation pipelines by gathering data from multiple sources and systems, integrating, consolidating and cleansing data, and structuring data and analytical procedures for use by our clients in our solutions. Perform design, creation, and interpretation of large and highly complex datasets Consult with internal and external clients to understand the business requirements so successfully build datasets and implement complex big data solutions (under senior lead's supervision). Ability to work with Technology and D&A teams to review, understand and interpret the business requirements to design and build missing functionalities to support the identity and fraud analytics needs (under senior lead's supervision). Ability to work on the end-to-end interpretation , design, creation, and build of large and highly complex analytics related capabilities (under senior lead's supervision). Strong oral and written communication skills, and ability to collaborate with cross-functional partners Qualifications: 3+ years of professional data engineering or data wrangling experience in - working with Hadoop based or Cloud based big data management environment - bash scripting or similar experience for data movement and ETL - Big data queries in Hive/Impala/Pig/BigQuery (Sufficient in BigQuery API libraries to data prep automation is a plus) - Advanced Python programming (Scala is a plus) with strong coding experience and Proficient in data studio, Big Table, GitHub working experience (Cloud composer and Data flow is a plus) - basic GCP certification is a plus - Knowledge of Kubernetes is a plus (or other types of GCP native tools of the container-orchestration system for automating computer application deployment, scaling, and management) - Basic knowledge in machine learning (ensemble machine learning models, unsupervised machine learning models) with experience using Tensorflow and PyTorch is a plus - Basic knowledge in graph mining and graph data model is a plus 2 rounds of interviews. 1 with two members of team, the 2nd will be with Hiring Manager. 3+ years of professional experience as a data engineer 3+ years working with Python and SQL. Experience with state-of-the-art machine learning algorithms such as deep neural networks, support vector machines, boosting algorithms, random forest etc. preferred Experience conducting advanced feature engineering and data dimension reduction in Big Data environment is preferred Strong SQL skills in Big Data environment (Hive/ Impala etc.) a plus Things that would stand out on resume - 1- Masters Degree in Computer Science & Data Science 2- Previous Company - Any Bank, Ecommerce Kulwinder Singh // Sr. Technical Recruiter Office: 4132402192 Email: [email protected] Keywords: artificial intelligence machine learning access management green card Idaho http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=776817&uid= |
[email protected] View All |
10:52 PM 20-Oct-23 |