| Urgent Need || GCP Data Engineer || Bolingbrook, IL || USC, GC, H4-EAD at Bolingbrook, Illinois, USA |
| Email: [email protected] |
|
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1700920&uid= KFORCE Title: GCP Data Engineer Duration: 12+ months Location : Bolingbrook, IL (Hybrid 3 days every other week; 6 days a month) NEED GENUINE CANDIDATES ONLY Technical Requirements: Google Cloud Platform (GCP) Dataflow, Dataproc, PubSub, DataProc, BigQuery, Cloud Storage REST API, microservices DataProc SPARK Engine that Google provides Exposure to Kafka Exposure to real time ingestion frameworks Building data pipelines Kafka Job Summary: The GCP Data Engineer will be responsible for designing, developing, and deploying data pipelines and processes within Google Cloud Platform (GCP). This role is pivotal in upgrading GCP infrastructure and ensuring smooth operation of both batch and real-time data ingestion pipelines. The engineer will work across different teams, collaborate with offshore resources, and be self-sufficient in driving projects forward. The position requires strong technical skills in GCP and related technologies, with a focus on data pipeline development, cloud storage, and Python scripting. Key Responsibilities: Development (70%): Design and implement batch and real-time ingestion pipelines using Google Cloud Platform services such as Dataflow, Dataproc, and BigQuery. Write and optimize Python scripts in Spark, leveraging GCP s Spark Engine (Dataproc). Utilize Cloud Functions for data ingestion and processing tasks. Work with Cloud security for building security frameworks and perform vulnerability assessment to detect/resolve platform security issues. Develop Data Engineering pipelines for Data Ingestion into GCP Platform/Big Query. Build and maintain scalable data pipelines to handle high-volume data (e.g., 150 million rows). Create dashboards in Looker for Big Query performance monitoring, slots management, user queries. Work with Platform Architects for building new capabilities/frameworks or setting up policies/guidelines for development teams. Develop and manage APIs on the Apigee platform and utilize Cloud Functions for API management. Build microservices and integrate with REST APIs to support data ingestion and processing needs. Support (30%): Provide technical support for existing data pipelines and systems. Collaborate with offshore engineers and various teams to gather requirements and ensure smooth deployment and integration. Requirements: Bachelor s degree in Computer Science, Engineering, or a related field. 6+ years of experience in data engineering, with a focus on GCP technologies. Proficiency in GCP services, including Dataflow, Dataproc, Pub/Sub, BigQuery, and Cloud Storage. Experience in building and maintaining data pipelines using Python and Spark. Strong experience with Big Query including SQL and Stored Procedures. Hands-on experience with real-time ingestion frameworks like Kafka or similar. Experience with Spark on GCP (Dataproc) and strong Python scripting skills. Familiarity with REST APIs and microservices architecture. Experience with high-volume data processing (GB-level). Previous experience working in a similar role with GCP stack technologies. Preferred Qualifications: Experience with API development on the Apigee platform or similar Thanks and Regards Abhishek Tripathi Senior US IT Recruiter | Pransu Tech Solutions Office: 1010 Continental Ave, Canton, MI 48188 Email: [email protected] Hangouts: abhishek@[email protected] Keywords: information technology Illinois Michigan Urgent Need || GCP Data Engineer || Bolingbrook, IL || USC, GC, H4-EAD [email protected] http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1700920&uid= |
| [email protected] View All |
| 07:06 PM 28-Aug-24 |