Home

Need A : GCP Software Engineer - 100% Remote work at Dearborn, Michigan, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=156001&uid=

From:
Yash,
SSPEARHEAD INC
[email protected]
Reply to:   [email protected]

Hi,

Hope you are doing good.!!

I am reaching out to you on an exciting job opportunity with one of our clients.

Direct Client: disclose later

Job Category: Google Cloud Platform (GCP) Software Engineer

Location: Dearborn, MI (100% Remote work may require relocation down the road)

W2 Pay: $57.00hr + 20 days PTO

Visa: US Citizen, TN, EAD, GC, OPT, H1 transfers only please

Software Engineer Senior #897193

TOP SKILLS

In-depth understanding of Googles product technology (or other cloud platform) and underlying architectures

5+ years of application development experience required
5+ years of SQL development experience
3+ years of GCP experience
Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc, Airflow etc.
2 + years professional development experience in Java or Python.

Job Description:

       Were seeking an experienced Software Engineer who is familiar with Lean Agile practices.

       You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and analytics in the Google Cloud Platform.

       You will be responsible for designing the transformation and modernization on Google Cloud Platform, as well as landing data from source application to GCP.

       Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must.

       We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform. You will:

       Work in collaborative environment including pairing and mobbing with other cross-functional engineers

       Work on a small agile team to deliver working, tested software

       Work effectively with fellow data engineers, product owners, data champions and other technical experts

       Demonstrate technical knowledge/leadership skills and advocate for technical excellence

       Develop exceptional Analytics data products using streaming, batch ingestion patterns in the Google Cloud Platform with solid Datawarehouse principles.

Skills Required:

       Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment.

       Implement methods for automation of all parts of the pipeline to minimize labor in development and production.

       This includes designing and deploying a pipeline with automated data lineage.

       Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions.

       Test and compare competing solutions and report out a point of view on the best solution. Integration between GCP Data Catalog and Informatica EDC.

       Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine.

Skills Preferred:

       Strong drive for results and ability to multi-task and work independently

       Self-starter with proven innovation skills

       Ability to work with cross-functional teams and all levels of management

       Demonstrated commitment to quality and project timing

       Demonstrated ability to document complex systems

       Experience in creating and executing detailed test plans

Experience Required:

       Work with data team to analyze data, build models and integrate massive datasets from multiple data sources for data modelling

       Implement methods for automation of all parts of the predictive pipeline to minimize labor in development and production.

       Formulate business problems as technical data problems while ensuring key business drivers are captured in collaboration with product management

       Extracting, Loading, Transforming, cleaning, and validating data

       Designing pipelines and architectures for data processing

       1+ year of designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java/ Python etc.

       1+ year of hands-on GCP experience with a minimum solution designed and implemented at production scale

Experience Preferred:

       Architecting and implementing next generation data and analytics platforms on GCP

       Experience in building solution architecture, provision infrastructure, secure and reliable data-centric services and application in GCP

       Experience with Informatica EDC is preferred

       Experience with development eco-system such as Git, Jenkins and CICD. Exceptional problem solving and communication skills

       Experience in working with Agile and Lean methodologies

       Team player and attention to detail

       Performance tuning experience

Education Required:

       Bachelors degree in computer science or related scientific field

       IT or related Associated topics: data architect, data center, data integrity, data manager, data management, data scientist, data warehousing, sql, sybase, Teradata

Education Preferred:

       Masters degree in computer science or related field

       GCP Professional Data Engineer Certified

       2+ years mentoring engineers

       In-depth software engineering knowledge

E-mail is the best way to reach me, if I dont pick your call.

Thanks & Regards

Yogesh

Sspearhead Inc

Keywords: access management
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=156001&uid=
[email protected]
View All
11:12 PM 18-Nov-22


To remove this job post send "job_kill 156001" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 8

Location: Dearborn, Michigan