HCL America Requirements at Chicago, Illinois, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1828402&uid= From: Nora West, W3Global [email protected] Reply to: [email protected] Hello, Please share me suitable resume to [email protected] Note: Dont share me a profile if you already submitted to HCL. (Remote) Sr. Technical Cobol Lead Mandatory skills: COBOL, ASSEMBLER, JCL, DB2/VSAM Extensive experience in designing, developing, testing, and deploying COBOL,Assembler, SQL, JCL, IMS DBDC, DB2. Expert with Mainframe, SOA services and Mainframe integration. Hands-on experience in TSOISPF, Endevor, File Manager, File Manager IMS utility, File Manager DB2 utility, SDSF, IF, TABLE BASE, Fault Analyzer, IBM De Good to Have Skills Insurance domain knowledge Well versed in JIRA and Confluence. Understands the Agile concept and hands-on experience working on Agile. DXC Cots products like CyberLife is a plus Senior Technical Architect - Remote 12-15+ of overall IT experience with at least 3+ years as an Azure Cloud architect 3-5+ yrs. of working knowledge of Azure PaaS Services & Cloud Design Patterns Experience in analyzing enterprise application(s) for Migration to the Azure Cloud Ability to classify application treatments as per Garter 7R and create corresponding technology mapping & cloud solutions Ability to create Azure Cloud Target State Architecture Exposure to Integration Services like Azure Service Bus Good understanding of application-level Cloud security mechanisms and basic level understanding of network security Lead Data Engineer Chicago, IL (Need Locals) Mandatory Skills: Lead Data Engineer Skillset Python/Pyspark scripting Glue / lambda Snowflake (sql, stored procs) Airflow or equivalent orchestration AWS Cloud formation Informatica / ETL Location is Chicago, IL - 3 days onsite. Local preferred . Need solid hands-on candidates. Coding test will be part of evaluation Submit along with Skill matrix. Will not consider without Skill Matrix Add skill matrix as First Page in resume itself for the ease of reading. Technical Product Owner - Chicago, IL Mandatory Skills: AWS, Snowflake we are looking for a Product owner with below Key requirements Strong experience in data ingestion and consumption, particularly with microservices and the Data API layer Need to have deep functional experience in the Finance and Insurance domains, which is crucial for this position. Experience in data management and building data solutions such as Data Lake and Data Warehouse. Required core technology experience in AWS cloud services, Snowflake, Informatica, and BI tools like Tableau, Cognos, and BO. Given these gaps, I do not believe he would be a suitable fit for the Product Owner role for our Enterprise Data Lake or Enterprise Data Warehouse initiatives. Please let me know if you have any questions. Informatica Lead - Chicago, IL and Charlotte, NC Mandatory skills: Informatica Power Center, Unix, Oracle / strong SQL Design and implement data pipelines to extract, transform, and load data from various sources into a data warehouse Develop and maintain data models and schemas to support business intelligence and analytics applications Optimize data storage and retrieval systems for performance and scalability Collaborate with data scientists and analysts to develop and deploy machine learning models and algorithms Implement data governance policies and procedures to ensure data quality, security, and compliance Develop and maintain ETL workflows using tools Design and implement data integration solutions for third-party applications and services Conduct performance tuning and optimization of database systems and queries Mentor and train junior data engineers and other team members on best practices and emerging technologies in data engineering. Building and managing DevOps & DataOps CI/CD pipelines and automated deployments Prepare technical design documents and SLA agreements SQL DBA with MarkLogic and Netezza -Branchville, NJ (Hybrid) An L3 role typically involves advanced-level responsibilities, including resolving complex issues, overseeing system stability, and providing expert guidance. Here's a breakdown based on specifications for SQL (70%), MarkLogic (20%), and Netezza (10%): 1. SQL Database Design and Optimization : Designing efficient database schemas, indexing strategies, and query optimization to improve performance. Performance Tuning: Monitoring and optimizing the performance of SQL queries, stored procedures, and overall database health. Troubleshooting : Handling escalated issues from L2 support, including complex SQL errors, deadlocks, and data inconsistencies. Data Integrity and Security : Implementing security measures, such as access controls and encryption, ensuring data integrity and compliance with standards. Backup and Recovery : Managing and maintaining database backups, ensuring disaster recovery processes are in place. Automation : Creating and maintaining scripts for database tasks, including backups, monitoring, and alerts. Collaboration : Working with development teams to optimize database interactions and provide guidance on best practices. 2. MarkLogic Database Administration : Managing the MarkLogic NoSQL database, including installation, configuration, and upgrading. Query Optimization : Optimizing XQuery and search performance in MarkLogic, ensuring efficient data retrieval. Data Management : Managing document-based data structures, indexing, and retrieval, ensuring the system handles large-scale data efficiently. Troubleshooting and Maintenance: Handling escalated MarkLogic-specific issues, including performance bottlenecks and system failures. Security : Managing security features like user roles, permissions, and encryption in MarkLogic. 3. Netezza System Administration : Administering the Netezza appliance, including system health checks, upgrades, and maintenance. Performance Tuning : Optimizing Netezza-specific SQL queries, particularly those involving large datasets. Troubleshooting : Handling escalated issues specific to the Netezza environment, such as performance lags or database errors. Data Load Management: Overseeing data load processes to ensure efficient ETL operations. Thanks & Regards, Nora West Recruiter E: [email protected] LinkedIN : https://www.linkedin.com/in/j-pooja-nora-west-609474191/ US: 1701 Legacy Dr, Suite#1000, Frisco, Texas 75034 CA: 18 King Street East, Suite 1400 Toronto, ON M5C 1C4 Keywords: continuous integration continuous deployment business intelligence information technology California Delaware Illinois New Jersey North Carolina HCL America Requirements [email protected] http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1828402&uid= |
[email protected] View All |
10:54 PM 09-Oct-24 |