Home

Rajeev - Python Developer/Lead/Architect
[email protected]
Location: Phoenix, Arizona, USA
Relocation: any
Visa: H1B
Resume file: Rajeev R. Ojha_1748533233724.docx
Please check the file(s) for viruses. Files are checked manually and then made available for download.
Rajeev Ojha Email: [email protected]

Phone: 248-876-0218 Ext 122
Location: Phoenix, AZ, 85083

SUMMARY:
18+ years of IT experience leading full-stack application development, cloud migration, and legacy modernization projects across diverse industries including healthcare, finance, and insurance.
Accomplished Technical Project Manager and Solution Architect, adept at bridging communication between business stakeholders and engineering teams to deliver scalable, production-grade solutions.
7+ years of expertise in designing microservices architectures using Python frameworks such as Flask, Django, FastAPI, and Pydantic, ensuring secure, high-performance RESTful APIs and async-ready endpoints via WSGI/ASGI (Gunicorn, Uvicorn) servers.
Extensive experience designing and implementing robust ETL pipelines using Python and cloud-native services to ingest, transform, and migrate large-scale datasets across healthcare, finance, and enterprise systems.
Proficient in building scalable, production-grade ETL workflows leveraging AWS (S3, Lambda, Redshift), enabling clean data flows, automated validation, and analytics readiness.
Proficient in Python packaging and environment management tools including Wheel, Conda, and virtualenv for reproducible builds and deployments.
Skilled in CI/CD practices using GitHub Actions, CircleCI, Jenkins, GitLab CI/CD, and experience with container orchestration on AWS ECS and EKS for scalable cloud deployments.
Proficient in Terraform and CloudFormation for infrastructure as code, delivering repeatable and auditable environments across AWS, Azure, and GCP.
Strong command of SQL and NoSQL databases, including PostgreSQL, MySQL, SQL Server, DB2, MongoDB, Couchbase, DynamoDB, Redshift, and Neo4j.
Extensive experience with data visualization and BI tools including Power BI, Power Query, and Microsoft 365 for reporting and stakeholder-facing dashboards.
Well-versed in software engineering best practices: TDD, DDD, MVC, and agile delivery methodologies.
Skilled in debugging, performance tuning, and monitoring using tools like Datadog, Postman, PDB, and VS Code, supporting rapid issue resolution in production environments.
Effective team leader and mentor with experience managing cross-functional Agile teams, guiding junior developers, and coordinating global delivery efforts.

TECHNICAL SKILLS:
APIs & Integration: RESTful APIs, GraphQL (basic), Postman, RabbitMQ, Power Query
Architecture & Design: Microservices, Domain-Driven Design (DDD), Event-Driven Architecture, MVC
BI & Reporting: Power BI, Power Query, Microsoft 365 (Excel, SharePoint, Teams integration)
Cloud & Serverless: AWS (EC2, Lambda, RDS, S3, CloudFormation), Azure (Functions, Storage), GCP, Serverless Architecture, IAM
Databases: PostgreSQL, MySQL, MongoDB, Couchbase, DynamoDB, Neo4j, Redshift, SQLite, DB2, Redis, Oracle
Development Methodologies: Agile, Scrum, Waterfall, CI/CD
Frameworks & Libraries: Django, Flask, FastAPI, React, Node.js, Pydantic, SQLAlchemy, Pandas, NumPy, Matlplotlib, Sci-kit
IDEs & Tools: VS Code, VI, Eclipse, Android Studio
Infrastructure & DevOps: Terraform, Docker, Kubernetes, GitHub Actions, CircleCI, GitLab CI/CD, Jenkins, Git
Mainframe Modernization: COBOL, JCL, VSAM, IDMS, DB2, CICS, Endevor, Changeman, Migration Strategies
Programming Languages: Python, JavaScript, Bash, VBScript, Shell Scripting, SQL
Testing & Debugging: Pytest, TDD, PDB, VS Code Debugger
Web Servers & Concurrency: Gunicorn (WSGI), Uvicorn (ASGI), AsyncIO



EDUCATION
Master of Computer Applications (MCA) DDU University, Gorakhpur, India (2002)
Bachelor of Science in Mathematics (Hons.) BHU University, Varanasi, India (1998)

CERTIFICATIONS
AWS Certified Solution Architect Associate (2020)
IBM Certified DB2 Database Administrator (2009)
Multiple IBM Talent & Employee of the Month Awards

PROFESSIONAL EXPERIENCE
Senior Python Architect | KeyBank June 2024 Present Remote (Phoenix AZ US)
Led the migration of mainframe authentication (RACF, DB2) to SailPoint IdentityIQ, enabling modern, scalable identity governance across systems.
Designed and implemented secure, scalable APIs for digital asset management using Python, ensuring robust access controls and integration with IAM policies.
Automated data analysis and reporting workflows using Python, Excel VBA, and ServiceNow, significantly reducing manual effort and time to insights.
Performed comprehensive access and file storage audits across employees, systems, and teams.
Identified over-permissioned users and conducted root-cause analysis for access policy violations.
Flagged unauthorized and expired access to critical systems and data.
Reviewed security rules and access policies to align with enterprise compliance and risk frameworks.
Supported data access updates and visibility controls at the database level.
Ensured compliance with data protection standards and identified security risks for mitigation.
Documented findings and strategic recommendations for leadership review and compliance reporting.
Assessed existing RBAC models and devised implementation strategies for role-based access enforcement in SailPoint.
Automated exception reporting to highlight access discrepancies and irregularities across applications.
Instituted regular access review processes to ensure permission accuracy and reduce audit risk.
Python Architect & DBA & Data Engineer | PwC (Global) May 2022 June 2024 Remote (Phoenix AZ US)
Led a very large-scale medallion architecture based data cleanup using Python, enabling robust data quality checks, deduplication, and enrichment across bronze, silver, and gold layers.
Served as Lead Data Engineer and DBA, spearheading the migration of on-premise Oracle databases to Azure SQL Server, ensuring data integrity and performance tuning.
Designed and implemented high-performance, scalable ETL pipelines using Python, Docker and M Query powering an audit and analytics platform.
Standardized and restructured data from multiple heterogeneous sources such as Oracle, Sql Server, flat files into a unified format required by in house Neptune, improving downstream analytics and interoperability.
Built and integrated Power BI dashboards for reporting.
Leveraged Microsoft 365 tools (PowerBI, Excel, SharePoint, Teams) for enhanced team collaboration, documentation, and streamlined operational workflows.
Experimented with integrating lightweight Python-based ML models (scikit-learn) into API services for internal anomaly detection and audits insights recommendation proof-of-concepts.
Lead Python Developer | FliptRx Nov 2021 May 2022 Remote (Phoenix AZ US)
Acted as Lead Python Developer supporting a Pharmacy Benefit Manager (PBM) startup focused on Medicare/Medicaid workflows, ensuring sub-second API responses for real-time prescription pricing and approvals.
Designed and developed Flask-based microservices, integrated with Couchbase and AWS, to deliver scalable, low-latency backend systems for pharmacy transactions.
Built a custom parser for NCPDP EDI formats, enabling accurate adjudication of pharmacy claims data received from clients like ChangeRX.
Led the development of ETL pipelines to ingest, transform, and migrate large volumes of healthcare and prescription data to AWS S3 and Redshift, supporting analytics, auditing, and reporting needs.
Standardized logging across services using the Python logging module, with real-time observability and Datadog integration, enabling proactive monitoring and debugging.
Developed and maintained GraphQL endpoints, ensuring seamless communication between backend APIs and React-based frontend applications.
Engineered TCP socket communication endpoints for real-time, bidirectional data exchange with pharmacy partners.
Supported RabbitMQ-based asynchronous workflows, contributing to scalable, event-driven system design compatible with ML model trigger pipelines.
Collaborated with distributed teams using GitHub, Jira, and Confluence, delivering high-quality features on tight deadlines through aggressive sprint cycles and continuous deployment into AWS-hosted microservices.

Lead Python Developer | ASI Aug 2021 Nov 2021 Remote (Rhode Island US)
Acted as Technical Project Manager and Architect, leading the modernization of legacy Visual FoxPro applications to a fully cloud-native AWS architecture using Django, MySQL, and serverless technologies.
Architected and delivered a centralized authentication system using Django REST Framework, integrated with AWS Cognito and API Gateway for secure, scalable access control across microservices and potential ML API integrations.
Spearheaded the automation of the invoice-to-PO reconciliation process, converting PDFs from diverse vendor formats into a standardized structure using Python-based ETL pipelines on AWS Lambda and S3, forming a clean data foundation for ML models.
Oversaw CI/CD deployment pipelines using GitHub, Bitbucket, and AWS CodePipeline, supported by CloudFormation to ensure reproducible infrastructure aligning with best practices for automated model deployment and environment consistency.
Led full-stack development efforts on AWS Elastic Beanstalk, ensuring scalable service delivery and operational readiness for potential machine learning model endpoints.
Collaborated on early-stage MLOps readiness by aligning data processing workflows, deployment architecture, and observability standards to support future ML model integration and monitoring.
Managed cross-functional teams including developers, testers, and cloud engineers, ensuring timely, high-quality deliverables while aligning with enterprise security and architectural standards.

MongoDB and Sr Python Developer | Here May 2021 Aug 2021 Remote (Rhode Island US)
Addressed chronic issue regarding deployment of servers for MongoDB, AWS
Developed CloudFormation templates (CFT) for capacity management on AWS.
Resolved chronic deployment issues for MongoDB on AWS, improving system stability.
Developed AWS CloudFormation Templates (CFTs) to automate capacity management.
Optimized database performance by tuning indexing strategies for MongoDB clusters.
Implemented monitoring solutions using AWS CloudWatch and Python-based scripts.
Mainframe Migration Architect | Amica Mutual Insurance May 2017 May 2021 Lincoln, Rhode Island US
Migrated mainframe legacy reporting systems to modern, cloud-ready solutions using Python and MongoDB, with workloads prepared for deployment on AWS infrastructure.
Built a data mapping tool using Python, Neo4j, and Visual Basic to support mainframe-to-cloud modernization and enable traceability for data migration to AWS-based targets.
Automated the migration of 650K+ reports from Control-D to IBM Content Manager using Python, with batch jobs restructured to support future migration to AWS S3 and archival workflows.
Designed an interactive Mainframe Artifacts Mapping Tool [https://priorart.ip.com/IPCOM/00265610D] using Neo4j, Python, and Visual Basic, helping stakeholders visualize dependencies for smoother cloud migration planning.
Decommissioned 800+ programs and thousands of jobs, reducing legacy footprint and enabling the shift of workloads to AWS-hosted services.
Automated and refactored repetitive mainframe tasks, saving over 2 person-months of effort and laying the groundwork for cloud-native replacements using AWS Lambda and Python.
Supported QA automation by helping develop a Python-Selenium testing suite for Guidewire and mainframe applications, ensuring consistency as systems transitioned to hybrid and AWS-hosted environments.

Mainframe Sr Developer/Architect | American Express Oct 2015 May 2017 Phoenix AZ
Led batch job performance optimization, reducing processing time and saving $300K.
Assisted customers in architecting and re-engineering applications. Part of Mainframe Modernization team.
Implemented major projects with functionality exceeding Business requirements, ahead of schedule, under budget, and defect free. Served as Mainframe liaison to business teams and experience in agile processes & transformation.
Developed and executed plan to migrate source code from COBOL to COBOL5/6.
Improved multiple batch jobs by fine tuning DB2 SQL constructs, SORT utility, dataset compression usage.
Targeted and improved chronic incidents. The process improvement was valued over $300K savings.
Automated parts of Source code to CA-R load (CA-R -> IT Asset catalog and reporting tool from CA).
Identified multiple redundant obsolete processes from Dev/QA and had them shutdown leading to MIPS savings.
Upgraded COBOL applications to next version while ensuring system reliability and cost optimization.
Used ServiceNow for troubleshooting and triaging tickets/digital workflow. Used agile practices in this assignment.

Mainframe Lead Analyst Apr 2008 Sep 2015
Hertz, Oklahoma City, OK
Improved ailing nightly batch cycle. An improvement of over 4 hours realized through an effort lasting 8 months.
Improved multiple batch jobs by leveraging DB2 fine tuning, VSAM optimization and SORT usage.
Resulted in over USD 400K/Year savings for the client.
Automated Traffic Violation Tracking (TVT) process replacing 8-person team.
Instrumental in successful knowledge transfer from the previous team.
Played crucial role in IT integration when Hertz acquired Dollar Thrifty
Supported two of the key applications Vehicle Monitoring and Vehicle Information System.
Fascilitated the migration of key IDMS records to DB2
Requirements gathering, application design, coding, testing and documentation.
Led the offshore team of 15 people and trained multiple resources on DB2 and IDMS.
Used IDMS, ADSO, DB2, IDD, IDML, ADSC, CULPRIT, COBOL, JCL, VSAM, CICSWEB, Changeman and Endevor.
Used IBM Maximo for troubleshooting and triaging tickets. Used Changeman, RAT for digital workflow.
Analyzed source code and identified issues dealing with incorrect and inefficient coding.

Database Administrator IDMS May 2005 Mar 2008
AmerisourceBergen, Orange, CA/ Pune, India
Physical and Logical Database Design and Changes, Performance monitoring, Backup and Restore of Databases.
Training and assisting application programmers on IDMS.
Assisted in Migration from IDMS to DB2 applications.
Well versed with IDMS tools IDD, IDDM, OLQ, ADSC, ADSO, DMLO, CULPRIT, PMON.
Planned the migrations, procedural changes, and training to ease the transition from one release to another, making it as transparent as possible to the user. Used waterfall model in this assignment.

Software Developer | General Electrics May 2003 Apr 2005
Mumbai, India
Responsible for coding, testing, analysis, and production support for various applications for GE.
Used IDMS, ADSO, COBOL, Visual Basic, Oracle, JCL, Easytrieve.
Keywords: continuous integration continuous deployment quality analyst machine learning javascript business intelligence sthree rlang information technology purchase order Arizona California

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];5574
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: