Home

Anantharaj Essakimuthu - Java Developer
[email protected]
Location: Dublin, Ohio, USA
Relocation: Yes
Visa: H1B
Resume file: Anantharaj Essakimuthu Resume_1746066329013.docx
Please check the file(s) for viruses. Files are checked manually and then made available for download.
Anantharaj Essakimuthu +1 380 2325094 /[email protected]
LinkedIn: https://www.linkedin.com/in/anantharaj-essakimuthu/

Professional Summary:
14+ years of expertise in software development and data engineering, specializing in Java (up to Java 21), Spring Boot, and leveraging modern Java 8 and Java 11 features such as Lambda Expressions, Streams API, and Optional for efficient programming solutions.
Proven track record of architecting and implementing high-performance, large-scale applications with a focus on data engineering, analytics, and leveraging Spring Boot for microservices-based architectures.
Extensive experience in developing and optimizing Apache Spark applications for large-scale data processing, utilizing Spark Core, Spark SQL, and PySpark for efficient ETL workflows and analytics.
Experienced in Electronic trading SGX and Moomoo SG and Worked with traders, quant researchers, and clients on product development low latency based applications.
Proficient in building distributed data processing pipelines using Apache Spark integrated with Databricks and AWS, ensuring scalability, fault tolerance, and high performance for structured and unstructured data.
Skilled in integrating Apache Spark with AWS services such as Lambda, S3, and EMR, enabling real-time analytics and efficient handling of Parquet and other file formats.
Expertise in Spring Boot for developing and deploying lightweight, scalable microservices, enhancing modularity and maintainability of applications.
Experienced in Monetary Authority of Singapore (MAS) financial regulatory to create a rules guidelines and business logic to protect the anti-money laundering.
Extensive experience with AWS services, including EC2, S3, RDS, Lambda, and ECS, for creating secure, efficient, and scalable cloud-based solutions.
Designed and implemented data pipelines using Databricks, AWS Glue, and Apache Spark, incorporating Java 8/11 Stream API for improved data transformations and processing.
Developed and deployed containerized applications using AWS ECS and Kubernetes, ensuring seamless scaling and high reliability for modern data engineering workflows.
Proficient in Python scripting for data transformation, automation, and integration with advanced data engineering workflows in cloud environments.
Hands-on expertise in Scala and Java 8/11 for optimizing ETL workflows and building robust data processing pipelines.
Strong background in data modeling and storage using SQL (MySQL, PostgreSQL) and NoSQL (e.g., DynamoDB), optimized for analytical and transactional use cases.
Designed event-driven architectures using Kafka, AWS SNS, and SQS, enabling real-time data processing and streaming analytics.
Implemented Infrastructure as Code (IaC) solutions with AWS CloudFormation and Terraform, ensuring consistent and automated provisioning of cloud resources.
Built and optimized CI/CD pipelines using AWS tools like Code Pipeline, Code Build, and Code Deploy, supporting automated testing and deployment of data engineering applications.
Designed monitoring and logging solutions with AWS CloudWatch, Databricks Monitoring, and Prometheus, ensuring system observability and reliability.
Proficient in Java 8/11 features such as Completable Future, Streams, and Optional for asynchronous programming and functional-style coding in data engineering solutions.
Extensive experience with distributed system design patterns and best practices for fault-tolerant, scalable applications using Spring Boot and Kubernetes.
Adept at leveraging Databricks notebooks to integrate Python, Scala, and Java for advanced data transformation and machine learning workflows.
Skilled in managing AWS IAM roles, VPC configurations, and implementing security policies for safeguarding data pipelines and cloud environments.
Proficient in version control using GitHub/GitLab and implementing efficient branching strategies for collaborative development in data engineering teams.
Experienced in mentoring teams on best practices in Spring Boot, Java 8/11, and data engineering workflows, ensuring high standards of code quality and performance.
Strong analytical skills with a proactive approach to addressing scalability, efficiency, and reliability challenges in data engineering and software development.

Skills Set:
Programming Languages: Up to Java 21, Python, JavaScript, TypeScript, SQL, HTML/HTML5, CSS2/CSS3, XML, Shell Scripting, Scala.
Frameworks and Libraries: Angular (Angular 14), React, Vue.js, Spring Boot, Spring Core, Spring MVC, Spring REST, Hibernate, Django, Flask, Drop wizard, Bootstrap, jQuery, AJAX, GraphQL
Data bases: MySQL, PostgreSQL, Oracle, DB2, Sybase, MongoDB, Cassandra
Tools: Docker, Kubernetes, Azure Kubernetes Service (AKS), Jenkins, Git (GitHub, GitLab, Bitbucket), JIRA, Maven, ANT, Photoshop, Chrome Developer Tools, Apache Kafka
Cloud Platforms: AWS (including S3, EC2)
Development Practices: CI/CD (Continuous Integration/Continuous Deployment), Agile (Scrum), Design-Driven Development
Testing Framework: JUnit, Cucumber, Selenium, TestNG, Tosca, OSV, Perfecto
Others: Linux, UNIX, Apache, REST APIs
Banking Systems: IS08583, IS020022 SWIFT, T24

Education:
Bachelor of Computer science from M.S University, 2010.
Master of Computer Applications from Anna University, 2014.

Professional Summary:
Client: State of Ohio/Strategic Systems USA Feb 2025 Still
Role: Full-Stack Developer
Project Description:

The Senior Developer for State of Ohio system I have create or uses code to connect data from back-end systems to a variety of endpoints and process data for financial systems. Will create domain APIs for service engineering. An engineer is responsible for understanding business problems, designing, developing, configuring, testing, and deploying software to provide the solution. Use agile techniques and are a part of a larger, cross-functional team. In my responsibilities may be building pipelines, building test strategy, building cloud environments to be used in part of the development processes and responsibility is to ensure the delivery of the software is high quality and automated as much as possible.

Client: United Overseas bank/ Gientech Dec 2022 Dec 2024
Role: Senior Developer
Project Description:

Gov-Wallet is a digital wallet module that Government agencies can use to disburse monies and credits to citizens and beneficiaries conveniently and securely. Gov-Wallet is part of the Service layer of the Singapore Government Tech Stack (SGTS).

As a Senior Developer for United Overseas bank Payment and processing team, I have been instrumental in designing and implementing high-performance, scalable applications aligned with critical business and technical goals. Leveraging expertise in Java 21 By architecting micro services on Kubernetes and integrating cloud-native solutions, I enabled resilient, containerized deployments for large-scale data operations. Additionally, I automated CI/CD pipelines and implemented Infrastructure as Code using Terraform and CloudFormation, ensuring seamless deployment and management of secure, scalable environments.
Responsibilities:
Architected and implemented high-performance, scalable applications using Java 21, ensuring alignment with business requirements and technical goals in data engineering workflows.
Optimized Spark applications for performance and scalability by tuning configurations, partitioning strategies, and caching techniques to handle high-throughput.
Integrated Apache Spark with Databricks and AWS services like S3 and EMR, enabling seamless data ingestion, transformation, and storage in Parquet and other file formats.
Designed, deployed, and managed microservices architectures on Kubernetes, enabling scalable, resilient, and containerized data applications in production environments.
Proficient in using Java 8 features such as Lambda Expressions, Streams API, and Optional to write cleaner, more concise, and functional-style code for data transformation and processing.
Hands-on experience with Java 11 features, including HTTP Client API, Local-Variable Syntax for Lambda Parameters, and performance enhancements, for building efficient and modern applications.
Designed and implemented microservices using Spring Boot, integrated with RESTful APIs and secured with Spring Security, ensuring robust and scalable solutions.
Utilized Lambda Expressions and the Streams API in Java 8/11 for processing large datasets efficiently, reducing boilerplate code and enhancing code readability.
Leveraged AWS services like EC2, S3, RDS, and Lambda to architect and optimize cloud-native applications, ensuring high availability and reliability for large-scale data pipelines.
Developed and maintained RESTful APIs and backend services using Java 21 and Python, integrating with data engineering tools to support scalable and secure data processing.
Built and automated CI/CD pipelines using AWS Code Pipeline, Code Build, and Code Deploy for seamless deployment of data engineering workflows.
Utilized Python for advanced automation tasks, data transformation, and integration with Databricks and other analytics platforms to streamline data operations.
Created Infrastructure as Code (IaC) templates using AWS CloudFormation and Terraform, automating the provisioning and management of AWS cloud environments.
Implemented containerization best practices by building Docker images and orchestrating deployments with Kubernetes, ensuring scalability and robust data processing pipelines.
Optimized data processing performance using Databricks, integrating Parquet file formats for efficient storage and query performance in analytics workflows.
Designed and implemented secure access controls with AWS Identity and Access Management (IAM), VPC configurations, and role-based access control, safeguarding sensitive data.
Leveraged AWS S3, DynamoDB, and RDS for efficient, reliable, and scalable data storage solutions tailored to data engineering needs.
Conducted code reviews, performance tuning, and system optimization for Java-based applications, ensuring low latency and high throughput.
Guided teams on best practices for implementing Kubernetes, AWS services, and data engineering tools, ensuring alignment with architectural and operational standards.
Integrated AWS SNS, SQS, and Kafka for event-driven architectures, enabling real-time messaging and efficient data pipeline orchestration.

Environment: Local & DEV followed by Three non-prod (UAT, CAT & PERF) and Two Prod (SANDBOX & PROD)
Techstat & Tools: Intellij, Bitbucket, Jules, Golang, Oracle, Java 21, Terraform, AWS

Client: Highmark Health/ Thryve Digital Feb 2018 Nov 2022
Role: Java Developer
Project Description:
The Claims Billing System (CBS) is a web-based application used to produce invoices for self-funded health insurance clients. The system can invoice paid claims and related expenses as well as administrative fees and miscellaneous items. Users navigate the web interface via drop-down menus and a ribbon of tabs, each containing panels dedicated to specific functions. Hyperlinks within the panels expand to display information and perform functions related to that record. Users view nearly all data stored within CBS by accessing a bill account. Every bill account houses the details that drives production of the client invoices and produces a single suite of invoice reports.

As a Backend Java Developer for USA Healthcare Application Development team, I designed and implemented scalable, real-time health care claims systems utilizing Java, and Angular 14 to meet dynamic business needs. Leveraging cloud platforms like AWS and Azure, I developed robust ETL pipelines with Apache Spark and Databricks, ensuring efficient data transformation and real-time analytics. I created secure RESTful APIs and automated infrastructure deployments using Terraform and CloudFormation, optimizing resource provisioning and system performance. Additionally, I implemented event-driven architectures with Kafka and containerized applications with Docker and Kubernetes, ensuring reliability and scalability across distributed systems.


Responsibilities:
Directed the analysis, design, development, and testing phases of the SDLC using Agile (Scrum) and Design-Driven Development, improving project efficiency and delivery timelines.
Designed and developed complex data engineering applications using Java, and Angular 14, meeting business requirements with optimized performance and scalability.
Architected robust, scalable game and data systems for real-time processing using AWS services, including EC2, S3, and RDS, to manage workloads with low latency and high reliability.
Developed backend services and ETL pipelines using Python, Databricks, and Apache Spark, ensuring efficient data transformation and seamless integration with cloud infrastructure.
Developed and optimized Apache Spark-based ETL pipelines to process large-scale datasets, enabling efficient data transformation and real-time analytics.
Designed secure, robust APIs with Spring Boot, incorporating OAuth2 and JWT-based authentication, ensuring compliance with modern security standards.
Leveraged Java 11's improved garbage collection algorithms (G1GC and ZGC) to optimize memory management in high-performance applications.
Implemented reactive programming paradigms in Spring Boot using Reactor and Web Flux, improving scalability for asynchronous and event-driven applications.
Streamlined cloud-native development with Spring Boot Actuator for monitoring and managing applications in production environments.
Migrated legacy Java applications to modern Spring Boot and Java 8/11, introducing Lambda Expressions and Streams API for cleaner and more efficient code.
Implemented performance tuning strategies (e.g., partitioning, caching, resource configuration) for Spark applications to ensure low latency, high throughput, and scalability.
Integrated Apache Spark with cloud platforms (e.g., Databricks, AWS EMR) and file formats like Parquet, streamlining data ingestion, transformation, and analysis workflows.
Designed and implemented RESTful APIs to enable secure, real-time communication across platforms, integrating Parquet file formats for optimized data storage and retrieval.
Utilized AWS services like Elastic Load Balancing (ELB) and Aurora to manage distributed gaming servers and data storage, ensuring fault tolerance and scalability.
Automated infrastructure deployment using Terraform and CloudFormation, enabling consistent provisioning and scaling of AWS resources.
Orchestrated real-time data processing solutions with Kubernetes and Azure Kubernetes Service (AKS), ensuring robust scaling and fault-tolerant operations.
Implemented event-driven architectures with Apache Kafka, enabling high-throughput messaging and efficient data stream processing across systems.
Built CI/CD pipelines using Jenkins, AWS Code Pipeline, and Code Deploy to automate deployment workflows, reducing deployment time by 30%.
Created and maintained data workflows using Databricks notebooks, integrating Scala and Python for scalable data transformations and analytics.
Managed and optimized NoSQL and SQL databases, including Cassandra, MongoDB, Postgres, and DB2, for efficient storage and querying of large datasets.
Designed secure access control policies using AWS IAM, ensuring compliance with security standards and safeguarding sensitive data.
Automated routine tasks and deployments using Python scripts, improving development workflows and integrating data engineering tools for backend logic optimization.
Processed and analyzed large-scale geographic data using Parquet file formats and Google Protocol Buffers (PBF) for efficient storage and processing.
Built interactive, dynamic user interfaces using React.js and Redux, enhancing performance and user experience in data-heavy applications.
Leveraged Databricks monitoring tools to track pipeline performance and ensure consistent data delivery within SLAs.
Enhanced deployment reliability by containerizing applications with Docker and managing environments across Linux and UNIX servers.
Conducted unit testing with JUnit and Mockito to ensure code quality and reliability, adhering to best practices in development workflows.
Implemented rigorous code reviews and collaborated with cross-functional teams to drive adherence to high standards of data engineering and cloud computing practices.
Environment: Agile, Java, Angular 14, J2EE, Spring, Hibernate, HTML5, CSS3, JavaScript, React, Redux, REST APIs, Oracle, DB2, MongoDB, PostgreSQL, Cassandra, Docker, Kubernetes, Azure, Apache Kafka, UNIX, Linux, Jenkins, JIRA.

Client: BMO Harris/FIS Global Jan 2014 Sep 2017
Role: Java Full Stack Developer
Project Description:
This System is developed for real time ATM withdrawal, balance inquiry and Deposit funds and Transfer amount. This system is a real banking system. Using this system transaction details can be made available to the end users in a near real time environment so that the settlement positions are available minutes after the end of a business day. This system focuses on transaction management, Cash Management, and customer bank account Maintenance. Reports are developed which would provide a detailed vision of the status of the account. CONNEX HP Advantage is very known EFT product running on HP Non-Stop platform. This product offers ATM, POS device drivers, transaction switching. Major components of the product include Terminal Handler (TH), Communication Handler (CH), Processer Interface (PI), Authorization processer (AP), Primary Message control (PMC), CONNEX Environment database (CED).

As a Java Full Stack Developer for FIS Banking Product team, I spearheaded the development of modules for card management, underwriting, and payment processing, enhancing operational efficiency and user experience. Leveraging Java with modern frameworks, I built scalable microservices and integrated self-service portals, increasing customer engagement by 25%. I ensured seamless integration with third-party systems, regulatory compliance, and secure data exchange while optimizing database performance using Oracle and Cassandra. Additionally, I implemented CI/CD pipelines and automated test suites, reducing deployment time by 40% and ensuring robust, high-quality solutions for insurance workflows.
Responsibilities:
Led the analysis, design, development and testing phases of the SDLC using Agile methodology (Scrum) and Design-Driven Development, improving project delivery timelines by 30%.
Developed modules for creating, updating and managing card processing, enhancing operational efficiency and user experience.
Implemented features for card activation, fraud assessment and settlement calculation utilizing both Java and python, resulting in streamlined processes and reduced error rates.
Built and maintained systems for fraud navigator rules, evaluation and settlement ensuring compliance with regulatory standards.
Integrated fraud detection algorithms into applications, enhancing security measures and reducing fraudulent claims.
Developed CRM functionalities to manage customer profiles, interactions and communication, improving customer satisfaction and retention.
Implemented self-service portals for customers to view policies, submit claims and make payments, increasing customer engagement by 25%.
Integrated with third-party systems, including payment gateways, external databases and regulatory bodies, to ensure seamless operations.
Ensured secure and reliable data exchange between systems, safeguarding sensitive information.
Designed and developed new microservices components using Drop wizard within a global architecture to support both UK and US market strategic business flows.
Utilized UNIX and Linux for project deployment in production environments ensuring smooth transitions.
Managed databases, including Oracle, Cassandra and SQL Server, optimizing data storage and retrieval.
Developed user interfaces using React.js, incorporating props, states, keys, refs and React Router for client-side navigation, resulting in dynamic single-page applications (SPAs).
Set up and maintained CI/CD pipelines using Jenkins for build and release automation, enhancing deployment efficiency by 40%.
Utilized DB2 for data persistence of user operations, such as form submissions and policy selections.
Developed reporting tools to track user transactions using shell scripting and DB2, providing insights into operational performance.
Designed middleware and microservices to ensure seamless communication between Java-based and Ruby-based systems.
Planned and executed data migration strategies for transitioning legacy insurance data to new systems, maintaining data integrity and consistency.
Troubleshoot Linux errors during implementation ensuring minimal downtime and efficient operations.
Developed and executed automated test suites using selenium and JUnit (Java) to enhance code reliability.
Conducted manual testing to validate complex authorization based scenarios and workflows ensuring comprehensive test coverage.
Identified, analyzed and resolved defects in java codebases, improving software quality.
Conducted root cause analysis for recurring issues and implemented preventive measures to reduce future occurrences.
Designed and maintained database schemas to support insurance-specific data structures, facilitating data management.
Environment: Agile, Java 1.8, J2EE, HTML5, CSS3, Microservices, Bootstrap, Spring Tool Suite, Kafka, JavaScript, Angular, Spring MVC, Spring JPA, Swagger, MongoDB.

Client: West Corporation/MRR Embedded systems Jun 2010 Dec 2013
Role: Developer Analyst
Project Description:
As a Developer Analyst at MRR Embedded systems, I contributed to the development of security based applications and embedded applications to enhancing security of access the server location with authorized user. I developed and maintained full-stack applications, designed RESTful APIs, and implemented robust microservices to support seamless integration security ecosystems. By optimizing database performance and implementing caching strategies, I ensured high system responsiveness and scalability. Additionally, I enhanced application security and conducted comprehensive automated and manual testing to deliver reliable and secure solutions aligned with industry standards.
Responsibilities:
Collaborated in an Agile development environment, participating in daily SCRUM meetings to enhance team communication and project efficiency.
Developed and maintained full-stack applications utilizing C# for backend services and Java technologies for various components, improving overall system functionality.
Designed and implemented RESTful APIs and web services in C#, facilitating seamless integration with multiple front-end applications.
Authored automated tests using RSpec, Minitest (Ruby) and JUnit (Java), enhancing code quality and reliability by identifying potential issues early in the development process.
Conducted manual testing for complex scenarios and edge cases ensuring comprehensive test coverage.
Profiled and optimized application performance, addressing bottlenecks in both Ruby and Java codebases to enhance user experience.
Implemented caching strategies and efficient algorithms to improve application performance significantly.
Established security measures and best practices in both Ruby and Java applications, safeguarding sensitive data and ensuring compliance with industry standards.
Environment: Web Services, Git, Jenkins, Jira, Agile, Windows, Multithreading, Spring MVC, MySQL, Spring ORM, Hibernate, Singleton, Data Access Objects, REST API, Ruby.
Keywords: csharp continuous integration continuous deployment javascript sthree hewlett packard

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];5385
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: