Home

RAGHU RAM - business data analyst/data analyst
[email protected]
Location: Dallas, Texas, USA
Relocation: yes
Visa: h1b
Resume file: Business Data Analyst_Raghu_Resumeee_1747326222602.docx
Please check the file(s) for viruses. Files are checked manually and then made available for download.
Raghu Ram

Contact: +1 (919)-399-9956

Email:[email protected]



Professional Experience

Business Data Analyst with 9+ years of experience in leveraging data to optimize business processes and drive growth. Proven ability to translate business requirements into technical solutions, with expertise in data analysis, data warehousing, and Agile methodologies.

Proficient in data cleaning, standardization, and transformation, coupled with experience in ETL processes for comprehensive and accurate data preparation.

In-depth knowledge of banking data sets, including products (loans, credit cards), accounts (savings, checking), customers, and transactions (deposits, withdrawals, wire transfers).

Automated pipeline tasks using Python scripts and Airflow workflows, reducing operational costs. Optimized batch processing with PySpark for high performance and scalability.

Proficient in Python, SQL, R, and SAS for data extraction and analysis, utilizing Jupyter Notebook for data modeling and statistical analysis.

Successfully managed Agile and Scrum projects, ensuring adherence to project timelines, KPIs, and performance metrics.

Collaborated with stakeholders and cross-functional teams, playing a pivotal role in executing Business Continuity Plans and leading change management initiatives.

Consistently delivered solutions that enhance team productivity, improve customer value, and align with org. goals

Effectively tracked issues and managed change requests using JIRA and ASANA, ensuring smooth project execution and timely resolution of any system or process-related challenges

Led Value Stream Mapping workshops, contributed to Business Process Modeling and Re-engineering, and authored comprehensive BRDs and FRDs to support product development and operational efficiency.

Advanced Tableau, Power BI, and AWS Quicksight skills for creating interactive dashboards and visualizations, uncovering insights that drive strategic decision-making.

Experienced in BABOK Guide, Agile Project Management, Software Development Life Cycle (SDLC), and Waterfall methodologies, leading teams in developing user stories, epics, and system requirement documents. Successfully translated product vision into actionable development using a well-defined project roadmap.

Collaborated with UI/UX teams to enhance design and usability while also supporting API documentation and integrations for Web Service APIs, ensuring efficient data exchange and functionality across applications.

Proven track record of collaborating effectively with cross-functional teams (Data Engineers, Product Managers, and Business Partners) to deliver impactful data-driven solutions.

Expertise in implementing and managing A/B and multivariate tests to optimize user experience.


Technical Skills

Data Visualization & Business Intelligence Tableau Desktop, Tableau Online, Tableau Server, Power BI, Tableau Prep, Google Analytics, Qlik, IBM Cognos

ETL & Data Integration SSIS, Informatica PowerCenter, Alteryx Designer

API Development Postman, Swagger/OpenAPI, RESTful APIs, SOAP APIs

Programming & Scripting Python, SQL, SAS, PySpark

Cloud Platforms Snowflake, AWS(S3, EC2, Lambda), Redshift, Azure Databricks

Data Analysis & BPM Jupiter Notebook, SAS, Excel (Pivot Tables, VLOOKUP), Erwin, Microsoft Visio, Miro

Database Management SQL Server, Oracle, MySQL, NoSQL, T-SQL

Project Management &

Methodologies Agile, Scrum, Waterfall

Collaboration Tools Confluence, Slack


CERTIFICATIONS

Business Analysis Professional Certificate by Microsoft and LinkedIn

Lean Six Sigma Yellow Belt Certificate by KPMG

Certified Anti Money Laundering Specialist issued by ACAMS

Python for Data Science, AI & Development by IBM from Coursera


Education

Texas A&M University-Commerce

Master of Science in Business Analytics

SRM University

Bachelor of Technology Electronics and Communication Engineering


Professional Experience


Client: Walmart, Dallas, TX, USA March 2024 - Present

Role: Business Data Analyst

Responsibilities:

Conducted business reviews with stakeholders to identify improvement areas and implemented data-backed recommendations.

Designed and maintained data models and ETL pipelines to integrate patient and billing data from various sources.

Optimized export forecasting and maintained optimal shelf life, reducing carrying costs by 15% and increasing turnover by 16.8% using Power BI dashboards.

Been part of building models to solve capacity breaches and optimize replenishment processes for different FMCG.

Maintained automation of the Tableau dashboard refresh process, improving reporting for daily, weekly, and monthly compliance reviews.

Leveraged Redshift and Redshift Spectrum to design and implement an automated data pipeline for processing and analyzing large datasets, enabling efficient data loading and transformation from various sources, including data residing in Amazon S3

Reduced data processing time by 50% by optimizing Redshift table design (distribution keys, sort keys) and query performance and by utilizing Redshift materialized views to pre-compute and store frequently accessed data, enabling faster turnaround for critical analyses and reporting.

Acted as a liaison between technical teams and business stakeholders, translating technical jargon into actionable business insights.

Built interactive Tableau dashboards to provide real-time insights into patient care, billing trends, and revenue cycle management.

Developed and maintained detailed user stories and acceptance criteria in JIRA, translating business requirements into actionable tasks for IT teams and ensuring successful delivery in an Agile environment.

Enhanced data accuracy by eliminating 40% of redundancies, improving the reliability of insights delivered to stakeholders.

Conducted risk assessments, identifying vulnerabilities in transaction processing and customer behavior using Alteryx for data mapping and analysis.

Developed business reports with Power BI, increasing strategic planning accuracy by 12%.

Improved compliance accuracy by 15% through precise Schedule B and ECCN classification for US export shipments.

Reduced compliance risks by conducting Restricted Party Screening for export transactions.

Streamlined SOPs and policies, increasing operational efficiency through Alteryx-based automation of compliance workflows.

Environment/Tools: Confluence, SQL, Power BI, Power Center, Tableau, Python, Excel, Workday, Agile, Scrum, SharePoint



Company: Revolut India Operations Pvt. (Bank) Limited, India Jan 2022 Nov 2022

Role: FCC Business Data Analyst

Responsibilities:

Collaborated with diverse teams at Revolut to ensure compliance with regulatory standards while enhancing AML monitoring systems using Fiserv's AML Risk Manager. This strategic partnership effectively addressed client needs and facilitated seamless integration into existing workflows.

Utilized Redshift's concurrency scaling feature to handle peak loads during high-volume trading periods, ensuring consistent performance of the financial risk analysis platform.

Developed Qlik dashboards to visualize key AML metrics, including transaction monitoring, exception handling, and suspicious activity trends, enabling compliance teams to identify high-risk patterns effectively.

Worked as a secondary contact in the system design process. Supported the primary API developers by documenting RESTful APIs, outlining key functionalities, endpoints, and integration points for seamless data exchange between FCC systems.

Demonstrated a strong understanding of data integrity and privacy practices within the financial sector, ensuring adherence to stringent risk management frameworks and regulatory requirements and contributing to robust compliance processes.

Managed compliance and AML monitoring for a range of banking products, including Payments, Transfers, Current Accounts, Savings Accounts, Credit Cards, Corporate Loans, and Investment Platforms, ensuring regulatory adherence (OFAC, FinCEN) and mitigating risks through advanced data analysis.

Designed and implemented complex ETL processes using Informatica PowerCenter and Snowflake, optimizing data integration from diverse sources to support regulatory reporting and improve data accuracy in compliance assessments.

Developed interactive dashboards and reports in Tableau Server and Power BI, visualizing KPIs and compliance metrics.

Facilitated data-driven decision-making across executive management by providing clear, actionable insights into regulatory compliance and operational performance.

Applied Scrum methodologies to streamline processes, safeguard sensitive information, and maintain compliance with regulatory standards. Led successful deployments of data analytics solutions using Agile methodologies and JIRA, enhancing operational efficiency and compliance with AML regulations.

Partnered with UX/UI designers to improve user interfaces, enhancing data accessibility and compliance-related visualization tools, resulting in more intuitive and user-friendly solutions.

Utilized Selenium for automated testing, ensuring the reliability and performance of data solutions. Managed UAT processes to validate new features, ensuring they met business and regulatory requirements.

Environment/Tools: SQL, Power BI, Python, Tableau, ETL Informatica, Snowflake, Power Center, Agile, Scrum, ERP, ECRM, UI/UX.



Company: Infosys Limited, Hyderabad, India April 2021 Decemeber 2021

Role: Associate Consultant

Responsibilities:

Built automated ETL processes using Python (pyodbc, pymysql) to integrate data sources, enabling real-time analysis seamlessly.

Built and optimized data pipelines to handle large volumes of claims and eligibility data, ensuring accurate reporting and analytics for healthcare providers, insurers, and other stakeholders.

Designed and implemented scalable data models using DBT, automating the transformation of raw data into actionable insights, while improving data quality and processing speed for healthcare claims, benefits, and eligibility data.

Integrated cloud platforms such as Azure and Snowflake to manage large volumes of structured and unstructured healthcare data.

Utilized Postman and SOAP UI for API testing and validation, ensuring robustness, performance, and adherence to security standards.

Facilitated data reconciliation processes post-migration, ensuring minimal business disruption and full compliance with data integrity standards.

Designed and developed a workflow for the data migration from Centene to CVS/Aetna Data warehouse and was part of creating Views, and complex Stored Procedures to reduce retrieval time of data on Aetna Systems.

Performed Data validations after the migration and created reports on data distribution and documentation on data mapping.

Collected requirements, identified, and prepared data models to develop automated tableau dashboards to visualize metrics of medical claims, and managed reports on the tableau server reducing the working time by 80%.

Developed workflows with a scheduled refresh in Tableau prep to clean data across different data sources and later source it on the Tableau Desktop to create stories.

Automated Python scripts for data pull across multiple databases followed by data cleaning, manipulation, and analysis.

Worked cross-functionally with IT and Provider teams to automate the reporting process in Tableau. This automation has significantly reduced the daily report generation time by 85%.

Reported turnaround time for Cases/Tickets within the organization and identified delays which initiated conversations for representatives with Medicaid providers/practitioners and provider Analysts within the organization.

Developed interactive dashboards with Power BI and Tableau for granular analysis of patterns and trends, facilitating better decision-making.

Collaborated effectively with internal stakeholders to assess and enhance data-driven processes, leveraging data analysis to improve team decision-making.

Managed project timelines and resources using Agile methodologies and JIRA, ensuring timely delivery and effective collaboration.

Partnered with QA to create test cases and actively participated in UAT system testing.

Environment/Tools: SQL, Power BI, Python, Tableau, ETL Informatica, Snowflake, Power Center, Agile, Scrum, ERP.



Company: Deloitte USI, Hyderabad, India Feb 2019 Mar 2021

Role: Business Data Analyst

Responsibilities:

Applied Agile methodologies to improve process workflows, reducing process cycle time and increasing on-time delivery performance. Tools used: JIRA, Trello.

Utilized Tableau and Google Data Studio to analyze freight and pricing data, improving anomaly detection accuracy and enhancing pricing strategy.

Enhanced risk detection by 15% through data analysis in Tableau, identifying suspicious activity patterns and ensuring regulatory adherence.

Enhanced data accuracy by 40% through data cleansing and deduplication techniques, using Teradata's Quality and Transformations capabilities.

Developed data-driven solutions with cross-functional teams, leading to a 23% increase in high-risk transaction identification.

Managed user acceptance testing (UAT) processes to ensure successful implementation of AML compliance solutions.

Designed and implemented efficient stored procedures in SQL Server to streamline data access during a warehouse migration. This resulted in improved operational efficiency.

Prepared detailed weekly and monthly reports for leadership, facilitating data-driven decision-making.

Applied UML diagrams to visualize and document AML process flows, including data and compliance workflows.

Performed Gap Analysis of client requirements, & generated workflow processes, flow charts, & relevant artifacts.

Analysed and Documented (BRD), FRD, Functional Specification Documentation, and System Requirement Documentation, using UML methodologies, Six Sigma techniques, and Caliber RM.

Implemented SSIS and Microsoft Visio for ETL process visualization within SQL Server Integration Services, ensuring seamless data integration and regulatory compliance.

Provided AML compliance training to new hires, ensuring knowledge transfer and team competency.

Conducted thorough data analysis using SQL queries and Excel VLOOKUP to support AML investigations.

Utilized Python for data manipulation and automation tasks, improving operational efficiency in data processing.

Collaborated with IT teams to enhance data security measures, ensuring compliance with data privacy regulations

Performed a thorough evaluation of the existing AML transaction monitoring procedures, utilizing SQL and Excel for data retrieval and analysis. This assessment revealed areas of inefficiency and discrepancies in data integration and alert creation, resulting in a 25% decrease in false positive alerts.

Maintained advanced data profiling methods by leveraging Python for automation and statistical analysis, resulting in a 15% enhancement in identifying previously unnoticed suspicious transactions within the AML system.

Participated in cross-functional meetings to discuss and implement strategies for improving AML processes and compliance effectiveness.

Environment/Tools: SQL, MySQL, T-SQL, Python, Tableau, ETL Informatica, Apache, Hadoop, Spark, Bigdata, Power Center, Agile, Scrum, Data Management.



Company: HDFC AMC, Hyderabad, India June 2018 Feb 2019

Role: Business Analyst

Responsibilities:

Improved Client Segmentation (15%): Utilized SQL to extract and segment client data by investment preferences and risk tolerance at HDFC. Further analysis with advanced Excel techniques revealed hidden spending patterns related to asset allocation. This data-driven approach boosted targeted marketing effectiveness by 15%.

Analyzed CRM data (through Salesforce) at HDFC to segment clients and personalize marketing, driving a 22% increase in cross-selling rates. Additionally, utilized R for market basket analysis to identify upsell and cross-sell opportunities.

Created functional requirement documents (FRDs) for system enhancements, aligning technical deliverables with business objectives.

Reduced report generation time by 100% through automation and optimization of reporting processes.

Led training initiatives on Cams Edge and HDFC Chatbot at HDFC, equipping teams to handle client inquiries efficiently. This yielded a 10% reduction in transaction processing time and a 5% increase in customer satisfaction scores related to online transactions.

Acted as the primary interface with the programmers and the business customer to perform User Acceptance Testing (UAT) of Oracle Applications Financials.

Designed interactive Tableau dashboards at HDFC, simplifying complex investment data and fostering a 27% increase in client inquiries and follow-up actions on investment opportunities.

Utilized SQL and Python to analyze client data at HDFC, identifying and rectifying inconsistencies in investment product performance across the allotted region.

Environment/Tools: SQL, MySQL, T-SQL, Python, Tableau, ETL Informatica, Apache, Hadoop, Spark, Bigdata, Power Center, Agile, Scrum, Data Management, Salesforce, Oracle.



Company: Temenos India Private Limited, Hyderabad, India July 2016 May 2018

Role: Data Analyst

Responsibilities:

Generated SSIS packages to load data from source systems into the data warehouse database.

Acquired data from Oracle, CSV, Excel, and text files and loaded them into destination tables.

Created packages using complex transformations and control flow tasks such as script, File System, bulk Insert, FTP, send mail tasks, merge, merge join, look up, and all.

Constructed logging files & configuration files to deploy packages from the development to the production environment.

Created jobs and schedules to automate the process of running packages in the SQL server.

Formalized complex summary reports using SSRS to show region and channel-wise sales.

Automated the delivery of reports using subscriptions in SSIS.

Generating complex transact SQL (T-SQL) queries, sub queries with the help of SQL server constraints such as (primary keys, foreign keys, defaults, check and unique, etc.).

Using analytical and troubleshooting skills performed tuning and optimization to achieve rapid issue resolution in large-scale production environments which are located globally.

Environment/Tools: SQL, Power BI, Python, Tableau, Apache, Hadoop, Spark, Bigdata, Qlik, Power Center.



Company: Dot Scripts Technologies, Hyderabad, India May 2015 June 2016

Role: Data Analyst (Financial Services, E-commerce, Health)

Responsibilities:

Developed and hosted web service endpoints using both SOAP and REST protocols on IIS to securely facilitate data extraction and streamline access workflows for ETL processes in SQL Server Integration Services (SSIS), optimizing data retrieval for reporting and analysis in financial, e-commerce, and healthcare projects.

Generated SSIS packages to load data from various sources including Oracle, CSV, Excel, and text files into the data warehouse, utilizing complex transformations and control flow tasks such as Script, File System, Bulk Insert, FTP, Send Mail, Merge, Merge Join, and Look Up.

Constructed logging and configuration files to ensure smooth deployment of packages from development to production environments and automated package executions in SQL Server through scheduled jobs.

Designed comprehensive summary reports in SQL Server Reporting Services (SSRS) to track region- and channel-wise sales, with automated report delivery using SSIS subscriptions.

Wrote and optimized complex T-SQL queries and sub-queries, leveraging SQL Server constraints (primary keys, foreign keys, defaults, check, and unique) for data integrity and performance.

Applied strong analytical and troubleshooting skills to tune and optimize performance, achieving rapid issue resolution in large-scale, high-demand production environments globally.

Environment/Tools: SQL, Power BI, Python, Tableau, Apache, Hadoop, Spark, Bigdata, Qlik, Power Center.



Company: Adaps IT Private Limited, Hyderabad, India Nov 2014 April 2015

Role: Data Analyst

Responsibilities:

Participated in all phases of project life, including data collection, data mining, data cleaning, model development, validation, and report creation.

Utilized the K-Means clustering technique to identify outliers and classify unlabeled data.

Developed segmentation models using K-means Clustering to discover new segments of users.

Created Power BI dashboards using stack bars, bar graphs, scattered plots, and geographical maps.

Evaluated statistical information to determine risk.

Interaction with Business analysts, SMEs, and other Data Architects to understand Business needs and functionality for various project solutions.

Carrying out specified data processing and statistical techniques such as sampling techniques, estimation, hypothesis testing,
Keywords: quality analyst artificial intelligence user interface user experience business intelligence sthree rlang information technology Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];5491
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: