Shivany - Senior Data Analyst |
[email protected] |
Location: Garner, North Carolina, USA |
Relocation: Relocate any where in USA |
Visa: H1B |
Resume file: Shivany Data Analyst M_1745525802429.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |
Shivany
Power BI Developer/ Sr. Data Analyst [email protected] || 9549982358 Professional Summary: Around 8 years of experience in Software Design, Development, Integration, Implementation and Maintenance of Business Intelligence and the related Database Platforms applications along with 4+ years of data visualization using Power BI and Tableau. Data Analyst with solid understanding of Data Modeling, Evaluating Data Sources strong understanding of Data Warehouse/Data Mart Design, ETL, BI, Client/Server, Cloud applications. Extensive Experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration. Extensively developed Business Requirements Documents, Functional Design Documents and Technical Design Documents. Excellent hands-on building predictive models using algorithms like classification and regression in R and Python Excellent knowledge of Health Insurance Portability and Accountability Act (HIPAA) Standards and Compliance issues. Developed processes for ingesting and importing data from various sources into MongoDB databases, ensuring data quality, consistency, and integrity. Very keen in knowing newer techno stack that Google Cloud platform (GCP) adds. Experience in aggregating data and configuring reports from different data sources using data blending. Experience in working with ETL team on Data Migration using SQL, SQL Azure, Azure storage, Azure Data Factory, SSIS and PowerShell. Experience in testing and writing SQL statements - Stored Procedures, Functions, Triggers, and packages proficient in Snowflake, Teradata, SQL Server, Oracle etc. Proficient in Building Analysis Services reporting models, developing visual reports, KPI scorecards, and dashboards using Power BI desktop. Can work parallelly in both GCP and Azure Clouds coherently Experience in writing complex SQL Queries using stored procedure, common table expressions (CTEs) temporary table to support Power BI. Analyzed data generated from SNS notifications to extract insights, trends, and patterns, providing valuable insights into system events and user interactions. Utilized various data analysis techniques to extract insights, trends, and patterns from large datasets stored on EC2 instances. Created different Power BI reports utilizing the desktop and the online service and schedule refresh. Worked with end users with problems installing the Power different stalling, and configuring the Personal and On-Premises gateway, connecting to data sources, and adding the different users. Created reports in Power BI preview portal utilizing the SSAS Tabular via Analysis connector. Proficiency in different visualization in the reports using custom visuals like Bar Charts, Pie Charts, Line Charts, Cards, Slicers, Maps etc. Also, using different transformation inside Power Edit Query into clean-up the data. Expertise in the design and development of Tableau visualization and dashboard solutions using Tableau Desktop and publishing the same on Tableau Server/ Public/ Reader and external websites. Extensive experience in various Tableau Desktop objects like Measures, Dimensions, Folder, Hierarchies, Extract, Filters, Table Calculation, Calculated fields, Sets, Groups, Parameters, Forecasting, Blending and Trend lines. Gather healthcare data from various sources such as electronic health records (EHRs), claims data, patient satisfaction surveys, and medical billing systems. Strong management skills, reporting and analytical/problem solving skills with attention to detail. Education Details: Bachelor's Srikrishna Devaraya University (2017) Technical Skills: Reporting Tools: Power BI Desktop, Power BI Service, Power BI Gateway, Power BI Report Server, Tableau Desktop, Tableau Server, SQL server reporting services (SSRS), SQL Server Analysis Services (SSAS). Cloud Technologies: Azure and Power BI Service. Data Analysis: NumPy, Pandas, Python, R, SQL, Excel, ANOVA, chi-square, A/B test. ETL Tools: Informatica, SQL Server Integration Services (SSIS). Databases: Snowflake, SQL server, Azure SQL DB, Oracle, Teradata. Operating System: Windows, UNIX Languages: DAX, Python, SQL, PL/SQL, T-SQL Application Tools: TOAD, MS Office, FTP, SQL Assistant, Rally, JIRA, GitHub, Hadoop Methodologies: Agile/Scrum, Waterfall Professional Experience: WholeSale EZ Oct 2024 - Present Role: Power BI Developer/ Sr. Data Analyst Responsibilities: Involved in sprint planning sessions and sizing the user stories in Agile environment. Responsible for gathering, analyzing, and documenting business requirements, functional requirements and data specifications to generate the Reports. Converted functional requirements into technical specifications and performed required data analysis, data mapping, functional testing, unit testing, and test case preparation using required tools based on project needs. Used Power BI to develop data analysis prototype and used Power View and Power Map to visualize reports. Expertise in creating different visualizations using Slicers, Lines, Pies, Histograms, Maps, Scatter, Bullets, Heat Maps, Tree maps, etc. Analyzed covid datasets, built visualizations and recommended insights for business using Python, SQL, and Power BI Build data pipelines in airflow in GCP for ETL related jobs using different airflow operators Used DAX (Data Analysis Expressions) & MDX functions for the creation of calculations and measures in the Tabular Mode & multi-dimensional Cubes. Worked closely with the Enterprise Data Warehouse team and Business Intelligence Architecture team to understand repository objects that support the business requirement and process. Developed and optimized SQL queries and data processing workflows to efficiently retrieve and manipulate data stored on EC2 instances. Identified and troubleshooted data-related issues, such as data discrepancies, data quality issues, and system failures, and resolve them in a timely manner to minimize impact on operations. Cleansed and preprocessed raw healthcare data to ensure accuracy, consistency, and integrity, addressing issues such as missing values, duplicates, and outliers. Designed and maintained the data model for the Data Warehouse (EDW), ensuring that it accurately represents the organization's data and supports efficient data retrieval and analysis. Created visualizations, dashboards, and reports using tools like MongoDB Charts, Tableau, or Power BI to communicate analysis results and insights to stakeholders. Enriched SNS message data with additional contextual information from other data sources to enhance analysis and provide deeper insights into events and trends. Used Pandas, NumPy, seaborn, SciPy, Matplotlib, NLTK in Python for developing various machine learning algorithms and utilized machine learning algorithms such as linear regression, multivariate regression. Implementing Azure Logic Apps, Azure Functions, Azure Storage, and Service Bus Queues for enterprise-level systems. Integrated Lambda functions with other AWS services such as S3, DynamoDB, RDS, Redshift, or Elasticsearch to orchestrate complex data workflows and data pipeline architectures. Used cloud shell SDK in GCP to configure the services Data Proc, Storage, BigQuery Implemented Azure Storage - Storage accounts, blob storage, and Azure SQL Server. Implemented scheduled data refreshments with on-premises data gateways as per the business requirement. In Power BI developed reports by different time intelligence like year to data (YTD), Month to date (MTD), same period last year. Collected HR data from various sources including HRIS (Human Resources Information Systems), payroll systems, employee surveys, and performance management systems. Integrate data from disparate sources to create a unified dataset for analysis. Programmed a utility in Python that used multiple packages (SciPy, NumPy, pandas). Created Row level security so based on role, the end user can view the data. Analyzing reporting requirements and developing various Dashboards, using trend lines and reference lines. Created no of views, calculated tables, calculated columns, and measures. Have experience in using the Bookmarks and Sync Slicers and Buttons extensively to develop the Best User Experience. Involved in UAT of the applications by providing users with test cases and scenarios, and guiding them during the testing process. Developed and maintained performance metrics and key performance indicators (KPIs) to monitor healthcare quality, patient safety, provider performance, and operational efficiency. Designed Lambda functions to process batch data stored in Amazon S3 or other storage services, enabling scalable and cost-effective data analysis workflows. Monitored data systems specific to the P&C insurance domain, such as policy administration systems, claims management systems, and underwriting platforms, to ensure they are functioning correctly and meeting performance metrics and service level agreements (SLAs). Developed the python Scripts for Automate the Data Dumps. Reviewed the data model and reporting requirements for Cognos Reports with the Data warehouse/ETL and Reporting team. Developed and maintained ETL processes to extract data from source systems, transform it to fit the data warehouse EDW schema, and load it into the data warehouse, ensuring data quality and integrity throughout the process. Provided walkthrough to the end user to understand functionality and answering questions related to data. Environments: Power BI Desktop, Power BI Services, Power BI On-Premises Gateway (Standard mode and Personal Mode), SSIS, SSRS, SQL Server, DAX, M- Language, Power Query, EXCEL, Azure Cloud, Snowflake, DBeaver, Python, Jira. RBH Crown Plaza Nov 2022 Sep2024 Role: Power BI Developer/ Sr. Data Analyst Responsibilities: Worked with Business users to gather requirement specifications for new dashboards. Communicating with the users about their requirements, converting the requirements into functional specifications and developing advanced visualizations. Analyzed HR data to identify trends, patterns, and correlations related to employee demographics, retention, turnover, performance, compensation, and other HR metrics. Used statistical techniques and data visualization tools to present findings. Worked on Data Cleansing using Power Query in Power BI Desktop Editor. Wrote DAX expressions to create new measures, calculations per business requirements. Effectively used data blending, filters, actions, and hierarchies feature in Power BI. Published the developed dashboard, reports on the Power BI Services so that the end-users can view the data. Participated in the configuration of an on-premises Power BI gateway to refresh datasets of Power BI reports and dashboards. Cleansed and preprocess raw data stored in S3 buckets to ensure consistency, accuracy, and suitability for analysis, using tools like Python, SQL, or Apache Spark. Monitored the performance of EC2 instances and associated data infrastructure, identifying and troubleshooting issues related to performance, scalability, and resource utilization. Implemented data quality checks and validation processes to ensure that data meets ANSI standards for completeness, accuracy, and integrity. Implemented processes and controls to ensure the quality and consistency of data in the data warehouse, including data cleansing, deduplication, and error handling. Experience in moving data between GCP and Azure using Azure Data Factory. Expertise in development of High level design, Conceptual design, Logical and Physical design for Database, Data warehousing and many Distributed IT systems Created Tables, Views, SQL joins and defined roles and privileges to implement business rules and control access to database. Conducted root cause analysis of data-related issues within the P&C insurance domain, considering factors such as policy coverage, claims processing rules, and underwriting guidelines, and develop solutions to prevent recurrence. Developed machine learning models and algorithms to detect patterns, anomalies, and outliers in SNS message data, providing early warnings for potential issues or opportunities. Built multifunction readmission reports using python pandas and Django framework. Wrote standard & complex SQL Queries to perform data validation and graph validation to make sure test results matched back to expected results based on business requirements. Wrote complex SQL queries in database Access and created various joins (inner, outer joins) to fetch the desired output for Data Analysis. Used various sources to pull data into Power BI such as SQL Server, Excel, cloud, SQL Azure, etc. Used DAX (Data Analysis Expressions) functions for the creation of calculations and measures in the Tabular Models. Managed metadata for the EDW, including data definitions, data lineage, and data dictionaries, to provide documentation and context for data analysis and reporting. Worked on HIPAA transactions and code sets standards according to the test scenarios such as 837 health care claim transactions. Created effective reports using visualizations such as Bar chart, Clustered Column Chart, Waterfall Chart, Gauge, Pie Chart, Tree map, KPI etc. in Power BI. Strong experience on connecting various Data Sources in Power BI. Validated already developed python reports. Fixed the identified bugs and re-deployed the same. Created various data modeling in Power BI and linking the tables using various dimensions. Reverse engineered SSRS reports and converted them into Power BI reports. Leverage a broad stack of technologies Python, Docker, AWS, Airflow, and Spark to reveal the insights hidden within huge volumes of numeric and textual data. Developed and maintained multiple Power BI dashboards/reports and content packs. Wrote calculated columns, Measures queries in Power BI desktop to show good data analysis techniques. Used Power BI Power Pivot to develop data analysis prototype and used Power View and Power Map to visualize reports. Documented data-related processes, procedures, and troubleshooting steps specific to P&C insurance operations, and share knowledge and best practices with team members and insurance business users to facilitate collaboration and continuous improvement. Scheduled Automatic refresh and scheduling refresh in Power BI Service. Environment: Power BI Desktop, Python, Power BI Service, Power BI Data Gateway, DAX, T-SQL, SQL Profiler, MS Excel, MS SQL Server 2018, Snowflake, SSRS, SSIS, Azure, Windows. Matrix IT Software Hyderabad, India Jul 2019 Oct 2022 Role: BI Developer (Power BI) Description: I was involved in Data Analysis, Data Profiling, Data Integration, Migration, Data governance, metadata management, and Master Data Management and Configuration Management. The Project involved in developing the dashboards in Tableau, Power BI environments for Metrics and Compliance department. Responsibilities: Requirement gathering from Business Users and documenting the Technical Design documents for building the Power BI reports. Designing & data modelling of various datasets to build reports. Extensive experience of building dashboards using Tableau. Ability to introduced Tableau to Client s to produce different views of data visualizations and presenting dashboards on web and desktop platforms to End-user and help them make effective business decisions. Experience with creation of users, groups, projects, workbooks and the appropriate permission sets for Tableau server logons and security checks. Created incremental refreshes for data sources on Tableau server. Involved in Requirements gathering/analysis, Design, Development, Testing and Production roll of Reporting and Analysis projects. Performed exploratory data analysis (EDA) and statistical analysis on datasets stored in S3 buckets to extract insights, identify trends, and answer business questions. Worked on Advanced SQL skills, fluent in R and/or Python, advanced Microsoft Office skills, particularly Excel and analytical platforms. Documented data standards, procedures, and metadata definitions following ANSI documentation standards, ensuring clarity and accessibility for stakeholders. Able to creatively stretch the capability of Tableau to customize and develop creatively stunning and dynamic visualizations. Developed HR reports and dashboards to visualize key HR metrics and KPIs. Customize reports to meet the needs of various stakeholders including HR managers, department heads, and executive leadership. Created Tableau Dashboards with interactive views, trends and drill downs along with user level security. Experience in creating different visualizations using Bars, Lines and Pies, Maps, Scatter plots, Gantts, Bubbles, Histograms, Bullets, Heat maps and Highlight tables. Developed reports by using SQL Server and Power BI as part of POC. Used Power BI, Power Pivot to develop data analysis prototype, and used Power View and Power Map to visualize reports. Used query languages such as SQL or tools like AWS Athena to query data directly from S3 buckets, enabling ad-hoc analysis and exploration of large datasets. Wrote calculated columns, Measures queries in Power BI Desktop to show good data analysis. techniques. Hands on experience on both Load performance tuning and Report performance tuning. Environment: Tableau 2019.x/ 2018.x, 10.x, Power BI, Azure, Oracle 12c, SQL Server, TOAD, PL/SQL, Java Script. Hexa IT Solutions Jun 2017 Apr 2019 Role: Data Modeler /Data Analyst Description: Liquidity analysis and reporting are to analyse Liquidity Risk across the enterprise and send reports to Senior Management and Regulators. Liquidity Analysis depends on Funding and/or Asset Markets, Assets and Liabilities (including Off-Balance Sheet items), and comprehensive Stress Testing of the Cash Flows. As a BI Analyst, I was responsible for Assess and document data requirements and client-specific requirements to develop user-friendly BI solutions - reports, dashboards, and decision aids. Responsibilities: Interacting with Business Users, gathering requirements for Risk adjustment analytics reporting. Developed intuitive visualization products using appealing tables, graphs, charts, maps, images & logos and Drill-Down and Drill-Up Hieratical relationships and actions. For example, developing three dashboards where the first summaries at a higher level and the rest detail to the middle & lower levels. Trained and guided business users on how to effectively use Dashboards and Worksheets in slicing and dicing the reports and exporting the reports to different media for further analysis and printing. Developed business stories composed of dashboards, worksheets, images, and texts for presentation purposes. Responsible to create interactive dashboards for Regulatory metrics and scorecards. Prepared proof of concept for best practices in Tableau deployment and visualization Prepared KPI s and Year over Year comparison reports as per the user requirement. Created dashboard from scratch that included drill-through, filters, Heat/ Tree Maps, Pie charts, donut charts, Bar Charts, Line/Area Charts, Geographic map, scatter plots and Bullet Graphs. Experience working with Extract with high data volumes and connecting to multiple data sources like Oracle, Excel, and My SQL using Data Blending. Perform Data Cleaning, features scaling, features engineering using pandas and NumPy packages in Python. Generated reports that include Filters, Charts, Scorecards and Drilldown. Creating Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps, Gantt charts Expertise in understanding visual design principles like colour, layouts, and graphics. Build various Excel power pivot reports for business users. Continuously collected and analysed reporting requirements from business users with an Agile mindset and created appealing new dashboards and enhanced/upgraded existing Dashboards to better inform end users and enabled informed business decisions. Improve report performance by extracting data sources into (. hyper) format and schedule timely refreshes of Dashboards and Worksheets using Tableau Server Creating New Schedule's and checking the task's daily on the server Environment: Tableau 10.x/ 2018.x, Teradata, Oracle, Toad, SQL Server, Informatics, Jira Keywords: cprogramm business intelligence sthree database active directory rlang information technology microsoft procedural language |