Sravanthi - Sr. Data Analyst/Data Engineer |
[email protected] |
Location: Plano, Texas, USA |
Relocation: Yes |
Visa: H4EAD |
Resume file: Sravanthi Manda - Resume 2025_1750886283152.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |
Professional Summary
14+ years of IT experience as Data Analyst/Technical Engineer, in Banking, Finance industry, Healthcare, Retail, Transportation, Communications etc., in Software Development, Data warehousing, ETL and SQL and AWS cloud technology Strong knowledge Teradata SQL, Import/Export utilities TPT, BTEQ, MultiLoad, FastLoad, TPump and FastExport Experienced in Data mapping activities and previous working experience with Big Data technologies like Hadoop. Experienced in optimizing SQL queries Experienced in application development using Python, Django, React, Java, JavaScript and AWS Hands on experience on Unified Data Analytics with Databricks, Databricks Workspace User Interface, Managing Databricks Notebooks, Delta Lake with Python, Delta Lake with Spark SQL. Worked on the new age database systems like - AWS, Snowflake, Re-dash etc. Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source systems Experience of working in other ETL tools like Informatica PowerCenter/Developer/Analyst, Teradata, Oracle PL/SQL, Datastage and Pervasive ETL Data Integrator Ability to work well both in a team and individual environment Experienced with technologies like AWS, S3, Python coding, PySpark, T-SQL, etc., while working with clients like CapitalOne Expertise in data warehousing techniques for data cleansing, slowly Changing Dimensions, Surrogate key assignment, change data capture Involved in the Extraction, Transformation and Loading (ETL) of the data into the Data Warehouse using Informatica PowerCenter and other tools as well like Teradata, Ab Initio, Pervasive ETL Data Integrator for the backend. Technical Skills: Tools: TOAD, Teradata Studio/SQL Assistant 13.10/14.0/15.11, Oracle SQL Developer 3.0, Putty, Eclipse 5.0, HP Quality Center (QC)10.0, HP QuickTest Pro (QTP) 10.0/9.5, Soap UI 4.0, Jira 6.1, Selenium Webdriver 5.x, Selenium IDE 1.10.0/ 1.9.0, TFS (Team Foundation Server). ETL Tools: Ab Initio GDE 3.1.2/1.15, Co>Operating System 2.14, Informatica PowerCenter 8.6, Informatica Developer, Informatica Analyst, Pervasive ETL Data Integrator 8.0, SQL Server Management Studio 10.0 Databases: Oracle 10G, Teradata, DB2, SQL Server, Snowflake, AWS File systems, Big Data - Hadoop file system (HDFS) Operating Systems: Windows NT/2000/XP Pro, Windows 7/8, UNIX, DOS, Red Hat Linux Programming Languages: C, Java, Python, SQL, PL/SQL, T-SQL, UNIX Shell Scripting, Hadoop file system Education: Bachelor Degree in Electronics Communication from JNTU University, Kakinada in 2010 Professional Experience: CapitalOne Sep 2024 Present Sr. Data Analyst/Data Engineer In the team Heist/Bookkeeper, for the area Dealer Financials/Dealer Risk which deal with Dealer data and loan data, predominantly work on maintaining the jobs that load the data into the new database system called POTF (Platform of the future) from an existing legacy database which is ready to be decommissioned, performing RCA, making code changes, monitoring jobs, loading backdated data from the legacy database system, supporting during off hours etc., are all a part of the responsibilities in this project. Attended daily standup meetings to discuss the status of the tasks performed for self and others in the team, discuss impediments/roadblocks, needed information etc. Worked on Jira to update the stories was responsible for maintaining the Jira board for the project. Always ensured everybody s work was in sync with the corresponding Jira stories their work hours, added comments whenever there is an update or progress in work, assigning stories to the right team members etc. Expertise in Python and Scala, user-defined functions (UDF) for Snowflake using PySpark and Java using a framework called Quantum which is an inhouse tool of CapitalOne. Worked with tools, like Microsoft SQL Server Management Studio, Microsoft Visual Studio to check SQL Stored procedures and SQL packages developed by the developers Worked on creating many documents for the project like BRD, FRD, Mapping document, metadata specifications, etc., which are required for the project Redesigned the views in snowflake to increase the performance Developed test strategies and did end to end testing on the data loaded from source to target to make sure the data quality was good and also, to make sure the data is loaded correctly Made code fixes in tools like IntelliJ, Microsoft Visual Studio Code for the json files which contain the Quantum code and shell scripts to make any logical changes needed Worked on migrating the jobs from an inhouse job scheduler (AROW - Automatic for Broadcom Vendor) to Redwood RunMyJobs scheduler and worked on designing the jobs and maintaining them as per need basis Develop data modeling documents using Microsoft Visio to design physical database development and share to business users Tools: Java, PySpark, SQL, Snowflake, OneLake, AWS file system, IntelliJ, Microsoft Visual Studio, Teradata SQL Studio, Sharepoint, Unix (Putty), Github, Ab Initio GDE, Co>Op System, Informatica PoweCenter, Jira, MS Office, Google Suite, Redwood RunMyJobs scheduler. Amtrak Feb 2022 Aug 2024 Sr. Data Analyst/Data Engineer Manual Receipts/Revenue Transformation is the project which binds multiple number of different components in order to generate revenue data from the manual or electronic receipts/tickets which are lifted by the conductors on board (train), station agents etc., which is used by the accounts team to calculate the revenue of the tickets that are lifted. The different components of Amtrak which are involved in this Manual receipts where data flows from one step to other to provide the final data are COB (Conductor on Board)/EMM (Electronic Mobile Machines) team, Arrow (Amtrak Legacy DW) team, SDR/F&B (Food & Beverage)/VTrak team, Mulesoft team, ASAP/RailRes/STARS/Agent Sales team, AIM/ODS team. Worked on Jira to update the stories was responsible for maintaining the Jira board for the project. Always ensured everybody s work was in sync with the corresponding Jira stories their work hours, added comments whenever there is an update or progress in work, assigning stories to the right team members etc. Expertise in Python and Scala, user-defined functions (UDF) for Hive and Pig using Python. Worked with tools, like Microsoft SQL Server Management Studio, Microsoft Visual Studio to check SQL Stored procedures and SQL packages developed by the developers Developed test strategies and did end to end testing on the data loaded from source to target to make sure the data quality was good and also, to make sure the data is loaded correctly Since there was no testing team, the end-to-end testing was done by the DEV team. So, this role required multiple hats of being a Team lead, Developer and also a Tester wherever needed Develop data modeling documents using Microsoft Visio to design physical database development and share to business users Tools: Informatica, Java, SQL, SQL Server, Teradata, Microsoft Visual Studio, Teradata SQL Studio, Sharepoint, Unix (Putty), Sourcetree, Bitbucket, Github, Ab Initio GDE, Co>Op System, Informatica PoweCenter, Jira, MS Office, etc Charter Communications Jan 2021 Jan 2022 Data Analyst/Data Engineer Location Intelligence as the name suggests, deals with the location data of the customers or potential customers in a particular location, where all the location information is processed and location geocoding is applied on the data to obtain the specifics like premise , building number etc., of the location obtained through the system and this information is loaded into the target database which is used by other departments with Charter such as accounts department, some of the customers etc. Attended daily standup meetings to discuss the status of the tasks performed for self and others in the team, discuss impediments/roadblocks, needed information etc. Working on python flask API to modify the metric SQLs as part of enterprise migration Proficient in Object oriented design experience, with extensive experience of Python-C/C++ binding using Boost Python and Python C types. Used Pandas and other python libraries for AWS connections Worked with tools, like Microsoft SQL Server Management Studio, Microsoft Visual Studio to check SQL Stored procedures and SQL packages developed by the developers Developing complex SQL to implement business logic and optimizing the performance of queries with modifications in T-SQL queries to use as the data source in the data comparison between different sources using a data validation tool Document all the needed requirements before developing database objects like Dimension tables, fact tables, stored procedure, Views and T-SQL scripting using SSMS to satisfy business needs Tools: SSIS/SSMS/SSAS, Ab Initio, Informatica, Visual Studio, SQL Server, Teradata, Microsoft SSIS/SSMS/SSAS services, Unix (Putty), Ab Initio GDE, Informatica PoweCenter, Sourcetree, Bitbucket, Github, Jira, MS Office. CapitalOne Feb 2018 Jan 2021 Data Analyst/Data Engineer DRM - Floga is the team which deals with data from different PODs and maps them from different source to one target i.e., the Snowflake database. And also, performing tests on the data in Snowflake against the legacy database systems. This role requires to do coding and map data to with the right transformations to the target and also, perform tests on it as there is no separate testing team to do the testing. Worked with different tools like Informatica PowerCenter, Ab Initio etc., for coding and loading data from heterogeneous sources to one target (Snowflake) database Had to take up different roles in the project like Onsite lead, Facilitator for the notifications between Floga team and client managers etc., and tried to guide team in the right direction Attended daily standups and also, had to meet with the Business leads (SMEs) from different PODs to discuss questions related to requirements from the team and translate the same to the team without any gaps in the knowledge Expertise in writing UAT test cases and conducted UAT Testing while working with this team Wrote SQL queries using Teradata SQL Assistant, Oracle SQL Developer, SQL Server Management studio, Hadoop file system, AWS (S3 parquet file system) and Snowflake for finding duplicates, finding out issues and tracking bad data and updating flags/fields before running the jobs etc. Conducted different forms of testing on heterogeneous database systems to check the data and data standards, while trying to check different positive and negative scenario test cases, to help test the data thoroughly and make sure none of the bugs slip through the craps Has experience working on Spark SQL, T-SQL, Python languages and writing and running many commands using the coding technique This team did all the Development, end-to-end Testing, and even worked as the Business Analyst in gathering the requirements from all the SMEs so this role required wearing multiple hats. And she was able to do it as per requirements. Designed Test strategies, wrote test scenarios/test cases, user stories in Jira for every Sprint, and also in HP QC whenever needed and maintained all the needed documentation for all the test cases and attached them to the Jira tickets by the end of every Sprint Worked on heterogeneous databases like - AWS File Systems, HDFS (Hadoop File system), Snowflake, SQL Server, Oracle and Teradata database Discussed Test Plan with the team and also performed Regression testing by the end of every Quarter on all the data accumulated so far to make sure the data meets all the standards and also to see if the new data loaded is also according to the business standards Had to take responsibility for many issues and fixed them and tested them. And discussed these issues with other team members to ensure that the issue will not repeat in future and also, documented the steps taken to fix, any such related issues in future, easily Charter Communications Mar 2017 Jan 2018 Data Analyst (Lead) Interim Reporting as the name suggests is a temporary reporting arrangement for the existing Charter data after the merger between Charter, Bright House Networks (BHN) and Time Warner Cable (TWC), which takes the legacy data from each of the sources (Charter, BHN and TWC) and merges them into one database which is then, used by the downstream Health Status Report (HSR) by the users, for reporting purposes. Worked with different tools like Informatica PowerCenter, Appworx (UC4), Teradata SQL Assistant etc., for carrying out support activities in the project Worked with Unix scripts and made changes to some existing scripts and created new ones for running some parts of the code and for some complex functionalities Worked with some unix boxes in checking for the data in the files and used many commands like grep, awk etc. Did SQL code review, implementation and validation as and where needed Wrote sql queries using Teradata SQL Assistant for finding duplicates, finding out issues and tracking bad data and updating flags/fields before running the jobs etc. Attended and lead many status calls and took the lead in explaining complex issues to the client and documented minutes of the meetings (MoMs) to distribute to the larger audience to keep them informed about the progress that we are making Tools: Informatica, PL/SQL, Teradata, Informtica PowerCenter/Developer/Analyst 9.6, Teradata Studio/SQL Assistant 14.0, Jira 7.1, Sharepoint, Unix (Putty), Appworx (Application Manager UC4) 9.1.0, Filezilla, Notepad++, MS Office. Cognizant Technology Solutions - Monsanto Company Aug 2016 Feb 2017 Data Analyst (Lead) So, in GSC, there are over 2000 users all across the world and it is the responsibility of GSC team to ensure the data flows through to SW smoothly from the DS and also, address any data discrepancies reported by the users and resolve them. There are BODS (SAP), JAVA and Informatica jobs which are being monitored by the GSC team and all of these carry out different responsibilities when it comes to extracting data from the SAP source to GSC DS to SW finally. Worked with different tools like Informatica PowerCenter, Ab Initio GDE, SAP Data Services Management Console, Java monitoring console etc., for carrying out different roles in the project Had to take up different roles in the project like GSC Support onsite lead, Facilitator for the notifications between GSC DS and the users etc., and carried out every role with very much responsibility Executed SQL queries using TOAD for checking the status of entries in tables after running jobs and updating flags/fields before running the jobs Ran different autosys jobs and fixed many issues faced by the jobs Had to monitor, fix bugs and do enhancements for many of the Informatica workflows and Ab Initio graphs that were loading data into the GSC Datastore and had to have the level of expertise in both the ETL tools Worked with Unix scripts and made changes to some existing scripts and created new ones for running some parts of the code and for some complex functionalities Worked with some Unix boxes in checking for the data in the files Created some Ab Initio graphs and Informatica workflows for some functionalities exclusive to GSC Datastore and also, received client appreciation for them Worked on many Ab Initio components like Reformat, Sort, Filter by Expression, Aggregate, Replicate, Join, Rollup, Scan, Fuse, Partition by Key etc., and worked on different simple and complex transformations and incorporated them in the graphs with ease Tools: Informatica, Ab Inito, Oracle PL/SQL, Teradata, Oracle, Teradata, Informtica PowerCenter/Developer/Analyst 9.6, Ab Inito GDE 3.1.4, Oracle10g, Teradata SQL Assistant/Studio 14.0, , SQL, Hadoop file system, Autosys r11.3. Cognizant Technology Solutions - US Bank Jun 2015 Mar 2016 Data Analyst Analyze/Compare the existing logic in the latest ETL spec with the requirements Wore different roles in the project like developer, profiler, tester etc. and carried out every role with very much responsibility Developed Ab Initio graphs for the ELZ team, to be loading their data to the Teradata database, which is used for reporting purpose Developed different workflows, mappings, profiles and objects using different informatica tools and tested the same to ensure they are working as per the business requirement Validate the data and give sign-off for UAT testing to be done by PM s or Business users Ran different autosys jobs and fixed many issues faced by the jobs Gathered all resources required for project development i.e. BTEQ, MLOAD AND FLOAD Import scripts Worked on heterogeneous sources like Teradata, Hadoop, Oracle, DB2 in the same project and handled them effectively Worked with Unix scripts and made changes to some existing scripts and created new ones for running some parts of the code and for some complex functionalities Worked with many Unix commands like grep, awk etc. Tools: Teradata, Ab Initio, Oracle, Teradata, Hadoop file system, DB2, Teradata Studio/SQL Assistant 14.0, Ab Inito GDE 3.1.4, Co>Op 3.1.3, Oracle10g, Informatica, PowerCenter /Developer/ Analyst 9.6, TOAD 14.0, SQL, HP Quality Center 11.52, Lotus Notes, Sharepoint, Hadoop file system, Autosys r11.3, Putty, Filezilla, Notepad++, MS Office Cognizant Technology Solutions- Express Scripts Inc. Oct 2014 May 2015 ETL Business Data Analyst CBM/BWS PSO teams are Production Support teams in ESI which deal with the issues arising for the code which has been promoted to Production for different parts of the applications CBM and BWS in ESI. Resolved many tickets which came to Production Support and also, performed many hot fixes to the code which broke or which has some troubles Analyzed and reviewed the specification (Design) document provided by client Preparation of Unit Test Plan and Unit Test Cases when working with the Teradata DEV team Developed and made enhancements to many Ab Initio graphs that were not functioning as expected and made sure they were meeting the expectations and succeeded Making sure when working in the Teradata DEV team they had all resources required for project development i.e. BTEQ, MLOAD AND FLOAD Import scripts Worked on many Ab Initio components like Reformat, Sort, Filter by Expression, Aggregate, Replicate, Join, Rollup, Scan, Fuse, Partition by Key etc., and worked on different simple and complex transformations and incorporated them in the graphs with ease Also, wrote shell scripts to schedule datastage jobs and other scenarios in MDM Prod Supp project Optimized performance of SQL queries by tuning them to run for shorter duration and help complete the jobs faster while fetching the data in a faster way Proficiency in data warehousing techniques for data cleansing, slowly Changing Dimensions, Surrogate key assignment, change data capture Wrote Unix scripts to schedule any new or updated Datastage jobs Used Datastage Designer and Director environments for designing datastage jobs and sequences and also, to monitor if some jobs had been deployed successfully during deployments Worked with Unix scripts and made changes to some existing scripts and created new ones for running some parts of the code and for some complex functionalities Loaded data in various environments like DEV, QA, PROD etc., by running corresponding IIS and MDM scripts which would trigger the related jobs, whenever requested by any related team as a part of Prod Support responsibilities Tools: Ab Initio, Teradata, Datastage, Oracle, Teradata, DB2, Ab Inito GDE 3.1.4, Co>Op 3.1.3, Datastage Infosphere/Datasphere 8.0, , Oracle10g, SQL,Remedy tracking system, QualityCenter 10, IMT Tracking system, INC tracking system, Teradata SQL Assistant 14.0, Teradata Studio 14.0, Putty, Filezilla, Notepad++, MS Office SV IT Inc. MasterCard May 2014 Oct 2014 Data Analyst/QA Analyst Would validate the MasterPass application being developed by the developers and help identify any issues with the UI and report them to the developers for fixes Performed Smoke, Functionality, Incremental and Regression tests based on Analysis, and understanding of system requirements, non-functional specifications, and end-user needs Participated in a major R1-MasterPass release of the LightBox version of MasterPass and receive lot of applause and critical acclaim from the St. Louis and Kansas City clients who were able to witness and use the improvised application Performance tuning helped in tuning the SQL queries that were taking longer to run and reduce their runtime, so that, they do not interfere and hinder the performance of other jobs running on that same database Worked with Unix scripts and made changes to some existing scripts and created new ones for running some parts of the code and for some complex functionalities Worked with some unix boxes in checking for the data in the files and used many commands like grep, awk etc. Tools: QA, Oracle, SQL Server, Jira 5.1.2, HP QC 11.0, Toad SQL client 10.2, IE 9/MFF 32.0.0/Chrome 37.0.2, IPhone 5s/6 (Mobile testing), MS Office, Snagit 8.0.2, Selenium IDE, Selenium WebDriver, IMacros (For firefox). SV IT Inc. MTM, Inc. Jun 2013 May 2014 ETL Data Analyst Attended the Business Requirement review meetings conducted by BA team to better understand the application functionality and to make the Mapping document to create the Test Plan and Test Scenarios. Received many appreciations for finding many bugs in the loaded and their causes and helping them get fixed and also, for maintaining excellent documentation information for the bugs created for quick and easy understanding purpose Worked with Unix scripts and made changes to some existing scripts and created new ones for running some parts of the code and for some complex functionalities Worked with some Unix boxes in checking for the data in the files Worked on many Ab Initio components like Reformat, Sort, Filter by Expression, Aggregate, Replicate, Join, Rollup, Scan, Fuse, Partition by Key etc., and worked on different simple and complex transformations and incorporated them in the graphs with ease SQL query tuning did this to optimize the query runtime and help the testing better on tables Worked on heterogeneous sources like Teradata, Oracle, DB2 in the same project and handled them effectively Prepared ETL SSIS packages using Microsoft SQL Server suite for extracting data from the Source, transforming it and loading it into the end tables which are used by other team SV IT Inc.- NARS/Integrity Solutions Mar 2013 May 2013 ETL Developer Developed processes in the Pervasive Data Integrator according to the business rules and the requirement sent by the client. Developed different mappings in the Pervasive Map designer which were re-used by the other members of the team. Was involved in all phases of ETL cycle and prepared detailed level design that depicts the transformation logic used in Pervasive ETL. Created many automated graphs in Ab Initio for many of manual validations and reduced time and efforts with progressive results Performed query tuning for the SQL queries to complete faster and reduce the runtime, so that, it would improve the overall performance of the job runtime on the server Developed Ab Initio graphs for specific requirements as and where needed and Was involved in Unit Testing the Pervasive process and quality assurance and bug fixing Prepared and run processes using the Pervasive ETL tool, depending upon the source to target tables mappings with their attributes specified SV IT Inc.- Express Scripts Inc. Nov 2012 Feb 2013 Data Analyst Developed graphs according to the business rules and the requirement sent by the client Involved in all phases of ETL cycle and prepared detailed level design that depicts the transformation logic used in Ab Initio Prepared and ran graphs using the Ab Initio tool, depending upon the source to target tables mappings with their attributes specified Provided Production support from ETL team and created numerous instant Ab initio graphs to solve any error s in the Live environment on spot Experience in ETL Mapping and Coding Teradata scripts Performed query tuning for the SQL queries to complete faster and reduce the runtime, so that, it would improve the overall performance of the job runtime on the server TCS - The Home Depot Sep 2011 Feb 2012 ETL/BIS Tester Requirement analysis and communicating the gaps in the requirements to business Analyzing the existing Informatica routines Building the database modules for the project using PL/SQL Development of new ETL routines in Informatica, creating new mapping and validating session to upload new data to target oracle data base Performance Tuning of the PL/SQL code and Informatica routines TCS The Home Depot Sep 2010 to Aug 2011 Ab Initio Developer Attended daily stand-up calls during the morning and analyzed the business requirement mapping document sent by the on-site business analysts team Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction Keywords: cprogramm cplusplus quality analyst business analyst user interface sthree information technology hewlett packard microsoft procedural language Colorado |