Sirish - SQL/SSIS/ETL Power BI Developer |
[email protected] |
Location: Houston, Texas, USA |
Relocation: |
Visa: H1B |
Resume file: Shirish_SQL_SSIS_ETL_1746199391507.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |
SIRISH
SQL/SSIS/BI Lead Developer Phone: +1(469)-639-0138 | Email: [email protected] Summary: Highly motivated SQL server/BI Lead with 10+ years of IT experience having multiple skill set and extensive working experience in data warehousing and Business Intelligence Technologies with expertise in SQL Server development, MSBI stack (TSQL, SSIS, SSAS, SSRS), Azure, Power BI for building, deploying and managing applications and services through a global network of Microsoft - managed data centers. Experience in Analyzing, Designing and Developing Business Intelligence (BI) database applications and various segments of Software Development Life Cycle (SDLC), using MS SQL Server, SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS) and SQL Server Analysis Services (SSAS) Expertise in installing, configuring and maintaining SQL Server ... database systems. Excellent T-SQL Developer skills working with Stored Procedures, Views, User Functions, Triggers, SOL Agent jobs. Hands on experience with SQL programming in creating tables, Stored Procedures, triggers, user-defined functions, views, indexes, user profiles, relational database models. Experience in Performance Tuning, Query Optimization, Database consistency checks using DBCC utilities. Extensively used tools like SQL Profiler, Index Tuning Wizard, and Database Engine tuning wizard for monitoring and tuning MS SOL Server performance. Experience in migrating application code and objects from development/test environments to production. Experience in data analysis entails understanding the flow of data and its interdependency among QNXT core modules such as Claim, Provider, Member and Affiliates module Experience in optimizing data loading performance by implementing bulk insert operations within SSIS packages when interacting with external APIs, significantly reducing processing time. Experience Working on SSIS PowerPack ZappySys components to perform multi-functional transmissions in a single component ZappySys Task. Well-versed in usage of SSIS Control Flow items(Execute package/SQL tasks, Script task, For each loop, Sequence containers etc) and SSIS Data Flow items(Conditional Split, Data Conversion, Fuzzy lookup, Merge Join, Pivot etc). Experience in creating Master and Child packages, Package configurations, Event handling, ssIS event logging. Expert in Extracting and transforming data (ETL) from various heterogeneous sources and creating packages using SSIS. Good knowledge of defining, developing and deploying Star Schema, Snow flake Schema and Dimensional Data modelling using MS SQL Server Analysis Services (SSAS) on EDW. Experience in creating Jobs, Alerts, SQL Mail Agent, and schedule SSIS Packages. Worked with Tabular reports, Matrix reports, Gauges & Chart reports, Parameterized reports, Sub reports, Ad-hoc reports, Drill down reports using SQL Server Reporting Services(SSRS). Extensively used Report Wizard, Report Builder and Report Manager for developing and deploying reports in SSRS. Experience Involving in writing extensive SQL Queries for back end testing oracle database. Experience in Creating and scheduling Automate jobs using Autosys, SQL Agent to run SSIS packages and EPPlus jobs. Experience on Creating logs for ETL load at package level and task level to log number of records processed by each package and each task in a package using SSIS. Experience on Implementing error handling and roll back process in ETL load. Experience on Configuring SSIS packages using Package configuration wizard to allow packages run on different environments. Experience in using FACETS support systems were used to enable inbound/outbound HIPAA EDI transaction in support of HIPAA 834 transactions. Extensively Worked with Facets Trizetto Batch Scheduling and monitored daily for updates in various domains Facets. Worked extensively on Optimization and Tuning for increasing Query and Database performance. Highly analytical and solid programming skills besides competent in handling critical situations. Good verbal and written communication skills combined with strong interpersonal and conflict resolution skills and possess strong analytical skills Technical Skills: Languages: C#, C, C++, VB 6.0. Web services: WCF, RESTful, and Web Services. Scripting Languages: JavaScript, VB Script Markup Languages: XML, XAML, HTML, XSL, XSLT, CSS, Server .NET Technologies: C#, Entity Framework, Aspose, EPPlus, ASP.NET MVC 4/3, ADO.NET. Methodology: AGILE, SCRUM, Waterfall. Databases: SQL Server [2015/12/08], Oracle. Data Modeling: Snowflake Schema, Dimensional Data Modeling, SSAS(Tabular and Multidimensional) ETL Tool: SQL Server Integration Service Reporting Tools: SQL Server Reporting Service, Power BI Operating Systems Windows: 10/8/7/XP/Vista, UNIX Software Visual Studio.NET: IIS 5/6/7.5, SSRS, SSIS, Visual SourceSafe, TFS, Cloud: AWS, Azure, Snowflake Professional Experience: Bank of America, Charlotte,NC Oct 2023 Present Role: SQL/SSIS developer Responsibilities: Wrote complex SQL queries using joins, sub queries and correlated sub queries to retrieve data from the data base. Used different joins, sub queries and nested queries in SQL Query. Involved in writing various DB objects like user-defined functions, views, and usage of indexes for accomplishing multiple tasks. Implemented a API web service connection within an SSIS package to pull Tradeinventory data from a Application, streamlining data synchronization and improving accuracy. Designed and deployed SSIS packages to extract Trade history data from a company's internal API, utilizing data transformations to create forecast segmentation insights. Designed and constructed dependable SSIS packages with strategies for error handling and notification to maintain integrity and alert responsible parties in the event of an error. Integrated AWS codebuild and Codedeploy with Git Repositories to automate SQL script execution and version control. Implemented AWS CodePipeline for automating database schema migrations and deployment of SQL scripts. Works with SQL Server and SSIS technologies and has good Data Warehouse knowledge and tools. Created SSIS Packages using Transformations like Lookup, Derived Columns, Condition Split, Aggregate, Execute SQL Task, Data Flow Task, Execute Package Task, etc. Participated in the data mapping workshops and the development of a Data Dictionary. Utilize daily scrums and biweekly sprints to manage projects and stay on time by adhering to the agile methodology. Closely collaborated with the visual design team to choose the best renderings for the outputs from BI. Implemented tasks to schedule Power BI Gateway and Dashboard refreshes on-premises and personally at regular intervals. Worked with the company to define and establish KPIs for the calculations in the Power BI and SSRS systems. Effectively optimized the performance of stored procedures and SQL source queries for data load. Created SSIS packages that combine data from different sources into a single source for Power BI reports. Designed and developed data marts and business intelligence using multi-dimensional models - star schema. Worked in Creating DAX Expressions and implementing Partitions in Tabular models. Created various Tableau dashboards for the management to depict important KPIs. Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server. Tuned SQL queries and stored procedures to improve performance and stability. Involved in creating external and internal tables in azure SQL data warehouse and made stored procedures to move the data from external to internal tables. Participation in the entire Software Development Life Cycle (SDLC). Developed Custom Azure Data Factory Pipeline Activities, including Copy activity. Using SQL, SQL Azure, Azure Storage, Azure Data Factory, SSIS, and PowerShell mostly for Data Migration. Developed SSIS Reusable Packages to extract data from Excel, XML, and multi-formatted flat files into the database. Used DAX to troubleshoot and design complex calculations. Involved in deciding which KPIs and metrics to use in the reports. Environment: MS SQL Server 2015/17/19, T-SQL, Windows 10/8/7 Servers, TFS, SSIS 2105, SSRS 2015, SQL Server Management, LINQ. Visual Studio 2017/15, PL/SQL, Web Services, XML Schema, XSD, XSL, JavaScript, Autosys, GIT, Power BI Desktop, Tableau, CICD Pipeline, AWS Wells Fargo, Charlotte,NC Mar 2021 Oct 2023 Role: SQL/SSIS developer Responsibilities: Worked closely with Developers, Quality Assurance Analysts, Architects, Business Partners and other Technology teams to ensure requirements and design are completely understood. Designed and developed SSIS packages to load data from sources like SQL, Excel, CSV, and Multi formatted Flat files to SQL server database. Optimized data loading performance by implementing bulk insert operations within SSIS packages when interacting with external APIs, significantly reducing processing time. Extensively used transformations such as Conditional Split, Data Conversion, Derived Column, Multicast, Row Count, Merge, Union All, Script task, Execute Package task, Expression task, Bulk Insert task, Execute SQL task and Containers namely-Foreach Loop Container, For Loop Container and Sequence Container. Wrote C# scripts in Script Task in SSIS packages. Scheduled and monitored the regular jobs and ETL packages, Used ADO.net connection, .net and C# code in SSIS Script task Utilized custom script tasks in SSIS to parse JSON responses from a weather API, transforming the data into a structured format for loading into a weather database. Created and scheduled Autosys Automate jobs to run SSIS packages and EPPlus jobs. Responsible for creating Databases, Tables, Cluster/Non-Cluster Index, Unique/Check Constraints Views, Stored Procedures, Triggers, Rules and Defaults. Creating and Modifying T-SQL stored procedures for validating the integrity of the data. Designed and developed the user interface using MVC framework in C#, Entity framework following the best coding practices. Configured CodePipeline to trigger when new SQL scripts are pushed to the repository. Used AWS CodeBuild to validate and package SQL migration scripts. Deploed SQL scripts using AWS CodeDeploy, AWS Lambda, or direct execution in Amazon RDS/Aurora via automated Scripts Used Model, Database and Code First approaches in the Entity Framework Used Bulk Copy Program (BCP) to transfer data between servers. Created and published Parameterized, Linked, Sub reports, Drilldown and Drill through reports on large data sources. Involved in enhancements and modification of SSRS Reports. Worked with configuring the SSRS report manager, report server and generating the web URL for the report server and creating a virtual directory for the report server where the deployed reports would be stored as a repository. Created Database Objects - Tables, Views, Stored Procedures and Table Valued functions to use in Integration (ETL) and reporting projects. Involved in enhancing and migrating Databases, ETL processes, SQL Agent Jobs and reports from SQL Server 2008 R2 /2012/2014 to SQL Server 2016. Performed Data reconciliation, Validation and Error handling after extracting data into SQL Server during server migrations. Environment: MS SQL Server 2015/17/19, T-SQL, Windows 10/8/7 Servers, TFS, SSIS 2105, SSRS 2015, SQL Server Management, LINQ. Visual Studio 2017/15, PL/SQL, CICD Pipeline, AWS, Web Services, XML Schema, XSD, XSL, JavaScript, Autosys, Azure Utility, Aspose, EPPlus, Entity Framework. NCDST (North Caroline Department of State Treasurer), Raleigh, NC Aug 2020 Mar 2021 Role: SQL/SSIS developer Responsibilities: Worked closely with Developers, Quality Assurance Analysts, Architects, Business Partners and other Technology teams to ensure requirements and design are completely understood. Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business. Created Business-Crucial stored procedures and functions to support efficient data storage and manipulation. Worked on Extracting, Transforming and Loading (ETL) data from Excel, Flat file, to MS SQL Server by using DTS and SSIS services. Working on SSIS PowerPack ZappySys components to perform multi-functional transmissions in a single component ZappySys Task. Created packages in SSIS with error handling. Worked with different methods of logging in SSIS. Extensively worked on Facts and Slowly Changing Dimension (SCD) tables. Performed Loading operation of historical data using full load and incremental load into Enterprise Data Warehouse. Involved in building Data Marts and multi-dimensional models like Star Schema and Snowflake schema. Filtered data from Transient Stage to EDW by using complex T-SQL statements in Execute SQL Query Task and in Transformations and implemented various Constraints and Triggers for data consistency and to preserve data integrity. Writing T-SQL scripts, dynamic SQL, complex stored procedures, functions, triggers and SQLCMD. Maintaining jobs for data messaging from development server to test server for generating daily reports. Used data conversion tasks in SSIS to load the data from flat file to SQL Server database. MS SQL Server Configuration, Performance Tuning, Client-Server Connectivity, Query Optimization, Database Maintenance Plans and Database Consistency Checking DBCC commands. Built MDX queries and Data Mining Expression (DMX) queries for Analysis Services cubes & Reporting Services. Performed database transfers and queries tune-up, integrity verification, data cleansing, analysis, and interpretation. Conducts ETL process design and sizing estimates for large scale data warehousing projects. Written several SQL queries and analyzed them to validate the business rules in Oracle database through SQL Developer as part of Backend testing. Designed ETL packages dealing with different data sources (SQL Server, Flat Files) and loaded the data into target data sources by performing different kinds of transformations using SQL Server Integration Services (SSIS). Responsible for coding SSIS processes to import data into the Data Warehousing from Excel spreadsheets, Flat Files and OLEDB Sources. Involved in building and maintaining SSIS Packages to import and export the data from various data sources using BIDS based on the design data models. Developed, monitored and deployed SSIS packages. Created document for Self-Service Deployment and Execution process for SSIS packages. Constructed OLTP and OLAP Databases. Created complex SSAS cubes with multiple fact measures groups, and multiple dimension hierarchies based on the OLAP reporting needs. Created calculated fields with MDX code to meet additional measures needs that are calculated from existing measures. Implemented cell level security in cubes using MDX expressions to restrict users of one region seeing data of another region using SSAS. Performed unit and system testing, troubleshooting and bug fixing in development and QA environments. Environment: SQL Server 2008/2012, Windows Server, SSIS, CICD Pipeline, AWS, SSAS, SSRS, Excel, T-SQL, PL/SQL, MS Access, Team foundation Server (TFS), Mainframe VSAM & Flat Files, Power BI Wells Fargo, Charlotte, NC Aug 2018 to Aug 2020 Role: SQL developer Responsibilities: Performed T-SQL tuning and optimization of queries that take longer execution time using MS SQL Profiler and Database Engine Tuning Advisor. Reverse engineered backend database to change the T-SQL scripts, create Views, Stored Procedures, Triggers and Functions to improve performance drastically. Experienced working in the Agile methodology. Performed improvement of application by replacing the old cursor logic in various triggers by writing complex T-SQL queries. Performed Data Modelling to know about the relationships between entities in a database. Worked with SQL server BIDS 2012. Designed and developed SSIS (ET) packages to validate, extract, transform and load data from OLTP system, PSD (Staging Database) to data warehouse (PDW). Applied various data transformations like Aggregate, Sort, Multicasting, Conditional Split, Derived column in SSIS. Developed and deployed SIS packages for ETL from OLT and various sources to staging and staging to Data warehouse using Lookup, Fuzzy Lookup, Derived Columns, Condition Split, Term, Slowly Changing Dimension and more. Performed ETL mappings using MS SQL Server Integration Services. Involved in creating VB.Net Script for Data Flow and Error Handling using Script component in SSIS. Developed and deployed data transfers packages using SSIS, also maintained the jobs running on Servers. Extensively used transformations such as Conditional Split, Data Conversion, Derived Column, Multicast, Row Count, Merge, Union All, Script task, Execute Package task, Expression task, Bulk Insert task, Execute SQL task and Containers namely-Foreach Loop Container, For Loop Container and Sequence Container. Wrote C# scripts in Script Task in SSIS packages. Used Bulk Copy Program (BCP) to transfer data between servers. Created and published Parameterized, Linked, Sub reports, Drilldown and Drill through reports on large data sources. Involved in enhancements and modification of SSRS Reports. Environment: MS SQL Server 2015/13, T-SQL, PL/SQL, Web Services, XML Schema, XSD, XSL, JavaScript, Windows 10/8/7 Servers, TFS, SSIS 2105, SSRS 2015, SQL Server Management, LINQ. Visual Studio 2017/15, .Net Framework 4.0 and 4.5, C#.NET, ADO.NET, Autosys, Azure Utility, Aspose, EPPlus, Entity Framework. Gainwell/DXC Technologies, Conway, AR May 2017 Aug 2018 Role: SQL/SSIS developer Responsibilities: Generated database SQL scripts and deployed databases including installation and configuration. Created and developed stored procedures and triggers to handle complex business rules, history data and audit analysis. Filtered bad and inconsistent data using complex T-SQL statements and implemented various constraints and triggers to maintain data consistency. Used SQL script to find heavily fragmented indexes and have re-organized the indexes accordingly to get desired performance from the tables. Used joins and sub-queries to simplify complex queries involving multiple tables. Migrated data from heterogeneous data sources and legacy systems to a centralized SQL Server using SQL Server Integration Services (SSIS) to overcome transformation constraints. Implemented Error handlers and Event handlers in SSIS packages. Created packages to validate, extract, transform and load data to centralize SQL Server using OLEDB providers from the existing diversified data sources. Experienced in visual mapping to write the data from upstream pipeline to the appropriate MongoDB field. Data analysis entails understanding the flow of data and its interdependency among QNXT core modules such as Claim, Provider, Member and Affiliates module Involved in the Requirement Analysis, Design phase and Development phase of Agile SDLC model system. Involved in FACETS Implementation, involved end to end testing of Facets, Claim Processing and Subscriber/Member Module Developed various SQL objects such as Functions, Tables, Views, Triggers, Indexes, Constraints and created Dynamic and Static Stored Procedures and Queries to develop Decision Engine. Identifies and resolves problems related to EDI processes. Followed Agile methodologies (scrum, extreme programming) and test-driven development. Environment: Visual Studio 2015/13/12, .Net Framework 4.0 and 4.5, C#.NET, ASP.NET 4.0, ADO.NET, MS SQL Server 2014/12, T-SQL, PL/SQL, Web Services, XML Schema, XSD, XSL, JavaScript, Windows 10/8/7 Servers, TFS, SSIS, SSRS, LINQ, SQL Server, MongoDB. TriZetto (Cognizant), Phoenix, AZ Jan 2016 May 2017 Role: Data Engineer/ETL Developer Responsibilities: Initially loaded data into staging tables from different source databases system using different transformations, control flow tasks, loop containers and data flow tasks, then moved the data into different Schema's. Create, analysis, design, implementation, development and validation 837P, 837I and 834 EDI transaction. Involved in claims adjudication process of FACETS application. Experience with Medicare and Medicaid (MMIS) claims processing, Medicaid billing, Medicare membership and eligibility verification and care management. FACETS support systems were used to enable inbound/outbound HIPAA EDI transaction in support of HIPAA 834 transactions. Worked with Facets Trizetto Batch Scheduling and monitored daily for updates in various domains Facets. Collect and analyze requirements from the clients to design suitable software for them. General data mapping of Claim Encounter EDI files 837I, 837P, 834 including coding, translation, development, testing and deployment. Programming, including writing T-SQL in SQL Server and SSIS packages, which supports export Claim Encounter EDI files. Written several SQL queries and analyzed them to validate the business rules in Oracle Designed ETL packages dealing with different data sources (SQL Server, Flat Files) and loaded the data into target data sources by performing different kinds of transformations using SQL Server Integration Services (SSIS). Implemented SSIS data, created maintenance procedures and provided data integrity strategies. Knowledge of SQL server and the ability to design and develop against processed EDI data. Created ETL packages using SSIS to move data from various heterogeneous data sources. Used SSIS to create ETL packages to validate, extract, transform and load data to data warehouse databases, data mart databases, and process SSAS cubes to store data to OLAP databases Expert in removing the Duplicate (De-duplication) rows from a table and applying Fuzzy lookup in SSIS Performed day-to-day Database Maintenance tasks including Database Monitoring, Backups, Space, and Resource Utilization. Strong knowledge of Medicaid and Medicare transactions. Environment: .Net Framework 4.0, C#, ASP.Net, HTML, VBNET, JavaScript, CSS, XML, XSLT, AJAX 2.0, ADO.Net, Web Service, WPF, HTML5, TFS, jQuery, WCF, SQL Server 2008, SSRS, TFS, Web Forms, IIS, TFS, Windows Server XP clients, Visual Studio 2008, Microsoft Office. Accenture, Hyderabad, India Jan 2013 Dec 2014 Role: SQL Developer/SSIS Developer Responsibilities: Analyzed and implemented an enhancement for an existing data mart by adding new dimensions and fact tables. Used SSIS ETL system to bring the data together from different systems including text files, spreadsheets, SQL server and Oracle databases. Developed SQL queries, functions, stored procedures and triggers to perform the backend testing of data Writing PL/SQL Procedures & Batch Processes Used Model, Database and Code First approaches in the Entity Framework. Developed various database queries and Stored Procedures using SQL Server and T-SQL. Design and develop SSIS packages, store procedures, configuration files, tables, views, and functions; implement best practices to maintain optimal performance. Managed the daily scheduled run of the jobs for the data mart to ensure that data is available for the business and marketing analyst first time in the morning. Developed unit testing strategies for the ETL processes and provided improvement suggestions on requirements to business users. Implemented error handling and roll back process in ETL load. Configured SSIS packages using Package configuration wizard to allow packages run on different environments. Used SSIS to create ETL packages to validate, extract, transform and load data to data warehouse databases, data mart databases, and process SSAS cubes to store data to OLAP databases Used to remove the Duplicate (De-duplication) rows from a table and applying Fuzzy lookup in SSIS. Creation of information and process flow diagrams as well as data flow diagrams that captured user data and reporting requirements for direct mail and database marketing systems. Analysis and Database design using SQL Server. Wrote T-SQL stored Procedures and Batch Queries in SQL Server. Responsible in all facets of Software Development Life Cycle (SDLC), including requirements gathering, establish technical specifications, set deadlines and milestones, Designing, Coding, Testing, Quality Assurance and Production Deployment. Generated Business reports using SQL Server 2005 Reporting Services (SSRS). Used SQL Server Integration Services (SSIS) to transform data from Flat Files to SQL Server 2005. Created Windows Services to run the SSIS package to load data into a database daily. Environment: MS SQL Server 2008r2. 2012, 2014, T-SQL, SSRS 2008 r2, SSIS 2008r2, SSAS 2008r2, Oracle, PL/SQL, Facets, Crystal Reports, Windows 8/7, Asp.net, WCF. Keywords: cprogramm cplusplus csharp quality analyst business intelligence database active directory information technology microsoft procedural language Arizona Arkansas Delaware North Carolina |