Karthik - Data Architect |
[email protected] |
Location: , , USA |
Relocation: Yes |
Visa: H1B |
Resume file: Sri Karthik_Senior Data Architect and Senior BI Technical Project Manager_1745501665976.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |
KARTHEEK PS
SENIOR DATA ARCHITECT & SR TECHNICAL PROJECT MANAGER +1 (469) 444-8898 PROFESSIONAL SUMMARY 17+ Years of IT Experience as Senior Data Architect/Technical Project Manager with expertise in Business Intelligence, Data warehousing, Data Analytics, Bigdata and Cloud(BI DW) for implementing solutions for various businesssituations Hands-on experience in projects using Data analytical/Visualization tools like Tableau, Power Bi and Qlik for data analysis and business insights Having extensive experience in Data Architecture, Data Integration and Data Modeling along with knowledge on Information Management in areas of Enterprise Architecture Planning, Data Governance, and in-depth expertise in Data Warehousing Experience in creating Data Governance Policies, Data management, Data Dictionary, Data Mapping, Reference Data, Metadata, Data Lineage, and Data Quality Rules. Document data lineage for data landscape (both, detailed and high level views), Identify Authoritative Data Sources and document all data quality control Having good knowledge and understanding to implement Predictive analytics(Statistical, Predictive and Machine Learning) Working as a Training specialist on Data Management programs which mainly focus on virtualization, governance, master data, quality, catalog, metadata and data analytics for in-house client s technology teams and business data teams, online classes on Data Architecture, and as Corporate Trainer for small-scale technology service driven organization s Hands-on experience in Database Design, Development of business intelligence applications in Microsoft SQL Server environments and Involved in migrations from one server version to another Hands-on experience and implementation of migration projects on cloud streams like AWS Cloud on specific data services related to Data warehousing and Data Lakes Having Knowledge, certified with hands-on experience, POC s implementation on Architecture and design (health care applications data) for data stream ingestion using AWS and Snowflake Cloud data warehouse Hands-on in snowflake data Modeling, ELT using snowflake SQL, implementing stored procedures, and standard DWH ETL concepts. Responsible for technical delivery review, resolution of architecture issues, and for migration of the existing on-premises application to the AWS Snowflake platform Worked on Snowflake components (snow pipe, snow sql,stored procedures, streams, tasks etc.) for data transformations Hand-on and Strong experience in SSIS Package development, Report and dashboard development by using native SSRS and Experience in writing MDX queries, multi-dimensional modeling, creation and maintenance of cubes, processing cubes, creating perspectives, partitions, attribute relationships, aggregations and KPIs in SSAS Full Life Cycle Experience in Model, Design, Development of Enterprise Data warehouse (EDW) and Data Lake implementation Hands-on experience in supporting web based, mission critical applications including Logical and Physical Database Design, Development, Performance Tuning and Change Management Experience in understanding of all phases of SDLC, focusing on translating business requirements into via-ble technical solutions Proven ability to work with business users to take requirements and translate them into technical designs to facilitate BI reports and analytics Experience on using Business Requirements create Functional Specifications, Mapping Documents, Test Plans, Status Reports and confirming the specifications with business analyst and stake holders Worked in Health care, Marketing, Manufacturing, OIL & GAS Supply chain, Insurance and Banking domains Worked with direct interaction of clients in India,Singapore,USA,UK,Australia,China,Netherlands and Hong Kong Worked on implementing test plans and strategy, review of the test cases written for unit and integration level, and review of code in database and ETL Guide, mentor and train BI team members through technical solution deliverables Experience on training to Business Users on the use of Data visualization tools like Power BI, Tableau or a similar Business Intelligence software Prepare and dispense work assignments, provide guidance, and/or review the work of local and offshore development team members Develop technical documentation to define the system components, development environment and implementation planning/strategies Certified in AWS & Snowflake Clouds as Architect and Data Engineering, Tableau & PowerBi Data Analytics & Admin Certified in PMP (PMI) and CDMP(DAMA International) Various Roles Played: Programmer, Senior Analyst, Senior Technical Consultant, Data Modeler, Technical Lead, Team Lead, Project Lead and Solution Architect. Latest Role: Senior Consultant/Technical Project Lead/Technical Project Manager, Solution and Senior Data Architect in BI, Data Warehousing, Data Analytics and Cloud EDUCATIONAL QUALIFICATION Bachelor of Technology (B. Tech) in Electronics and Communication from JNTU, INDIA (2004 - 2007) Diploma in Electronics and Communication from SBTET, INDIA (2001 - 2004) TECHNICAL SKILLS Microsoft Technologies : Microsoft Business Intelligence (MSBI),Microsoft PowerBI Databases : SQL SERVER 2005/2008 R2/2012/2014/2016/2019/2022,Vertica,PostGres,Oracle SQL Analysis Services(BI) : MDX, DMX, XMLA Data Modeling : Relational, Dimensional, ODS, Structured/Unstructured,OLTP,OLAP Life Cycle Expertise : Requirements analysis, Design, Coding, Testing, Database tuning, Hadoop Platform : Cloudera Hadoop Distributions, Hive, Impala and Presto Data WH and Integration : MSBI - SSIS, SSRS and SSAS, Oracle Data Integrator, Pentaho, Matillion, AWS Glue, ADF Data Modelling Tool : Erwin Data Modeler, ER\Studio, Lucid Chart, Draw.io, SSMS, SqlDBM and Toad Data Modeler Data Catalog Tool : Collibra/Data Governance and Data Quality Project Management : JIRA, ASANA, Microsoft Project Version Controller : Visual Source Safe (VSS), Team Foundation Server (TFS), SVN and GIT Predictive Analytics : Statistical Modelling, Predictive Modelling and Machine Learning Cloud Data Platform : Snowflake Cloud(Snow Pipe, Stream& Tasks, Snowpark, clustering, Snow SQL, Datasharing, Sec & Governance) Cloud Technologies : AWS (Amazon Web Services) S3, EC2, RDS, Redshift, Athena, EMR, Glue, DataPipeline, DMS, SCT, Cloudwatch, Lambda, Quick sight, IAM, KMS, DataZone, CloudTrail, Kenesis, Databrew, Aurora, DynomoDB, Tranfer&Snowball Analytics/Reporting Tool : ProClarity, Crystal Reports, PowerPivot (BI), Tableau, Power BI, Qlikview/QlikSense and Oracle DVD ACHIEVEMENTS Best Employee of the Year (Development Team) 2009 - Awarded by Mahindra Satyam Rating A+ in yearly performance review (2009 and 2010) - Awarded by Mahindra Satyam Received Spot Performance Awards in projects (2013) - Awarded by KPIT Technologies Received appreciations in leading and handling the project deliveries Best Technical Trainer appreciations on Specific Analytical tools and BI platform technologies by Big Clients in Singapore especially from Data and stakeholder teams PROFESSIONAL EXPERIENCE PROJECT: #18 Project : Program Data Management and Open Data Upgrade Company : Universal Service Administrative Co., USA Technologies : SQL Server, Postgres, Oracle, Vertica, Tableau, Pentaho, MSBI, Collibra, Jira, Confluence, AWS Data Platform Services Duration : May 2023 to Till Date Team Handling 4 Position : Senior Data Architect & BI Manager PROJECT DESCRIPTION USAC(Universal Service Administrative Company) is an independent not-for-profit designated by the FCC, USAC administers the Universal Service Fund (USF).With the guidance of policy created by the FCC, we collect and deliver funding through four programs like (E-Rate, Rural Health Care, Lifeline and High Cost) focused on places where broadband and connectivity needs are critical. These programs serve people in rural, underserved, and difficult-to-reach areas. Data Architect main role for these programs to support, audit the existing systems and provide expert advices with solutions in data management focuses on data quality, security and governance practices with standardization and migration to cloud services for different program departments in the organization. RESPONSIBILITIES Collaborating with important stakeholders, including division-based data stewards as well as technical system owners to establish an understanding of the organization s vast processes and data Designing and implement an efficient and adaptable data model to support data exploration, data science, analytics, and GIS, resulting in insights to inform key business decisions Participate in process improvement initiatives and in setting best practice standards around Data Architecture, Meta data, Governance, Master Data Management, and Data Quality flows. Investigating the opportunities to increase the value of the data to the business while focusing on projects that improve business processes Design, build and maintain flexible and scalable data warehouse solutions to support Business Intelligence (BI) and Reporting projects Provide expertise in the creation and refinement of agile technical stories for data architecture and development Serve as Subject Matter Expert in data warehouse concepts (Kimball, Inman, star schema, and normalized and de-normalized data models Migrating Postgres, Vertica DWH and DB s to AWS Cloud Helping to develop, implement, and maintain appropriate data governance on the data warehouse and associated meta-data for purposes of data quality, integrity, availability, and security Migrating Pentaho ETL jobs to AWS Cloud using AWS Data Pipeline Monitor and analyzes performance to identify and recommend optimizations Coordinates with team members and leads the implementation of Master Data Management, in conjunction with the new data warehouse Lead the integration of the data warehouse with a modern, leading business intelligence platform Snowflake Cloud Project: Responsible as Senior Architect and Consulting Engineer Datawarehouse migrations from Vertica Main tasks involved for implementation: Data Architecture, Analyze Data and design, Plan for Migration, Setup Snowflake, Schema mapping, Data split and Stage, Query optimization with cluster keys and Data Validation Migration method of both ETL and Replication implementing using Matillion, Snow Pipe and SnowSQL Creating Technical documentation Planning for training sessions to technology team Collaborative Data Sharing Platform and Source Data Integration(POC) : Responsible as Senior Consulting Engineer Created a data-sharing platform for program units comes under FCC within the company to securely share data for analytics and reporting. Leveraged AWS DataZone to centralize metadata management, set up secure data sharing protocols, and simplify cross-team collaboration Developed an automated data integration pipeline that brought together data from multiple external and internal sources into AWS, using AWS DataZone to catalog and organize the datasets for ease of use. The solution streamlined data discovery and ensured data was cataloged according to governance policies Responsibilities: Configured AWS DataZone to enable safe and efficient data sharing across teams, maintaining data security and integrity. Integrated AWS Redshift and Athena to perform data analysis and reporting on shared data. Created secure access policies and managed user permissions to ensure only authorized personnel could access sensitive data. Key Achievements: Enhanced Collaboration: Increased collaboration among business teams by 62%, with seamless access to shared data. Reduced Duplication: Prevented duplicate data silos by centralizing data storage and ensuring efficient sharing practices. Security and Compliance: Ensured compliance with data privacy regulations by establishing granular data access controls and audit logs. PROJECT: #17 Project : Neuroglee Connect Platform Company : Neuroglee Therapeutics, Singapore and USA(Remote) Technologies : AWS, MySQL, SQL Server, PostgreSQL, Tableau desktop 19.x/20.x, Tableau Server, Python,TypeORM, React and Node JS, JAVA, Spring Boot, Rudder Stack, Snowflake Duration : July 2022 to Sep 2022 and Temp Support (Nov 2022 to Apr 2023) Team Handling 5 Position : Senior Data Architect PROJECT DESCRIPTION The Neuroglee connect platform is a virtual care program focusing on patients with neurological conditions like mild cognitive impairment and Alzheimer s disease. This system consists of 4 sub-components, which are the following: - IOS App focusing on patient - IOS and Android App focusing on the patient s loved one s and care partners - Clinical dashboard for hospital systems to remotely manage and monitor the patients - AI engine, in AWS cloud providing the personalized care to the patients Patient App interaction data are fetched to the AWS cloud as a data stream and stored in data lake, where it get analyzed by the AI engine. NG Cloud service NG-MCI is the central hub for the care partner, patient and dashboard. The cloud service provides GraphQL based APIs for both the applications and dashboard to interact with the cloud. RESPONSIBILITIES Architect and design for a cloud agnostic, open architecture, modular, highly scalable, expandable, and maintainable which uses as a Data Analytics core product model in Neuroglee Worked with product owner to develop and maintain product architectural roadmap that is aligned with business and enterprise architecture strategies and standards Created, maintained and tested the optimal data pipeline architecture POC s implementation rationale and mappings/data collections based on the discussions with data science team refer to different data streams from IOT applications to collect specific data related to patients, doctors and caretakers video meetings, video calls, gaming modules for improving the new features and add on supplements in the applications Created two architectural solution designs for POC s using Python, Rudder Stack streams, Kenesis streams, S3,SNS, Snow pipe and Snowflake Worked closely with the data science and platform engineering teams to integrate into the production systems and to ensure the functionality of the analytics engine Mentored data engineers in the building and maintain the robust and scalable data pipelines and data storages through great design and architecture Managed joint operations team that maintains the computing infrastructure for internal product & analytics modelling Assisted internal customers in crafting technical content for analytics related tender responses Ensured the data governance, data security, and compliance of data retention policies. (e.g., HIPAA, GDPR) Drafted and implemented the ELT and data pipeline based on the requirements using cloud services or any tools. Act as final reviewer for quality control of product development code Conduct periodic technology scans and review of potentially useful or open source software components or products, as well as doing competitor product analysis PROJECT: #16 Project : Single Source of Truth (SSoT) and Security Layer Enhancement Company : HCL, Singapore and Facebook, USA Technologies : Hadoop, Hive, Presto, MySQL, SQL Server, Oracle, Tableau desktop 19.x/20.x, Tableau Server, Python Duration : Sep 2021 to Aug 2022 Client : Meta Facebook Team Handling 67 Position : Senior Data Architect/TPM (Technical Project Manager) PROJECT DESCRIPTION This project is about the enhancement of Single Source of Truth (SSoT) and access security. Analyzing and identifying the existing and new sources from systems like messenger, tasks, live chat, bomgar, helpdesk ticketing system, mailing, facilities, management, internal BOT engagement and HR etc. within Company to gather data in to single location for reporting and analysis purpose. Identify the requirements from stake holders, business technical teams and data strategy teams to integrate source data to SSoT and build the enterprise/ad-hoc dashboards using Tableau. RESPONSIBILITIES As a Senior Data Architect, Reviewing all data source systems to understand what is available and complete audit of current state of the SSoT and dashboards/reports Performing Internal audit of all existing environment implemented systems Identify the gaps and issues in the existing environment and provide recommendations with usable solutions Document the data mapping and data dictionary for all the data sources and build the data flow architecture Data architecture strategy for short term and long term to develop for enhancement of SSoT Design, optimize, and maintain effective database solutions and data models for the storage and retrieval of data Partner with Data Engineering and other cross-functional partners to aid development of core datasets Work with business to analyze and document structural requirements for new systems and applications, designing conceptual and logical data models and flowcharts Research and properly evaluate sources of information to determine possible limitations in reliability or usability Plan and manage projects of data integrity/data quality improvement and replenishment initiatives Define governance structures and processes to manage information for supporting the control systems Assist in migration of data from legacy systems to new solutions Recommend solutions to improve new and existing database systems Monitor and optimize performance by performing regular tests, troubleshooting and integrating new features Access control layer creation for accessing the tables of SSoT for users based on authorization controls to sort out permissions Analyzing the tableau dashboards for development based on the requirements and guide the team for implementation Review the dashboards for any performance issues and set up standards of development for the implementation and improvements As a TPM (Technical Project Manager), Analyze and plan based on the requirements provided by business team and implement standards in reference to scheduled project Assign and oversee the daily tasks of technical personnel while ensuring all subordinates are actively working toward established milestones Establish and implement training processes where require for the business analytics team and data strategies team for all specific technical personnel Hold regular technical team meetings to determine progress and address any questions or challenges regarding specific data projects initialized from different senior strategic managers Update and maintain all production technologies ensuring proper maintenance and installation PROJECT: #15 Project : MOT-Performance Management Reporting For P&L Accounts/BS Statement Company : PERSOLKELLY, Singapore (PERSOL Holdings) Technologies : Hadoop, Hive, Impala, Presto, Oracle, Tableau desktop 19.x/20.x, Tableau Server and Tivoli Workload Scheduler (TWS), Qlikview, Qliksense Duration : 1.2 Years [July 2020 to Sep 2021] Client : DBS Bank, Singapore Team Handled : 26 Position : Technical Project Lead/Senior Data Architect PROJECT DESCRIPTION This project is about the enterprise performance management reporting dashboards development for MOT (middle office technology division)-Finance group. Dashboards are related to Balance sheet and Profit & Loss Accounts statements depends on the wireframe. RESPONSIBILITIES Lead a team of consultants and involved in analyzing the business needs, identifying a suitable solution and guided them in the design and development of a solution Managed and supervised a team of 26 personnel refer to ETL developers, Data Analytics Engineers and Admins Maintained active relationships with business partners/stack holders to understand business requirements Reviewed and performed full audit of existing data model in hive, Scripts, ETL logics, Data Dictionary, Data Pipeline and found data gaps & issue finding for further review by business and changes for implementation Prepared data architecture and data flow diagrams of entities Implemented security access to hive tables for different group of users and data project access list for to use in pipelines for extraction Development of dashboards for P&L and Balance Sheet using Tableau Identified and ingested data from different data sources like databases and file System Performed analysis on source data for quality checks and evaluated detailed business and technicalrequirements Involved with ETL team in the design of Build Hadoop Data Lake for data storage, data processing and visualization using tools like Hive, Impala, SQL, Tableau dashboard reporting Design physical data mapping for P&L and Balance Sheet attributes specifications with the help of business users team Involved with ETL team for performing the tasks to develop the tables/views using Spark API over Cloudera Hadoop to perform analytics on data in Hive. Worked on TWS scheduler to automate the jobs based on event trigger between ETL Jobs and Tableau Extract Jobs in Tableau server using shell scripts Worked on Agile development methodologies, tools, and processes. Participated in Scrum activity, Sprint Planning, Story Writing, Backlog Grooming and Mid Sprint Review Designed and developed Tableau dashboards/workbooks for Balance sheet and P&L Maintain activities of Tableau Server for user logins, content management (permissioning, Projects, sites etc.) and Administrative activities, server licensing, upgrading versions, deployment activities for UAT and Production. Mentors others on coding standards, data mapping design, code integration and performance tuning Collaborate with various cross functional teams; infrastructure, network, database and application for various activities: development, setup and framework rollout activities Supported Qlikview project for Balance Sheet reports which are existing dashboards for change requests Development of dashboards using Qliksense for POC s to business users Involved in modifying, testing and implementation of existing dashboards developed using Qlikview by updating QV data models and QV objects extracted from multiple data sources like SQL, Oracle and Flat Files (CSV, Excel) PROJECT: #14 Project : MAS Notice 610(Monetary Authority of Singapore) and Regulatory Reporting For Singapore Region Company : OneAston, Singapore Technologies : SQL SERVER 2016, SSIS, SSRS, VERMEG AgileReporter and JAMS Scheduler Duration : 1.1 Years [May 2019 to June 2020] Client : ABN AMRO Bank, Singapore Team Handled 14 Position : Technical Lead/Data Architect (DW/Analytics) PROJECT DESCRIPTION This is about the MAS610 regulatory reporting. As per MAS circular, all FI s and Banks has to submit reports in new notice 610 reporting format before 1st of April 2020 and thereafter subsequent releases of notice amendments with changes and parallel run for producing the report to MAS extended until 1st of Oct 2020. Challenges: Bank is having huge data coming from different source systems based on frequency like monthly, weekly and daily. After receiving the data, bank has to process the data in DWH and generate the reports according to Vermeg AgileReporter. Linking of data from different source systems and creating reports in provided notice formats is very challenging. Solution: We receive the data from different source systems, analyze and process the data to datawarehouse. Bank is having only 60% of the requirements covered as given by MAS and 40% is data gap where to discuss with finance team for the business reporting logics. From datawarehouse, sending the data to Vermeg. Vermeg is generating reports according to MAS610 new regulatory reports format. RESPONSIBILITIES Leading the team and assigning different tasks for development as per requirements Managed and supervised a team of 14 personnel refer to BI Data Engineers, Analytics Developers and System Analysts Preparing the development tasks and time schedule to meet the deadline as per business reporting requirements Providing the technical and business logical (as per from finance) guidance to team for implementation Training the developers on technical designing and implementation of SSIS Packages Designing the DWH dimensional model and develop of SQL databases for External, Staging, DWH and Reporting DB for Vermeg Designing of packages using SSIS for loading data from different systems to SQL databases for External, Staging, DWH load from Staging Source, DWH to Vermeg ETL reporting as per the BI design and requirements Involved in Code Review of SQL logical scripts and SSIS Packages Conducted & managed daily agile sprints with the team members and business analysts in MS project and share point tasks Documenting the detail Technical design, Business logic source systems mapping, Standards to follow the coding and Test documents Involving in meetings with refer to Project status and updates to management about progress Involving in meeting with BA, Finance and Vermeg team for preparing mapping requirements for reporting where ever require for understanding PROJECT: #13 Project : Enterprise Datawarehouse (EDW) For APAC Region and Tableau Reporting Solutions Company : OneAston, Singapore Technologies : SQL SERVER 2016, SSIS, Tableau Duration : 1.1 Years [May 2019 to June 2020] Client : ABN AMRO Bank, Singapore Position : Technical Lead/Data Architect PROJECT DESCRIPTION Purpose of this project is to prepare the Enterprise DW for all the APAC Region like Australia, China, Singapore and Hong Kong for regulatory reporting and other financial progress reports. This involves migration of all reports previously developed manually using MS- Access, MS-Excel and SSRS & converting to different dashboards. RESPONSIBILITIES Designing and development of EDW for APAC region according to different country requirement using the source files from different source systems Developed data models and assisted in development of data architecture policies and procedures Ensured data consistency, quality and integrity between databases and operational systems Implemented data retention and recovery policies and Coordinated with teams to optimize database performances Managed secured user access to databases and ensured that data architecture tasks were executed within deadlines Preparing the analysis of reports to migrate which are developed using different solutions Presenting the Tableau importance and usage in creating visualization and analysis to various business users PM s and Lead s Gather the additional requirements and existing reports changes for designing the dashboards and reports using Tableau Provided basic training, understanding, building dashboards and reports using Tableau tool (Desktop ,Server) Involved in POC of migrating few finance reports to dashboards using EDW database Involved in preparing the dedicated team and for developing dashboards using Tableau PROJECT: #12 Project : Internal Clinical BI Systems Company : NKF, Singapore Technologies : Power BI, SQL Server, Oracle, R, Python Duration : 4 months [Nov 2018 to Feb 2019] Team Handled 4 Position : Senior IT Specialist PROJECT DESCRIPTION Development of internal Clinical Systems in Power BI platform. Implementation on designing, architecting, development of different dashboards based on clinical stakeholder s team and higher management requirements. Training users on dashboards and standard reports development on BI platform. RESPONSIBILITIES Created the new team for specific development of BI systems project Worked with key stakeholders to spearhead new technology including planning for next upgrade of our Clinical System Manage cross agencies interface for major IT systems / projects Lead, drive and mentor team members in business requirement reviews and design Developed the prototype and functional/technical specifications and coordinate the development Developed the necessary test plans for the system integration testing Maintained and review access matrix for application systems Keep updated with current system development technologies to stay technically relevant Worked with Infrastructure Team and vendors to develop and maintain database backup & recovery strategy Monitor system efficiency and recommend design or process changes to enhance performance and flexibility PROJECT: #11 Project : TTSH and IMH Hospitals (EDW and Tableau Dashboards/Reports Support) Company : Jobline Consultancy, Singapore Technologies : Microsoft Business Intelligence (SSIS and SSAS), SAP BODS, Tableau, SQL Server 2012, Oracle Duration : 1 Year [Nov 2017 to Nov 2018] Client : IHiS (Integrated Health Information Systems), Singapore Team Handled 10 Position : Senior Lead Analyst (BI) PROJECT DESCRIPTION EDW development using SSIS/SAP BODS and Job Monitoring in SQL Server Agent and Task Scheduler. TTSH (Tan Tock Seng Hospital) and IMH (Institute of Mental Health Hospital) Health Care Intelligence Applications - Implementation of Dashboards developed using Tableau and SSAS RESPONSIBILITIES Managed and supervised a team of 10 personnel refer to Data Engineers and System Data Analysts Attend to user queries on data issues as part of production support for existing Enterprise Data Warehouse and data mart applications Provide support and development work using ETL tool (MS-SQL SSIS and SAP BODS) for existing production ETL jobs and change requests Attend user meetings to document and analyze requirements on change requests for EDW and data mart applications Perform data profiling and mapping to understand the data requirements for change requests Importing the data from data sources and preparing the data as per user requirement Migration of reports and dashboards that are developed in OBIEE using Tableau Design and create data visualization (new reports and dashboards) using interactive dashboard design in Tableau Created meaningful dashboards and KPI using tableau functions like calculation, parameter, filter, formatting and actions Leveraging the full range of tableau platform technologies to design and implement production ready solution Created Prompts, Set, Group, Table Calculations, Conditions and Filter (Local, Global) for dashboards Effectively used detail level and Hierarchies feature in tableau for effective Interaction Used Data Blending to build visualization using multiple data sources and also Used excel sheet, flat files, CSV to generated Tableau ad-hoc reports Deliver dashboards to convey the story points of the data. Published customized interactive reports and dashboards, extract scheduling using Tableau server Managing Tableau Server including content admin activity like managing site, project, and user group and monitoring refresh/subscription schedules Administer user, user groups, and scheduled instances for reports in Tableau Server Creating and maintaining site, project, schedules and assigning permission to user groups in Tableau server Liaise closely with vendors to design, configure and test the enhancements made in accordance with IHiS project methodologies and policies. Support the team in analyzing and documenting project requirements through reviewing ETL source codes of other similar projects for cluster hospitals like Alexandra Campus and KTPH PROJECT: #10 Project : Toner forecast processesoptimization Company : MindTeck, Singapore Technologies : Microsoft Business Intelligence (SSRS, SSIS), Microsoft SQL Server 2016, MS Office, MS Visio, GIT Hub, Essbase, Crystal Reports, Power BI and Amazon Web Services(AWS) Duration : 6 Months (May 2017 to Nov 2017) Client : HP, Singapore Team Handled 3 Position : Technical Lead (BI/DW) PROJECT DESCRIPTION At present data sources are gathered and uploaded manually. The scope of this project is to automate the data explosion and loading. At a high level, extract data from Excel + database sources, transform (explosion process) and load into Essbase. This is being done manually at present and is tedious and error prone. For this purpose, automation done using ETL and providing the data directly from source to load in to Essbase. Maintenance of existing Power BI dashboards for Ad-hoc requests and creation of new dashboards. Migration of database and packages to AWS. RESPONSIBILITIES Designed and Developed SSIS packages to extract data from various Excel and other database sources and automated the packages for incremental loading based on daily process checks Conducted daily meetings with business stakeholders and director level members for requirement gatherings and presented solutions using best practices Designed and developed Power BI visualizations for dashboard and ad-hoc reporting solutions by connecting from different data sources and databases based on the user requirements on Ad-hoc basis As a Power BI SME, responsible for the design, development and UAT/Production support Expertise in data preparation with experience in blending multiple data connections, and creating multiple joins across same and various data sources Used and created various visualizations in reports like scatter plot, box plot, Sankey chart, bar graphs, Gantt chars, trend lines, waterfall charts, heat maps, geo-maps allowing end users to utilize full functionality of the dashboard Created custom calculations using DAX in Power BI where required in dashboards as per the business user needs Managed Power BI Portal - Data refreshes / User management and set up gateway data refresh Used Parameters and Quick filters, implemented table Calculations Migrating on-premises SQL databases to AWS RDS SQL Server using AWS DMS Migrating SSIS packages to AWS and scheduling to run the packages using SQL Server Agent Job PROJECT: #9 Project : Information Management (Application System manages all user requests) Company : RSystems Singapore Pte Ltd Technologies : Microsoft Business Intelligence (SSRS, SSIS, SSAS), Microsoft SQL Server 2012. JIRA, MS Visio, TFS Duration : 1.1 Year [May 2016 to May 2017] Client : NTUC Income, Singapore Team Handled : 5 Position : Senior Technical Consultant PROJECT DESCRIPTION Information Management - Data Management handles all data related activities and processing for all NTUC departments. This also handles report development and all enhancements of existing reports, ETL's and batch processes. RESPONSIBILITIES Managed and supervised a team of 5 personnel refer to BI Developers Document and create Impact Analysis on existing Business Intelligence components (SSRS, SSIS and SSAS) and Data requirements for new requirements and enhancements. Work closely and collaborate with application and infrastructure support teams to drive resolutions Plan, Analyze, Design, Build, Test, Deploy, optimize and support new and existing data models, ETL and report processes in production environment. Support users on the day to day basis to address Business As Usual tasks related to their report data. Analyzed and prioritize all incidents logs created at BMC Remedy Tool by business users to ensure the SLA are met. Assign incidents and requests to appropriate Information Management teams to conduct investigations and resolutions. Escalate Incident tickets on time to L2 support if needed to ensure immediate resolutions. Document all issues and incident until its closure on a day to day basis. Used TFS as source control for maintaining SQL scripts and packages and used as source for deployment activities in UAT & PROD Analyzed and fix UAT defects logged by business users during UAT testing. Create all appropriate documentations required to maintain the system and data. Used Jira Agile in the project works for planning the tasks as per tickets based on workflow environments (Daily Scrum Meeting, Sprint assignment backlogs, 1on1 meeting based on user discussions) Manage timeline and conduct estimates on existing tasks. Create and support deployment plans and executions. Prepare pre and post verifications scripts for data conversion initiatives. PROJECT: #8 Project : Tableau dashboard implementation for all NTUC Incomedepartments Company : RSystems Singapore Pte Ltd Technologies : Tableau 9/10, Microsoft SQL Server 2012 Duration : 1.1 Year [May 2016 to May 2017] Client : NTUC Income, Singapore Position : Senior Tableau Consultant PROJECT DESCRIPTION Creation of Tableau dashboard and data models for all NTUC income departments like Sales, Corporate, Income Motor and Non Motor, Finance and Operations. RESPONSIBILITIES Involved in creating database objects like tables, views, procedures, triggers, and functions using T-SQL to provide definition, structure and to maintain data efficiently. Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server. Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau. Developed Tableau workbooks to perform year over year, quarter over quarter, YTD, QTD and MTD type of analysis. Built dashboards for measures with forecast, trend line and reference lines. Restricted data for particular users using Row level security and User filters. Developed Tableau visualizations and dashboards using Tableau Desktop. Developed Tableau workbooks from multiple data sources using Data Blending. Published the Dashboards to Tableau Server. In server added users, created and managed groups, created separate folders to maintain Workbooks and Projects, Database Views, Data Sources and Data Connections Add and Edit Sites and corresponding administration rights and roles for users for departmentwise Create, Modify and Manage server task schedules Monitor server activity and usage statistics to identify possible performance issues/enhancements Created Subscriptions to send the dash board reports to users through TABCMD command line utility Provided production support for Tableau users PROJECT: #7 Project : AWACS (Advanced weekly analysis and Controlsystems) Company : WebSynergies Pte Ltd, Singapore Technologies : SQL SERVER 2014, SSIS, SSRS, SSAS, Power BI and Tableau 9.0 Duration : 6 Months [Oct 2015 to May 2016] Client : Sony (HQ), Singapore Team Handled : 7 Position : Senior Technical Consultant PROJECT DESCRIPTION AWACS is a regional web based platform to support SEA weekly sell-through operation. Harmonize sell-through collection process and to generate common report to branch and HQ (same view, same time). Supports sell-through from dealer on daily or weekly basis. AWACS to generate sell-through data report for analysis and information sharing with dealer. As part of this project for SEA region, AWACS system provides weekly and daily reports with refers to Sell-In and Sell-thru key figures based variances among application systems. It provides Dealer forecasting reports for PSI and sell-thru figure changes. Enhancement to allow the upload of Sell-in and Sell- through forecast Key Figures (Qty, Amt) based on Channel Group and Model to automate and integrate report generation, as well as improve ease of report distribution for sales analysis. RESPONSIBILITIES Managed and supervised a team of 7 personnel refer to Data Engineers and BI Report Developers Involved in the requirements gathering with Business users and implemented specification/design documents for the project Designed the flow charts preparations for design of interfaces and reports Designed the Proof of concepts for the ETL interface and created the mapping documents between different format of file sources and target tables, Implemented ETL interface development using SSIS Guiding the team of developers, involved in the development and helping solving the issues Implemented the new template of SSIS packages executions for all the interfaces This involved index creation, index removal, index modification, file group modifications, and adding scheduled jobs to re-index and update statistics in databases Developed and optimized database structures, stored procedures, Dynamic Management views, DDL triggers and user-defined functions Responsible for documentation of system related activities and Provides technical documentation of the system Developed SSRS report for error reporting and summary analysis report using SSAS cubes Developed SSAS Tabular model and created Power pivot reports utilizing tabular via analysis connector Using SSAS DB as source, developed dashboards using Power BI for various system users to access the data for computation and sales analysis purpose Involved with Query Optimization to increase the performance of the Report and performance of SSIS packages Developed test plans, produce test scenarios and repeatable test cases/scripts through all parts of the development Created some specific dashboards using Tableau for the operations department according to custom needs on Ad-hoc basis Created Users, Groups and Projects on Tableau Server Involved in UAT and Production deployment activities of reports and dashboards to Power BI Service and Tableau Server PROJECT: #6 Project : Content Management Datawarehouse and Statements Regeneration Company : Comtel Solution Pte Ltd, Singapore Technologies : SQL SERVER 2012, SSIS, Power BI Duration : 6 Months [May 2015 to Oct 2015] Client : Bank of Singapore Position : Senior Technical Consultant PROJECT DESCRIPTION Content management for long term retention of computer generated reports, customer documents such as bank statements, credit card statements, policy documents and compliance reports. Regeneration of client statements for montly and dalily related to data files generated from T24 banking software for the backdated dates. Reports development for analysis purpose to users using visualizations tool like Power BI and Qlikview. RESPONSIBILITIES Involved in the requirements gathering from the Business content management report users. Involved in the system requirement and specifications of the server setup for the project Designed the flow charts preparations of server and system setups using Microsoft Visio Worked on data vault methodology to build a warehouse. Designed the Proof of concepts for the ETL code generation as a parser Created the mapping documents between source and target tables, Implemented ETL design and development using SSIS Implemented the new structure of SSIS packages executions by creating the Jobs This involved index creation, index removal, index modification, file group modifications, and adding scheduled jobs to re-index and update statistics in databases Responsible for documentation of system related activities and Provides technical documentation of the system Involved with Query Optimization to increase the performance of the Report. Developed test plans, produce test scenarios and repeatable test cases/scripts through all parts of the development Analyzed the Test Data used for Unit and Integration Testing Developed analytical reports/dashboards for the users specifically related to credit card and policy management and share across the teams for data analysis Implemented Power BI gateway, different source to matching analysis criteria and further modelling implemented for power pivot and power view PROJECT: #5 Project : Data warehouse for Credit Control and Business Management Departments Company : Comtel Solution Pte Ltd, Singapore Technologies : SQL SERVER 2008R2, SQL SERVER 2012, BIDS, SSAS, SSIS, SSRS Duration : 1 Year [April 2014 to May 2015] Client : CIMB Bank, Singapore Position : Senior Technical Consultant (Individual contributor for implementation and main lead of the project) PROJECT DESCRIPTION CIMB Securities represent the retail securities businesses of CIMB Group which deals with Financing and Banking. CIMB Securities are the Licenced members of stock exchanges in Singapore and provides the trade execution and distribution of capital market products for individual investors. This project mainly deals the clients trade information that how it is managed and implemented on daily business with related to stock exchanges in the market. So for this purpose to analyze the data by the users build the datamarts for credit control and business management departments. Users will do the data analysis through Excel using BI tool. This datwarehouse is used by business users of respective departments and higher management. RESPONSIBILITIES Involved in the requirements gathering from the Business users Involved in the system requirement and specifications of the server setup for the project Designed the flow charts preparations of server and system setups using Microsoft Visio Analyzed database requirements in detail with the project stakeholders by conducting Joint Requirements Development sessions Developed a Conceptual model using Erwin based on requirements analysis Developed normalized Logical and Physical database models to design OLTP system Created dimensional model for the reporting system by identifying required dimensions and facts Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model Designed the Proof of concepts for the ETL and SSAS Cubes designs and development using SSIS and SSAS Developed SSAS Tabular model and created Power pivot reports and done calculations using DAX Involved in Analyzing the requirements and designing of the tables with relationships using the different source tables Created the mapping documents between source and target tables, Implemented ETL design and development using SSIS Implemented the new structure of SSIS packages executions by creating the Jobs using the Autosys automation tool. Involved as administrator for the jobs implementation in the Autosys tool This involved index creation, index removal, index modification, file group modifications, and adding scheduled jobs to re-index and update statistics in databases Developed and optimized database structures, stored procedures, Dynamic Management views, DDL triggers and user-defined functions Designed and implemented data marts with facts, dimensions and OLAP cubes using dimensional modeling standards in SQL Server 2008R2. Implemented Star schema and Snow flake Schema dimensional model. Involved in the complete data warehouse development life cycle and actively supported business users for change requests. Building &, testing of OLAP cubes with SSAS and in adding calculations using MDX,Created and Managed OLAP Cubes using SSAS and Involved in creating calculated members for the SSAS cubes. Daily support of system wide replication tasks including monitoring, alerting, and problem resolution. Generated periodic reports based on the statistical analysis of the data from various time frame and division using SQL Server Reporting Services (SSRS) by accessing Cubes Developed various operational Drill-through and Drill-down reports using SSRS Developed different kind of reports such a Sub Reports, Charts, Matrix reports, Linked reports from different Data Sets. Involved with Query Optimization to increase the performance of the Report. Developed Reports for business end users using Report Builder with updating Statistics and Used cascaded parameters to generate a report Aided in development and management of Excel accessing the cubes using macros Developed test plans, produce test scenarios and repeatable test cases/scripts through all parts of the development Keywords: business analyst artificial intelligence javascript business intelligence sthree database active directory rlang information technology hewlett packard microsoft Colorado Delaware |