Prashanth Bogineni - Python full stack developer |
[email protected] |
Location: Farmington, Michigan, USA |
Relocation: open |
Visa: H1b |
Resume file: Prashanth Bogineni Python Developer_1750169532532.doc Please check the file(s) for viruses. Files are checked manually and then made available for download. |
Python Developer with over 9+ years of IT Experience in designing, developing, testing, and implementing various stand-alone and client-server architectures-based enterprise application software on different domains
Good experience in developing web applications implementing MVT/MVC architecture using Django, Flask web application frameworks. Extensive knowledge working with AWS services like S3, ELB, EBS, Auto-Scaling, Route53, Storefront, IAM, Cloud Watch, RDS etc. Developed and optimized ETL workflows in both legacy and distributed environments. Experience in Object Oriented Design and Programming concepts in Python. Experience in the Hadoop ecosystem components like HDFS, Spark with Scala and pythonZookeeper. Knowledge of rendering large data sets on the application view using ReactJS. Working Experience on various Python packages such as NumPy, SQLAlchemy, Beautiful soup, pickle, Pyside, Pymongo, SciPy, PyTables. Good Experience with delivering datasets from Snowflake to One Lake Data Warehouse and built CI/CD pipeline using Jenkins and AWS lambda and Importing data from DynamoDB to Redshift in Batches using Amazon Batch using TWS scheduler. Progressive experience in Tableau Desktop, Tableau Public and Power BI for business analysis, deliver customized ad-hoc reports and performing actionable dashboards for enterprise applications. Hands on experience for designing, implementing, Testing, and maintaining database solution for Azure and strong analytical and problem-solving skills for Hadoop Data Architecture. Experienced in NoSQL technologies like MongoDB, Redis and Cassandra and relational databases like SQLite, PostgreSQL, Dynamo DB, and MySQL databases. Good experience in working with Amazon Web Services like EC2, Virtual private clouds (VPCs), Storage models (EBS, S3, instance storage), Elastic Load Balancers (ELBs). observed the data behavior and made reusable scripts to help other teams to access similar cases Worked on an android application in backend team on Appsync, Lambda, Sns, Rds Aurora Serverless, DynamoB, Api Gateway Created mutation, resolvers, query, schema for appsync. Experience in job workflow scheduling and monitoring tools like Airflow and Autosys. Experience with specific technologies/platforms like Java script, React/Redux. Experienced in working with Docker for creating images that are deployed on AWS as Microservices and Experienced on managing the local deployments in Kubernetes and creating local clusters and deploying application containers. Hands on experience on analyzing client data using Scala, spark, spark SQL, define an end-to-end data lake presentation towards the team. Hands-on experience in developing ETL data pipelines using PySpark, AWS glue on AWS EMR. Experience in working with Network Monitoring tools and automation. Proficient in developing Web Services (SOAP, RESTful) in Python using XML, JSON. Experience with Unit testing/Test driven Development (TDD), Load Testing and worked on Celery Task queue and service broker using RabbitMQ. Hands on Experience in AWS like Amazon EC2, Amazon S3, Amazon Redshift, Amazon EMR and Amazon SQS, glue, athena, AWS CloudWatch, lambda, etc. Involved in all the phases of Software Development Life Cycle (SDLC) using Project management tools JIRA, Redmine and Bugzilla. PROFESSIONAL SKILLS Languages: Python, Scala, C, C++, Shell Script, Java, PHP, PL/SQL, Ruby IDE: Sublime Text, Pycharm, Spyder Notebook, Anaconda, Jupyter Notebook Big data: Spark, HDFS, HIVE, Hadoop, MapReduce, Spark-Core, Spark-SQL, Scala Machine Learning: Regression, Polynomical Regression, Random Forest, Logistic Regression, Decision Trees, Classification, clustering, Association, Simple/Multiple linear, Kernal SVM, K-Nearest Neighbours (K-NN), Natural Language Processing (NLP). Version Controls: GIT, GitHub, SonarQube, SVN, CVS Test Management Tools: Agilent, Regression Manager, JIRA, Redmine Build Tools: PyBuilder, PIP, Npm, VirtualEnv, Coverage, Jenkins, Docker Defect Tracking Tools: Jira, Bugzilla, Redmine, HP ALM, Tableau, Cron, Matplotlib, Pandas, Flume, Splunk, Bubbles (ETL), PySpark, Kafka, Boto3(AWS) Web Technologies: WSGI, JavaScript, HTML, CSS, JQuery, AngularJS, Ext JS, Node JS, Vue.js, React JS, JSON, AJAX, and bootstrap. Frameworks: Django, Flask, Pyramid, FastAPI Scripting Languages: JavaScript, UNIX Shell Scripting, Python Web Services: Soap, Apache Axis, Restful Databases: PostgreSQL, MySQL, Maria DB, Oracle 8i/9i/10g, MS Access, SQL Server, Sybase, SQLite, SQL Alchemy NoSQL Databases: Mongo DB, Redis, Cassandra Operating Systems: Linux, Windows, IOS. Cloud Platform/CICD Tools: AWS, Aurora, Sagemaker, Snowflake, SnowSQL, SnowpipeAWS, Azure, Docker, Kubernetes, Bamboo PROFESSIONAL EXPERIENCE ACG-AAA Insurance, January 2025-Present Professional Consultant/Java to Python Integration Infra Engineer Jan 2025-Present Responsibilities Led JavaSpringboard4 to Python conversion, ensuring seamless migration and system stability Developed and optimized APIs using FastAPI and implemented Pydantic for data validation Integrated Liquibase for database version control and schema management Automated CI/CD pipelines using Jenkins for seamless deployments and improved efficiency Managed and deployed applications on GCP with Cloud Run, enhancing scalability and reliability Designed and developed database schemas, ensuring efficient data storage and retrieval using PostgreSQL and Oracle SQL Collaborated with cross-functional teams to support Low-Level Design (LLD) and application development Implemented dependency management and package handling using Poetry for Python projects Assisted in backend development by leveraging Python and FastAPI for scalable application design Worked extensively on data validation using Pydantic libraries, improving application data integrity Developed Liquibase scripts for database migrations and ensured version control in database changes Implemented SonarQube for code quality and security analysis, ensuring high coding standards Worked with Bitbucket and Git for version control, ensuring smooth collaboration and code management Designed and developed microservices architecture to improve application modularity and scalability Maintained CI/CD pipelines using Jenkins for efficient software deployment and testing Contributed to designing database tables, indexing strategies, and performance optimization Environment: Python, JavaSpringboard4, LLD, MicroService, Pydontic, Sonarqube , GCP, Cloudrun, FastAPI, RESTfulwebservice, Json, Liquibase, Jenkins, Docker, PostgresSQL, Oracle SQL, CI/CD pipelines, Poetry, Bitbucket, GIT Advanced Micro Devices (AMD), Austin, TX Jan 2023 Dec 2024 Software engineer/ Python Developer Responsibilities Developed tools and new features using Python for silicon wafer testing in an agile environment using Continuous Integration/Continuous Deployment (CI/CD) pipeline and published them as Python packages. A highly immersive Data Science program involving Data Manipulation & Visualization, Web Scraping, Machine Learning, Python programming, SQL, GIT, Unix commands, MySQL, Mongo DB, Hadoop. Developing web-based applications using SaaS, Python, Django, Kafka, RPC, CSS, Artificial Intelligence, AI/ML, HTML, JavaScript and JQuery based in Ansible automation script, Angular.JS, Boto 3, React. Experience in python, Jupyter, Scientific computing stack (numpy, scipy, pandas and matplotlib). Conducted data blending, data preparation using SQL for Tableau consumption and publishing dashboard to Tableau server. Experience in Microsoft Azure Cloud platform and merge with Python. Worked with Microsoft Azure cloud services for migrating on-premises data from RDBMS sources (PostgreSQL) and FTP servers (cloud-based) to the Azure Data Lake Storage Gen1 & Gen2. Azure Cloud Merge with Python to store the data into cloud High security. Converted Talend Joblets to support the snowflake functionality. Automated RabbitMQ cluster installations and configuration using Python/Bash. Deployed patches for Linux and application servers, Red Hat Linux Kernel Tuning. Designed Data Marts by following Star Schema and Snowflake Schema Methodology, using industry leading Data Modeling tools like Erwin. Unit tested the data between Redshift and Snowflake. Use Python unit and functional testing modules such as unit test, unittest2, mock and custom frameworks in-line with Agile Software Development methodologies. Generated Python Django forms to record data of online users and use PyTest for writing test cases. To analyze/dispose on problematic lot low yield, quality deviation & customer return lot. Implemented and modified various SQL queries and Functions, Cursors and Triggers as per the client requirements. Used micro service architecture, with Spring Boot-based services interacting with REST and Kafka. Day to day tasks in Red Hat Linux including upgrading rpms, kernel, HBA driver, configuring SAN Disks, multipathing and LVM file system. Manage datasets using Panda data frames and MYSQL, queried MYSQL database queries from python using Python-MySQL connector and MySQL db package to retrieve information. Scripted the build tool (Bamboo) to use Docker and run tests and release the packages to the local antifactory (Automation), with their own CI/CD pipeline. Used Pandas as API to put the data as time series and tabular format for manipulation and retrieval of data. Setting up new products and new process flows, routes and process options as needed. Analyzing various logs that are been generating and predicting/forecasting the next occurrence of event with various Python libraries. Environment: Python, Django, Pandas, Jupyter Notebook, Tableau Server, Linux, ER Studio, Machine learning, Artificial Intelligence, MLLib, regression, OLTP, random forest, OLAP, NLP, Azure, RabbitMQ, Wafer fab, RESTful webservice, Shell Scripting, Docker, Bamboo, MySQL, Ansible, CI/CD pipelines, Bitbucket, GIT Anywhere (Realogy), Madison, NJ Jun 2022 Dec 2022 Software Engineer/ Sr.Python AWS Developer Responsibilities Interacted with the Business team to gather requirements. I am extensively working with AWS services like S3, ELB, EBS, Auto-Scaling, Route53, Storefront, IAM, Cloud Watch, RDS etc. Used the AWS-CLI to suspend on AWS Lambda function used AWS CLI to automate backup of ephermal data stores to S3 buckets EBS. Deployed Airflow (Celery Executor) on S3 mounted to EFS as central directory with broker as SQS and stored metadata in RDS and logs to S3 Buckets. Design, development, and implementation of performant ETL pipelines using python API (PySpark) of Apache Spark, AWS glue on AWS EMR. Developed the ETL jobs as per the requirement to update the data into the staging database (Postgres) from various data sources and REST API s. Compared IRDB onprime data with VN standard market data which is in VNML using python on an anaconda environment. Recorded the online user s data using Python Django forms and implemented test case using Pytest. Developed analytical queries in Teradata, SQL-Server, and Oracle. Involved in infrastructure as code, execution plans, resource graph and change automation using Terraform. Managed AWS infrastructure as code using Terraform and CloudFormation. Developed Merge jobs in Python to extract and load data into MySQL database. Created various plans in IDQ to enhance the source data by comparing with reference tables and dictionaries. Created Terraform scripts for EC2 instances, Elastic Load balancers and S3 buckets. Implemented Terraform to manage the AWS infrastructure and managed servers using configuration management tools like Chef and Ansible. Created Athena and integrated with AWS Glue to fully manage ETL service that can categorize the data. Configured an AWS Virtual Private Cloud (VPC) and Database Subnet Group for isolation of resources within the Amazon RDS Aurora DB cluster. Designing and Manage API system deployment using fast http server and Amazon AWS architecture. Built numerous Lambda functions using python and automated the process using the event created. Created AWS Lambda architecture to monitor AWS S3 Buckets and triggers for processing source data. Developed a data science pipeline using AWS Sagemaker and scheduled it successfully in production. Worked on packages like socket, REST API, Django. Wrote Unit and Integration Tests for all the ETL services. Performed S3 buckets creation, policies on IAM role-based policies, MFA and customizing the JSON template. Automated various service and application deployments with ANSIBLE on CentOS and RHEL in AWS. Worked with DevOps practice using AWS, Elastic Bean stalk and Docker with Kubernetes. Worked on MySQL database on simple queries and writing Stored Procedures for normalization. Deployed the project into Jenkins using the GIT version control system. Learned to index and search query many documents inside Elastic search. Understanding of secure-cloud configuration, Cloud Trail, cloud-security technologies (VPC, Security Groups, etc.) and cloud-permission systems (IAM) Used version controlling systems like GIT and SVN. Consumed web services performing CRUD operations. Used Python Library Beautiful Soup 4 for Web Scraping to extract data for building graphs. Used AngularJS as the development framework to build a single-page application. Environment: Amazon Web Services (AWS), Cloud Environment, Snowflake, Lambda,Airflow, Celery, AWS Sagemaker, DynamoDB, Python 3.4, Django, API Gateway, REST API, anaconda, Jupyter Notebook, Teradata, Spark, Spark API, Spark, SQL, Spark Streaming Spring framework, Amazon S3, CloudWatch, eclipse, MS-SQL Server, GIT, Jira Bank of America (BOFA), Charlotte, NC Mar 2021 Jun 2022 Python Application Developer Responsibilities Part of team Cloud Platform Engineering to implementing REST APIs in Python using micro-framework like Flask with SQLAlchemy in the backend for management of data center resources on which OpenStack would be deployed. Used Jenkins pipelines to drive all micro services built out to the docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes. I wrote and executed various MySQL database queries from python using Python-MySQL connector and MySQL DB package. Build SQL queries for performing various CRUD operations. I worked extensively with automation tools like Jenkins, SonarQube Chef and Puppet for continuous integration and continuous delivery (CI/CD) and to implement the End-to-End Automation. Design back schema, and database JSON structures Configured VMware HA, DRS in acquiring higher efficiency for VMware infrastructure. Used Scala to convert Hive/SQL queries into RDD transformations in Apache Spark. Worked with Cloud-based solutions like Turbonomic for deploying on-demand Windows and Linux Automated Configured Tenants, Blueprints, Security Groups, End Points and Resource Pools in vRA Infrastructure Deployed and managed Blueprints for Linux and Windows systems on Dev, QA and Prod environments Configuring the Splunk Add-on for AppDynamics and Splunk App for Jenkins. Installed/Configured and Managed JFrog Artifactory Repository Manager and all the Repositories. Responsible for building key based SSH authentication with nodes, creating inventory of remote hosts and creating playbooks. Created Ansible playbooks which are the entry point for Ansible provisioning, where the automation is defined through tasks using YAML format and run Ansible Scripts to depending on provision to Dev servers. Converted a slow and manual procedure to dynamic API generated procedures in Ansible. Experience in building various versions of RHEL Linux VMs Installed and configured Active Directory, Global Policy, and Windows Authentication methods in an enterprise environment. Collaborated within a team using an agile development workflow and widely accepted collaboration practices using Git. Developed a fully automated continuous integration system using Git, Jenkins, MySQL, and custom tools developed in Python and Bash. Environment: Python, Flask, RESTful webservice, MySQL, Ansible, Jenkins, SharePoint, SonarQube, Jfrog,Bitbucket, GIT, GITLAB, Turbonomic, ESXi, VCenter Server, Tableau Desktop, Tableau Server, Cloning, Hive/SQL, Scala DRS Snapshots, HA in Clusters, DRS, Windows Server OS 2003/2008/2012, Windows Client OS Win 7/Win 8/Win 8.1, VCAC, VDI, BMC Track-it, NetApp FC SAN/NAS storage, Avocent KVM, Cisco UCS servers, HP ProLiant G3, G5and G7 servers, HP Blade Servers Northwestern Mutual Life, Milwaukee, WI June 2020 January 2021 Software Engineer (Python/AWS/DevOps) Responsibilities Designed the Front-end applications, user interactive (UI) web pages using web technologies like HTML, XHTML, and CSS. Web designing & development using HTML5, CSS3, JavaScript, React JS and Ajax. Data sources are extracted, transformed, and loaded to generate CSV data files with Python programming and SQL queries. Analyzed application requirements and created JILs for use with Autosys. Added several options to the application to choose algorithm for data and address generation. Developed python framework using various Python libraries like Pandas and NumPy for Data validation. Set up and build AWS Infrastructure various resources, VPC EC2, S3, IAM, EBS, Security Group, Autoscaling, and RDS in Cloud Formation JSON templates. Developed single page applications using React Redux architecture, ES6, web pack and grunt. Architect and developed python and Django for the back-end development and front-end application using React, Webpack, Redux and PostgreSQL for database. Written bash and python scripts integrating Boto3 to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMI s and scheduling lambda functions for routine AWS tasks. Implemented Docker containers to create images of the applications and dynamically provision slaves to Jenkins CI/CD pipelines, Git and SonarQube. Experienced with AWS cloud platform and its features, which include EC2, S3, ROUTE53 VPC, SNS, RDS and CLOUD WATCH Used the AWS-CLI to suspend on AWS Lambda function used AWS CLI to automate backup of ephemeral data stores to S3 bucket EBS. Created Terraform scripts for EC2 Instances, Elastic Load balancers and S3 buckets. Implemented Terraform to manage the AWS infrastructure and managed servers using configuration management tools like Chef and Ansible. Implemented HTTP REST API using Node JS and extensively tested RESTful services using POSTMAN. Involved in daily SCRUM meetings to keep track of the project status. Involved in all phases of Software Development Life Cycle (SDLC), including requirements analysis, design and development, bug fixing, supporting QA teams, and debugging production issues. Maintained the code base and version controller with GIT. Environment: Python 3.x, Django 1.9, AWS, Airflow, Autosys, HTML5, CSS3, React JS, JavaScript,JQuery, MySQL, AWS, AWS, MongoDB, PostgreSQL, Angular JS, JIRA, RETS, Rest APIs, Jenkins, Docker, Kubernetes, Git, Jenkins, SonarQube, Linux. American Family Insurance, Madison, Wisconsin Jan 2019 Jun 2020 Python Developer Responsibilities Analyzed and gathered business requirements specifications by interacting with client and understanding business requirement specification documents. Implemented REST APIs using Python and Django framework. Developed the ETL jobs using AWS glue as per the requirements to update the data into staging database (Postgres) from various data sources and REST API s. Used data types like dictionaries, tuples, and object-oriented concepts-based inheritance features for making complex algorithms of networks. Unit Test Python library was used for testing many programs on python and other codes. Wrote Python scripts to parse JSON documents and load the data in database. Implemented web applications in Flask frame works following MVC architecture. Developed data transition programs from DynamoDB to AWS Redshift (ETL Process) using AWS Lambda by creating functions in python for certain events based on use cases. Worked on AWS SQS, consumed the data from S3 buckets. Imported data from different sources like AWS S3, Local file system into Spark RDD. Responsible for delivering datasets from Snowflake to one Lake Data Warehouse and built CI/CD pipelines using Jenkins and AWS lambda and Importing data from DynamoDB to Redshift in Batches using Amazon Batch using TWS scheduler. Used Celery with RabbitMQ, MySQL, Django, and Flask to create a distributed worker framework. Implemented SQL Alchemy which is a python library for complete access over SQL. Implemented full CI/CD pipeline by integrating SCM (GIT) with automated testing tool Gradle & Deployed using Jenkins and Dockized containers in production and engaged in few DevOps tools like Ansible, Chef, AWS CloudFormation, AWS code pipeline, Terraform and Kubernetes. Utilized Python Libraries like Boto3, numPY for AWS. Used Test driven approach (TDD) for developing services required for the application. Used Git to resolve and code the work on python and portlet. Environment: Python 3.x, Django 1.9, TeamCity, AWS, S3 bucket, AWS Lambdas, Redmine, HTML5, CSS3, JavaScript, JQuery, MySQL, AWS, Docker, Kubernetes, MongoDB, Angular JS, JIRA, RabbitMQ, Selenium, Web Services, Jenkins, Git, Linux Exxon Mobil, Spring, Texas Aug 2017 Jan 2019 Python Developer Responsibilities Responsible for SDLC process in gathering requirements, system analysis, design, development, testing and deployment. Used PySpark to expose Spark API to Python. Designed and developed data management system using MySQL. Rewrite existing Python/Django modules to deliver certain format of data. Responsible for debugging and troubleshooting the web application. Developed server-based web traffic statistical analysis tool using Flask, Pandas. Wrote python scripts to parse XML documents and load the data in database. Used JQuery for all client-side JavaScript manipulation. Created unit test/regression test framework for working/new code. Created a Python based GUI application for Freight Tracking and processing. Worked in development of applications especially in UNIX environment and familiar with all commands. Environment: Python, Flask, PHP, Ruby, HTML5, CSS, JavaScript, JQuery, AJAX, Web services, GitHub, Selenium, MYSQL, PostgreSQL, Mongo DB. Social Solutions, Austin, Texas Jul 2016 Aug 2017 Python Developer Developed MVC with number of templates for login and registration of candidate detail with end-to-end protocol to save in database. Designed Forms, modules, Views and Templates using Django and Python. Installed/configured/maintained MySQL database and MongoDB database. Collecting logs using Logcat and reporting bugs on Redmine. Installed/configured/maintained Apache servers on all machines (production server and development servers). Extensive experience in deploying, managing and developing Oracle SQL developer clusters. Rewrite existing Python/Flask module to deliver certain formats of data. Created a script in python for calling Rest APIs. Responsible for debugging and troubleshooting the web application. Collaborated with internal teams to convert end user feedback into meaningful and improved solutions. Environment: Python, Django, Flask, JIRA, Redmine, Jenkins, Apache Web Server, Apche spark, XML, MySQL, SQLAlchemy, MongoDB, REST, CSS, SASS, AJAX, HTML, Shell scripting, Sublime, XHTML, SVC. CMS Info Systems, INDIA Nov 2014 Aug 2015 Python Developer Responsibilities Participated in the complete SDLC process and used PHP to develop website functionality. Designed and developed the UI of the website using HTML, XHTML, AJAX, CSS, and JavaScript. Developed entire frontend and backend modules using Python on Django Web Framework. Designed and developed data management system using MySQL. Built application logic using Python 2.7. Used Django APIs for database access. Used Python to extract information from XML files. Developed shopping cart for Library and integrated web services to access the payment (E-commerce) Designed and developed horizontally scalable APIs using Python Flask. Designed Cassandra schema for the APIs. Implemented monitoring and established best practices around using elastic search. Effectively communicated with the external vendors to resolve queries. Used GIT for the version control. Environment: Python 2.6/2.7, JavaScript, Django Framework 1.3, CSS, SQL, MySQL, LAMP, JQuery, Adobe Dreamweaver, Apache web server. Gate Info Solutions, INDIA Feb 2013 Nov 2014 Java/SQL Developer (Intern) Responsibilities Involved in developing the UI pages using HTML, DHTML, CSS, JavaScript, JSON, jQuery, Ajax. Involved in writing test cases using JUNIT. Involved in developing HTML and JavaScript for client-side presentation and data validation on the client side within the forms. Created database access layer using JDBC and PL/SQL stored procedures. Designed object model, data model, tables, constraints, necessary stored procedures, functions, triggers, and packages for Oracle Database. Worked on Java based connectivity of client requirement on JDBC connection. Used data types like dictionaries, tuples, and object-oriented concepts-based inheritance features for making complex algorithms of networks. Involved in peer code reviews and performed integration testing of the modules. EDUCATION & CERTIFICATIONS Harrisburg University of Science and Technology, Harrisburg, PA, Grad Year-2018 Masters in computer information systems University of Wales, Cardiff, UK, Grad Year-2013 Masters in computer applications Kakatiya University, Warangal, Telangana, India, Under Grad Year-2009 Bachelors in Pharmaceutical Sciences Keywords: cprogramm cplusplus continuous integration continuous deployment quality analyst artificial intelligence machine learning user interface javascript access management business intelligence sthree database active directory information technology hewlett packard fibre channel microsoft procedural language New Jersey North Carolina Pennsylvania Texas Wisconsin |