I'm Pavan Kumar Dharomoju, a keen AI Enthusiast, Data Scientist, and a passionate photographer. I am an engineering graduate from CBIT, with a data science specialization from IIT Madras, now delving deeper into AI at Northwestern University. I've gained substantial professional experience at Deloitte USI along with technical skills and multiple certifications
When not immersed in technology and data, I love capturing the world through my lens. This space offers a glimpse into my journey of innovation, leadership, research, and photography.
Welcome to my world where technology, data, and creativity intertwine.
Led an AI engineering team focused on accelerating CRISPR research. Developed a scalable system that leveraged over 6 million PubMed Central documents to enhance data analysis and bioinformatics workflows. By incorporating multithreading and GPU acceleration, reduced model inference time by 40%, significantly improving research efficiency. Designed RESTful APIs using AWS to handle large-scale data processing, directly supporting breakthroughs in CRISPR technology and contributing to advancements in gene-editing research.
Worked as a Business Technology Analyst, implementing machine learning models within Salesforce CRM to improve customer recommendations and client satisfaction. Streamlined processes through automated workflows and CI/CD pipelines, boosting efficiency and reducing project delivery times by 30%. Deployed predictive models on AWS to enable faster, data-driven decision-making for clients.
Applied predictive modeling to boost lead generation by 40%. Automated data pipelines, cutting processing time by half. Optimized marketing budgets to increase return on investment by 18%. This role showcased the power of data-driven strategies in driving business growth.
Enhanced machine learning models by improving neural network accuracy by 11% through parameter tuning. Contributed to the development of models using TensorFlow and Scikit-Learn, deepening my practical understanding of AI techniques.
Pursuing advanced studies in AI, focusing on machine learning, natural language processing, and robotics. Engaged in research projects addressing real-world problems, such as developing models to predict disease outbreaks. This program is enhancing my technical expertise and problem-solving skills, preparing me to contribute innovatively to the field of AI.
Completed a diploma in data science while working full-time. Studied statistical analysis, machine learning algorithms, and big data technologies. Applied learning to real projects, such as analyzing consumer behavior to improve business strategies. This education enhanced my ability to derive insights from complex data and make data-driven decisions.
Gained a strong foundation in electrical circuits, electronics, and power systems through a blend of theoretical and practical learning. Participated in lab work and collaborative projects, including designing an energy-efficient power distribution system. This education ignited my passion for engineering and prepared me to tackle complex technical challenges.
Developed a Streamlit dashboard for marketing teams to visualize and forecast the effectiveness of marketing campaigns. Features include feature importance based on pre-trained models (Ridge, Bayesian Ridge, ElasticNet), ROI forecasting using ARIMA time-series analysis, and various data visualizations like correlation heatmaps and scatterplots. This tool aids in optimizing marketing strategies and budget allocation.
Tech Stack: Python, Streamlit, Machine Learning, Time-Series Analysis.
Implemented a Reinforcement Learning solution using Deep Q-Networks (DQN) to optimize maintenance strategies for a fleet of trucks. Created a custom OpenAI Gym environment to simulate truck fleet dynamics and tire wear. The RL agent helps in reducing operational costs by optimizing tire rotation and maintenance schedules. This project demonstrates advanced AI techniques applied to real-world logistics challenges.
Tech Stack: Python, PyTorch, OpenAI Gym, Reinforcement Learning.
Deployed a scalable web application using Kubernetes and MongoDB, demonstrating container orchestration and database integration. Set up ConfigMaps, Secrets, Deployments, and Services in Kubernetes, and utilized Docker and Minikube for local development and testing. This project showcases DevOps practices and automates deployment processes, enhancing reliability and scalability of web applications.
Tech Stack: Kubernetes, Docker, MongoDB, YAML, DevOps.
Predicted electricity load using Long Short-Term Memory (LSTM) neural networks to assist in energy distribution and management. The project involved time series analysis and developing models to forecast future electricity demand, helping utility companies optimize their operations. Included data preprocessing, feature engineering, and model evaluation to ensure high prediction accuracy and reliability.
Tech Stack: Python, Keras, TensorFlow, Time Series Analysis, Deep Learning.
Explored and optimized decision tree models through parameter tuning, cross-validation, and feature importance analysis to improve predictive accuracy. The project includes data preparation steps, such as handling missing values and encoding categorical variables, as well as model evaluation using metrics like accuracy and confusion matrices. Provides insights into enhancing decision tree performance for various classification tasks.
Tech Stack: Python, Scikit-Learn, Machine Learning, Data Analysis.
Built a machine learning model to predict customer churn for a telecommunications company. By identifying at-risk customers, the company can take preventive measures to retain them, reducing churn and increasing profitability. The project involved data preprocessing, feature engineering, model training using Random Forest Classifier with hyperparameter tuning, and model evaluation using classification reports and AUC-ROC score.
Tech Stack: Python, Scikit-Learn, Machine Learning.
Able to analyze large datasets, identify patterns and make data-driven decisions.
Adept in utilizing LLMs like GPT and BERT, excelling in complex language processing.
Proficient in creating and implementing machine learning models to solve business problems.
Expertise in using statistical techniques for hypothesis testing and predictive analysis.
Skilled in creating intuitive data visualizations to effectively communicate findings.
Experienced in maintaining and improving infrastructure, and automating software development processes.
Proficient in data extraction, transformation, loading (ETL) and the setup of data pipelines.
Specializes in cutting-edge deep learning techniques including CNNs, RNNs and LSTMs.