AM

Hi, i'm Rahul 👋

Data Migration Consultant

I specialize in SAP BODS development, helping enterprises streamline their data migration processes for seamless system transitions. With expertise in data integration and transformation, I enable businesses to optimize their data strategies for greater efficiency and insight.

About

Throughout my journey as an SAP BODS Developer, I've been driven by a clear purpose: data should empower businesses, not complicate them. At the intersection of data migration and quality management, I focus on simplifying complex data processes to ensure seamless transitions and reliable insights.

Currently, I'm working as Data Migration Consultant at TCS, focusing on optimizing data migration processes for seamless system transitions. I specialize in leveraging SAP Data Services and the Legacy Transfer Migration Cockpit to enhance data quality and accessibility. Prior to TCS, I gained valuable experience at Capgemini, on diverse projects that honed my skills in ETL processes and data migration. My dedication to helping organizations harness the power of data has driven me to deliver effective solutions for informed decision-making.

Experience

Sep 2024 — Present

SAP BODS Consultant • TCS

Currently working on a greenfield implementation project for Austria client. Responsible for data migration and conversion tasks, ensuring seamless integration of legacy data into the new SAP environment. Utilizing SAP BODS and LTMC to drive efficient data extraction, transformation, and loading processes. Collaborating with cross-functional teams to ensure data accuracy and successful implementation of the migration strategy.

SAP Data Services 4.2
Migration Cockpit
SQL
Azure
Agile

Apr 2021 — Sep 2024

SAP BODS Developer • Capgemini

Successfully completed 2 Go-Live projects, ensuring smooth transitions and Implementation. Assisted senior developers in the development and testing of SAP BODS conversion jobs, gaining hands-on experience in data extraction, transformation, and loading techniques. Conducted data validation and quality checks to ensure adherence to established standards using custom profiling rules in Information Steward. Prepared relevancy, preload and Post load reports required for migration process. Worked on DEV, QA , Pre- Production and Production loads and fixing defects raised on HP ALM tool. Conducted performance tuning and troubleshooting to enhance data processing efficiency and minimize system downtimes. Participated in scrum meeting, contributed ideas for process improvements, and assisted in documentation efforts.

SAP Data Services 4.1
Information Steward
MS SQL Server
LTMC
Confluence

Projects

Screenshot of Data Migration with SAP BODS - Udemy Course - Work in Progress

Data Migration with SAP BODS - Udemy Course - Work in Progress

Unlock the power of SAP BODS for seamless data migration with my comprehensive Udemy course. Dive deep into the fundamentals of data migration, from ECC to S4 HANA, and master the tools and techniques to elevate your skills. Meanwhile, explore more insights on my blog: rahultanwar.hashnode.dev

SAP Data Services 4.2
SQL
Udemy
Blogs
Screenshot of Azure Data Engineering Project

Azure Data Engineering Project

The project aims to develop a robust data pipeline that ingests JSON files from Azure Blob Storage, processes the data into CSV format, performs necessary data cleaning operations, and visualizes business metrics using Tableau. The project will leverage Azure Data Factory for orchestrating data movement and Azure Databricks for data transformation and cleaning.

Azure Blob Storage
Azure Data Factory
Azure Databricks (PySpark and Spark SQL)
Tableau
Screenshot of Yelp.com Data Extractor

Yelp.com Data Extractor

Developed an automated solution using UiPath to extract and process data from Yelp.com for various geographic locations. The bot was designed to collect structured data, transform it into a predefined format based on client requirements, and deliver the final output via automated email to the client. This project significantly streamlined the data extraction and delivery process, enhancing operational efficiency.

UiPath
Github
UiPath Data Studio
Xaml
Json
Screenshot of Mac OS Clock

Mac OS Clock

A simple clock website developed using HTML, CSS, and JavaScript. The website displays the current time in a user-friendly interface and provides basic functionalities such as time updating and customization options.

HTML
CSS
JavaScript
Netlify
Github
Screenshot of Suicides Data Analysis

Suicides Data Analysis

Analyze the dataset of suicides in India during 2001–2012. Perform ETL process in Azure Data Factory. Stored the resulted Dataset in SQL server and performed different types of queries to find out trends of our dataset.

SQL Server Management Studio
Azure Data Factory
Azure SQL Server
Azure SQL Database
T-SQL

Get In Touch

Looking to chat? Feel free to DM me on Telegram - the fastest way to reach out to me!

Coded in Visual Studio Code. Built with Next.js, Tailwind CSS and Shadcn/ui, deployed with Netlify.