Worked on Life Science projects like Contract & Pricing, Claims, Government pricing etc.
● Setup and constructed entire database model and architecture of the project on PostgreSQL.
● Built pipeline to using Python send structural data to Amazon S3 buckets.
● Processed the data using Amazon EMR.
● Worked on AWS EC2 and RDS for DB development.
● Used Sqoop for data ingestion.
● Optimized existing data pipelines using PySpark and SQL, resulting in a 40% reduction in processing time..
● Created reports using Amazon redshift and Oracle.
● Query Optimization, Analysis using Explain plan and database configurations.
● Assisted in the development and deployment of data pipelines using CI/CD methodologies, Docker containerization, and Git version control, contributing to [specific achievement or outcome]
Worked on various major projects like Boarding Reconciliation, Baggage system, Lost and Found, Contract Tracker, Inventory/Integrity violation reports.
As Database Developer : Nov 2018-Dec 2020
● Setup and constructed entire database model and architecture of the project on Oracle SQL.
● Developed and maintained data models to improve data quality and consistency across various data sources.
● Worked on Stored Procedures, Packages, Views, Functions, Cursors, Triggers and XML.
● Created various Market share reports, Regulatory reports.
● Query optimization for faster reports on different database like Oracle, PostgreSQL and MS/SQL.
● Query Optimization, Analysis using Oracle AWR Reports and Explain plan.
As Data Engineer : Apr 2020 - Nov 2021
● Designed and implemented data pipelines for ground services operations using Oracle and AWS, resulting in a 50% reduction in data processing time.
● Used Informatica to load data from different Oracle servers source to single destination.
● Worked on ETL batch jobs.
● Worked on Amazon S3 to load data and Redshift for Database.
● Worked on Data Lake and Staging Area where data will be staged into AVRO file formats.
DCH (Data Clearing House)
● Work on client user stories , requirement gathering and developing the final solution.
● Worked on Stored Procedures , Packages , Views , Triggers, XML Processing
● Query Optimization , Analysis using Oracle AWR Reports and Explain plan.
● Code Review and Knowledge sharing for globalization within team for zero dependency.
● Delivered Custom client reports on ad-hoc basis.
● Worked on job scheduling using cron-tab.
● Worked mainly on DML operations daily basis.
● Created Reporting tool using Oracle PL/SQL, Perl and Apex to automate the Client request.
● Modification and implementation of the databases using Oracle SQL, PL/SQL, UNIX/LINUX.
Smart Axiata
● Worked on data migration and data integrity check after migration.
● Tested all plans as per requirements in Product Design Documents by preparing fully planned test cases in complete details by covering all scenarios.
Celcom (Kenan Billing system)
● Worked on Account Move Tool (AMT) where data of a customer having a post-paid connection.
Built triggers which would prevent certain accounts getting scheduled for move which were basically not eligible for move
SQL, PL/SQL, Python, React
undefinedMicrosoft Certified Data Analyst Associate.
Databricks Certified Data Engineer Associate
Microsoft Certified Data Analyst Associate.