Summary
Overview
Work History
Education
Skills
Certification
Profile Summary
Personal Information
Timeline
Generic
Ramarao Bhashyam

Ramarao Bhashyam

Singapore

Summary

  • Experienced IT professional with over 14 years of expertise in Big Data platform implementation across Banking and Financial Services, Retail, Lending, and Fleet Management domains.
  • Played a key role in the conceptualization and design of a confidential Big Data platform for extracting both structured and unstructured data from enterprise transactional systems, applying advanced analytics for customer sentiment and risk analysis.
  • Skilled in end-to-end Data Warehousing and Business Intelligence solutions across on-premises and cloud environments. Proficient in Cloudera/Hortonworks Hadoop ecosystems, Spark, and various data analytics and ETL tools.
  • Demonstrated experience in data preparation, modeling, and insight generation using cloud platforms and traditional infrastructure.
  • Hands-on experience with technologies including Python, SQL, Microsoft Excel, Hive, PySpark, and Spark SQL for data analysis, feature engineering, and machine learning. Adept in the MSBI stack (SSIS, SSRS, SSAS 2012–2016), Azure Data Factory, Power BI (Power Query, Power Pivot, Power View), and Informatica BDM. Well-versed in tools like Denodo, SQL Developer, Tableau, Informatica PowerCenter, and UNIX.
  • Strong programming skills in Python, PL/SQL, T-SQL, PowerShell, C#, and PySpark API. Expertise in designing data warehouses and databases for regulatory reporting and analytics, with a focus on performance tuning and query optimization.
  • Deep knowledge of AWS and Azure database services, including implementation and migration.
  • Successfully led database migration projects from on-premises Microsoft SQL Server to AWS RDS PostgreSQL. Familiar with Agile, Scrum, and Kanban methodologies. Experienced in DevOps tools such as GitHub, GitLab, Jenkins, Docker, and Ansible for automated deployment.
  • Demonstrated ability in incident predictability by proactively identifying and monitoring customer incidents, automating root cause analysis, and reducing incident occurrences through proof-of-concept solutions.
  • A result-oriented, fast learner with strong planning and execution capabilities.

Overview

15
15
years of professional experience
1
1
Certification

Work History

Technical Lead (AVP)

OCBC Bank
09.2023 - Current
  • Working as Technical Lead for Finance squad Regulatory reporting deliverables in the private banking for both Venom and Carnage squads. OP23 deliverables are mentioned below.
  • SDIC submission changes
  • MAS 757
  • MAS 637
  • FRR - FCR
  • MAS 610s
  • Bank of Singapore is a subsidiary of OCBC Bank, dedicated to private banking services through its internet bank, bankofsingapore.com it offers a suite of banking products designed to help customers to achieve a range of financial goals.

Data Engineer

Beyondsoft
11.2022 - 09.2023
  • SPDB Bonds twin trader monitoring system is a recommendation engine build for trader to provide recommendations on US and China bonds for feature trading purpose and give an insights to trader post accept or reject recommendations. The work includes below mentioned activities.
  • Requirements study & Analysis
  • Preparing the design documents (Functional and Technical).
  • Working on the Data modeling for recommendation engine back end.
  • Worked on Data Engineering and Feature Engineering for project.
  • Data Engineering includes below tasks:
  • Build pipe to download data from internet to SFTP using azure function app to schedule.
  • Build pipe to transfer files from SFTP to S3 object storage in DMZ zone.
  • Build ETL pipe to ingest data from S3 bucket to MySQL database using python.
  • Worked on python Feature Engineering pipeline to apply transformation logic required for Machine learning model training.
  • Build docker image and setup corn job by schedule it in on-premises Kubernetes cluster.
  • Implemented archival and retention strategy on both S3 object storage and My SQL database.
  • Tools Used: Data Engineering, Feature Engineering, Docker, Kubernetes, My SQL, S3, Python Django, Python,Python API(boto3, numpy, pandas, akshare, fredapi, pymysql, mysql-connector-python)

Data Engineer

Accenture
12.2021 - 11.2022
  • Return Collection Consumption (RCCZZ) System is a regulatory reporting project to submit returns from Financial institutions to Banking, Insurance and PPD Users. The work includes below mentioned activities.
  • Top 100 Borrowers Group System (TBGS)
  • MAS122
  • MAS637
  • Requirements study & Analysis
  • Prepared the design documents (Functional and Technical).
  • Prepared source data mapping and report design documents.
  • Created Hive DDL scripts from zeppelin notebooks.
  • Worked on data ingestion framework to load data from XML source to Hive table in EDL by using Informatica DBM, Linux Shell scripts.
  • Created shell scripts to execute Hive SQL scripts and File handling Operations.
  • Created denodo base views on Hive tables located in Enterprise Data Lake.
  • Created Operational metadata handling tables and stored procedures in SQL Server database.
  • Prepared auto sys jobs setup to trigger all the informatica workflows from shell script.
  • Created denodo base views on Operational metadata tables to provision data for reporting and support needs.
  • Created BO Universe for Integrated Reporting needs of Business users.
  • Created SAP BO detailed and summarized reports for Business Users.
  • Build Tableau dashboard for Business Uers regulatory requirement.
  • Work on the Access control matrix for DSL and Reporting layers.
  • Prepared SIT test case documents for ETL, DSL and Reporting layers.
  • Worked on the Operational Monitoring dashboard design in tableau and prepare source mapping.
  • Tools Used: Hortonworks Hadoop Data Platform, Hive, Zeppelin, Python(PySpark), Informatica BDM, Denodo, SAP BI 4.2, Tableau, SQL Server

Senior Analyst

Quesscorp Singapore Pte Ltd
07.2018 - 12.2021
  • Fraud Trade and Surveillance System is a Data analytical reporting project to detect and prevent frauds early. The work includes below mentioned activities.
  • Requirements study & Analysis
  • Prepared the design documents (Functional and Technical).
  • Prepared Source data mapping and assisted SAS vendor on mapping clarifications.
  • Involved in data profiling for source systems data received daily frequency.
  • Involved in new data mart design for all sources data.
  • Worked on Interface SSIS ETL to generate daily file feed from relational systems and ingest on to Enterprise Business Platform (EBP) and Data warehouse (in Teradata)
  • Setup control M file transfer jobs for all the source systems to transfer the files.
  • Worked on Data ingestion framework to load structured, semi structured and unstructured data onto Enterprise Business Platform (EBP) using Linux Shell scripts and Teradata.
  • Implemented C# post API call to push voice files from Application server staging area to NICE actimize landing area.
  • Worked Teradata ETL framework (by using Teradata multiload utility and Shell scripts) to load structured data into Teradata warehouse and Risk DataMart.
  • Worked on the XMAP design and development to standardize the reference data coming from different source systems.
  • Worked on the SAS downstream data and control file feed generation from Data Mart in Teradata.
  • Involved in the SAS and Actimize NICE lower environments setup with the help of Vendor.
  • Prepared the metadata seed for large sample voice base data for building language package by transcribers.
  • Worked on the Control jobs setup for Interface ETL, Teradata(DW, DM) and Data Lake loading jobs.
  • Work with source teams on source data quality issues for resolution.
  • Worked on the User access control data configuration setup for the project.
  • Tools Used: Cloudera PCloud Hadoop, Hive, Python, Teradata, SQL Server, SAS Viya, SAS 9.x, SAS VA & VTA, Actimize NICE, SSIS

Module Lead

Mindtree Ltd
12.2015 - 04.2018
  • Itemization of fees and charges for Europe is an enhancement to existing client facing reporting project in which we are automating report extraction by building an operational data store to extract client facing reports. The work includes below mentioned activities.
  • Requirements study & Analysis
  • Prepared the design documents (Functional and Technical).
  • Designed a new Operational Data Store for YTD Client fees and charges Statements generation.
  • Worked on the data model design for the reporting.
  • Worked on the Dynamic Data Ingestion framework on SSIS 2012 to load transactional data from Data Vault to ODS.
  • Created extraction stored procedures to get account level data from Data Vault.
  • Create a stored procedure to extraction YTD fees and charges data from ODS.
  • Prepared control M job setup for Initial Load, Daily Batch load, Manual data feeds, archival.
  • Submitted code in Git, Jenkins and configured ansible pipeline for deploy code on to higher environments.
  • Deploy code onto SIT and UAT post review.
  • Represent CAC and promote code to Production with the help of the release team.
  • Migrate existing 9 dashboards in Qlik view to Power BI.
  • Tools Used: SQL Server, SSIS, SSRS, Visual Studio 2012, Git, Jenkin, Nexus, Ansible, Power BI

Senior Software Engineer

SenecaGlobal IT Services Pvt. Ltd
03.2015 - 12.2015
  • ODP Downstream and Reporting is one of theData vault migration projects which is part of GWP 1.0. The work includes below mentioned activities.
  • Requirements study & Analysis
  • Prepared the design documents (Functional and Technical).
  • Implemented Structural changes(Column length increment) in the Data Vault.
  • Analyze the existing ETL in SSIS 2010 and implement the code change by change source to file instead of SQL table.
  • Inserted new configuration entries for all the SSIS packages and disable the legacy configuration entries.
  • Developed new ETL in SSIS 2010 for new modules to retrofit data in existing Data Vault.
  • Performed Impact Analysis and documented the impacted object inventory list system wise.
  • Implemented the Data Migration for existing data in the Data Vault.
  • Modified all the stored procedures functions and views as part of the migration.
  • Post migration data quality issues getting from upstream are logged and follow up for closure.
  • Created customer data Interface ETL for Data Analytics from Data Vault to EDW and Hadoop.
  • Involved in multiple environments(TR, ORT1, ORT2, UAT) setup for testing end to end data migration across multiple applications.
  • Implemented code changes in all the downstreams extraction stored procedures.
  • Work with each downstream system testing with Users on the migration issues.
  • Migrated the existing EUC detailed and summarized reports from QlikView to SSRS.
  • Designed and developed different kinds of reports (Tabular, matrix, Charts) using SSRS.
  • Analyze existing control M jobs and set up new control M flow by adding new jobs.
  • Work on the release preparation and code deployment for higher environments (TR, ORT,UAT, PROD).
  • Tools Used:SQL Server 2012/2014, SSIS, Control M, SSRS, Visual Studio 2012, TFS, Bitbucket, JIRA

Senior Software Engineer

Fidelitone
03.2015 - 11.2015
  • Leasehub is inhouse project that is developed with the help of data virtualization concept. The work includes below mentioned activities.
  • Requirements study & Analysis and involved in daily status calls.
  • Participated in Project effort estimation and Project planning.
  • Prepared design documents (Functional and Technical).
  • Prepare the Mapping document by analyzing the existing.
  • Defining and implementing the required business transformation rules and logic's
  • Provided high quality service within the scheduled time frame.
  • Designing and developing the ETL packages using SSIS 2012 for load data onto LeaseHub and on to downstream consumers.
  • Implemented Data Quality business rules in Stage load ETL.
  • Implemented audit log and Error handling and notification services for the SSIS packages.
  • Built reconciliation ETL to check the data accuracy for regulatory reporting audit check.
  • Involved code review and Unit testing to reduce the post release defects.
  • Involved SIT test case preparation and test execution.
  • ToolsUsed: SQL Server 2012,SSIS 2012, Power BI and Denodo, Control M, Oracle 11g, SQL Developer, Informatica MDM

Senior Programmer Analyst

McMillan Shakespeare Group(MMSG)
06.2012 - 03.2015
  • BI Application BAU Support is support come enhancement for existing ETL andReporting jobs in production.
  • Requirement analysis and preparing the design documents (Functional and Technical).
  • Analyze the existing reports in Business Objects and migrate them to SSRS 2008.
  • Executed this project in Agile Scrum methodology.
  • Defined and implemented the required business transformation rules and logics.
  • Provided high quality service within the scheduled time frame.
  • Designed and developed Dashboards using SSRS 2008.
  • Designed and developed Documents, Invoices and invoice Data Files using SSRS 2008.
  • Designed and developed different kinds of reports (Tabular, matrix, Charts) using SSRS 2008.
  • Implemented multiple types scheduling for Reports based on the client requirement.
  • ToolsUsed: SQL Server 2008, SSIS 2008, SSRS 2008, TFS, RDS, Manage Engine, Visual Studio 2008

Programmer Analyst

SPI (Software Paradigms InfoTech Pvt. Ltd)
11.2010 - 03.2015
  • Data Integration Legacy is one of the migration projects which we migrated to Azure Cloud Services. The work includes below mentioned activities.
  • Requirements study & Analysis and involved in daily status calls for enhancements and issues.
  • Participated in Project effort estimation and Project planning.
  • Involved in preparation of Impact analysis for Change requests or defects received from customers.
  • Prepared design documents (Functional and Technical)
  • Prepared the Mapping document by analyzing the existing system.
  • Defining and implementing the required business transformation rules and logic's
  • Provided high quality service within the scheduled time frame.
  • Resolve the tickets within time frame defined based on priority.
  • Create new stored procedures and optimize the performance of existing procedures.
  • Monitoring the process chains based on daily, weekly & monthly.
  • Analyze the issues received from downstream consumers (Regulatory reporting and Credit Risk and QRM teams etc.) and provide quick fix or provide timelines for fix.
  • Involving in Unit testing, Integration testing, Regression testing and User acceptance tests for enhancements.
  • Coordinate with both offshore and onshore teams to provide quality service to customers.
  • Update daily Kanban board and coordinate with team.
  • Involve in Reverse engineering activities to give mapping information to Migration teams.
  • Tools Used: SQL Server 2012, SSAS, SSIS 2012, Power BI and Denodo, Control M, Oracle 12C EDW, SQL Developer, Informatica MDM

Senior Software Engineer

HBC
06.2011 - 05.2012
  • The objective of the Income List mapping project is to automate conversion of income list files (TXT, XLS,XLSX PDF, and CSV) received from employers into a COMFIN compliant format (CSV) and reduce the manual effort in this activity by using SSIS 2008. Worked on the below mentioned Modules in Rexx
  • Phase 1 BI Development project modules:
  • Electronic Statements – Reporting
  • Second Authority – Reporting
  • Terminations – Reporting
  • Part Payments – Reporting
  • Income List Mapping - ETL
  • ToolsUsed: Visual Studio 2008, SQL Server 2008, SSIS 2008, SSRS 2008, TFS, VB.NET, ASP.NET

Software Engineer

Park Mobile
01.2011 - 05.2011
  • The Park mobile business analyzers project is to develop a solution which helps the managers who are working in the park mobile to analyze information, trends and details related to parking business across regions, suppliers, zones etc. Also it will enable park mobile executives to take corrective actions for the business growth. In park mobile Business Analyzer (PBA) the following features are available.
  • EDW Modeling and development
  • OLAP Modeling and development
  • ETL Modeling and development
  • Reports/Scorecards/Dashboards development
  • User Interface development
  • Security (Forms based authentication only)
  • Tools Used: SQL Server 2008 R2, SSIS 2008 R2,SSAS 2008 R2, SSRS 2008 R2, MDX, C#.NET, SharePoint, SVN

Education

MCA - Computer Science

QIS College of Engineering&Technology
01.2009

B.Sc. - Math's, Physics, Chemistry

JKC College
01.2006

Skills

  • Data Engineering
  • Data Modelling
  • Master Data Management
  • Cloud Computing
  • Data Visualization
  • Data Analytics
  • Data warehousing
  • Data Virtualization
  • Microsoft Business Intelligence
  • Big Data Management
  • Feature Engineering

Certification

  • Certified in:
  • DP-203 Data Engineering on Microsoft Azure
  • PL-300 Microsoft Power BI Data Analyst
  • SAS Viya Administration: Essentials issued by SAS.
  • DP-200: Implementing an Azure Data Solution certified from Microsoft.
  • SQL Server 2008, Business Intelligence Development certified from Microsoft.
  • Querying Microsoft SQL Server 2012 certified from Microsoft.

Profile Summary

Achievement-driven professional targeting assignments in Application Design & Development, Data Engineering, Feature Engineering and Administration with an organization of repute preferably in Private Investment Banking, Financial Services and Retail industry., Strong experience on Big Data platform implementations in Banking and Financial Services, Retail, Lending and Fleet management services. Involved in conceptualization and design of a Confidential Big Data Platform to extract structured and unstructured data from Enterprise transactional systems and then applying Big Data analysis techniques to determine customer sentiment and risk analysis. IT professional with over 12+ years of experience in Analysis, Design, Development and testing of database Warehousing and Business Intelligence implementation experience in on-premises and cloud platforms. Possesses hands-on experience in Cloudera/Hartonworks Hadoop data platform with multiple Data Analytics, ETL tools, Spark, and various Confidential IaaS/PaaS services. Extensive experience in data preparation and modelling of datasets for user insights in both on premises and cloud platform. Experience in analysing data using python, SQL, Microsoft Excel, Hive, PySpark, Spark SQL for Feature Engineering and Machine Learning. Expert experience in implementing end-to-end BI solutions using the MSBI full stack (SSIS, SSRS and SSAS 2016/2014/2012), Azure Data Factory and Power BI (Power Query, Power Pivot, Power View), Informatica BDM. Comprehensive knowledge in technical tools such as Denodo, SQL Developer, Tableau, Informatic Power Center and UNIX. Comprehensive knowledge in programming languages such as Python, PL/SQL, T-SQL, PowerShell Scripting, C#, PySpark API. Extensive experience in database/data warehouse design for Regulatory Reporting and Data Analytics. Experience in Performance Tuning and Query Optimization. Efficiently helped in the Software Development Life Cycle (SDLC) processes including Analysis, Design, Programming, Testing and Documentation. Extensive knowledge on AWS and Azure database services implementation and migrations. Developed solutions for given technical specifications installed application software and deployed customizations and contributed to code reviews. Experience in Using GitHub, GitLab, Jenkins, Docker and Ansible to fully automate deployment. Experience on Database migrations from on premises to cloud and within cloud services. Working on database migration from on premises Microsoft SQL Server to AWS RDS PostgreSQL database tables and data. Experienced in Agile, Scrum and kanban project development practices. Incident predictability: Proactively understand and monitor customer incidents and automate root cause analysis and incident prevention. A proof of concept was undertaken that demonstrated reduction of incidents. Result-oriented, quick learner and a keen planner with the proficiencies procured.

Personal Information

Title: Technical Lead

Timeline

Technical Lead (AVP)

OCBC Bank
09.2023 - Current

Data Engineer

Beyondsoft
11.2022 - 09.2023

Data Engineer

Accenture
12.2021 - 11.2022

Senior Analyst

Quesscorp Singapore Pte Ltd
07.2018 - 12.2021

Module Lead

Mindtree Ltd
12.2015 - 04.2018

Senior Software Engineer

SenecaGlobal IT Services Pvt. Ltd
03.2015 - 12.2015

Senior Software Engineer

Fidelitone
03.2015 - 11.2015

Senior Programmer Analyst

McMillan Shakespeare Group(MMSG)
06.2012 - 03.2015

Senior Software Engineer

HBC
06.2011 - 05.2012

Software Engineer

Park Mobile
01.2011 - 05.2011

Programmer Analyst

SPI (Software Paradigms InfoTech Pvt. Ltd)
11.2010 - 03.2015

B.Sc. - Math's, Physics, Chemistry

JKC College

MCA - Computer Science

QIS College of Engineering&Technology
Ramarao Bhashyam