Summary
Overview
Work History
Education
Skills
Accomplishments
Timeline
Hi, I’m

Raghavendra Prasad Nettalam

Data Architect
Singapore
Raghavendra Prasad Nettalam

Summary

14+ years of overall experience in software development, integration, maintenance, and application support.5+ years of experience in MDM (Master Data Management) using Neoxam Datahub. Worked on MDM to handle source data for different instrument classifications like Equity Spot, Equity Indices, FX Spot, IR Swaption, IR CapFloor, Volatility Curves, CreditSpread Curves, Commodity Curves, IR Curves etc. Knowledge of Market data and Scenario generation for VaR and Stress Expected Shortfall. Experience in Requirement capturing, Functional Analysis, Designing of solutions, SDLC & Agile. Worked on data feeds from Bloomberg, Refinitiv, ICE, Markit etc. Experience in analyzing, developing, testing, and running fully automated ETL, matching, enriching, mastering, and exporting quality data to downstream systems. Experience in business analysis, product design, data integration, process improvement, requirement gathering, data analysis and data quality management. Working closely with business users on various implementation projects. As part of the development of ETL, designing process flows, writing, and reviewing specifications, Mapping documents and Unit test cases to actual build and performing Manual Testing. Thorough understanding and involvement in SDLC & PDLC that includes requirement gathering, designing, implementing, and testing. Experience in different Databases like Oracle, SQL Server and DB2. Solid understanding of ETL design principles and good practical knowledge of performing ETL design processes.

Overview

14
years of professional experience

Work History

UOB

Lead Data Architect
09.2022 - Current

Job overview

  • Currently working on delivering FRTB-IMA regulatory project using Market Risk Scenarios Solution (MRS) application that runs on NeoXam DataHub
  • Leading a team of data professionals, providing guidance, mentoring and support
  • Ensure that the data architecture and processes adhere to FRTB regulatory requirements
  • Define and maintain the overall data architecture, considering scalability and performance
  • Address data-related challenges and issue that arise during the project, providing effective solutions
  • Implement data governance practices to ensure data quality, accuracy and compliance with regulatory requirements
  • Coordinate, review and support the code/design provided by the NeoXam product team for the FRTB business requirements
  • Led the data sourcing initiative for historical market data (2007 onward) from multiple providers to support regulatory compliance under the FRTB IMA framework.
  • Ensured data completeness, accuracy, and consistency to meet requirements for downstream modules, including Stress Expected Shortfall (SES) and Default Risk Charge (DRC).
  • Developed a scalable ETL pipeline for ingesting and processing diverse datasets, facilitating seamless integration with SES and DRC modules.
  • Collaborated with global data providers Bloomberg, LSEG, ICE and S&P to source granular market data for equities, credit, interest rates, and commodities, ensuring alignment with risk factor eligibility criteria.
  • Implemented data validation and cleansing processes to address discrepancies and enhance reliability for risk modeling and capital computations.
  • Streamlined workflows between data sourcing and downstream modules, optimizing the input for stress testing and default risk analysis.
  • Created automated reconciliation mechanisms to monitor data lineage and ensure consistency across SES and DRC reporting frameworks.
  • Performed gap analysis on historical data coverage, bridging gaps with alternative sources to enhance the robustness of risk factor modeling.
  • Supported cross-functional teams by providing high-quality datasets tailored to SES stress scenarios and DRC calculations under regulatory guidelines.
  • Documented and reported key findings on data quality and impact analysis, ensuring readiness for regulatory audits and supervisory reviews.

Accenture Pte Ltd

Business and Technology Delivery Associate Manager
11.2021 - 09.2022

Job overview

  • Part of GIC Technology Group, worked on Master Data Management project using NeoXam Datahub
  • Business Analysis to develop functional specification from business requirement to comply with key regulatory initiatives
  • Contribute to designing of Data Model and Design Architecture
  • Write Analysis & Design Specification Documents (A&D) detailing the design approach using NeoXam Datahub
  • Build and configure the design in NeoXam using various Datahub components like Feed Procedures, Feed Maps, Value Maps, Business Rules, Subscription Modules, Scheduled Tasks, DQCs etc
  • Perform Unit Test and System Test the builds and give Build Validation Demonstrations to the data stewards and users

Luxoft Information Technology Singapore Pte Ltd

Senior Consultant
08.2019 - 11.2021

Job overview

  • Part of the UOB NeoXam product team worked for Market Data Services (MDS) and Market Risk Scenarios Solution (MRS) projects
  • Customize NeoXam Datahub application based on business requirements agreed with the client considering standard best practices and internal convention rules defined by the product support team
  • Test the deliverables on an end-to-end basis before packaging the customization
  • Provide support during SIT and UAT phases in identifying and resolving issues to the client’s satisfaction within established time frames and with Neoxam agreed scope
  • Deliverables: Core data dictionary enhancements, Feed Maps, Feed Procs, Input and Output Value Mapping, Client Subscriptions, DSP/DSR, Business Rules, Data Derivation service and Schedule Tasks
  • Managing change requests and delivery of changes and backlogs for Market Risk Scenarios

Standard Chartered Bank

System Analyst
04.2015 - 08.2019

Job overview

  • Part of the SCB Liquidity Risk team worked for ALM Fermat, GT Fermat, IRRBB and Basel III projects
  • Implemented pricing and valuation functionalities for IRPV01 and EVE computation based on business requirements
  • Implemented Liquidity risk computation functionalities including LCR and NSFR and made reports available to users
  • Involved in setting up configurations for the Moody’s and SCB customized lookup tables used for ETL purposes
  • Analyzing requirements and giving probable suggestions on the ETL requirements
  • Interaction with group finance users for gathering the data requirements
  • Involved in preparing the Technical Specifications, Unit test case and unit test result documents as per SCB standards
  • Develop and unit test of PL/SQL code and DataStage jobs as per the mapping specifications provided by business team to meet the user requirements
  • Debugging the PL/SQL code and DataStage jobs depending on the issues raised by testing team
  • Involved effectively in SIT/UAT phases of the project to ensure the batch jobs run with defect free code
  • Maintaining (Create/Modify) Control-M jobs and interfaces
  • Triggering the ETL batch runs through UNIX/Control-M part of PDLC
  • Involved in monitoring of Control-M jobs for failure alerts, job flow analysis and troubleshooting
  • Involved in raising Change Request (CR), preparation of AIG (Application Installation Guide) for PSS to implement Change Request (CR) on Production system
  • Worked on Moody’s Fermat application for data analysis, querying and debugging purposes
  • Coordination with multiple teams to assess the impact during the code change process
  • Extensive use of JIRA to raise report and track issues/tasks
  • Involved in generating the source code package using GIT and JENKINS
  • Extraction of data from different source systems like psftp, sophis, murex, etc., with various formats
  • Coordination with Onsite team, Users and reporting to Onsite/Offshore manager
  • Analysis and Resolution of techno-functional problems based on specifications, mappings and problems reported by the team and users of Business
  • Involved in translating functional requirements into technical requirements
  • Interacted with business representatives for gathering the Data requirements

Deloitte Consulting India Pvt. Ltd.

Associate Developer
09.2014 - 02.2015

Job overview

  • Part of the ConvergeHEALTH team worked on Health Care project
  • Worked with clients from varying Geographies, was also part of various implementation/migration projects
  • Understanding business Mapping documents and translating them to functional specifications
  • Involved in every phase of ETL Implementation including Build, Test and Deploy
  • Involved in improving the performance tuning of DataStage jobs wherever necessary
  • Work with the development team to build the change requests, from the business, on top of existing application
  • Involved in preparation of Data Migration document contains the instructions for code promotion from one environment to another
  • Involved in troubleshooting the DataStage and PL/SQL codes
  • Developed SQL queries as per the business requirements

IBM India Pvt. Ltd.

ETL Developer
09.2010 - 09.2014

Job overview

  • Was part of the ETL team worked on ‘Web Intelligence System’ project
  • To create and provide list of bad URLs to delete from Search Index for both ibm.com and w3
  • The application is made of 3 major components: ETL (Extract, Transform, and Load) using IBM’s BACC infrastructure for DataStage, Database (DB2 and Netezza) and IBM’s BACC infrastructure for Cognos reporting
  • The databases DB2 & Netezza are in hosted environments and the Data is loaded into databases through DataStage jobs and UNIX scripts
  • Able to understand different technologies (Cognos, JavaScript, DB2 etc.,) and worked with team to integrate DataStage jobs with other areas
  • Effectively involved in monthly DataStage processing activities
  • Able to establish ssh connection between the DataStage server and Database server to execute Unix scripts from DataStage server while the scripts run on Database server which improves high performance of data processing
  • Automated UNIX scripts to run through DataStage Sequence jobs
  • Used DataStage Designer to design DataStage jobs to extract, transform and load the data from source to target
  • Used DataStage Designer to Export, Import the DataStage Jobs and Metadata of tables
  • Involved in improving the performance tuning of DataStage jobs wherever necessary
  • Designed various Job Sequences for the set of jobs developed to send out email notifications to the users
  • Scheduling ETL jobs through DataStage Director
  • Migrated jobs from DataStage8.7 to9.1 to meet business goals of the project
  • Developed SQL queries as per the business requirements
  • Prepared ETL Operations Manual in implementing the changes
  • Coordinate with QA and provide inputs in Integration Testing

Education

Anna University

B.E. from Electronics and Communications Engineering
04.2008

Skills

  • Products: NeoXam Datahub, IBM DataStage, Traditional ETL Tools
  • Programming Skills: Python, PL/SQL, Unix Shell Scripting
  • Database: SQL Server, Oracle, DB2
  • Scheduling Tools: Control-M, Unix CLI - Cron
  • Code/Defect Management Tools: JIRA, SVN Tortoise, GitLab, SharePoint, Jenkins
  • SDLC/PDLC: Agile, Waterfall, Continuous Integration/Delivery

Accomplishments

    1. AWS Certified Cloud Practitioner

    2. IBM Certified Solution Developer – InfoSphere DataStage v8.5

    3. Oracle Database 11g Administrator Certified Associate

Timeline

Lead Data Architect

UOB
09.2022 - Current

Business and Technology Delivery Associate Manager

Accenture Pte Ltd
11.2021 - 09.2022

Senior Consultant

Luxoft Information Technology Singapore Pte Ltd
08.2019 - 11.2021

System Analyst

Standard Chartered Bank
04.2015 - 08.2019

Associate Developer

Deloitte Consulting India Pvt. Ltd.
09.2014 - 02.2015

ETL Developer

IBM India Pvt. Ltd.
09.2010 - 09.2014

Anna University

B.E. from Electronics and Communications Engineering
Raghavendra Prasad NettalamData Architect