Summary
Overview
Work History
Education
Skills
Websites
Certification
Citizenship
Honors And Awards
Accomplishments
Timeline
Generic

Amanullah Khan Ameer Khan

Chennai

Summary

  • As a Principal Data Engineering Lead at HCLTech Malaysia, I don't just strategize; I'm a hands-on architect driving digital transformation through robust data integration and enhanced operational efficiency.
  • My deep expertise across ETL/DWBI, Big Data (Cloudera, Spark, Hive), and diverse database platforms (Oracle, Teradata) empowers me to deliver impactful solutions that optimize data flow and enable critical business objectives.
  • I am an exceptional communicator and an adept problem-solver, skilled at navigating complex stakeholder dynamics and resolving conflicts. This commitment to excellence is consistently recognized through top-tier performance honors and client accolades for delivering innovative, quality-driven solutions that meet stringent business and regulatory standards.
  • Oh, a quick tip: when my email lands in your inbox, consider it a digital promise that a solution is typically on its way. It's a principle I live by, and it's even in my signature:"U~WE~DATA - United We Deliver Data, Together We Enable Analytics"

My career journey has been consistently marked by significant contributions and validated through prestigious honors & Certifications from esteemed organizations and clients ,which I'd keep saved to the last section :)

Overview

13
13
years of professional experience
1
1
Certification

Work History

Principal Data Engineering & Operations Lead

HCLTech Malaysia, Client:Maybank
09.2023 - Current

Spearheading Maybank's M25+ Digital Transformation Initiatives:

Drove the data architecture and integration strategies for key M25+ Digital Transformation projects, ensuring seamless data flow and integrity across new and legacy systems, and enabling strategic business objectives and BCBS (Basel Committee on Banking Supervision) implementations.

  • Sp2A - Trade Transformation: Led the complex data migration and integration from legacy Oracle trade systems to the new Finastra vendor platform. Orchestrated a parallel concurrent production existence strategy, managing data reconciliation during the transition until the eventual decommissioning of the legacy TBLS system. Applied state-of-the-art data principles and architectural guidelines within REDW-Teradata to accommodate concurrent data flows and ensure uninterrupted downstream consumption.
  • SP1F - Loan Origination System Transformation: Spearheaded the robust data integration of Maybank's new in-house PESTOS Loan Origination System with the REDW-Hadoop DataLake. Implemented a hybrid data integration strategy, seamlessly combining real-time online data consumption via API calls with traditional batch outbound file integration, ensuring comprehensive and timely data availability for analytics and reporting.
  • SP4e - SME Purchasing Transactions Optimization: Led the data-centric implementation for changes related to new SME Purchasing Transactions rules. Aligned REDW data structures and pipelines with the latest product behavior for Maybank's SME customers, directly enabling new product rollouts, enhanced customer insights, and optimized business processes.

CARISMA (Capital Risk Management) Initiative: Orchestrated the end-to-end data integration strategy and development for this critical initiative, a cornerstone for Maybank's Capital Risk Management.

  • My role encompassed designing, developing, and managing complex data flows from REDW outbounds into Maybank's Universal DataLake (UDL), a distinct CDH cluster meticulously designed to serve Malaysia (MY) and 11 other Overseas Units (OU'S) (e.g., Philippines, China, Indonesia, Hong Kong, Brunei, Vietnam, Cambodia).
  • My contributions in CARISMA extended to developing necessary data provisioning mechanisms from UDL to a diverse suite of critical downstream applications, including LiMa (Limit Management), CPA (Customer Profitability Analytics), Pre-Deal & Post-Deal/Pricing, ECL (Expected Credit Loss), EC (Economic Capital), Credit Status & Account-Risk Tagging, and BSM (Balance Sheet Management) alongside Enabling seamless data accessibility for downstream consumers by leveraging BDSQL services for efficient querying of UDL Hive tables, ensuring timely and accurate data delivery for advanced risk analytics, financial reporting, and strategic decision-making across the group .

Big Data Visualization & Key Risk Indicator (KRI) Analytics Initiative :

  • Architected, developed, and deployed the first-ever OAS/OBIEE dashboard directly atop a BigData-Hadoop system within Maybank, establishing a groundbreaking capability for real-time risk intelligence and strategic oversight.
  • Designed a comprehensive KRI dashboard solution, visually representing 7 critical trade-related Key Risk Indicators for executive and risk management oversight. Beyond standard metrics, these included: Trade Settlement Failure Rate, Open Trade Exposure (Value), Trade Reconciliation Discrepancy Count, Counterparty Exposure Limits Breached, Average Trade Processing Time, Number of Exceptions in Trade Flow, and BG's Processed.

Led the end-to-end OAS/OBIEE development lifecycle:

  • RPD Development: Designed and implemented the full Oracle Business Intelligence Repository (RPD) across Physical (establishing secure data source connections to Hadoop), Business Model & Mapping (defining complex logical tables, precise join conditions, advanced calculated measures), and Presentation layers (structuring intuitive subject areas, hierarchies, custom groups) to ensure optimal query performance on Hadoop data.
  • Catalog Management: Established robust OBIEE Catalog management protocols, including granular folder structures, access permissions, object lifecycle management (e.g., migration between environments), and scheduling mechanisms for automated report generation.
  • Dashboard & Report Creation: Crafted highly interactive dashboards and detailed analytical reports featuring advanced visualizations, dynamic prompts, precise filters, drill-down capabilities, and embedded performance tuning, enabling daily monitoring and deep-dive analysis of critical risk trends.
  • Successfully piloted and implemented Kerberos authentication, including the meticulous setup of krb5.conf configurations and keytab file generation (ktutil) on the OAS/OBIEE WebLogic server, ensuring secure and seamless data access to the underlying REDW-Hadoop system.
  • Devised and deployed a resilient Linux-level OS cron job to proactively auto-renew the Kerberos keytab file. This strategic automation mitigated intermittent ODBC driver errors caused by expired keytabs, ensuring continuous and stable data fetching for the OAS/OBIEE dashboard and maintaining uninterrupted connection integrity between the Hadoop system and the OBIEE RPD Physical layer.

Technical Lead

HCLTech Malaysia, Client:Maybank
Kuala Lumpur
11.2018 - 09.2023

Architecting & Optimizing Maybank's Regional & OU Data Warehouse/Lake Integrated Business data from diverse Legacy source systems into Data Warehouses-Oracle/Teradata/Hadoop, including:

  • Directed end-to-end data engineering for Maybank's bifurcated Regional Enterprise Data Warehouse (REDW), integrating a Teradata-based analytical platform (REDW-Teradata) with a Hadoop data lake (REDW-Hadoop) on a Cloudera Distribution Hadoop (CDH) cluster via a Big Data Appliance (BDA) box.
  • Pioneered the technical implementation of Teradata FSLDM (Financial Services Logical Data Model) on REDW-Teradata, customizing it for intricate financial data architecture and regulatory compliance. Simultaneously, developed and managed the REDW-Hadoop environment using ODI (Oracle Data Integrator) as the ELT tool and PySpark for efficient Sqoop operations.
  • For REDW-Hadoop, architected and implemented a robust, multi-Layer data ingestion and transformation pipeline: progressing data from raw sources through an ODS filestore, an external Hive Staging Layer (schema-on-read with all datatypes as string), to an internal Hive Semantic Integration (SI) Layer (with precise data type casting), and finally into specialized DataMarts (e.g., Loan Origination) for downstream consumption
  • Engineered a comprehensive Audit Balancing Control (ABC) Framework for all ODI workflows, systematically capturing critical metadata (e.g., workflow start/end times, total source rows, total inserted rows) into a centralized Hive table (backed by HDFS). Leveraged PySpark DataFrames to automate real-time updates and status tracking within this audit table, ensuring data integrity, traceability, and operational oversight across the entire data lifecycle at REDW-Hadoop .
  • Developed and optimized complex ETL/ELT pipelines using Informatica PowerCenter for structured data ingestion into Teradata and ODI/PySpark for Hadoop. Maintained operational excellence within an AIX IBM Unix environment, implementing advanced shell scripting for automation, job orchestration, and robust system monitoring to ensure data pipeline integrity and performance.
  • OFSAA and IBM Core-Banking Legacy Source for comprehensive financial product data (Term-Loans, Block Discounting, FCL, Lombard-tranche & STRC, OD).
  • CARDLINK mainframe (CDPCRD) for detailed cardholder demographics and transaction history, including account lifecycle events (e.g., Account opening and closure dates , Regulatory/Ledger Balances at Account Level etc).
  • Trade Systems (TBLS, Tradeline - Oracle-based): Integrated high-velocity, high-volume transactional trade data, encompassing order lifecycles, execution details, settlement information, and critically, comprehensive counterparty information, 3rd Party Global Customer Identification File (GCIF) data, and specialized Earmark accounts. This granular data was crucial for real-time risk assessment, precise compliance reporting, and supporting complex front-to-back office trading operations and financial reconciliation.
  • Individually led and delivered high-impact data integration projects for key financial products and regulatory initiatives, including Rent-to-Own (RTO), Lombard Lending, and FATCA (Foreign Account Tax Compliance Act), requiring deep dives into financial product lifecycles, complex data mapping, and strict adherence to regulatory standards.
  • Ensured seamless data provisioning from Warehouse to critical downstream applications and regulatory frameworks, directly supporting: LCR (Liquidity Coverage Ratio) ,RDMS (Risk Data Mart System) ,MFRS9 (Malaysian Financial Reporting Standard 9) , GTRC (Global Trade Repository Reporting and Compliance)

Senior Software Engineer

Accenture , Client:CIGNA HEALTHCARE
04.2017 - 06.2018
  • Designed and developed highly complex ETL processes using Informatica PowerCenter, creating intricate mappings, transformations, and workflows to integrate high-volume, disparate healthcare data sources into the Teradata Enterprise Data Warehouse for CignaHealthCare .
  • Contributed as a Senior Data Engineer to the critical data integration efforts following the Healthspring acquisition into Cigna, establishing Cigna HealthSpring (CHS). This involved extensive ETL development and data warehousing initiatives, delivering robust, scalable, and compliant data solutions essential for integrating disparate healthcare datasets and enabling strategic analytics and operational reporting for the combined entity.
  • Ensured data integrity, consistency, and accuracy through rigorous validation and error handling mechanisms ,delivering robust, scalable, and compliant data solutions essential for strategic healthcare analytics and operational reporting .
  • Applied advanced Teradata database management and performance optimization techniques, writing highly optimized SQL queries, developing efficient stored procedures, and strategically utilizing Teradata utilities for efficient data loading, indexing strategies, and query tuning on massive datasets. This directly contributed to enhanced query performance and reduced data latency for critical analytical applications.
  • Oversaw and secured critical healthcare data domains, including comprehensive patient demographics, complex medical claims, extensive provider networks, prescription fulfillment data, and detailed clinical outcomes. Ensured strict adherence to all healthcare data privacy regulations, including HIPAA compliance, throughout the entire data lifecycle (ingestion, processing, storage, and access).
  • Operated on a Kanban-based full Agile project lifecycle (sprint/PI planning, backlog management, UT/SIT/UAT/regression testing, production rollout), providing technical guidance to junior team members and overseeing seamless deployments.
  • Developed an automated Python script for proactive data quality profiling of massive outbound source files. Utilized Pandas Library for efficient chunked file reading and for robust pattern matching, the script analyzed the initial 25 rows for critical data quality issues such as inconsistent date formats, presence of junk characters, or unexpected literals. This method provided immediate anomaly reports (output to a text file) and then report them immediately to source teams, bypassing lengthy data warehouse loading processes for initial feedback and significantly accelerating data remediation cycles for diverse file types (e.g., CSV, delimited).

Software Engineer

Accenture, Client:APLL(APL Logistics) & NOL(Neptune Orient Lines)
Bangalore
04.2015 - 06.2018
  • Contributed as a Software Engineer to APLL NOL, focusing on complex data integration initiatives leveraging Oracle Data Integrator (ODI) to centralize critical logistics and order management data for enhanced operational visibility.
  • Integrated data from disparate logistics source systems: a)Oracle OTM (Oracle Transportation Management): Processed comprehensive shipment, location, and freight details, ensuring accurate reflection of transportation operations.
    b)LSS (Logistics Services Suite): Ingested booking and purchase-order information from this custom Accenture application, where front-end entries mapped to an underlying Oracle Database Table via database synonyms.

Developed intricate ETL solutions using Oracle Data Integrator (ODI), applying advanced features and customizations to optimize data flow and ensure reliability:

  • Reusable Mappings: Implemented re-usable mappings to standardize common transformation logic, significantly reducing development effort and ensuring consistency across multiple data flows.
  • Flow (Yellow) Mappings: Designed complex flow mappings to define sophisticated data transformations and integration patterns between source and target systems.
  • LKM/IKM Customization: Modified and extended Loading Knowledge Modules (LKMs) and Integration Knowledge Modules (IKMs) to tailor data loading and integration strategies. This included custom pre/post-processing steps, error handling routines, specific data validations, and optimization for performance-critical scenarios, directly impacting data throughput and reliability.
  • Change Data Capture (CDC) Enablement: Implemented CDC mechanisms on source tables (e.g., using Journalizing components) to enable efficient, real-time incremental data loads. This involved setting up subscribers, managing journal tables, and developing ODI interfaces to process only changed data, vastly improving load times and system resource utilization.
  • Advanced ODI Development: Utilized advanced ODI components such as procedures, variables, sequences, and user-defined functions to handle complex business logic, orchestration of multi-step processes, and robust error logging. Focused on creating highly performant and maintainable ODI interfaces, ensuring data quality and operational efficiency for critical logistics data.
  • Developed pixel-perfect BI Publisher (XML Publisher) reports for operational and analytical needs, ensuring precise data presentation and formatting. This involved designing complex data models (SQL-based), and creating versatile report templates (e.g., RTF, PDF, XSL-FO) using BI Publisher Desktop.
  • Implemented report bursting capabilities for efficient, targeted distribution, including cloud-bursting options to manage high-volume outputs and scale.
  • Configured iBots for automated, rule-based email distribution, ensuring reports were delivered to specific recipients with dynamic content, subject lines, and attachments only when predefined business conditions were met.
  • Applied advanced XML Publisher features such as conditional formatting, dynamic grouping, sub-templates, and secure parameter handling to enhance report functionality and user experience

Associate Software Engineer

DXC(Previously CSC-Computer Sciences Corp), Client:Arabtec
03.2014 - 04.2015
  • Contributed as an Associate Software Professional to the implementation of Oracle Business Intelligence Applications (OBIA) 11.x, focusing on establishing robust enterprise-level analytics solutions.

Specialized in OBIA 11.x configuration and deployment utilizing BIACM (Business Intelligence Applications Configuration Manager) as the central tool for managing the entire application lifecycle. This included:

  • Functional Area Setup: Configured specific OBIA functional analytics areas (e.g., Financial Analytics, Supply Chain Analytics) based on business requirements.
  • Source System & Data Load Parameter Definition: Defined and managed source system configurations (e.g., Oracle EBS, PeopleSoft) within BIACM, setting up data load parameters and execution plans for various analytical components.
  • ETL Orchestration via BIACM and ODI: Used BIACM to deploy and manage ETL processes. Leveraged Oracle Data Integrator (ODI) as the primary ETL tool, working with pre-built OBIA mappings and customizing them as needed for specific client data models and transformations. Explored or utilized GoldenGate integration with ODI for advanced real-time Change Data Capture (CDC) scenarios, ensuring highly efficient incremental data updates.
  • End-to-End Installation & Configuration: Performed comprehensive installation and configuration of OBIA 11.x components, including the Oracle BI Server, Presentation Services, ODI, and WebLogic Server instances.
  • BI Repository (RPD) & Presentation Services Customization: Configured and extended the OBIA Repository (RPD) across all layers (Physical, Business Model, Presentation) to support custom reporting requirements, new hierarchies, and derived measures. Developed and customized dashboards and analyses within OBIEE Presentation Services.
  • Security & Performance: Implemented role-based security within OBIEE and assisted with performance tuning activities to optimize data loads and report query execution.
  • Data Load Management: Executed and monitored full and incremental data loads, troubleshooting and resolving issues to ensure timely and accurate data availability for business intelligence.

Associate Software Engineer

DXC(Previously CSC-Computer Sciences Corp), Client:Al-Abbar
07.2013 - 03.2014
  • Contributed as an Associate Software Professional at CSC (now DXC Technology) for Al-Abbar, a key client in the manufacturing and retail sector, focusing on the end-to-end implementation of Oracle Business Intelligence Applications (OBIA) 7.x. This initiative aimed to establish a robust analytics platform for critical business insights.

Managed the complete OBIA 7.x implementation lifecycle, covering core components and stages:

  • Prerequisites & Initial Setup: Assisted with foundational database setup for the data warehouse (e.g., creating schemas, tablespaces) and ensured all prerequisite software installations were met.
  • System & Application Installation: Executed end-to-end installation of OBIA 7.x components, including the Oracle BI Server, Presentation Services, and associated web components. This also involved the installation and configuration of Informatica PowerCenter (repository and integration services) and DAC (Data Warehouse Administration Console) client and server.
  • Source System Configuration & Data Ingestion: Configured diverse source data systems (e.g., Oracle EBS, flat files like CSV) and performed initial full-load data ingestion, critical for foundational dimensions such as the Chart of Accounts (COA). This involved defining source system parameters within DAC and mapping data for OBIA's pre-built ETL.
  • ETL Development & Customization: Worked extensively with Informatica PowerCenter workflows and mappings within the OBIA framework. This included customizing out-of-the-box (OOTB) ETL to align with client-specific data models and business rules, ensuring efficient data extraction, transformation, and loading into the OBIA data warehouse.
  • BI Repository (RPD) Development & Configuration: Contributed to the setup and customization of the OBIA Repository (RPD) across its physical, logical, and presentation layers. This involved modifying existing subject areas, hierarchies, and measures, or creating new ones to address specific client reporting requirements.
  • Presentation Services & Reporting: Configured OBIEE Presentation Services, enabling access to out-of-the-box analytical dashboards and reports. Assisted in building and customizing new dashboards, analyses, and alerts to meet business user demands for critical insights.
  • Security Configuration: Implemented user and group security within OBIEE, defining object and data level permissions to control access to dashboards and reports.
  • Data Load Execution & Monitoring: Executed and monitored full and incremental data loads through DAC, ensuring data integrity, proactively resolving load failures, and optimizing execution plans for timely data availability in the BI dashboards.
  • Post-Implementation & Support: Conducted comprehensive system verification, assisted with data validation, and provided initial support for post-go-live operations.

Associate Senior Software Engineer

DXC(Previously CSC-Computer Sciences Corp), Client:Unikai
Chennai
07.2012 - 06.2013
  • Contributed as an Associate Software Professional at CSC (now DXC Technology) for Unikai, a prominent client in the manufacturing and distribution sector. This role involved providing foundational support in Oracle Database Administration and WebLogic Server Management for various client environments.

Performed essential junior-level DBA activities, including:

  • Database installation, configuration, and patching assistance.
  • User, role, and privilege management to maintain database security.
  • Executing routine backup and recovery operations (e.g., using RMAN scripts for cold and hot backups).
  • Monitoring database performance, space utilization (tablespaces, datafiles), and troubleshooting connectivity issues.
  • Managing database objects and ensuring operational stability.

Handled core WebLogic administration tasks for Oracle and OBIEE environments, including:

  • Installation, configuration, and domain management of WebLogic Server instances.
  • Deployment and undeployment of Java EE applications (WAR/EAR files).
  • Configuring JDBC data sources and connection pools for application connectivity.
  • Managing JMS resources for messaging services.
  • Monitoring server health, performance metrics, and managing log files via the WebLogic Administration Console.
  • Provided direct support for OBIEE application server components configured within WebLogic, ensuring their availability and performance.
  • Executed server startup, shutdown, and general troubleshooting of application deployment and runtime issues.

Education

Bachelor of Engineering - Computer Science

Aalim Muhammed Salegh College Of Engineering
Chennai, Tamil Nadu, India
11-2012

Standard XII - Higher Secondary School

Velammal Matriculation Higher Secondary School
Chennai, Tamil Nadu, India
03-2008

Standard X - High School

Spartan Matriculation School
Chennai, Tamil Nadu, India
03-2006

Skills

Technical Skills & Expertise

Databases & Data Warehousing:

  • Oracle Database
  • Teradata
  • Hive (including HDFS)
  • Teradata FSLDM

ETL & Data Integration Tools:

  • Informatica PowerCenter
  • Oracle Data Integrator (ODI)
  • Oracle GoldenGate real-time Replication (familiarity/utilization for CDC)
  • Change Data Capture (CDC) (via ODI)

Big Data Ecosystem:

  • Cloudera Big Data Platform (CDH)
  • HDFS Operations
  • Hive Querying through Hue/Beeline Interface
  • Apache Spark/Sqoop (via PySpark)

Business Intelligence & Reporting:

  • Oracle Business Intelligence (OBIEE/OAS)
  • Oracle BI Publisher (XML Publisher)
  • DAC (Data Warehouse Administration Console)
  • BIACM (Business Intelligence Applications Configuration Manager)

Programming & Scripting Languages:

  • Python
  • Oracle PL/SQL
  • Shell Scripting
  • SQL (General)

Operating Systems & Administration:

  • Unix Command Proficiency
  • IBM AIX
  • Red Hat Enterprise Linux
  • WebLogic Server Administration (including JDBC/JMS configuration)
  • Kerberos Authentication
  • RMAN (Recovery Manager)

Methodologies & Compliance:

  • Agile Methodology (including Sprint & PI Planning)
  • Data Quality Frameworks
  • Data Governance Protocols
  • HIPAA Compliance
  • Audit Balancing Control (ABC) Framework

Functional & Operational Skills
  • Exceptional Communicator: I consistently articulate complex technical and business concepts clearly across all organizational levels, fostering understanding and driving alignment
  • Adept Problem-Solver: I excel at identifying the root causes of intricate technical, functional, and operational challenges, then developing and implementing effective and sustainable solutions
  • Skilled in Conflict Resolution & Stakeholder Management: I effectively mediate high-level disputes, including those between senior directors, to achieve consensus, mitigate risks, and maintain project momentum
  • Strategic & Commercially Astute: I proactively explore new billability opportunities, identify avenues for value creation, and contribute directly to business growth within project scopes
  • Team Empowerment & Mentorship: I provide critical support and guidance to teams encountering technical, functional, or operational roadblocks, ensuring timely issue resolution and fostering autonomy and growth
  • Committed to Project & Process Optimization: I actively contribute to streamlining workflows, enhancing operational efficiencies, and ensuring rigorous adherence to project timelines, quality standards, and overall success

Certification

  • Generative AI Fundamentals by Google, 11358302, 09/01/24
  • Data Science With Python by Great Learning, 07/01/24
  • AWS Partner: Generative AI Essentials by Amazon Web Services, 02/01/24
  • Google Analytics Individual Qualification (GAIQ), 94084287, 10/01/22
  • IBM Data Science by Coursera, 10/01/20
  • BigData Essentials by Skillsoft, 15848530, 04/01/20
  • Oracle Products Support Specialist by Oracle, 03/01/13
  • BI Challenge to Go (BIC2G) Certificate by Oracle, 03/01/13

Citizenship

India, Tamil Nadu, Chennai

Honors And Awards

  • Distinct Performance Award, HCL Technologies, 2023-24, 2022-23, 2019-20, 2018-19
  • Client Plaque Award for Excellence, Maybank, HCLTech, 2022-23, 2023-24
  • High Performance Award, Accenture, 2016-17
  • Client Spot Award, Al-Abbar, CSC, 2013-14
  • Second Runner-up in Amateur Kickboxing League, AMAFC, 2022

Accomplishments

  • FY2013-14: Client Spot Award – Recognized by Al-Abbar, CSC, for immediate and impactful contributions.
  • FY2016-17 (Issued April 2016): High Performance Award – Received from Accenture, acknowledging outstanding performance and dedication.
  • FY2018-19 (Issued November 2019): Distinct Performance Recognition – Earned this top-tier performance honor from HCL Technologies, highlighting exceptional contributions in this fiscal year.
  • FY2019-20 (Issued November 2020): Distinct Performance Recognition – Consistently recognized by HCL Technologies with this prestigious performance award for the subsequent fiscal year.
  • FY2022-23: I) Distinct Performance Honor – Achieved this top-most performance recognition from HCL Technologies for consistent excellence. II) Client Plaque Award for Excellence – Garnered this significant accolade from Maybank, in collaboration with HCLTech, for delivering innovative and scalable solutions with impeccable quality and business impact.
  • FY2023-24: Distinct Performance Honor – For the fourth time, I've consistently earned HCL Technologies' top-most performance recognition.
  • Client Plaque Award for Excellence – Received for the second consecutive year from Maybank, via HCLTech, reinforcing my impact in delivering high-quality, business-critical solutions.
  • FY2024-25: Distinct Performance Honor – Achieved this top-most performance recognition from HCL Technologies for the fifth consecutive fiscal year, a testament to my Hard-Earned Sustained Excellence.

Timeline

Principal Data Engineering & Operations Lead

HCLTech Malaysia, Client:Maybank
09.2023 - Current

Technical Lead

HCLTech Malaysia, Client:Maybank
11.2018 - 09.2023

Senior Software Engineer

Accenture , Client:CIGNA HEALTHCARE
04.2017 - 06.2018

Software Engineer

Accenture, Client:APLL(APL Logistics) & NOL(Neptune Orient Lines)
04.2015 - 06.2018

Associate Software Engineer

DXC(Previously CSC-Computer Sciences Corp), Client:Arabtec
03.2014 - 04.2015

Associate Software Engineer

DXC(Previously CSC-Computer Sciences Corp), Client:Al-Abbar
07.2013 - 03.2014

Associate Senior Software Engineer

DXC(Previously CSC-Computer Sciences Corp), Client:Unikai
07.2012 - 06.2013

Bachelor of Engineering - Computer Science

Aalim Muhammed Salegh College Of Engineering

Standard XII - Higher Secondary School

Velammal Matriculation Higher Secondary School

Standard X - High School

Spartan Matriculation School
Amanullah Khan Ameer Khan