Summary
Overview
Work History
Education
Skills
Accomplishments
Certification
Timeline
Generic

Krishna Meenon S

Summary

Dynamic and results-driven professional with over 20 years of experience in the software development life cycle (SDLC), specializing in the implementation and management of software applications. Expertise includes 11 years in Big Data platform administration, overseeing environments with more than 1,000 nodes and petabytes of data, along with proficiency in managing Azure cloud environments for HDInsight clusters and data lake flows. Proven track record in data warehouse projects utilizing Datastage, Oracle, Db2, and Teradata, complemented by hands-on experience in maintaining Hadoop ecosystems while implementing robust security measures. Skilled in establishing standards and processes across multiple clusters, ensuring high availability and efficient resource management within Hadoop environments.

Overview

20
20
years of professional experience
7
7

Certifications

Work History

Head , Data Platform Engineering & SRE

Standard Chartered Bank
10.2021 - Current
  • Patching of the platforms and upgrading them versions of the platforms
  • Successfully upgradation of the HDI 4.1.0 to 5.1.0 to improvise the resiliency and security of the platform from Zero day vulnerabilities and CVE's
  • Proved successful working within tight deadlines and a fast-paced environment.
  • Used critical thinking to break down problems, evaluate solutions and make decisions.
  • Managing the platform services that holds 30+ applications.
  • Automation of cost alerts and time to time optimization of the platform on the Azure cloud for data access patterns and minimize the use by avoiding non standardized patterns and heavy egress costs from Cloud to onprem datalake
  • Manifests are build using Terraform and ADO pipelines to facilitate the end-to-end automation of the infra provisioning to the setup of the HDInsight platform services and configurations.
  • Designed the yarn scheduling patterns to best use of the capacity by the applications and reduces the overall capacity needs by 33%.

Head, Data Platform Engineering & SRE

Standard Chartered Bank
02.2018 - Current
  • Managing the platform services that hold 180+ source systems and 95+ Consumption systems with variety of tech stacks like Dremio for acceleration, DataIku for ML operations, MicroStrategy and Tableau for BI Reporting
  • Successfully delivered the data center migration for the entire datalake platform including the HDP to CDP stack upgrade, 4PB of Data, 100+services and 95 applications that runs 230K jobs per day.
  • Efficiently operated the platform with an TCO of $ 34millions then time to time cost savings and optimization.
  • Build the operating model for an effective recharging of the overall platform cost to the tenants of the platform!
  • Migrated the datalake platforms from a physicalized infra to complete virtualized environment and segregation of Compute and storage resources for an independent scaling.
  • Delivered the platform with 100% secure compliant requirements for Data-in-motion encryption, Data-at-rest encryption, MFA, Network micro segmentation, Usage of HSM modules, Data tokenization capability for tenants.
  • Developed and Deployed ADO pipelines for an end-to-end automation of the infra provisioning to the setup of the CDP platform services and configurations.
  • Configured and optimized YARN resource management, increasing cluster utilization by 40% without additional hardware spend.
  • Automation of Apache Nifi templates deployment was pivotal in achieving the reduction of application code deployments during the product migrations by 95%.

Senior Support Manager

Standard Chartered Bank
03.2014 - 03.2019
  • The Enterprise Data Management Platform is a Group Asset that supports direct end user and selfservice access to data for strategic and operational analytic purposes to holds the majority of the banks analytic data in a single integrated and validated data structure. Acquired extracts are untransformed when they enter EDM in Tier 1 where they are staged and a Source Image is kept using Bigdata-Hadoop Framework and then data transformed into the Tier 2 (Core Integrated 3NF Data Model) using Teradata FSLDM design together with Tier3 (Symantec view layer) that consists subset of the data an individual, business or country requires using reporting tools using MicroStrategy and Tableau
  • Perform the batch runs to load the different TP systems data using customized ingestion framework and Hortonworks Data Flow & Ni Fi
  • Administrate the cluster for day to day activities
  • Prepare the capacity utilization, planning and forecasting reports and dashboards
  • Create the Roles and Providing access to technology and business operational users with ACL's, Kerberos & Ranger policies
  • Configured HA of HDFS and YARN. Monitored multiple hadoop clusters environments using Ganglia and Nagios, smart sense and Grafana
  • Configuring the replication settings for folders to transfer the data using WANdisco and DISTCP
  • Implement the new nodes expansion to the existing cluster and create separate YARN queues for different application workloads
  • Perform the stack upgrades and Nodes expansions as per the requirement
  • Develop and implement the automated housekeeping scripts for logs and system files
  • Check the HDFS balancing between the data nodes and initiate the data balancing
  • Tuning the Yarn queues as per the cluster growth and setting the user Quotas.
  • Perform Maintenance activities on Hive metastore-Postgres periodically.
  • Communicate the batch & cluster status with stakeholders including upgrades and downtimes.
  • Implement the change implementations for application code deployment and Configure the CDC mappings for the source system side DDL changes and source the data
  • Co-ordinate with vendors IBM, Teradata & Hortonworks on the product bugs and resolve the issues as per the severity guide lines.
  • Monitor the Teradata viewpoint for cluster health check status, locks, explain plans and user sessions as per the TASM settings

Technical Lead

Standard Chartered Bank
07.2012 - 03.2014
  • The Standard Chartered Bank (SCB) is sending the Statement of Transactions (SOT) and Statement of Advices (SOA) to its customers periodically as per the financial guidelines and regulations. The transaction statements (SOT) consists of information related to Deposits, Withdrawals, Checks Paid, Interest Earned, and Service Charges or Penalties incurred on an account; it shows the cumulative effect of these transactions the account's balance, up to the date the report was prepared.
  • Lead the team for day to day batch runs
  • Review the Exceptions those are generated during the loading and liase with source system for data quality issues
  • Participating in severity calls and recovers and regenerates the statements from reputational loss situation.
  • Fixing the code issues for Statement formatting or logic issues in populating the data into layout
  • Participating in Disaster Recovery exercise and perform auto failovers.
  • Driving System Improvement Plan calls with different stakeholders on potential issues.
  • Responsible for Capacity planning estimations and Management reports

Senior Product Analyst

Standard Chartered Bank
09.2009 - 06.2012
  • The project aims at creating a Data Integration Hub (DIH) which would hold the data from various source applications within the bank, eg - EBBS, HOGAN, SCI. The DIH would be used as source to generate interface files for the downstream applications such as B&CPR, CDW, DORIS,R-FRAME etc., thereby decoupling the dependency between the downstream application and the TP systems for the interface file generation. In future the Data Integration Hub is presumed to be the single point of data source for all the downstream applications.
  • Replicating the data from source system to Target through daily subscriptions.
  • Creating the CDC Design & development from the Source to the target.
  • Creating the Jobs by using the Copy, Remove Duplicates & Aggregator.
  • Responsible for update the subscriptions as per source changes.
  • Involvement in creation of datastores & Subscriptions.
  • To raise the PMR to IBM on CDC product related issues.
  • Carrying out IDS moment runs for the files generation.
  • Responsible for Data issues in IDS runs.
  • Responsible for Capacity planning and EOD activities.
  • Responsible for preparing the tracking reports.

Production support

Standard Chartered Bank
08.2007 - 09.2009
  • The Credit Data Warehouse uses a data model developed by IBM and objective of this functional specifications document is to compile the necessary data to source or acquire data from the Amadeus system and store these data in a single repository, which is the WB Basel/Credit Data Warehouse. This document covers the ETL process (i.e. Extract, Transform and Load) only that serves to deliver the data warehouse.
  • Responsible for File collection in production for different systems and different Counties.
  • Source validations check up and testing source files in UT environment and sending reports to users.
  • Responsible for Daily and Monthly Runs.
  • Responsible to solve the Problem Tickets raised by user's Regarding C.R's and data mismatch checking in production.
  • Responsible to fix the job/sequencer issues and before the next run.
  • Creation of crontab to take the source files backup in the server.
  • Responsible to manage the change requests & production implementations.
  • Creation of server migration plans for production servers & OAT checkups.

ETL Developer

AT&T
01.2006 - 08.2007
  • Migrating data into the Universal Platform (UP) i.e. CADM segments after implementing several transformations using Ascential Data stage.
  • Preparing the technical specifications for the client given functional specifications and templates, according to business logic.
  • Contributing addition or modification to the project designs. Performing different lookup operations by using Hash file in order to get the input data files.
  • Created Job Sequences to run batch jobs and to handle events in Success and failure cases creating job parameters in data stage to run jobs.
  • Successfully migrated data into the UP after implementing several transformations using Ascential Datastage.
  • Testing - unit testing, IPWT.

Education

Computer Applications Development

Bharath Institute of Higher Education & Research
Chennai India
05-2005

Skills

Expertise in big data platform engineering & SRE

Migrated the datalake platforms from a physicalized infra to complete virtualized environment and segregation of Compute and storage resources for an independent scaling

Proficient in streamlining processes through automations and minimizing the manual tasks

Experienced in cloud environment administration and migrating the platforms from onprem to Cloud

Handled the platforms security remediation programmes for zero-day vulernabilties , CVE's and it's patches

Skilled in managing Cloudera and Hortonworks environments

Hadoop Eco System: HDFS , Atlas, Kerberos, Knox, Ranger,MapReduce2,YARN,Tez, Hive, HBase, Sqoop, Oozie, ZooKeeper, Falcon, Kafka, Spark, Ni-Fi & Solr

Big Data Distributions: Cloudera and Horton works (HDP) Apache distribution

SQL query engine : Presto

Hadoop Replication ; WANdisco, Cloudera Replication Manager, ISILON SynqIQ

Databases: Db2 9x,10x, Oracle 11G , Teradata 1510

Platforms : Windows, Redhat, Suse Linux IBM AIX

ETL Tool : IBM InfoSphere Information Server 85

Replication Tool : IBM Change Data Capture 113x

Reporting Tools : MictroStrategy 103, Tableau 9x

Job Scheduler : BMC ControlM 81, Apache Airflow

Query Tools: AQT v9, Toad, SQL Developer, SQuirreL, Teradata Studio, DB Visualizer

  • Strong understanding of total cost of ownership analysis and implementing the cost saving methods Handled the platforms with an TCO of $ 34millions
  • Effective leadership with an net promoter score of 43

Accomplishments

  • Winner of Data Impact Award in the category of Data Governance & Fabric Excellence (Operational Excellence) at EVOLVE25 APAC!
  • Winner of IDC Future of intelligence 2023
  • Winner of ET Data Elevate winner 2021

Certification

  • Microsoft Certified: Azure Solutions Architect Expert
  • Microsoft Certified: Azure Administrator Associate
  • Microsoft Certified: Azure Fundamentals
  • Hortonworks HDP Certified Administrator
  • Amazon Web Services Solutions Architect Associate
  • Certified ScrumMaster®
  • IBM Certified Solution Developer- Infospehere Data Stage v8.5

Timeline

Head , Data Platform Engineering & SRE

Standard Chartered Bank
10.2021 - Current

Head, Data Platform Engineering & SRE

Standard Chartered Bank
02.2018 - Current

Senior Support Manager

Standard Chartered Bank
03.2014 - 03.2019

Technical Lead

Standard Chartered Bank
07.2012 - 03.2014

Senior Product Analyst

Standard Chartered Bank
09.2009 - 06.2012

Production support

Standard Chartered Bank
08.2007 - 09.2009

ETL Developer

AT&T
01.2006 - 08.2007

Computer Applications Development

Bharath Institute of Higher Education & Research
Krishna Meenon S