♟️ Chess Challenge!
Arvind Abraham Profile Picture

Arvind Abraham

Senior Data Engineer | ML & Data Engineering Specialist
📍 Bengaluru, Karnataka, India

About Me

6+
Years Experience
200+
Students Mentored
3
Leading Tech Companies
Hi, I'm Arvind Abraham, a Senior Software Engineer who believes technology is more than just code. It’s the art of turning data into decisions, ideas into impact, and complexity into something beautifully simple.

For the past 6+ years, I’ve had the privilege of working with visionary companies like Zscaler, Kayzen (a Berlin-based AdTech startup), and Nokia. Along the way, I’ve built large-scale data systems, real-time analytics pipelines, and machine learning solutions that not only solve problems but unlock possibilities.

What excites me most is the blend of engineering and storytelling, designing pipelines that whisper insights, models that predict tomorrow, and systems that empower people to make smarter choices.

Outside of work, I love mentoring aspiring engineers, contributing to the tech community, and chasing experiences that push me out of my comfort zone, from mentoring 200+ students, to skydiving over Prague at 15,000 feet, to diving deep into the blue waters of the Andamans.

I see every project as an adventure and every challenge as a chance to grow, and I bring that same spirit to every team I work with. 🚀

Certifications, Honours and Achievements

  • Mentored over 200 students at Christ University
  • CyberSecurity Meet in person with Micha Weis (Head of Finance Cyber Unit - Ministry of Finance, Israel)
  • First Prize in Debugging (C/C++) at Inter-College fest associated with Anna University
  • First Prize in Paper Presentation at Inter-College fest associated with Anna University
  • Won the district-level versification competition among 80,000 students at the CBSE Youth Festival
  • Received commendation from Kapil Sibal, Minister of Human Resource Development, for successfully completing the inaugural batch of CCE
  • Thrilled to have completed a 15,000-foot skydive over Prague, an incredible honor and adventure
  • Completed two scuba dives in the Andaman & Nicobar Islands
  • Participated in Google Agentic AI Hackathon at Zscaler with sponsorship from Google. Credential: View Certificate
  • Industry Nominee on Board of Studies – Industry Nominee, Vimal Jyothi Engineering College and Education Mentor (Computer Science Department) at Christ College, Irinjalakuda
  • Core Skills

    🤖 Machine Learning

    TensorFlow PyTorch Scikit-learn MLflow A/B Testing Feature Engineering

    ⚡ Data Engineering

    Apache Spark Apache Kafka Apache Airflow ClickHouse Hadoop HDFS Delta Lake

    ☁️ Cloud & DevOps

    Microsoft Azure Kubernetes Docker CI/CD ARM Templates Monitoring

    💾 Databases

    MySQL PostgreSQL ClickHouse Redis MongoDB

    Technology Stack

    Python
    Kafka
    Apache Spark
    Apache Airflow
    Hadoop HDFS
    Azure
    dbt
    ClickHouse
    MySQL
    TensorFlow
    ChatGPT
    Kubernetes
    Docker
    Git

    Professional Experience

    Sr. Software Development Engineer

    Zscaler | Bangalore, India | May 2024 - Present

    • [Data Engineering] Architected real-time streaming data pipeline on Azure using ADX and EventHubs, fully automated with ARM templates – enabling scalable, low-latency insights for enterprise clients.
    • [Analytics Platform] Built powerful, reusable reporting framework with intelligent network data mapping, sanitization, validation, and anomaly detection capabilities.
    • [Security Analytics] Led IP-level fraud detection initiatives, extracting behavioral patterns and surfacing high-risk indicators in network traffic.
    • [DevOps Excellence] Designed end-to-end CI/CD pipeline integrating functional testing, semantic versioning, and security best practices, boosting developer velocity by 40%.

    DevOps Engineer (AI/ML & Data)

    Kayzen | Berlin & Bangalore | Aug 2019 - May 2024

    • [ML Impact] Built end-to-end ML backend for user retargeting, driving median monthly revenue of $30k with advanced feature engineering and model deployment.
    • [Performance Optimization] Optimized Parquet ETL processes, reducing ClickHouse to HDFS data transfer time from 24+ hours to 40-50 minutes for 1.5TB datasets.
    • [Streaming Excellence] Enhanced JSON data ingestion from Kafka to ClickHouse, increasing throughput from 48K to 93K messages per second.
    • [Data Architecture] Architected modern data lake and warehouse using Hive, Spark, HDFS, ClickHouse, Iceberg, and Delta technologies.
    • [Innovation Leadership] Pioneered Apache Airflow adoption during early development phase, creating scalable workflow management solutions.
    • [International Experience] Selected for employee exchange program in Berlin, leading cross-cultural technology initiatives.

    R&D BigData Engineer

    Nokia | Bengaluru, India | Aug 2018 - Aug 2019

    • [Security Integration] Implemented secure Schema Registry integration with Kafka for enterprise-grade data governance.
    • [Infrastructure Security] Developed SSL certificate generation scripts for internal authentication on 1K+ node Kubernetes clusters.