Naman  ·  Lead Snowflake Data Engineer  ·  9+ yrs

Senior
9+ years experienceremote
Available within 48 hrs

Proof of scale

1 billion rows of data handled
350 clients upgraded weekly
40% reduction in query processing time
1 billion rows of data handled40% reduction in query processing time350 clients upgraded weekly

About Naman

Naman is a Data Engineer with 9+ years of experience in developing and deploying ETL pipelines and managing data solutions. He has a strong background in data ingestion, processing, and optimization across various cloud platforms.

9+ years of commercial experience in

Skills(17)

PythonPySparkAirflowSnowflakeKubernetesCassandraAWSDockerODIAnsibleJenkinsOracle Data Integrator (ODI)Oracle GoldenGateSQLPL/SQLKafkaAWS S3

Why hire Naman?

Production deploy authorityMentored 5+ juniors

Led development efforts for ETL pipelines handling over 1 billion rows of data.

Achieved a 40% reduction in query processing time through performance tuning in Snowflake.

Managed Kubernetes deployments for critical data infrastructure, ensuring high availability.

Developed a client upgrade system capable of upgrading over 350 clients weekly.

Configured automated ETL jobs with comprehensive monitoring and alert systems.

Successfully handled and ingested over 1 billion rows of data from telecom networks.

Reduced query processing time by 40% in Snowflake through performance tuning.

Developed and deployed a client upgrade system capable of upgrading over 350 clients/customers weekly using Ansible and Jenkins.

Project highlights(4)

Telecommunications Data ProcessingLead Developer

Overview: Developed an application mediation layer to ingest and transform huge volumes of data from telecom network sites for efficient storage and analytical use. Responsibilities: Developed ETL pipelines from scratch using Python, PySpark, Airflow, and Cassandra, and deployed them in Snowflake. Managed Kubernetes deployment for PySpark and Cassandra. Created data models and conducted POCs for data storage technologies (NoSQL or NewSQL). Configured ODI's scheduling framework for automated ETL jobs with monitoring and alert systems.

PythonPySparkAirflowCassandraSnowflakeKubernetes

Key outcomes:

  • Handled and ingested over 1 billion rows of data successfully.

SIEM Data IngestionSenior Data Developer

Overview: Contributed to a Security Incident and Event Management (SIEM) tool that collects logs from hundreds of sources for analytics to detect cyber-attacks and vulnerabilities. Responsibilities: Developed and containerized an AWS CloudTrail connector/ingester agent using Docker for easy deployment. Created Python-based connectors to fetch and ETL data from multiple input sources. Developed a client upgrade system using Ansible and Jenkins, capable of upgrading over 350 clients/customers weekly. Implemented performance tuning for ODI mappings and robust error handling for ETL processes.

PythonODIDockerAWSAnsibleJenkins

Key outcomes:

  • Capable of upgrading over 350 clients/customers in a single week.

Real-Time Data IntegrationSenior ETL Developer

Overview: Led a critical initiative to implement a real-time data integration solution using Oracle Data Integrator (ODI) and Oracle GoldenGate for synchronizing data between operational systems and a central data repository. Responsibilities: Designed and implemented a real-time ETL pipeline for streaming transactional data from multiple source systems (ERP, CRM) ensuring minimal latency. Incorporated data validation rules and quality checks to ensure data accuracy and consistency. Optimized ODI mappings and GoldenGate processes for high throughput and minimal latency, setting up continuous monitoring dashboards.

Oracle Data Integrator (ODI)Oracle GoldenGateSQLPL/SQL

Key outcomes:

  • Achieved a 40% reduction in query processing time through performance optimization.

Snowflake Data Warehouse DevelopmentData Engineer

Overview: Designed and implemented a scalable cloud-based data warehouse solution on Snowflake to support business intelligence and analytics. Responsibilities: Designed and deployed Snowflake data warehouse architecture, optimizing for cost and scalability. Developed and optimized ETL pipelines using Snowflake, Kafka, and Airflow for data ingestion and transformation. Created and maintained star and snowflake schemas for effective querying. Optimized query performance using clustering keys, materialized views, and partitioning, achieving a 40% reduction in query processing time.

SnowflakeKafkaAirflowAWS S3

Key outcomes:

  • Achieved a 40% reduction in query processing time through performance optimization.

Industry experience

SaaS / B2B

1 project
  • Real-Time Data IntegrationSenior ETL DeveloperOracle Data Integrator (ODI) · Oracle GoldenGate · SQL · PL/SQL

Cybersecurity

1 project
  • SIEM Data IngestionSenior Data DeveloperPython · ODI · Docker · AWS +2

Telecom

1 project
  • Telecommunications Data ProcessingLead DeveloperPython · PySpark · Airflow · Cassandra +2

Ready to work with Naman?

Schedule an interview and onboard within 48 hours. No long hiring cycles.

At a Glance

Experience9+ years
Work moderemote
Starting from₹1.7 L/mo
Direct hirePossible
Start within48 hours
From₹1.7 L/ month

Single contract. No agency markup confusion.

Typically responds within 4 business hours.

5-day replacement guarantee
48-hour onboarding, single invoice
Direct chat — no recruiter middleman
Seniority signals
Owns production deploysGreenfield architectSystem ownerCode reviewerMentor / leads juniors
VerifiedVetted by Witarist
Technical skills assessed & verified
Background & identity checked
English communication verified
Ready to onboard in 48 hours

Not sure if this is the right fit?

Tell us your requirements and we'll match you with the best candidates.

Naman

Data Engineer