Snowflake Data Engineer

Experience: 4 to 6 years
Location: Bengaluru, Gurgaon, Pune
Job code: 101489
Posted on: Apr 23, 2026

About Us:
AceNet Consulting is a fast-growing global business and technology consulting firm leveraging a consultative approach, deep domain expertise and strong technology capabilities across business transformation, IT strategy & architecture, digital transformation, data engineering & analytics, AI/ML, cloud & infrastructure, enterprise applications and emerging technologies to deliver value to our global clients. AceNet’s marquee clients include Tier-1 and Tier-2 banking & financial services, asset & wealth management, healthcare, consumer retail, eCommerce & logistics, engineering, government & public sectors, consulting and technology firms. With presence across Abu Dhabi UAE, Texas USA and India (Bangalore, Gurgaon & Pune), AceNet brings strong consulting and delivery capabilities across project staffing, managed services, outsourcing and offshoring.

Job Summary:
We are looking for a Snowflake Data Engineer who is specialist in building, optimizing, and maintaining ingestion pipelines and transformation workflows that bring data from a diverse set of source patterns into the platform reliably and at scale. This role operates across the full Snowflake ingestion surface: ADF-orchestrated loads, Snowflake-native tasks and streams, file-based and API-driven ingestion, and push/pull patterns across SFTP and cloud storage.

Key Responsibilities:
Snowflake Platform:
*3+ years hands-on Snowflake development.
*Snowflake architecture: warehouses, databases, schemas, stages, file formats, storage integrations.
*Snowflake Streams, Tasks, Pipes — design and production use.
*Performance tuning: clustering, materialized views, warehouse sizing, query profiling.
*Snowflake RBAC, column masking, row-level security.
Ingestion & Orchestration:
*Azure Data Factory — pipelines, linked services, triggers, monitoring.
*File-based ingestion: CSV, JSON, Parquet, Avro from Azure Blob / ADLS or equivalent.
*REST API ingestion with auth patterns (OAuth2, API key) and pagination.
*SFTP automation and file handling pipelines.
*Push/pull ingestion pattern design and implementation.
SQL & Development:
*Expert-level SQL: complex joins, window functions, CTEs, query optimization.
*Python for pipeline scripting, API integration, and data manipulation (pandas, requests, SQLAlchemy).
*dbt or equivalent transformation framework — model development, testing, documentation.
*Version control with Git; experience in code review workflows.
Data Engineering Practices:
*Incremental load patterns: watermark, CDC, merge/upsert logic.
*Error handling, retry logic, and pipeline alerting.
*Data quality checks embedded within pipelines.
*Cost management and monitoring on cloud data platforms.
*Experience with Agile/Scrum delivery environments.

Role Requirement and Qualifications:
*Bachelor's degree in computer science, Information Systems, Engineering, or equivalent practical experience.
*3–5 years of demonstrated data engineering experience with a focus on Snowflake as the target platform.
*Proven delivery of production-grade ingestion pipelines across multiple source patterns in a team environment.
Preferred:
*SnowPro Core Certification (or equivalent Snowflake-issued credential).
*Experience working in offshore or near-shore delivery models with onshore stakeholders across time zones.
*Prior exposure to enterprise data platforms in regulated industries (financial services, oil & gas, public sector, or similar).
*Familiarity with Azure ecosystem services: ADLS Gen2, Azure Key Vault, Azure Monitor, Event Hub.
Nice to Have:
*Experience with Snowpark (Python or Java in-Snowflake compute) for advanced transformation workloads.
*Familiarity with Snowflake Data Sharing or Marketplace for cross-organization data exchange.
*Exposure to real-time or streaming ingestion patterns (Kafka, Azure Event Hub, Kinesis) that land into Snowflake.
*Experience with data catalog or metadata tools (Collibra, Alation, Purview) for documenting Snowflake objects.
*Familiarity with Terraform or other IaC tooling for Snowflake object provisioning.

Why Join Us:
* Opportunities to work on transformative projects, cutting-edge technology and innovative solutions with leading global firms across industry sectors.
* Continuous investment in employee growth and professional development with a strong focus on up & re-skilling.
* Competitive compensation & benefits, ESOPs and international assignments.
* Supportive environment with healthy work-life balance and a focus on employee well-being.
* Open culture that values diverse perspectives, encourages transparent communication and rewards contributions.

How to Apply:
If you are interested in joining our team and meet the qualifications listed above, please apply and submit your resume highlighting why you are the ideal candidate for this position.