Senior Python Developer
Location: Bengaluru, Chennai, Gurgaon, Hyderabad, Mysore, Pune
Job code: 101465
Posted on: Mar 30, 2026
About Us:
AceNet Consulting is a fast-growing global business and technology consulting firm leveraging a consultative approach, deep domain expertise and strong technology capabilities across business transformation, IT strategy & architecture, digital transformation, data engineering & analytics, AI/ML, cloud & infrastructure, enterprise applications and emerging technologies to deliver value to our global clients. AceNet’s marquee clients include Tier-1 and Tier-2 banking & financial services, asset & wealth management, healthcare, consumer retail, eCommerce & logistics, engineering, government & public sectors, consulting and technology firms. With presence across Abu Dhabi UAE, Texas USA and India (Bangalore, Gurgaon & Pune), AceNet brings strong consulting and delivery capabilities across project staffing, managed services, outsourcing and offshoring.
Job Summary:
We are looking for an experienced Senior Python Developer with strong expertise in designing, developing, and deploying scalable applications. The ideal candidate should have hands-on experience in Python-based frameworks, backend development, APIs, and cloud technologies, along with a solid understanding of software engineering best practices.
Key Responsibilities:
*Design, develop, and optimize scalable data pipelines and ETL processes using Azure Data Services and Databricks.
*Implement data transformations and analytics using PySpark and Python.
*Configure and manage Databricks Unity Catalog for secure data governance and access control.
*Orchestrate workflows using Apache Airflow for scheduling and automation.
*Collaborate with cross-functional teams to deliver high-quality data solutions in an Agile environment.
*Set up and maintain CI/CD pipelines for data workflows and deployments.
*Manage version control and branching strategies using Git.
*Write and execute unit tests to ensure code quality and reliability.
*Monitor, troubleshoot, and optimize data processes for performance and cost efficiency.
Role Requirements and Qualification:
*Azure Knowledge: Strong experience with Azure Data Factory, Azure Data Lake, Azure Synapse, or related services.
*Programming: Proficiency in Python and PySpark for data engineering tasks.
*Databricks: Hands-on experience in building and managing workflows in Databricks, including Unity Catalog.
*Orchestration: Experience with Apache Airflow for workflow scheduling and automation.
*CI/CD: Familiarity with tools like Azure DevOps or Jenkins for automated deployments.
*Version Control: Expertise in Git for code management.
*Testing: Strong understanding of unit testing frameworks for data pipelines.
*Agile Methodology: Experience working in Agile teams and using tools like Jira or Azure Boards.
*5+ years of relevant experience in Python development.
*Mandatory skills : Python, Databricks, git
*Desired skills :
Deep expertise in python
Experience with Databricks, pyspark and Git.
Experience with Azure cloud platform.
Why Join Us:
* Opportunities to work on transformative projects, cutting-edge technology and innovative solutions with leading global firms across industry sectors.
* Continuous investment in employee growth and professional development with a strong focus on up & re-skilling.
* Competitive compensation & benefits, ESOPs and international assignments.
* Supportive environment with healthy work-life balance and a focus on employee well-being.
* Open culture that values diverse perspectives, encourages transparent communication and rewards contributions.
How to Apply:
If you are interested in joining our team and meet the qualifications listed above, please apply and submit your resume highlighting why you are the ideal candidate for this position.
