Location: Hyderabad (On-site/Hybrid, as applicable) Salary Range: ₹25 – ₹30 LPA Experience Required: 7+ Years About the Role: We are seeking a highly skilled and motivated Data Engineer to lead the migration and optimization of our data pipelines from Snowflake to Databricks. This role will focus on building scalable, efficient, and high-performance ETL workflows using PySpark and SQL, ensuring seamless data operations for analytical and business needs. Key Responsibilities: • Migrate and optimize existing data pipelines from Snowflake to Databricks. • Design and maintain efficient, scalable ETL workflows using PySpark and SQL. • Develop performance-tuned, reliable data processing solutions for large-scale systems. • Troubleshoot and resolve issues in data pipelines, ensuring data quality and reliability. • Analyze technical requirements, propose and implement effective solutions independently. • Collaborate with business and technical stakeholders for smooth project execution. • Apply advanced Spark performance tuning strategies. • Maintain clear documentation for all workflows, data pipelines, and architecture changes. Required Qualifications: • Proficiency in Python, SQL, and Apache Spark (PySpark preferred). • Hands-on experience with both Databricks and Snowflake, including pipeline design, optimization, and migration. • Strong knowledge of ETL processes, data modeling, and distributed computing. • Experience with orchestration tools like Apache Airflow or Databricks Workflows. • Proven ability to work independently and manage priorities in a fast-paced environment. • Familiarity with AWS cloud services (e.g., S3, EC2, Lambda) is a strong advantage. Why Join Us? • Opportunity to lead critical data migration and transformation projects. • Work in a data-driven organization focused on innovation and excellence. • Competitive salary with performance-based growth opportunities. • Collaborative and inclusive work environment in a growing tech-driven setup. Apply Now to be part of a forward-thinking team shaping the future of data engineering. Equal Employment Opportunity (EEO) Statement: We are an equal opportunity employer and value diversity at all levels of the organization. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. All qualified applicants will receive consideration for employment without regard to any of the above characteristics. Anti-Discrimination and Anti-Harassment Policy: We are committed to maintaining a workplace free from discrimination, harassment, and retaliation. We promote a culture of respect, safety, and inclusion, and expect all team members to uphold these values. Recruitment Disclaimer: We do not charge any fees at any stage of the recruitment process and have not authorized any third party to do so. If you receive a suspicious communication, please verify with us directly. Show more Show less
???? We&aposre Hiring: Agentic AI Developer ???? Location: Pune (Hybrid) ???? Apply at: [HIDDEN TEXT] ???? Experience: 35 years (Hands-on) DATADOVE is on the lookout for an exceptional Agentic AI Developer to join our fast-growing team! If you&aposre passionate about building AI systems that make real impactespecially in marketing and operationsthis is your opportunity to shine. ???? What You&aposll Be Doing AI/ML Development & Deployment for marketing and operations use cases Deep Learning : Work with CNNs, RNNs, Transformers, and Attention Mechanisms Generative AI : Utilize OpenAI (GPT, DALLE, Whisper), Anthropic (Claude) Agentic AI : Build on platforms like AutoGen, CrewAI, AWS Bedrock Multimodal AI : Develop intelligent agents using text, voice, and more Python Expertise : Use NumPy, Pandas, Matplotlib, TensorFlow, PyTorch Traditional ML : Apply supervised/unsupervised learning, PCA, tuning Data Analytics : Run predictive analysis, clustering, A/B testing, KPI tracking MLOps & CI/CD : Handle model versioning, pipelines, monitoring Cloud Platforms : Leverage AWS (S3, Lambda, EC2, SageMaker, Bedrock) Dev Tools : Work with Cursor for streamlined code reviews and collaboration Collaboration : Communicate clearly with clients and teams to align delivery ???? What You Bring Degree: B.Tech / BE / M.Tech / MCA / M.Sc 3+ years experience in hands-on AI/ML development Strong skills in Python, Deep Learning, and Generative AI frameworks Familiarity with AWS cloud services & MLOps best practices Excellent communication and stakeholder management skills Bonus: Experience with Power BI / Tableau Domain Edge: Experience in marketing-focused AI solutions ???? Why Join DATADOVE At DATADOVE, we&aposre creating AI-powered solutions that help businesses scale faster, smarter, and more efficiently . Be part of a team where your work truly drives impact and where collaboration and innovation are core values. ???? Apply Now : Send your resume and portfolio to [HIDDEN TEXT] Show more Show less