Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be joining an innovative company that is revolutionizing retail checkout experiences by utilizing cutting-edge Computer Vision technology to replace traditional barcodes. Our platform aims to create seamless, faster, and smarter checkout processes, enhancing the shopping experience for both retailers and consumers. As we are growing rapidly, we are seeking an experienced Android/Cross-Platform App Developer to be a part of our team and help in shaping the future of retail technology. As a Senior Data Engineer, you will be an integral part of our expanding data team. Your primary responsibilities will involve building and optimizing data infrastructure, pipelines, and tooling to support analytics, machine learning, and product development. This role requires a strong background in cloud-native data engineering, a passion for scalable systems, and the ability to work independently with minimal supervision. Key Responsibilities: - Design, build, and maintain scalable data pipelines and ETL/ELT workflows using tools such as Kestra or Prefect. - Architect and manage cloud-based data infrastructure utilizing platforms like Snowflake, MySQL, and LanceDB. - Implement and uphold data quality, lineage, and governance best practices. - Collaborate with analytics, BI, and product teams to establish data models for reporting, experimentation, and operational use cases. - Optimize query performance, storage costs, and data reliability across various platforms. - Oversee data ingestion from internal and external systems through APIs, CDC, or streaming technologies like Kafka and MQTT. - Develop automated data validation, testing, and monitoring frameworks to ensure data integrity. - Contribute to infrastructure-as-code and deployment processes using CI/CD pipelines and version control systems like Git. - Capable of working independently and driving projects forward with minimal supervision. Skills and Qualifications: - 5+ years of experience as a data engineer or software engineer in large-scale data systems. - Proficiency in SQL, Python, and modern data transformation frameworks. - Hands-on experience in building and maintaining production-level ETL/ELT pipelines. - Familiarity with cloud data warehouses like Snowflake and RedPanda Cloud. - Expertise in workflow orchestration tools such as Airflow, Kestra, or Prefect. - Understanding of data modeling techniques like dimensional modeling and normalization. - Experience with cloud platforms such as AWS and Azure for data infrastructure and services. - Ability to work independently and lead projects with minimal guidance. Nice to Have: - Experience with streaming data technologies, specifically RedPanda. - Knowledge of data security, privacy, and compliance practices including GDPR and HIPAA. - Background in DevOps for data, encompassing containerization and observability tools. - Previous involvement in a Retail or e-commerce data environment. Software Qualifications: - Languages: Python, SQL, Rust - Data Warehousing: Snowflake, MySQL - ETL/ELT Orchestration Tools: Kestra, Prefect - Version Control & CI/CD: Git, GitHub Actions - Orchestration & Infrastructure: Docker, Kubernetes, Redpanda, Cloudflare - Monitoring: OpenobserveAI, Keep Why Join Us : - Become part of a forward-thinking company shaping the future of retail technology. - Collaborate with a dynamic and innovative team that values creativity. - Opportunity to contribute to cutting-edge projects and enhance your skills. - Competitive salary and benefits package. - Enjoy a flexible work environment with opportunities for career growth.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
Job Description: As a Senior Azure Cloud Data Engineer based in Bangalore with a hybrid working model, you will have a pivotal role in processing and analyzing IoT data generated by our connected products. Your primary objective will be to derive meaningful insights from this data to cater to the needs of both our internal teams and external clients. Your core responsibilities will revolve around the creation and upkeep of scalable, high-performance analytical data solutions utilizing Azure cloud technologies. Key Skills: With a minimum of 3 years of hands-on experience in utilizing Azure Analytics tools such as Data Factory, Synapse, and Event Hubs, you will be well-versed in these technologies. Proficiency in SQL, Python, and PySpark is essential for this role. Your expertise should also extend to ETL/ELT processes, data streaming technologies like Kafka/Event Hubs, and handling unstructured data. A sound understanding of data modeling, data governance, and real-time data processing is crucial. Familiarity with DevOps practices, CI/CD pipelines, and Agile methodologies will be an added advantage. Soft Skills: A strong analytical acumen coupled with exceptional problem-solving skills will be key strengths in this role. Your communication skills, both verbal and written, should be exemplary. The ability to work autonomously as well as collaboratively within a team is vital. Being detail-oriented and quality-focused is a must for delivering accurate and efficient results. Your organizational skills and adeptness at multitasking will aid in managing various responsibilities effectively. The capacity to adapt swiftly in a dynamic and fast-paced environment is essential. A self-driven and proactive approach towards tasks will be highly beneficial in excelling in this role.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
delhi
On-site
We are looking for a highly motivated and enthusiastic Senior Data Scientist with 5-8 years of experience to join our dynamic team. The ideal candidate will have a strong background in AI/ML analytics and a passion for leveraging data to drive business insights and innovation. As a Senior Data Scientist, your key responsibilities will include developing and implementing machine learning models and algorithms. You will work closely with project stakeholders to understand requirements and translate them into deliverables. Utilize statistical and machine learning techniques to analyze and interpret complex data sets. It is essential to stay updated with the latest advancements in AI/ML technologies and methodologies and collaborate with cross-functional teams to support various AI/ML initiatives. To qualify for this role, you should have a Bachelor's degree in Computer Science, Data Science, or a related field. A strong understanding of machine learning, deep learning, and Generative AI concepts is required. Preferred skills for this position include experience in machine learning techniques such as Regression, Classification, Predictive modeling, Clustering, and Deep Learning stack using Python. Experience with cloud infrastructure for AI/ML on AWS (Sagemaker, Quicksight, Athena, Glue) is highly desirable. Expertise in building enterprise-grade, secure data ingestion pipelines for unstructured data (ETL/ELT) is a plus. Proficiency in Python, TypeScript, NodeJS, ReactJS, and frameworks like pandas, NumPy, scikit-learn, SKLearn, OpenCV, SciPy, Glue crawler, ETL, as well as experience with data visualization tools like Matplotlib, Seaborn, and Quicksight, is beneficial. Additionally, knowledge of deep learning frameworks such as TensorFlow, Keras, and PyTorch, experience with version control systems like Git and CodeCommit, and strong knowledge and experience in Generative AI/LLM based development are essential for this role. Experience working with key LLM models APIs (e.g., AWS Bedrock, Azure Open AI/OpenAI) and LLM Frameworks (e.g., LangChain, LlamaIndex), as well as proficiency in effective text chunking techniques and text embeddings, are also preferred skills. Good to have skills include knowledge and experience in building knowledge graphs in production and an understanding of multi-agent systems and their applications in complex problem-solving scenarios. Pentair is an Equal Opportunity Employer that values diversity and believes that a diverse workforce contributes different perspectives and creative ideas, enabling continuous improvement.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
Apply Digital is a global digital transformation partner for change agents, offering expertise in Business Transformation Strategy, Product Design & Development, Commerce, Platform Engineering, Data Intelligence, and Change Management. Our goal is to help clients modernize their organizations and deliver meaningful impact to both their business and customers, whether they are initiating, accelerating, or optimizing their digital transformation journey. We specialize in implementing composable tech and leveraging our experience in building smart products and utilizing AI tools to drive value. With over 650 team members, we have successfully transformed global companies like Kraft Heinz, NFL, Moderna, Lululemon, Atlassian, Sony, American Express, and Harvard Business School. Founded in 2016 in Vancouver, Canada, Apply Digital has expanded to nine cities across North America, South America, the UK, and Europe. We are excited to launch a new office in Delhi NCR, India. At Apply Digital, we embrace the "One Team" approach, operating within a "pod" structure that brings together senior leadership, subject matter experts, and cross-functional skill sets. Our teams work within a common tech and delivery framework supported by well-organized scrum and sprint cadences, ensuring alignment towards desired outcomes through regular retrospectives. We envision Apply Digital as a safe, empowered, respectful, and fun community wherever we operate globally. Our team strives to embody our SHAPE values - smart, humble, active, positive, and excellent - creating a space for connection, growth, and mutual support to make a difference every day. Apply Digital is a hybrid-friendly organization with remote options available. The preferred candidate for the role should be based in or within commutable distance to Delhi/NCR, India, working hours that overlap with the Eastern Standard Timezone (EST). The client is seeking an experienced Data Engineer to design, build, and maintain scalable data pipelines and architectures to support analytical and operational workloads. Responsibilities include developing and optimizing ETL/ELT pipelines, integrating data pipelines into cloud-native applications, managing cloud data warehouses, implementing data governance and security best practices, collaborating with analytics teams, maintaining data documentation, monitoring and optimizing data pipelines, and staying updated on emerging data engineering technologies. **Requirements:** - Strong proficiency in English (written and verbal communication). - Experience working with remote teams across different time zones. - 5+ years of data engineering experience with expertise in building scalable data pipelines. - Proficiency in SQL and Python for data modeling and processing. - Experience with Google Cloud Platform (GCP) and tools like BigQuery, Cloud Storage, and Pub/Sub. - Knowledge of ETL/ELT frameworks, workflow orchestration tools, data privacy, and security best practices. - Strong problem-solving skills and excellent communication abilities. **Nice to have:** - Experience with real-time data streaming solutions, machine learning workflows, BI tools, Terraform, and data integrations. - Knowledge of Infrastructure as Code (IaC) in data environments. Apply Digital offers comprehensive benefits including private healthcare coverage, contributions to Provident fund, gratuity bonus, flexible vacation policy, engaging projects with global impact, inclusive and safe work environment, learning opportunities, and a commitment to fostering an inclusive workplace. Apply Digital is dedicated to celebrating differences, promoting equal opportunity, and creating an inclusive culture where individual uniqueness is valued and recognized.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
tirupati, andhra pradesh
On-site
You are an experienced Snowflake Data Engineer with expertise in Python and SQL, holding a Snowflake certification and having at least 4 years of hands-on experience with Snowflake. Your primary responsibility will be to design, develop, and maintain robust data pipelines in a cloud environment, ensuring efficient data integration, transformation, and storage within the Snowflake data platform. Your key responsibilities will include designing and developing data pipelines to handle large volumes of structured and unstructured data using Snowflake and SQL. You will also be responsible for developing and maintaining efficient ETL/ELT processes to integrate data from various sources into Snowflake, ensuring data quality and availability. Additionally, you will write Python scripts to automate data workflows, implement data transformation logic, and integrate with external APIs for data ingestion. You will create and optimize complex SQL queries for data extraction, transformation, and reporting purposes. Moreover, you will develop and maintain data models to support business intelligence and analytics, leveraging Snowflake best practices. Ensuring proper data governance, security, and compliance within the Snowflake environment will also be one of your responsibilities by implementing access controls, encryption, and monitoring. Collaboration is key, as you will work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver scalable solutions. As a qualified candidate, you must have a Snowflake Certification, 4+ years of experience with Snowflake, and active Snowflake certification. You should possess strong experience with Python for data processing, automation, and API integration. Expertise in writing and optimizing complex SQL queries and experience with data warehousing and database management is essential. Hands-on experience with designing and implementing ETL/ELT pipelines using Snowflake is also required. Familiarity with cloud environments such as AWS, GCP, or Azure, especially in relation to data storage and processing, is necessary. Experience with implementing data governance frameworks and security protocols in a cloud data platform is also a prerequisite. Preferred skills include experience with CI/CD pipelines for data projects, knowledge of Apache Airflow or other orchestration tools, and familiarity with big data technologies and distributed systems. Educational background should include a Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Additionally, possessing strong problem-solving and analytical skills, excellent communication skills to interact with both technical and non-technical stakeholders, and the ability to work in a fast-paced, agile environment are essential soft skills for this role.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
62336 Jobs | Dublin
Wipro
24848 Jobs | Bengaluru
Accenture in India
20859 Jobs | Dublin 2
EY
18920 Jobs | London
Uplers
13736 Jobs | Ahmedabad
IBM
12924 Jobs | Armonk
Bajaj Finserv
12820 Jobs |
Accenture services Pvt Ltd
11998 Jobs |
Amazon
11950 Jobs | Seattle,WA
Oracle
11422 Jobs | Redwood City