Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Software Engineer (Backend) (SDE-1) DViO is one of the largest independent, highly awarded, digital first marketing companies with a team of 175+ people operating across India, Middle East and South East Asia. We are a full-service digital marketing agency with a focus on ROI driven marketing. We are looking for a Software Engineer (Backend) to join our team. The ideal candidate will have a strong background in software development and experience with backend technologies. We are looking for someone who is passionate about backend system design and is looking to grow in this field. Responsibilities You will be working with a team that will be responsible for developing services for various applications, like marketing automation, campaign optimization, recommendation & analytical systems, etc. The candidate will work on developing backend services, including REST APIs, data processing pipelines, and database management. Develop backend services for various business use cases Write clean, maintainable code Collaborate with other team members Improvise code based on feedback Work on bug fixes, refactoring and performance improvements Tracking technology changes and keeping our applications up-to-date Requirements Qualifications: Bachelor's degree in Computer Science, Engineering, or related field 0-1 year of experience in software development Must-have skills: Proficient in either PHP, Python, or Node.js Experience with any backend MVC frameworks like Laravel, Rails, Express, Django etc. Experience with any database like MySQL, PostgreSQL, MongoDB, etc. Experience with REST APIs, Docker, Bash and Git Good-to-have skills: Experience with WebSockets, Socket.io, etc. Experience with search technologies like Meilisearch, Typesense, Elasticsearch, etc. Experience with caching technologies like Redis, Memcached, etc. Experience with cloud platforms like AWS, GCP, Azure, etc. Experience with monolithic architecture Experience with data warehouses or data lakes like Snowflake, Amazon Redshift, Google BigQuery, Databricks, etc. Benefits DViO offers innovative and challenging work environment with the opportunity to work on cutting-edge technologies. Join us and be a part of a dynamic team that is passionate about software development and build applications that will shape the future of digital marketing. Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Position Overview: ShyftLabs is seeking an experienced Databricks Architect to lead the design, development, and optimization of big data solutions using the Databricks Unified Analytics Platform. This role requires deep expertise in Apache Spark, SQL, Python, and cloud platforms (AWS/Azure/GCP). The ideal candidate will collaborate with cross-functional teams to architect scalable, high-performance data platforms and drive data-driven innovation. ShyftLabs is a growing data product company that was founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to accelerate business growth across various industries by focusing on creating value through innovation. Job Responsibilities Architect, design, and optimize big data and AI/ML solutions on the Databricks platform. Develop and implement highly scalable ETL pipelines for processing large datasets. Lead the adoption of Apache Spark for distributed data processing and real-time analytics. Define and enforce data governance, security policies, and compliance standards. Optimize data lakehouse architectures for performance, scalability, and cost-efficiency. Collaborate with data scientists, analysts, and engineers to enable AI/ML-driven insights. Oversee and troubleshoot Databricks clusters, jobs, and performance bottlenecks. Automate data workflows using CI/CD pipelines and infrastructure-as-code practices. Ensure data integrity, quality, and reliability across all data processes. Basic Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. 10+ years of hands-on experience in data engineering, with at least 5+ years in Databricks Architect and Apache Spark. Proficiency in SQL, Python, or Scala for data processing and analytics. Extensive experience with cloud platforms (AWS, Azure, or GCP) for data engineering. Strong knowledge of ETL frameworks, data lakes, and Delta Lake architecture. Hands-on experience with CI/CD tools and DevOps best practices. Familiarity with data security, compliance, and governance best practices. Strong problem-solving and analytical skills in a fast-paced environment. Preferred Qualifications: Databricks certifications (e.g., Databricks Certified Data Engineer, Spark Developer). Hands-on experience with MLflow, Feature Store, or Databricks SQL. Exposure to Kubernetes, Docker, and Terraform. Experience with streaming data architectures (Kafka, Kinesis, etc.). Strong understanding of business intelligence and reporting tools (Power BI, Tableau, Looker). Prior experience working with retail, e-commerce, or ad-tech data platforms. We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Responsibilities: ● Work with multiple Agile teams delivering data and analytics solutions. ● Serve as Scrum Master for teams supporting a global manufacturing enterprise. ● Collaborate with Product Owners to manage and refine backlogs aligned to business needs. ● Facilitate Agile ceremonies: Sprint Planning, Stand-ups, Reviews, Retrospectives, etc. ● Drive data-focused sprint delivery: ingestion, transformation, integration, and reporting. ● Identify and resolve blockers; champion continuous improvement and delivery velocity. ● Partner with cross-functional stakeholders: data engineers, analysts, and architects. ● Promote Agile practices across platforms like SAP ECC, IBP, HANA, BOBJ, Databricks, Tableau. ● Track Agile metrics (velocity, burndown, throughput) to improve team performance. ● Support capacity planning, sprint forecasting, and risk identification. ● Foster a high-performance culture built on adaptability, collaboration, and customer focus. ● Orient the team toward outcome-based progress: “building outcomes” vs. “completing tasks”. ● Help break down efforts into small, incremental work units for better delivery flow. ● Ensure story clarity with detailed descriptions and acceptance criteria. ● Lead daily stand-ups with a focus on “completion” over “in progress”. Must-Have Skills: ● 3–5 years of experience as a Scrum Master in Data & Analytics environments. ● Experience working with SAP, HANA, and related analytics tools/platforms. ● Strong knowledge of Agile principles beyond just the ceremonies. ● Ability to guide teams in behavior and mindset change, not just process compliance. ● Skilled in tracking sprint metrics and helping set achievable sprint goals. ● Strong organizational, interpersonal, analytical, and communication skills. ● Comfortable working with global teams and flexible across time zones. Show more Show less
Posted 1 week ago
3.0 - 6.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Job Title: Data Scientist Location: Navi Mumbai Experience: 3-6 Years Duration: Fulltime Job Summary: We are looking for a highly skilled Data Scientist with deep expertise in time series forecasting, particularly in demand forecasting and customer lifecycle analytics (CLV). The ideal candidate will be proficient in Python or PySpark, have hands-on experience with tools like Prophet and ARIMA, and be comfortable working in Databricks environments. Familiarity with classic ML models and optimization techniques is a plus. Key Responsibilities • Develop, deploy, and maintain time series forecasting models (Prophet, ARIMA, etc.) for demand forecasting and customer behavior modeling. • Design and implement Customer Lifetime Value (CLV) models to drive customer retention and engagement strategies. • Process and analyze large datasets using PySpark or Python (Pandas). • Partner with cross-functional teams to identify business needs and translate them into data science solutions. • Leverage classic ML techniques (classification, regression) and boosting algorithms (e.g., XGBoost, LightGBM) to support broader analytics use cases. • Use Databricks for collaborative development, data pipelines, and model orchestration. • Apply optimization techniques where relevant to improve forecast accuracy and business decision-making. • Present actionable insights and communicate model results effectively to technical and non-technical stakeholders. Required Qualifications • Strong experience in Time Series Forecasting, with hands-on knowledge of Prophet, ARIMA, or equivalent – Mandatory. • Proven track record in Demand Forecasting – Highly Preferred. • Experience in modeling Customer Lifecycle Value (CLV) or similar customer analytics use cases – Highly Preferred. • Proficiency in Python (Pandas) or PySpark – Mandatory. • Experience with Databricks – Mandatory. • Solid foundation in statistics, predictive modeling, and machine learning Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
India
Remote
Job Title: Senior Data Engineer Experience: 6+ Years Location: Remote Employment Type: Full Time Job Summary: We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic data engineering team. The ideal candidate will have deep expertise in C#, Azure Data Factory (ADF), Databricks, SQL Server, and Python, along with a strong understanding of modern CI/CD practices. You will be responsible for designing, developing, and maintaining scalable and efficient data pipelines and solutions to support analytics, reporting, and operational systems. Key Responsibilities: Design, develop, and optimize complex data pipelines using Azure Data Factory, Databricks, and SQL Server. Show more Show less
Posted 1 week ago
7.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Advisor will work closely in a consultative capacity with senior CoE management & global sales leadership on multiple projects related to customer & pricing analytics. Advisor is expected to provide industry recognized thought leadership at the enterprise level related to data analysis & visualization, creating the logic for and implementing strategies, providing requirements to data analysts and technology teams on data attributes, models and platform requirements, and communicating with global stakeholders to ensure we deliver the best possible customer experience. Provides regular expert consultative advice to senior leadership and champions the design and development of innovative solutions. Should possess and demonstrate understanding of core Business and Commercial concepts including financial metrics, market dynamics, and competitive landscapes. Communicates results to a broad range of audiences. Effectively uses current and emerging technologies to evaluate trends and develop actionable insights and recommendations to management, via understanding of the business model and the information available for analysis. Grade :11 "Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date" What Your Main Responsibilities Are The Key responsibilities of this role are: Leads & guide teams that leverage most advanced descriptive & diagnostic techniques and/or other approaches in analysis of complex business situations. Work cross-functionally with teams to analyze usage and uncovering key, actionable insights Champions, develops and implements innovative solutions from initial concept to fully tested production, and communicates results to a broad range of audiences Expert use, investigation and implementation of the most current and emerging technologies to evaluate trends and develop actionable insights and recommendations to management that will enable process transformation Designing and measuring controlled experiments to determine the potential impact of new approaches. Help with various data analysis and modelling projects Place actionable data points and trends in context for leadership to understand actual performance and uncover opportunities. Take ownership of the end-to-end system from Problem statement to Solution Delivery and leverage other teams if required Mentors less senior staff. Lead cross functional projects and programs formally preparing and presenting to management. Routinely work on multiple highly complex assignments concurrently. Provides expert consultation to Sr. Leadership routinely and present insights with strong storytelling skills What We Are Looking For Key skills needed for this role: Skills Strong financial acumen particularly of pricing models/systems, revenue & cost structures, contribution & operating margins, and P&L views Excellent stakeholder management skills particularly with team members across different regions to achieve common goals Strong communication skills to communicate with people across all levels including senior management & be able to tell logical stories by crafting solid visually appealing presentations. Excellent project management skills Strong analytical skills to deliver accurate results & actionable recommendation. Key behaviors & mindsets: Consultative mindset Innovation mindset Bias for action with a focus on transformation of legacy processes Sense of ownership Qualification Master’s degree in business, information systems, computer science, or a quantitative discipline from tier1/2 institutes. Experience requirement: 7-10 years of relevant analytics/consulting/leadership experience Tools/platforms: Oracle, SQL, Teradata, R, Python, Power BI, AbInitio, SAS, Azure Databricks FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
We’re Hiring: MLOps Engineer (Azure) harshita.panchariya@tecblic.com Location: Ahmedabad, Gujarat Experience: 3–5 Years Employment Type : Full-Time * An immediate joiner will be preferred. Job Summary: We are seeking a skilled and proactive MLOps/DataOps Engineer with strong experience in the Azure ecosystem to join our team. You will be responsible for streamlining and automating machine learning and data pipelines, supporting scalable deployment of AI/ML models, and ensuring robust monitoring, governance, and CI/CD practices across the data and ML lifecycle. Key Responsibilities MLOps : Design and implement CI/CD pipelines for machine learning workflows using Azure DevOps, GitHub Actions, or Jenkins. Automate model training, validation, deployment, and monitoring using tools such as Azure ML, MLflow, or KubeFlow. Manage model versioning, performance tracking, and rollback strategies. Integrate machine learning models with APIs or web services using Azure Functions, Azure Kubernetes Service (AKS), or Azure App Services. DataOps Design, build, and maintain scalable data ingestion, transformation, and orchestration pipelines using Azure Data Factory, Synapse Pipelines, or Apache Airflow. Ensure data quality, lineage, and governance using Azure Purview or other metadata management tools. Monitor and optimize data workflows for performance and cost efficiency. Support batch and real-time data processing using Azure Stream Analytics, Event Hubs, Databricks, or Kafka. DevOps & Infrastructure Provision and manage infrastructure using Infrastructure-as-Code tools such as Terraform, ARM Templates, or Bicep. Set up and manage compute environments (VMs, AKS, AML Compute), storage (Blob, Data Lake Gen2), and networking in Azure. Implement observability using Azure Monitor, Log Analytics, Application Insights, and Skills : Strong hands-on experience with Azure Machine Learning, Azure Data Factory, Azure DevOps, and Azure Storage solutions. Proficiency in Python, Bash, and scripting for automation. Experience with Docker, Kubernetes, and containerized deployments in Azure. Good understanding of CI/CD principles, testing strategies, and ML lifecycle management. Familiarity with monitoring, logging, and alerting in cloud environments. Knowledge of data modeling, data warehousing, and SQL. Preferred Qualifications Azure Certifications (e.g., Azure Data Engineer Associate, Azure AI Engineer Associate, or Azure DevOps Engineer Expert). Experience with Databricks, Delta Lake, or Apache Spark on Azure. Exposure to security best practices in ML and data environments (e.g., identity management, network security). Soft Skills Strong problem-solving and communication skills. Ability to work independently and collaboratively with data scientists, ML engineers, and platform teams. Passion for automation, optimization, and driving operational excellence. harshita.panchariya@tecblic.com Show more Show less
Posted 1 week ago
7.0 - 10.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Advisor will work closely in a consultative capacity with senior CoE management & global sales leadership on multiple projects related to customer & pricing analytics. Advisor is expected to provide industry recognized thought leadership at the enterprise level related to data analysis & visualization, creating the logic for and implementing strategies, providing requirements to data analysts and technology teams on data attributes, models and platform requirements, and communicating with global stakeholders to ensure we deliver the best possible customer experience. Provides regular expert consultative advice to senior leadership and champions the design and development of innovative solutions. Should possess and demonstrate understanding of core Business and Commercial concepts including financial metrics, market dynamics, and competitive landscapes. Communicates results to a broad range of audiences. Effectively uses current and emerging technologies to evaluate trends and develop actionable insights and recommendations to management, via understanding of the business model and the information available for analysis. Grade :11 "Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date" What Your Main Responsibilities Are The Key responsibilities of this role are: Leads & guide teams that leverage most advanced descriptive & diagnostic techniques and/or other approaches in analysis of complex business situations. Work cross-functionally with teams to analyze usage and uncovering key, actionable insights Champions, develops and implements innovative solutions from initial concept to fully tested production, and communicates results to a broad range of audiences Expert use, investigation and implementation of the most current and emerging technologies to evaluate trends and develop actionable insights and recommendations to management that will enable process transformation Designing and measuring controlled experiments to determine the potential impact of new approaches. Help with various data analysis and modelling projects Place actionable data points and trends in context for leadership to understand actual performance and uncover opportunities. Take ownership of the end-to-end system from Problem statement to Solution Delivery and leverage other teams if required Mentors less senior staff. Lead cross functional projects and programs formally preparing and presenting to management. Routinely work on multiple highly complex assignments concurrently. Provides expert consultation to Sr. Leadership routinely and present insights with strong storytelling skills What We Are Looking For Key skills needed for this role: Skills Strong financial acumen particularly of pricing models/systems, revenue & cost structures, contribution & operating margins, and P&L views Excellent stakeholder management skills particularly with team members across different regions to achieve common goals Strong communication skills to communicate with people across all levels including senior management & be able to tell logical stories by crafting solid visually appealing presentations. Excellent project management skills Strong analytical skills to deliver accurate results & actionable recommendation. Key behaviors & mindsets: Consultative mindset Innovation mindset Bias for action with a focus on transformation of legacy processes Sense of ownership Qualification Master’s degree in business, information systems, computer science, or a quantitative discipline from tier1/2 institutes. Experience requirement: 7-10 years of relevant analytics/consulting/leadership experience Tools/platforms: Oracle, SQL, Teradata, R, Python, Power BI, AbInitio, SAS, Azure Databricks FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace. Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Skill required: Tech for Operations - Automation Anywhere Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation,BE Years of Experience: 5 - 8 Years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? RPA Lead developer will be responsible for design & development of end-to-end RPA automation leveraging A360 tools & technologies. Should anticipate, identify, track, and resolve technical issues and risks affecting delivery. Understand the Automation Anywhere RPA platform, its features, capabilities, and best practices. You would need to be proficient in designing and implementing automation workflows that optimize business processes. What are we looking for? Minimum 5 – 8 years of strong software design & development experience Minimum 5 – 6 year(s) of programming experience in Automation Anywhere A360 , Document Automation, Co-pilot, Python. Effective GEN AI Prompts creation for Data extraction using GEN AI OCR Experience with APIs, data integration, and automation best practices Experience in VBA ,VB and Python Script programming Good Knowledge on GEN AI , Machine Learning. Should have good hands-on in core .NET concepts and OOPs Programming. Understands OO concepts and consistently applies them in client engagements. Hands on experience in SQL & T-SQL Queries, Creating complex stored procedures. Exceptional presentation, written and verbal communication skills (English) Good understanding of workflow-based logic and hands on experience using process templates, VBO design and build. Should understand process analysis and pipeline build for automation process. Automation Anywhere A360 Master/Advanced certification. Strong programming knowledge on HTML, JavaScript / VB scripts Experience with Agile development methodology. Exposure to SAP automation is preferred. Exposure to A360 Control Room features. Azure Machine Learning, Azure Databricks, and other Azure AI services. Exposure to GDPR compliance is preferred. Agile development methodologies are an added advantage. Roles and Responsibilities: Lead the team to develop automation bots and processes using A360 platform. Utilize A360 s advanced features (AARI, WLM and API Consumption, Document automation,Co-pilot) to automate complex tasks, streamline processes, and optimize efficiency. Integrate A360 with various APIs, databases, and third-party tools to ensure seamless data flow and interaction between systems. Should be able to identify and build the common components to be used across the projects. Collaborate with cross-functional teams including business analysts, Process Architects to deliver holistic automation solutions that cater to various stakeholder needs. Strong SQL database management and troubleshooting skills. Serve as a technical expert on development projects. Review code for compliance and reuse. Ensure code complies with RPA architectural industry standards. Lead problem identification/error resolution process, including tracking, repairing, and reporting defects. Creates and maintains documentation to support role responsibilities for training, cross-training, and disaster recovery. Monitor and maintain license utilization and subscriptions. Maintain / monitor RPA environments (Dev/Test/Prod) Review and ensure automation runbooks are complete and maintained. Design, develop, document, test, and debug new robotic process automation (RPA) applications for internal use. Any Graduation,BE Show more Show less
Posted 1 week ago
8.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are seeking a dynamic and experienced Life Sciences - Pre-Sales Specialist to join our team. This role focuses on supporting sales initiatives in the pharmaceutical, biotechnology, and medical device sectors, specifically targeting commercial analytics and R&D . You will play a critical role in establishing credibility for Tredence as an industry subject matter expert. This position combines technical expertise, deep industry knowledge, and exceptional communication skills to bridge the gap between customer needs and our solutions. Key Responsibilities: Solution Discovery & Qualification: Support account team and partner sales to conduct detailed discovery sessions to understand client requirements and pain points. Product Demonstrations & Value Articulation: Deliver compelling solution demonstrations that showcase the value of analytics, AI/ML, and data-driven insights in addressing life sciences commercial challenges. Develop and present customized proposals, Proof of Concept (PoC) plans, and solution roadmaps based on client-specific requirements. Advise and help build solution accelerators that address market needs. Technical & Industry Expertise: Leverage domain expertise in life sciences position relevant solutions, including data engineering, advanced analytics, and AI capabilities (e.g., in-silico drug discovery or omnichannel optimization). Stay updated on industry trends, regulatory changes, and competitive landscapes to enhance solution offerings and market relevance. Stakeholder Engagement: Collaborate with cross-functional teams, including data engineering, data science, and marketing teams, to align on solution approach and positioning Serve as a trusted advisor to prospective clients, building relationships with key stakeholders (e.g., brand managers, commercial excellence leaders, data science heads). Proposal Development & Support: Assist in RFP responses, ensuring proposals reflect a deep understanding of life sciences use cases. Create technical documentation, whitepapers, client success stories and other marketing materials to support the sales process. Create SOWs and funding requests from technology partners. Required Skills and Experience: Domain Expertise: Strong understanding of life sciences commercial operations, including brand management, salesforce optimization, market access, and marketing analytics. Strong understanding of life sciences R&D including drug discovery, clinical research and real-world evidence. Technical Proficiency: Experience with data platforms, analytics tools (e.g., Databricks, Snowflake, Tableau, Power BI), AI/ML solutions, and CRM systems in life sciences use cases. Ability to design, explain, and communicate complex data architectures and analytics solutions. Communication & Sales Skills: Excellent communication and presentation skills with the ability to convey technical concepts to non-technical audiences. Strong interpersonal skills to build relationships with clients and internal teams. Industry Experience: Minimum 8-10 years of experience in life sciences consulting, pre-sales, or solution engineering roles. Experience working with pharmaceutical, biotech, or medical device companies is preferred. Preferred Qualifications: Bachelor’s/Master’s degree or equivalent in Engineering, Business, Life Sciences, Data Science, or related fields. Industry knowledge in diagnostics, med device and clinical research. What We Offer: Competitive salary and performance-based incentives. Opportunity to work on cutting-edge commercial analytics solutions in the life sciences sector. Collaborative and innovative work environment with ongoing learning and development opportunities. Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Greater Ahmedabad Area
Remote
Job Title : Engineering Manager Experience : 10+ Years Location : Ahmedabad Department : Engineering Management About Simform Simform is a premier digital engineering company specializing in Cloud, Data, AI/ML, and Experience Engineering to create seamless digital experiences and scalable products. Simform is a strong partner for Microsoft, AWS, Google Cloud, and Databricks. With a presence in 5+ countries, Simform primarily serves North America, the UK, and the Northern European market. Simform takes pride in being one of the most reputed employers in the region, having created a thriving work culture with a high work-life balance that gives a sense of freedom and opportunity to grow. Role Overview We are seeking an experienced Engineering Manager to lead and execute complex technical projects for large-scale client accounts. This role requires a blend of strong technical leadership, hands-on engineering capabilities, and strategic project oversight. You will work closely with cross-functional teams including development, QA, DevOps, and architecture leads to design and deliver robust, scalable, and secure software solutions. The ideal candidate has deep technical expertise in backend and cloud technologies, strong stakeholder management skills, and a track record of driving engineering excellence across distributed teams in fast-paced environments. This role also involves contributing to pre-sales efforts, internal capability building, and enforcing best practices across project lifecycles. Key Responsibilities Lead the delivery of large, technically complex projects by designing, validating, and optimizing technical architectures across diverse tech stacks. Translate functional requirements into technical solutions for development teams, assisting with implementation and troubleshooting while acting as the project owner. Identify delivery risks, technical bottlenecks, or resource constraints early and implement mitigation strategies in collaboration with relevant stakeholders. Track and report on engineering KPIs such as sprint velocity, defect leakage, and deployment frequency to ensure quality and timely delivery. Work with Project Managers focusing on PoC, Prototyping and Technical Solution or solely manage the overall project, as needed. Maintain a hands-on approach to technology, with the ability to perform code analysis, reviews, audits, and troubleshooting. Ensure adherence to engineering best practices and enforce secure coding standards across project SDLC. Collaborate with QA team to define test cases and review/validate test scripts, test results ensuring comprehensive functional and non-functional testing. Advocate for process improvements, technical proof of concepts (PoCs), and the reduction of technical debt. Nurture and grow client accounts by ensuring optimised and robust solution delivery with highest quality standards. Serve as a liaison between technical and business stakeholders facilitating clear communication and alignment. Provide technical support for pre-sales initiatives and client interactions. Help define and implement architectural standards, guidelines, principles, guardrails, and governance practices working with different Tech Stack Leads to drive consistency and quality across projects. Contribute to internal initiatives such as technical training, building accelerators, managing technical audits, and creating reusable components. Required Skills And Qualifications 10+ years of technical experience in web/cloud/mobile application development with a broad range of backend technologies and in-depth expertise in at least one backend language (e.g. Node.js, Python, .NET, PHP, etc.) and cloud platforms (AWS, Azure or GCP). 2+ years of experience in engineering team management, technical project management, or large multi-team customer account management. Strong knowledge of system design principles including security, scalability, caching, availability, fault tolerance, performance optimization, observability (logging, alerting and monitoring) and maintainability. Hands-on expertise in at least one backend tech stack, with the ability to conduct code reviews, audits, and deep troubleshooting. Proven experience in designing and delivering robust, secure, and highly optimized production-grade software systems at scale. In-depth, hands-on understanding of cloud services compute, storage, networking, security and cloud-native solution design on AWS, Azure, or GCP. Familiarity with DevOps practices and CI/CD pipelines including tools such as Jenkins, GitLab CI, GitHub Actions, or similar. Strong interpersonal skills and stakeholder management capabilities. Excellent verbal and written communication skills; capable of mentoring, stakeholder presentation, and influencing technical teams and other stakeholders. Demonstrated ability to collaborate cross-functionally with technical and non-technical, internal and external teams to ensure end-to-end delivery. Solution-oriented mindset with the ability to drive incremental technical execution in the face of ambiguity and constraints. Strong understanding of Agile/Scrum methodologies with experience leading Agile teams, ceremonies, and sprint planning. Understanding of architectural documentation and artifacts such as HLD, LLD, architecture diagrams, entity relationship diagrams (ERDs), process flows, and sequence diagrams. Awareness of compliance, data privacy, and regulatory frameworks such as GDPR, HIPAA, SOC 2. Working knowledge of frontend technologies (e.g., React, Angular) and how they integrate with backend and cloud components. Strong adaptability and a continuous learning mindset in fast-paced, high-growth environments. Preferred Skills Certifications in cloud architecture (e.g., AWS Certified Solutions Architect, Azure Solutions Architect Expert, or equivalent) are a plus. Exposure in diverse range of projects including cutting edge technologies, such as Data Engineering, AI or ML. Knowledge of various testing tools and frameworks, e.g. JMeter, LoadRunner or equivalent. Familiarity with Mobile Testing frameworks, e.g. Appium, Calabash or equivalent. Experience with SaaS platforms or multi-tenant architecture is a strong plus. Skills Technical Project Management, Engineering Management, Application Development, Team Building, Training and Development, System Design, Solution Architecture, Azure, AWS,Python/Node.js/.NET/PHP/MEAN , DevOps, CICD, Cloud-Native Design, Microservices, Event Driven and Serverless Architecture Why Join Us Young Team, Thriving Culture Flat-hierarchical, friendly, engineering-oriented, and growth-focused culture. Well-balanced learning and growth opportunities Free health insurance. Office facilities with a game zone, in-office kitchen with affordable lunch service, and free snacks. Sponsorship for certifications/events and library service. Flexible work timing, leaves for life events, WFH and hybrid options (ref:hirist.tech) Show more Show less
Posted 1 week ago
5.0 - 10.0 years
20 - 35 Lacs
Bengaluru
Hybrid
As an Advanced Analytics & Reporting Analyst 3 in the Mobile Analytics domain, you will be responsible for analyzing and interpreting data related to the entire mobile user journey. Your insights will be crucial in understanding user behavior, optimizing app performance, and supporting our GTM team. Key Responsibilities: Mobile User Journey Analysis: Analyze stages such as downloads (organic/paid), app ranking, category penetration, and user transitions from freemium to paid models. App Store Analytics: Track and interpret metrics from Google and Apple consoles, focusing on key indicators like app ratings, reviews, and crash rates. App Launch Metrics: Monitor and analyze the frequency of app launches by users. Feature Usage Analysis: Evaluate the usage of specific app features. 3rd Party Tools Expertise: Utilize tools like Sensor Tower, Data.AI, Google App Console, App Store Console, AppTweak, and Branch to track performance and understand paid media effectiveness. Regular Reporting: Compile and present mobile DDOM scorecards and paid media performance reports every Monday. Deep-Dive Analysis: Conduct in-depth analyses and support monthly category mobile RTBs (e.g., DC RTB, LrM RTB). Project Support: Assist in unlocking growth and driving various analytical projects. Qualifications: Proven experience in the mobile analytics domain. Strong understanding of the overall mobile user journey and key metrics. Proficiency with 3rd party tools such as Sensor Tower/Data.AI, Google App Console, App Store Console, AppTweak, and Branch. Excellent analytical and problem-solving skills. Ability to compile and present detailed reports. Strong communication and collaboration skills. Experience working in an app agency or similar environment is highly valuable.
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: Senior Data Scientist Location: Bangalore Job Title: Assistant Manager - Security Engineering Location: UniOps Bangalore About Unilever Be part of the world’s most successful, purpose-led business. Work with brands that are well-loved around the world, that improve the lives of our consumers and the communities around us. We promote innovation, big and small, to make our business win and grow; and we believe in business as a force for good. Unleash your curiosity, challenge ideas and disrupt processes; use your energy to make this happen. Our brilliant business leaders and colleagues provide mentorship and inspiration, so you can be at your best. Every day, nine out of ten Indian households use our products to feel good, look good and get more out of life – giving us a unique opportunity to build a brighter future. Every individual here can bring their purpose to life through their work. Join us and you’ll be surrounded by inspiring leaders and supportive peers. Among them, you’ll channel your purpose, bring fresh ideas to the table, and simply be you. As you work to make a real impact on the business and the world, we’ll work to help you become a better you. About Uniops Unilever Operations (UniOps) is the global technology and operations engine of Unilever offering business services, technology, and enterprise solutions. UniOps serves over 190 locations and through a network of specialized service lines and partners delivers insights and innovations, user experiences and end-to-end seamless delivery making Unilever Purpose Led and Future Fit Background For Unilever to remain competitive in the future, the business needs to continue on the path to become data intelligent. The Data & Analytics team will persevere to make Unilever Data Intelligent, powering key decisions with data, insights, advanced analytics and AI. Our ambition is to enable democratization of data, information and insights as a completely agile organization that builds fantastic careers for our people and is accountable for delivering great work that maximizes impact and delivers growth. This Data & Analytics function endeavours to create clear accountability for all aspects of Data Strategy, Data Management, Information Management, Analytics, and Insights. We are accountable for impact of solutions, maintaining market relevance and minimising unnecessary overlaps in analytics products, ensuring simplicity and that our solutions better meet the needs of our users. We partner with the Digital and Data Legal Counsel to ensure that our Data Defence (Privacy, Governance, Quality, etc) is well structured and sufficiently robust to use data and AI correctly throughout the enterprise. We democratize information across the business, while supporting the culture shift required for data driven decision making. Our Vision Is To Make Unilever Data Intelligent, Partnering With The Business To Power Key Decisions With Data, Advanced Analytics And AI To Accelerate Growth. Our 5 Strategies To Achieve This Are Accelerate & simplify access to relevant data, information and insights Build in-house, leading-edge data, information, insights & analytics capability Lead the data & insights culture and careers to empower employees across Unilever Rapidly embed analytics products, solutions and services to drive growth Advance Information Automation at Scale The Senior Data Scientist is an exciting role in the Data Foundation. This team builds state of the art machine learning algorithms, maximising the impact of analytic solutions in driving enterprise performance. Typical initiatives include optimizing trade promotion investments, accurately forecasting customer demand, using NLP to glean insight on consumer trends from search data, and making individual assortment recommendations for each of the millions of stores that sell Unilever products. Main Purpose Of The Job The Senior Data Scientist improves business performance in the functional area of Unilever they serve, through the application of world class data science capability. They own delivery of data science on moderate projects or specific modules of a major global initiative. Key Accountabilities Interact with relevant teams to identify business challenges where data science can help Apply comprehensive data science knowledge to propose optimal techniques for key business challenges Create detailed data science proposals and project plans, flagging any limitations of proposed solution Design and prototype experimental solutions, particularly machine learning models Design scaled solutions and ensure high quality and timely delivery Facilitate industrialization and ongoing operation of solutions through well organised code, clear documentation and collaboration with ML Ops resources Govern the work of 3rd party vendors where needed to support delivery, while maximising creation of Unilever IP Represent Data Science in cross-functional governance of projects, engaging with stakeholders up to Director level Highlight recent developments in data science capability which could solve additional challenges Lead a team of up 1-2 data scientists / interns, providing career mentorship and line management Provide technical guidance to data scientists across D&A, particularly on the projects you lead Support the growth of D&A’s data science capability by contributing to activities such as tool and vendor selection, best practice definition, recruitment, and creation of training materials Build the reputation of D&A’s data science capability within Unilever and externally, through activities such as community engagement (e.g. Yammer), publications or blogs Provide ad-hoc & immediate support to the business when needed (for example Covid-19 crisis support) Depending on the specific project, the Senior Data Scientist can expect 60-90% of their work to be hands-on prototyping solutions, with the remainder spent planning and designing, overseeing and reviewing work of project staff, interfacing with stakeholders and managing team members. Experience And Qualifications Required Standards of Leadership Required in This Role Personal Mastery (Data-science and advanced analytics) Agility Business acumen Passion for High Performance Key Skills Required Professional Skills Machine learning - Expert Statistical modelling - Expert Forecasting - Expert Optimisation techniques and tools - Fully Operational Python coding - Fully Operational Data science platform tools e.g. MS Azure, Databricks - Fully Operational Deep learning (and applications to NLP & Computer Vision) - Fully Operational Collaborative development using Git repos - Fully Operational Automated Machine Learning platforms - Foundational knowledge While a broad data science technical background is required, the role will benefit from deeper skills (for example graduate studies or prior work experience) in one of the following areas, optimization, simulation, forecasting, natural language processing, computer vision or geospatial analysis. General Skills Project Management - Expert Communication / presentation skills - Expert 3rd party resource management - Expert CPG Industry analytics - Expert Strong communication and stakeholder engagement skills are essential, including the ability to influence peers and senior business stakeholders across Unilever. Relevant Experience Minimum of B.E. in a relevant technical field (e.g. Computer Science, Engineering, Statistics, Operations Research); preferably a postgraduate (Masters or Doctorate) degree At least 4 years building data science solutions to solve business problems, preferably in the CPG industry (less experience may be acceptable if balanced by strong post-grad qualifications) Experience with open source languages (eg. Python) and preferably with distributed computing (PySpark) Experience deploying solutions in a modern cloud-based architecture Experience managing the work of team members and 3rd party resource vendors Experience presenting insights and influencing decisions of senior non-technical stakeholders Key interfaces Internal Unilever operational, marketing, customer development, supply chain, product & finance teams Internal D&A teams (Engagement teams; Data CoE; Solution Factory; BDL Factory; Information Factory; Tech Transformation) Wider Unilever analytics and data science professionals External 3rd party Data Science vendors Universities Industry bodies At HUL, we believe that every individual irrespective of their race, colour, religion, gender, sexual orientation, gender identity or expression, age, nationality, caste, disability or marital status can bring their purpose to life. So apply to us, to unleash your curiosity, challenge ideas and disrupt processes; use your energy to make the world a better place. As you work to make a real impact on the business and the world, we’ll work to help you become a better you! Show more Show less
Posted 1 week ago
11.0 - 17.0 years
20 - 35 Lacs
Indore, Hyderabad
Work from Office
Greetings of the Day !! We have job opening for Microsoft Fabric + ADF with one of our clients. If you are interested in this position, please share update resume in this email id : shaswati.m@bct-consulting.com . * Primary Skill Microsoft Fabric Secondary Skill 1 Azure Data Factory (ADF) 12+ years of experience in Microsoft Azure Data Engineering for analytical projects. Proven expertise in designing, developing, and deploying high-volume, end-to-end ETL pipelines for complex models, including batch, and real-time data integration frameworks using Azure, Microsoft Fabric and Databricks. Extensive hands-on experience with Azure Data Factory, Databricks (with Unity Catalog), Azure Functions, Synapse Analytics, Data Lake, Delta Lake, and Azure SQL Database for managing and processing large-scale data integrations. Experience in Databricks cluster optimization and workflow management to ensure cost-effective and high-performance processing. Sound knowledge of data modelling, data governance, data quality management, and data modernization processes. Develop architecture blueprints and technical design documentation for Azure-based data solutions. Provide technical leadership and guidance on cloud architecture best practices, ensuring scalable and secure solutions. Keep abreast of emerging Azure technologies and recommend enhancements to existing systems. Lead proof of concepts (PoCs) and adopt agile delivery methodologies for solution development and delivery.
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Please note, this is a 12 month initial contract, with the possibility of extensions. This role is hybrid in 560037, Bengaluru. Insight Global are looking for a Data Management Business Analyst to join one of their premium clients in the financial services space. You will play a pivotal role in bridging the gap between business needs and technical solutions, with a strong emphasis on data governance and data management. You will ensure that the companies data assets are effectively governed, secure, and aligned with business objectives with a specific focus on supporting the capture of data lineage across the technology estate. You will be the liaison for internal stakeholders when it comes to understanding requirements. You may also be involved in manipulating data at the same time. Must haves: 5+ years' experience in a Business Analyst and/or Data Analyst role with a focus on data governance, data management, or data quality Strong technical understanding of data systems, including databases (for example, SQL), data modelling, and data integration tools Proficiency in data analysis tools and techniques (such as Python, R, or Excel) Experience in developing and implementing data governance frameworks, policies, or standards Excellent communication and stakeholder management skills, with the ability to translate complex technical concepts into simplified business language Experience creating business requirement documentation (BRD) Strong understanding of regulatory compliance requirements related to data (for example GDPR, DORA, or industry-specific regulations) Bachelor’s degree in a relevant field such as Computer Science, Information Systems, Data Science, Business Administration, or equivalent Plusses: Hands-on experience with data governance tools (such as Collibra, Informatica or Solidatus) Familiarity with cloud-based data platforms (such as Azure, AWS or Google Cloud) Knowledge of modern data platforms (for example Snowflake, Databricks or Azure Data Lake) Knowledge of data visualization tools for presenting insights (for example Tableau or Power BI) Experience writing user stories Experience working in an Agile environment (using tools such as Jira is advantageous) Experience working in financial services or other regulated industries Understanding of machine learning or advanced analytics concepts An advanced degree in Data Science, Business Analytics, or related fields Professional certifications in business analysis (such as CBAP, CCBA), data analysis, or data governance (such DAMA CDMP, CISA) are highly desirable Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Associate, Product Operations,NPS Prism Title: Associate, Product Operations Company Profile: NPS Prism is a market-leading, cloud-based CX benchmarking and operational improvement platform owned by Bain & Company. NPS Prism provides its customers with actionable insights and analysis that guide the creation of game-changing customer experiences. Based on rock-solid sampling, research, and analytic methodology, it lets customers see how they compare to their competitors on overall NPS®, and on every step of the customer journey. Launched in 2019, NPS Prism has rapidly grown to a team of over 200, serving dozens of clients around the world. While NPS Prism is its own company, NPS Prism is 100% owned by Bain & Company, one of the top management consulting firms in the world and a company consistently recognized as one of the world’s best places to work. We believe that diversity, inclusion, and collaboration are key to building extraordinary teams. We hire people with exceptional talents, abilities, and potential, then create an environment where you can become the best version of yourself and thrive professionally and personally. Position Summary: NPS Prism has experienced tremendous growth as a standalone software and data business over the past few years and is making the leap from being a consulting-led business to a technology-led business. Given that shift, we are looking to build our team with world-class team members to help drive business growth to its full potential in this next phase. This is a great opportunity to help build the largest startup owned by Bain & Company and take NPS Prism into the future. Key Responsibilities: Independently owns the workstreams assigned to the individual and develops client ready visualization dashboards using tools like Tableau with minimum guidance Works with multiple stakeholders, across instruments and workstreams to provide expertise in data management Provides structured support to junior team members, including helping them understand tools, logic flows, and standard work practices Builds reusable capabilities and templates that improve team efficiency and scalability, ensuring that they are well documented and across service lines Analyze the data using tools like SQL, Alteryx, and Databricks Design and execute new benchmarking survey instruments, including assessing data needs and designing the primary research surveys and sample specification Respond to onshore team/client questions around the data and insights Translates complex technical concepts such as screener logics, dashboard workflows, and research methodologies into clear explanations for internal teams and client facing discussions Provide ongoing support for subscription customers, such as additional data cuts and responding to questions about the data/methodology with precision and speed Support commercialization efforts, including conducting data-driven analysis for proposals, building custom outputs, and conducting product demos with prospective clients Required Qualifications, Experience & Skills: Education: Required: Graduate/post graduate from top-tier institute or have pursued a statistical/analytical course from a tier 1 university Preferred: Concentration in a quantitative discipline such as Statistics, Mathematics, Engineering, Computer Science, Econometrics, Business Analytics, or Market Research Experience 2+ years of experience in areas related to Data Management, Business Intelligence or Business Analytics. Hands-on experience in managing end-to-end customer surveys, data analysis and visualizations Technical Skills Extensive hands-on experience with Tableau, with strong command of dashboard design, interactivity, and performance optimization Proven experience using Alteryx for data prep, transformation, and validation at scale Proficient in applying statistical and data mining techniques to derive meaningful insights from customer feedback and survey data Comfort with SQL and Python (preferred), especially for data manipulation, automation, or analytics tasks Other Skills: Proactive problem-solver with a strong sense of ownership and attention to detail Experience mentoring junior team members and contributing to collaborative team culture Excellent interpersonal, written, and verbal communication skills Comfort working in a hybrid or remote environment with distributed teams Demonstrates resilience and adaptability in navigating change and feedback Consistently seeks feedback and iterates to improve performance and team outcomes Powered by JazzHR e13WpNNOGf Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are efficient, scalable, and aligned with business objectives. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with stakeholders to gather and analyze data requirements. - Design and implement robust data pipelines to support data processing and analytics. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Strong understanding of data modeling and database design principles. - Experience with ETL tools and data integration techniques. - Familiarity with cloud platforms and services related to data storage and processing. - Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
India
Remote
Looking for: Data/ML Engineer Job Type: Contract Location: Remote (India) Work Time: Remote: Time coverage up 12 AM IST Job Description: Required Skills & Experience: • Hands-on code mindset with deep understanding in technologies / skillset and an ability to understand larger picture. • Sound knowledge to understand Architectural Patterns, best practices and Non-Functional Requirements • Overall, 8-10 years of experience in heavy volume data processing, data platform, data lake, big data, data warehouse, or equivalent. • 5+ years of experience with strong proficiency in Python and Spark (must-have). • 3+ years of hands-on experience in ETL workflows using Spark and Python. • 4+ years of experience with large-scale data loads, feature extraction, and data processing pipelines in different modes – near real time, batch, realtime. • Solid understanding of data quality, data accuracy concepts and practices. • 2+ years of solid experience in building and deploying ML models in a production setup. Ability to quickly adapt and take care of data preprocessing, feature engineering, model engineering as needed. • 2+ years of experience working with Python deep learning libraries like any or more than one of these - PyTorch, Tensorflow, Keras or equivalent. • Prior experience working with LLMs, transformers. Must be able to work through all phases of the model development as needed. • Experience integrating with various data stores, including: o SQL/NoSQL databases o In-memory stores like Redis o Data lakes (e.g., Delta Lake) • Experience with Kafka streams, producers & consumers. • Required: Experience with Databricks or similar data lake / data platform. • Required: Java and Spring Boot experience with respect to data processing - near real time, batch based. • Familiarity with notebook-based environments such as Jupyter Notebook. • Adaptability: Must be open to learning new technologies and approaches. • Initiative: Ability to take ownership of tasks, learn independently, and innovate. • With technology landscape changing rapidly, ability and willingness to learn new technologies as needed and produce results on job. Preferred Skills: • Ability to pivot from conventional approaches and develop creative solutions. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview The Data Impact Analyst is part of the Data Impact team. The purpose of this team is to deliver business impact through data and industry leading analytics, all in close collaboration with the NWE commercial departments (DX, Sales and Marketing) and Europe sector teams/CoEs (Advanced Analytics, Data & Analytics, Digital, Reporting & Insights, Perfect Store). The associate will support PepsiCo NWE Commercial Data & Analytics strategy definition. He/she will own and maintain the commercial reporting landscape and play an instrumental role in data democratization, making sure that data and insights are available to everyone in an easy and effective way through standardization and new tools development (e.g. dashboarding). As a member of the Data Impact team, he/she will lead the translation of strategic business questions into analytics use cases and ultimately business impact through capturing needs, preparing relevant data sources and applying advanced analytics methods. Responsibilities Co-own data management strategy - define the way we collect, store, maintain and automate commercial data sources and assess improvement potential for the existing strategy Conduct periodic data quality checks Own and maintain existing commercial reporting landscape; assess automation and harmonization potential and align with commercial stakeholders on their reporting needs Based on the assessment, transform the existing reporting into Power BI dashboards; develop new reports if needed Use tools like Python/PySpark in Azure Databricks to prepare data for analytics use cases Work with commercial teams and translate their strategic business questions into analytics use cases Act as a data and analytics evangelist, be at the forefront of data-driven models and insights and lead others to leverage data in their decision making Qualifications Excellent analytical skills with a natural ability to visualize data in a way that uncovers (new) insights Ability to process and work with large and diversified datasets Strong experience in Power BI - Backend/Frontend Previous experience with data preparation tools like Alteryx and Azure Databricks Experience in Databricks and Python/PySpark - Should be able to perform an ETL transformation of a mid-large scale dataset Previous experience with data visualization tools - preferably MS Power BI Good to have knowledge of R/understanding and running simple machine learning models/SQL but not mandatory Attention to detail, accuracy and ability to work towards tight deadlines Intellectually curious, with an interest in how analytics can be leveraged to derive business value Effective verbal and written communication skills E2E project management experience is preferable - i.e. from collecting/understanding business requirements through development to implementation and evaluation Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
India
Remote
Data Architect Long Term Contract (Initially 6 months rolling) - Fully Remote $25 - $30 per hour ($4000 - $4800) Start Date: ASAP (must be available within 30 days of offer) KEY SKILLS An experienced Data Architect to build robust data architecture solutions using a range of cutting-edge technologies. Must have a minimum of 8-12 years Big Data Experience Strong and recent PySpark (5+ years) & Python Any experience of Palantir Foundry or similar products such as Databricks would be highly advantageous. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Kochi, Kerala, India
On-site
Overview As an MDM Technical Delivery Manager, you will be responsible for leading and overseeing the end-to-end delivery of Master Data Management (MDM) solutions. You will collaborate with cross-functional teams to drive technical implementation, ensure data governance, and align with business objectives. Your expertise in MDM platforms, integration strategies, and project execution will be key to delivering high-quality solutions Key Responsibilities Oversee a team of experienced professionals, fostering collaboration and high performance. Guide and mentor team members, supporting their job performance and career growth. Lead the technical delivery of MDM implementations, ensuring successful project execution. Define MDM architecture, strategy, and integration frameworks with enterprise systems. Collaborate with business stakeholders to understand data requirements and align solutions. Oversee data governance, quality, and compliance with regulatory standards. Manage MDM development teams, ensuring adherence to best practices and standards. Optimize data models, workflows, and processes for efficient MDM operations. Drive continuous improvements in MDM technologies, methodologies, and performance. Communicate project updates, risks, and resolutions to leadership and stakeholders. Required Qualifications Bachelor’s degree in Computer Engineering, Computer Science, or a related field. 5-7+ years of experience in software development and data Management. 5+ years of expertise in MDM implementation, with hands-on experience in Reltio, DataBricks, Azure, Oracle, and Snowflake. Strong background in integration design and development. Strong expertise in data integration design, ETL processes, and API development. At least 2+ years in an MDM Technical Lead and Delivery role. Proven track record in leading MDM projects and cross-functional teams. Solid understanding of diverse data sets, sources, and country-specific data models. Experience in life sciences MDM implementations. Experience in life sciences, healthcare, or pharmaceutical industries is a plus. Excellent communication, leadership, and problem-solving skills. IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who We Are We are Metyis, a forward-thinking, global company that collaborates with business partners and clients to create and implement the capacities and capabilities required to improve future performance. We operate across a wide range of industries, and with our embedded partnership model we commit to driving sustainable growth for industry-leading organisations, elevating their potential with a long-term vision. At Metyis, we develop integrated solutions that enable growth by working within our business partners’ organisations with diverse and multidisciplinary teams. This collaborative environment allows Metyis to build strategies and execute them through an ecosystem that combines Big Data solutions, Digital Commerce solutions, Marketing & Design solutions, and Advisory services. Our teams are set up so that you have plenty of room to build bigger and bolder ideas by speaking your mind and being creative with your knowledge. Imagine the things you could achieve with a multidisciplinary team that encourages you to be the best version of yourself. We are Metyis. Partners for Impact. What We Offer Interact with senior stakeholders at our clients on a regular basis to drive their business towards impactful change. Become the go-to person for end-to-end Infrastructure management and deployment processes. Lead your team in supporting data management, data visualization, and analytics products teams to deliver optimal solutions. Become part of a fast-growing international and diverse team What you will do Assist in the creation and maintenance of data pipelines for data management, visualization, and analytics products. Support the design of platform infrastructure, network, and security, and collaborate with senior engineers to ensure security, compliance, and cost efficiency in cloud environments. Support the automation of infrastructure provisioning and configuration management using Terraform. Help ensure our data platform operates efficiently and remains online. Collaborate with senior team members to develop and deploy automated services and APIs. Gain experience with Unix/Linux and network fundamentals. Support deployment processes with CI/CD tools like Azure DevOps, GitHub Actions. Work with Azure and its data platform and analytics components. Develop skills in automation and open-source tools. What You’ll Bring Understanding of IT operations and process re-engineering. 3+ years of experience in CloudOps, DevOps, or IT infrastructure. Familiarity with Microsoft Azure, cloud fundamentals and Azure DevOps (other cloud platforms is a plus). You’re proficient with scripting languages (e.g. Python, YAML, git, bash, PowerShell etc.) You’re proficient with CI/CD tools (e.g. Azure DevOps Pipelines, GitHub Actions) Azure certifications (AZ-104, AZ-400, AZ-305, or DP-203) are recommended and will be considered a plus. Experience working in azure data platform projects Familiarity with software engineering best practices in coding, software design patterns, testing & debugging Strong problem-solving skills and a willingness to learn. Interest in learning about data analytics and cloud operations. Ability to work collaboratively in a team environment. Good to have You have experience with testing & data quality (e.g. pytest, great expectations, etc.) You have experience of working in an Agile environment You have experience with data processing frameworks like Spark and Databricks environment as DevOps or DataOps. In a changing world, diversity and inclusion are core values for team well-being and performance. At Metyis, we want to welcome and retain all talents, regardless of gender, age, origin or sexual orientation, and irrespective of whether or not they are living with a disability, as each of them has their own experience and identity. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We’re Hiring – Join Our Growing Data Team! 🚨 Hi LinkedIn Network 👋 We are actively hiring for multiple Data Science and Data Engineering roles. If you’re passionate about data and want to work on impactful projects with a collaborative team, check out the openings below: 🔹 Sr. Data Scientist 📍 Experience: 5+ years 🔍 Must have: Strong experience in Machine Learning algorithms Hands-on with Regression & Classification models 🔹 Data Analyst (Data Science) 📍 Experience: 3+ years 🔍 Must have: Snowflake, R, Python, SQL Understanding of ML algorithms and data interpretation 🔹 Sr. Data Engineer 📍 Experience: 4+ years 🔍 Must have: Databricks, Python Data Visualization, ETL pipelines Snowflake expertise 🔹 Data Engineer (Spark) 📍 Experience: 3+ years 🔍 Must have: Strong in SQL, Spark, Hadoop Experience with CI/CD, Snowflake, ETL tools If any of these roles resonate with your experience or career goals, feel free to apply or reach out directly. Drop your resume at: [gayatri.yeolekar@mbww.com] Show more Show less
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Transform data into a format that can be easily analyzed by developing, maintaining, and testing infrastructures for data generation. Work closely with data scientists and are largely in charge of architecting solutions for data scientists that enable them to do their jobs. Role involves creating Data pipelines and integrating, transforming & enabling data for wider enterprise use. Job Description Duties for this role include but not limited to: supporting the design, build, test and maintain data pipelines at big data scale. Assists with updating data from multiple data sources. Work on batch processing of collected data and match its format to the stored data, make sure that the data is ready to be processed and analyzed. Assisting with keeping the ecosystem and the pipeline optimized and efficient, troubleshooting standard performance, data related problems and provide L3 support. Implementing parsers, validators, transformers and correlators to reformat, update and enhance the data. Provides recommendations to highly complex problems. Providing guidance to those in less senior positions. Additional Job Description Data Engineers play a pivotal role within Dataworks, focused on creating and driving engineering innovation and facilitating the delivery of key business initiatives. Acting as a “universal translator” between IT, business, software engineers and data scientists, data engineers collaborate across multi-disciplinary teams to deliver value. Data Engineers will work on those aspects of the Dataworks platform that govern the ingestion, transformation, and pipelining of data assets, both to end users within FedEx and into data products and services that may be externally facing. Day-to-day, they will be deeply involved in code reviews and large-scale deployments. Essential Job Duties & Responsibilities Understanding in depth both the business and technical problems Dataworks aims to solve Building tools, platforms and pipelines to enable teams to clearly and cleanly analyze data, build models and drive decisions Scaling up from “laptop-scale” to “cluster scale” problems, in terms of both infrastructure and problem structure and technique Collaborating across teams to drive the generation of data driven operational insights that translate to high value optimized solutions. Delivering tangible value very rapidly, collaborating with diverse teams of varying backgrounds and disciplines Codifying best practices for future reuse in the form of accessible, reusable patterns, templates, and code bases Interacting with senior technologists from the broader enterprise and outside of FedEx (partner ecosystems and customers) to create synergies and ensure smooth deployments to downstream operational systems Skill/Knowledge Considered a Plus Technical background in computer science, software engineering, database systems, distributed systems Fluency with distributed and cloud environments and a deep understanding of optimizing computational considerations with theoretical properties Experience in building robust cloud-based data engineering and curation solutions to create data products useful for numerous applications Detailed knowledge of the Microsoft Azure tooling for large-scale data engineering efforts and deployments is highly preferred. Experience with any combination of the following azure tools: Azure Databricks, Azure Data Factory, Azure SQL D, Azure Synapse Analytics Developing and operationalizing capabilities and solutions including under near real-time high-volume streaming conditions. Hands-on development skills with the ability to work at the code level and help debug hard to resolve issues. A compelling track record of designing and deploying large scale technical solutions, which deliver tangible, ongoing value Direct experience having built and deployed robust, complex production systems that implement modern, data processing methods at scale Ability to context-switch, to provide support to dispersed teams which may need an “expert hacker” to unblock an especially challenging technical obstacle, and to work through problems as they are still being defined Demonstrated ability to deliver technical projects with a team, often working under tight time constraints to deliver value An ‘engineering’ mindset, willing to make rapid, pragmatic decisions to improve performance, accelerate progress or magnify impact Comfort with working with distributed teams on code-based deliverables, using version control systems and code reviews Ability to conduct data analysis, investigation, and lineage studies to document and enhance data quality and access Use of agile and devops practices for project and software management including continuous integration and continuous delivery Demonstrated expertise working with some of the following common languages and tools: Spark (Scala and PySpark), Kafka and other high-volume data tools SQL and NoSQL storage tools, such as MySQL, Postgres, MongoDB/CosmosDB Java, Python data tools Azure DevOps experience to track work, develop using git-integrated version control patterns, and build and utilize CI/CD pipelines Working knowledge and experience implementing data architecture patterns to support varying business needs Experience with different data types (json, xml, parquet, avro, unstructured) for both batch and streaming ingestions Use of Azure Kubernetes Services, Eventhubs, or other related technologies to implement streaming ingestions Experience developing and implementing alerting and monitoring frameworks Working knowledge of Infrastructure as Code (IaC) through Terraform to create and deploy resources Implementation experience across different data stores, messaging systems, and data processing engines Data integration through APIs and/or REST service PowerPlatform (PowerBI, PowerApp, PowerAutomate) development experience a plus Minimum Qualifications Data Engineer I: Bachelor’s Degree in Information Systems, Computer Science or a quantitative discipline such as Mathematics or Engineering and/or One (1) year equivalent formal training or work experience. Basic knowledge in data engineering and machine learning frameworks including design, development and implementation of highly complex systems and data pipelines. Basic knowledge in Information Systems including design, development and implementation of large batch or online transaction-based systems. Experience as a junior member of multi-functional project teams. Strong oral and written communication skills. A related advanced degree may offset the related experience requirements. Data Engineer II Bachelor's Degree in Computer Science, Information Systems, a related quantitative field such as Engineering or Mathematics or equivalent formal training or work experience. Two (2) years equivalent work experience in measurement and analysis, quantitative business problem solving, simulation development and/or predictive analytics. Strong knowledge in data engineering and machine learning frameworks including design, development and implementation of highly complex systems and data pipelines. Strong knowledge in Information Systems including design, development and implementation of large batch or online transaction-based systems. Strong understanding of the transportation industry, competitors, and evolving technologies. Experience as a member of multi-functional project teams. Strong oral and written communication skills. A related advanced degree may offset the related experience requirements. Data Engineer III Bachelor’s Degree in Information Systems, Computer Science or a quantitative discipline such as Mathematics or Engineering and/or equivalent formal training or work experience. Three to Four (3 - 4) years equivalent work experience in measurement and analysis, quantitative business problem solving, simulation development and/or predictive analytics. Extensive knowledge in data engineering and machine learning frameworks including design, development and implementation of highly complex systems and data pipelines. Extensive knowledge in Information Systems including design, development and implementation of large batch or online transaction-based systems. Strong understanding of the transportation industry, competitors, and evolving technologies. Experience providing leadership in a general planning or consulting setting. Experience as a senior member of multi-functional project teams. Strong oral and written communication skills. A related advanced degree may offset the related experience requirements. Data Engineer Lead Bachelor’s Degree in Information Systems, Computer Science, or a quantitative discipline such as Mathematics or Engineering and/or equivalent formal training or work experience. Five to Seven (5 - 7) years equivalent work experience in measurement and analysis, quantitative business problem solving, simulation development and/or predictive analytics. Extensive knowledge in data engineering and machine learning frameworks including design, development and implementation of highly complex systems and data pipelines. Extensive knowledge in Information Systems including design, development and implementation of large batch or online transaction-based systems. Strong understanding of the transportation industry, competitors, and evolving technologies. Experience providing leadership in a general planning or consulting setting. Experience as a leader or a senior member of multi-function project teams. Strong oral and written communication skills. A related advanced degree may offset the related experience requirements. Analytical Skills, Accuracy & Attention to Detail, Planning & Organizing Skills, Influencing & Persuasion Skills, Presentation Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Position Overview We are looking for an experienced Lead Data Engineer to join our dynamic team. If you are passionate about building scalable software solutions, and work collaboratively with cross-functional teams to define requirements and deliver solutions we would love to hear from you. ShyftLabs is a growing data product company that was founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to help accelerate the growth of businesses in various industries, by focusing on creating value through innovation. Job Responsibilities: Develop and maintain data pipelines and ETL/ELT processes using Python Design and implement scalable, high-performance applications Work collaboratively with cross-functional teams to define requirements and deliver solutions Develop and manage near real-time data streaming solutions using Pub, Sub or Beam Contribute to code reviews, architecture discussions, and continuous improvement initiatives Monitor and troubleshoot production systems to ensure reliability and performance Basic Qualifications: 5+ years of professional software development experience with Python Strong understanding of software engineering best practices (testing, version control, CI/CD) Experience building and optimizing ETL/ELT processes and data pipelines Proficiency with SQL and database concepts Experience with data processing frameworks (e.g., Pandas) Understanding of software design patterns and architectural principles Ability to write clean, well-documented, and maintainable code Experience with unit testing and test automation Experience working with any cloud provider (GCP is preferred) Experience with CI/CD pipelines and Infrastructure as code Experience with Containerization technologies like Docker or Kubernetes Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience) Proven track record of delivering complex software projects Excellent problem-solving and analytical thinking skills Strong communication skills and ability to work in a collaborative environment Preferred Qualifications: Experience with GCP services, particularly Cloud Run and Dataflow Experience with stream processing technologies (Pub/Sub) Familiarity with big data technologies (Airflow) Experience with data visualization tools and libraries Knowledge of CI/CD pipelines with Gitlab and infrastructure as code with Terraform Familiarity with platforms like Snowflake, Bigquery or Databricks, GCP Data engineer certification We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources. Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Databricks is a popular technology in the field of big data and analytics, and the job market for Databricks professionals in India is growing rapidly. Companies across various industries are actively looking for skilled individuals with expertise in Databricks to help them harness the power of data. If you are considering a career in Databricks, here is a detailed guide to help you navigate the job market in India.
The average salary range for Databricks professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum
In the field of Databricks, a typical career path may include: - Junior Developer - Senior Developer - Tech Lead - Architect
In addition to Databricks expertise, other skills that are often expected or helpful alongside Databricks include: - Apache Spark - Python/Scala programming - Data modeling - SQL - Data visualization tools
As you prepare for Databricks job interviews, make sure to brush up on your technical skills, stay updated with the latest trends in the field, and showcase your problem-solving abilities. With the right preparation and confidence, you can land your dream job in the exciting world of Databricks in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
16951 Jobs | Dublin
Wipro
9154 Jobs | Bengaluru
EY
7414 Jobs | London
Amazon
5846 Jobs | Seattle,WA
Uplers
5736 Jobs | Ahmedabad
IBM
5617 Jobs | Armonk
Oracle
5448 Jobs | Redwood City
Accenture in India
5221 Jobs | Dublin 2
Capgemini
3420 Jobs | Paris,France
Tata Consultancy Services
3151 Jobs | Thane