Home
Jobs

1578 Data Processing Jobs - Page 47

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

11 - 13 Lacs

Mumbai, Pune, Bengaluru

Work from Office

Naukri logo

This is a key position supporting client organization with strong Analytics and data science capabilities. There is significant revenue and future opportunities associated with this role. Job Description: Develop and maintain data tables (management, extraction, harmonizing etc.) using GCP/ SQL/ Snowflake etc. This involves designing, implementing, and writing optimized codes, maintaining complex SQL queries to extract, transform, and load (ETL) data from various tables/sources, and ensuring data integrity and accuracy throughout the data pipeline process. Create and manage data visualizations using Tableau/Power BI. This involves designing and developing interactive dashboards and reports, ensuring visualizations are user-friendly, insightful, and aligned with business requirements, and regularly updating and maintaining dashboards to reflect the latest data and insights. Generate insights and reports to support business decision-making. This includes analyzing data trends and patterns to provide actionable insights, preparing comprehensive reports that summarize key findings and recommendations, and presenting data-driven insights to stakeholders to inform strategic decisions. Handle ad-hoc data requests and provide timely solutions. This involves responding to urgent data requests from various departments, quickly gathering, analyzing, and delivering accurate data to meet immediate business needs, and ensuring ad-hoc solutions are scalable and reusable for future requests. Collaborate with stakeholders to understand and solve open-ended questions. This includes engaging with business users to identify their data needs and challenges, working closely with cross-functional teams to develop solutions for complex, open-ended problems, and translating business questions into analytical tasks to deliver meaningful results. Design and create wireframes and mockups for data visualization projects. This involves developing wireframes and mockups to plan and communicate visualization ideas, collaborating with stakeholders to refine and finalize visualization designs, and ensuring that wireframes and mockups align with user requirements and best practices. Communicate findings and insights effectively to both technical and non-technical audiences. This includes preparing clear and concise presentations to share insights with diverse audiences, tailoring communication styles to suit the technical proficiency of the audience, and using storytelling techniques to make data insights more engaging and understandable. Perform data manipulation and analysis using Python. This includes utilizing Python libraries such as Pandas, NumPy, and SciPy for data cleaning, transformation, and analysis, developing scripts and automation tools to streamline data processing tasks, and conducting statistical analysis to generate insights from large datasets. Implement basic machine learning models using Python. This involves developing and applying basic machine learning models to enhance data analysis, using libraries such as scikit-learn and TensorFlow for model development and evaluation, and interpreting and communicating the results of machine learning models to stakeholders. Automate data processes using Python. This includes creating automation scripts to streamline repetitive data tasks, implementing scheduling and monitoring of automated processes to ensure reliability, and continuously improving automation workflows to increase efficiency. Requirements: 3 to 5 years of experience in data analysis, reporting, and visualization. This includes a proven track record of working on data projects and delivering impactful results and experience in a similar role within a fast-paced environment. Proficiency in GCP/ SQL/ Snowflake/ Python for data manipulation. This includes strong knowledge of GCP/SQL/Snowflake services and tools, advanced SQL skills for complex query writing and optimization, and expertise in Python for data analysis and automation. Strong experience with Tableau/ Power BI/ Looker Studio for data visualization. This includes demonstrated ability to create compelling and informative dashboards, and familiarity with best practices in data visualization and user experience design. Excellent communication skills, with the ability to articulate complex information clearly. This includes strong written and verbal communication skills, and the ability to explain technical concepts to non-technical stakeholders. Proven ability to solve open-ended questions and handle ad-hoc requests. This includes creative problem-solving skills and a proactive approach to challenges, and flexibility to adapt to changing priorities and urgent requests. Strong problem-solving skills and attention to detail. This includes a keen eye for detail and accuracy in data analysis and reporting, and the ability to identify and resolve data quality issues. Experience in creating wireframes a nd mockups. This includes proficiency in design tools and effectively translating ideas into visual representations. Ability to work independently and as part of a team. This includes being self-motivated and able to manage multiple tasks simultaneously and having a collaborative mindset and willingness to support team members. Location: Bangalore Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 3 weeks ago

Apply

1.0 - 3.0 years

5 - 8 Lacs

Mohali

Work from Office

Naukri logo

About the role As a Product Analyst , you will play a critical role in helping us build data-driven, user-centric features on the BotPenguin platform. You will work closely with Product Managers, Design, Engineering, Marketing, and Customer Success Teams to analyze user behavior, validate feature performance, and uncover growth opportunities through actionable insights. This is an exciting opportunity to join a high-growth product team and influence strategic decisions at the intersection of data, product design, and customer experience. What you need for this role Education: Bachelor s degree in Computer Science, Business Analytics, Engineering, Statistics, or related field. Experience: 1-3 years of experience in a product or data analyst role within a SaaS or tech product environment. Technical Skills: Strong expertise in MongoDB and data visualization tools (e.g., Tableau, Power BI, Metabase). Familiarity with Google Analytics, Mixpanel, Hotjar, or other product analytics platforms. Hands-on experience working with Excel/Google Sheets, building dashboards, and extracting user insights. Knowledge of product lifecycle, user funnels, A/B testing, and cohort analysis. Bonus: Exposure to Python, R, or basic scripting for data processing. Soft Skills: Excellent analytical and problem-solving skills. Strong communication and storytelling abilities able to translate data into strategic insights. Proactive attitude with a willingness to own initiatives and drive improvements. Keen interest in product design, user experience, and tech innovation. What you will be doing Collaborate with Product Managers to define key metrics, success criteria, and feature adoption benchmarks. Analyze platform usage, customer behavior, and market data to discover pain points and opportunity areas. Generate and maintain weekly/monthly product reports and dashboards for cross-functional teams. Design and evaluate A/B tests, feature rollouts, and experiments to improve user engagement and retention. Work with the Engineering team to ensure accurate data tracking and event instrumentation. Monitor product KPIs and proactively raise red flags for anomalies or unexpected trends. Participate in roadmap discussions, contributing insights backed by data. Assist in user segmentation and support marketing and CS teams with insights for personalized communication and retention strategies. Assist on any other related to the product development or management if required. Top reasons to work with us Lead the architecture and evolution of a fast-growing AI product used globally. Be part of a cutting-edge AI startup driving innovation in chatbot automation. Work with a passionate and talented team that values knowledge-sharing and problem-solving. Growth-oriented environment with ample learning opportunities. Exposure to top-tier global clients and projects with real-world impact. Flexible work hours and an emphasis on work-life balance. A culture that fosters creativity, ownership, and collaboration. We are seeking a passionate individual who will play a critical role in helping us build data-driven, user-centric features on the BotPenguin platform.

Posted 3 weeks ago

Apply

1.0 - 6.0 years

1 - 4 Lacs

Hyderabad

Work from Office

Naukri logo

Daily Base update MIS Reports Computer Skill must Required. MS excel and MS office Data entry into the system Data accuracy On time data updation in the systesm

Posted 3 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

Design, develop, and maintain data pipelines and ETL/ELT processes using PySpark/Databricks. Optimize performance for large datasets through techniques such as partitioning, indexing, and Spark optimization. Collaborate with cross-functional teams to resolve technical issues and gather requirements. Your Key Responsibilities Ensure data quality and integrity through data validation and cleansing processes. Analyze existing SQL queries, functions, and stored procedures for performance improvements. Develop database routines like procedures, functions, and views. Participate in data migration projects and understand technologies like Delta Lake/warehouse. Debug and solve complex problems in data pipelines and processes. Your skills and experience that will help you excel Bachelor s degree in computer science, Engineering, or a related field. Strong understanding of distributed data processing platforms like Databricks and BigQuery. Proficiency in Python, PySpark, and SQL programming languages. Experience with performance optimization for large datasets. Strong debugging and problem-solving skills. Fundamental knowledge of cloud services, preferably Azure or GCP. Excellent communication and teamwork skills. Nice to Have: Experience in data migration projects. Understanding of technologies like Delta Lake/warehouse.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Gurugram

Work from Office

Naukri logo

Most companies try to meet expectations, dunnhumby exists to defy them. Using big data, deep expertise and AI-driven platforms to decode the 21 st century human experience - then redefine it in meaningful and surprising ways that put customers first. Across digital, mobile and retail. For brands like Tesco, Coca-Cola, Procter & Gamble and PepsiCo. We re looking for a Big Data Engineer who expects more from their career. It s a chance to extend and improve dunnhumby s Data Engineering Team It s an opportunity to work with a market-leading business to explore new opportunities for us and influence global retailers. Joining our team, you ll work with world class and passionate people which is part of Innovation Technology. You will be responsible for working with stakeholders in the development of data technology that meet the goals of the dunnhumby technology strategy and data principles. Additionally, this individual will be called upon to contribute to a growing list of dunnhumby data best practices. Key Responsibilities Build end-to-end data solutions, including data lakes, data warehouses, ETL/ELT pipelines, APIs, and analytics platforms. Build scalable and low-latency data pipelines using tools like Apache Kafka, Flink, or Spark Streaming to handle high-velocity data streams. Automate data pipelines and processes end-to-end using orchestration frameworks such as Apache Airflow to manage complex workflows and dependencies. Develop intelligent systems that can detect anomalies, trigger alerts, and automatically reroute or restart processes to maintain data integrity and availability. Develop pipeline for real-time data processing. Implement data governance, metadata management, and data quality standards. Explore appropriate tools, platforms, and technologies aligned with organizational standards. Ensure security, compliance, and regulatory requirements are addressed in all data solutions. Evaluate and recommend improvements to existing data architecture and processes. Technical Expertise Bachelors or masters degree in computer science, Information Systems, Data Science, or related field. 3+ years of experience in data architecture, data engineering, or a related field. Proficient in data pipeline tools such as Apache Spark, Kafka, Airflow, or similar. Good experience of cloud platforms (Azure or Google Cloud), especially with cloud-native data services. Familiarity of API design and data security best practices. Familiarity with data mesh, data fabric, or other emerging architectural patterns. Experience working in Agile or DevOps environments. Extensive experience with high level programming languages - Python. Java or Scala is a plus. Experience with Hive, Oozie, Airflow, HBase, MapReduce, Spark along with working knowledge of Hadoop/Spark Toolsets. Experience working with Git and Process Automation In depth understanding of relational database management systems (RDBMS) and Data Flow Development Soft Skills Problem-Solving: Strong analytical skills to troubleshoot and resolve complex data pipeline issues. Communication: Ability to articulate technical concepts to non-technical stakeholders and document processes clearly. Collaboration: Experience working in cross-functional teams Adaptability: Willingness to learn new tools and technologies to stay ahead in the rapidly evolving data landscape. What you can expect from us We won t just meet your expectations. We ll defy them. So you ll enjoy the comprehensive rewards package you d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don t just talk about diversity and inclusion. We live it every day - with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One and dh Thrive as the living proof. Everyone s invited. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here) What you can expect from us We won t just meet your expectations. We ll defy them. So you ll enjoy the comprehensive rewards package you d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don t just talk about diversity and inclusion. We live it every day - with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One and dh Thrive as the living proof. We want everyone to have the opportunity to shine and perform at your best throughout our recruitment process. Please let us know how we can make this process work best for you. For an informal and confidential chat please contact stephanie.winson@dunnhumby.com to discuss how we can meet your needs. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here)

Posted 3 weeks ago

Apply

0.0 years

1 - 2 Lacs

Chennai

Work from Office

Naukri logo

Greetings! Your responsibilities include collecting and entering data in databases and maintaining accurate data of Medical Documents. - Document Splitting Process - Move the Cover Sheet - Typing Speed between 30WPM - Communication - Email Drafting

Posted 3 weeks ago

Apply

0.0 - 3.0 years

2 - 5 Lacs

Mumbai

Work from Office

Naukri logo

Facilitate solution efficiencies, scalability and technology stack leadership. Ensure fool proof and robust applications through unit tests and other quality control measures. Follow an agile development process, and enable rapid solutioning to business challenges. Take inputs from internal and external clients and constantly strive to improve solutions. Follow software design, development, testing and documentation best practices. Data engineering: Extract and parse data from online and local data sources; Clean up data, audit data for accuracy, consistency and completeness. Data processing and visualization: Summarize insights in a simple yet powerful charts, reports, slides, etc. Data storage and management: MySQL (AWS RDS) MongoDB. Application frameworks: React, React native, Django. Data Integration technologies: RESTful APIs, AWS S3 and UI data uploads. Project operations: For internal and external client projects, use our proprietary tools for performing data engineering, analytics and visualization activities. Responsible for project deliveries, escalation, continuous improvement, and customer success. Modify software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Follow the system testing and validation procedures. Qualifications" , "id":2 , "content":" Toppers and Tech Geeks who are BTech \/ BE in Computer Science\/Engineering. 0 to 3 years work experience "} The primary role of Jr. Software Developer will be to carry out a variety software/web application development activities to support internal and external projects, including but not limited to:

Posted 3 weeks ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Gurugram

Work from Office

Naukri logo

We re looking for a Sr Big Data Engineer who expects more from their career. It s chance to extend and improve dunnhumby s Data Engineering Team. It s an opportunity to work with a market-leading business to explore new opportunities for us and influence global retailers. Key Responsibilities Design end-to-end data solutions, including data lakes, data warehouses, ETL/ELT pipelines, APIs, and analytics platforms. Architect scalable and low-latency data pipelines using tools like Apache Kafka, Flink, or Spark Streaming to handle high-velocity data streams. Design /Orchestrate end-to-end automation using orchestration frameworks such as Apache Airflow to manage complex workflows and dependencies. Design intelligent systems that can detect anomalies, trigger alerts, and automatically reroute or restart processes to maintain data integrity and availability. Develop scalable data architecture strategies that support advanced analytics, machine learning, and real-time data processing. Define and implement data governance, metadata management, and data quality standards. Lead architectural reviews and technical design sessions to guide solution development. Partner with business and IT teams to translate business needs into data architecture requirements. Explore appropriate tools, platforms, and technologies aligned with organizational standards. Ensure security, compliance, and regulatory requirements are addressed in all data solutions. Evaluate and recommend improvements to existing data architecture and processes. Provide mentorship and guidance to data engineers and technical teams. Technical Expertise Bachelors or masters degree in computer science, Information Systems, Data Science, or related field. 7+ years of experience in data architecture, data engineering, or a related field. Proficient in data pipeline tools such as Apache Spark, Kafka, Airflow, or similar. Experience with data governance frameworks and tools (e.g., Collibra, Alation, OpenMetadata). Strong knowledge of cloud platforms (Azure or Google Cloud), especially with cloud-native data services. Strong understanding of API design and data security best practices. Familiarity with data mesh, data fabric, or other emerging architectural patterns. Experience working in Agile or DevOps environments. Experience with modern data stack tools (e.g., dbt, Snowflake, Databricks). Extensive experience with high level programming languages - Python, Java & Scala Experience with Hive, Oozie, Airflow, HBase, MapReduce, Spark along with working knowledge of Hadoop/Spark Toolsets. Extensive Experience working with Git and Process Automation In depth understanding of relational database management systems (RDBMS) and Data Flow Development. Soft Skills Problem-Solving : Strong analytical skills to troubleshoot and resolve complex data pipeline issues. Communication : Ability to articulate technical concepts to non-technical stakeholders and document processes clearly. Collaboration : Experience working in cross-functional teams and managing stakeholder expectations. Adaptability : Willingness to learn new tools and technologies to stay ahead in the rapidly evolving data landscape.

Posted 3 weeks ago

Apply

0.0 - 1.0 years

2 - 3 Lacs

Faridabad

Work from Office

Naukri logo

We are seeking a highly detail-oriented and technically adept 3D Data Annotation Specialist to join our growing team. This role is critical in shaping high-quality datasets for training cutting-edge AI and computer vision models, particularly in domains such as LiDAR data processing, and 3D object detection. Qualifications: B.Tech in Computer Science, IT, or related field preferred (others may also apply strong analytical and software learning abilities required). Strong analytical and reasoning skills, with attention to spatial geometry and object relationships in 3D space. Basic understanding of 3D data formats (e.g., .LAS, .LAZ, .PLY) and visualization tools. Ability to work independently while maintaining high-quality standards. Excellent communication skills and the ability to collaborate in a fast-paced environment. Attention to detail and ability to work with precision in visual/manual tasks. Good understanding of basic geometry, coordinate systems, and file handling. Preferred Qualifications: Prior experience in 3D data annotation or LiDAR data analysis. Exposure to computer vision workflows. Comfortable working with large datasets and remote sensing data Key Responsibilities: Annotate 3D point cloud data with precision using specialized tools [ Training would be provided] Label and segment objects within LiDAR data, aerial scans, or 3D models. Follow annotation guidelines while applying logical and spatial reasoning to 3D environments. Collaborate with ML engineers and data scientists to ensure annotation accuracy and consistency. Provide feedback to improve annotation tools and workflow automation. Participate in quality control reviews and conduct re-annotation as needed

Posted 3 weeks ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Experience Level: 3-7 Years Education Level: BE/B.Tech/ME/ M.Tech (CS or ECE) v Requirement - (ML Engineer - Data and ML Training Pipeline Engineer) Proficient in Python language. Familiar with Tensorflow/Keras frameworks Experience in building ML Pipeline and setup big data processing pipelines Able to evaluate model and benchmark various model version on different data and report generation Experience in Vision Domain ML models would be an advantage but not must have Comfortable in generating data as per model format, tfrecords, lmdb or similar format. Comfortable in data analysis and visualization Knack for debugging of python compilation, data reading issues. Proficiency in usage of different development tools - git/gerrit, static/dynamic analysis tools, code coverage, test & performance analysis tools Good to have: Comfortable with building data analysis, annotation scripts and exploring tools for ease of data understanding and annotations. Knack for data debugging and verify which classes/types of data are causing the ML model to fail. Should be comfortable in writing scripts to filter out/analyze such images and should be able to draw a conclusion after verifying the failure cases Experience in data visualization and cleaning for AI/ML projects. Synthetic Data Generation experience Able to track and version data used in ML pipeline and report generation

Posted 3 weeks ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Mumbai, Navi Mumbai

Work from Office

Naukri logo

{"heading":"Job Responsibilities" , "id":1 , "content":" Collaborate with data scientists, software engineers, and business stakeholders to understand data requirements and design efficient data models. Develop, implement, and maintain robust and scalable data pipelines, ETL processes, and data integration solutions. Extract, transform, and load data from various sources, ensuring data quality, integrity, and consistency. Optimize data processing and storage systems to handle large volumes of structured and unstructured data efficiently. Perform data cleaning, normalization, and enrichment tasks to prepare datasets for analysis and modelling. Monitor data flows and processes, identify and resolve data-related issues and bottlenecks. Contribute to the continuous improvement of data engineering practices and standards within the organization. Stay up-to-date with industry trends and emerging technologies in data engineering, artificial intelligence, and dynamic pricing "} {"heading":"Candidate Profile","id":2,"content":" Strong passion for data engineering, artificial intelligence, and problem-solving. Solid understanding of data engineering concepts, data modeling, and data integration techniques. Proficiency in programming languages such as Python, SQL and Web Scrapping. Understanding of databases like No Sql , relational database, In Memory database and technologies like MongoDB, Redis, Apache Spark would be add on.. Knowledge of distributed computing frameworks and big data technologies (e.g., Hadoop, Spark) is a plus. Excellent analytical and problem-solving skills, with a keen eye for detail. Strong communication and collaboration skills, with the ability to work effectively in a team- oriented environment. Self-motivated, quick learner, and adaptable to changing priorities and technologies. "} Sciative is on a mission to create the future of dynamic pricing powered by artificial intelligence and big data. Our Software-as-a-Service products are used globally across various industries - retail, ecommerce, travel, and entertainment. We are a fast growth-oriented startup, with an employee strength of 60+ employees based in a plush office at Navi Mumbai. With our amazing result-oriented product portfolio, we want to become the most customer-oriented company on the planet. To get there, we need exceptionally talented, bright, and driven people. If youd like to help us build the place to find and buy anything online, this is your chance to make history. We are looking for a dynamic, organized self-starter to join our Tech Team.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

Product performance: Ensure applications are scalable., fault tolerant, highly available,handle high degree of concurrency, deliver low latency, and perform AI tasks at massive scale (5 Billion data processing per day). Tech architecture design: Lead end to end tech design of SaaS applications. Project management: Work with other software and full stack developers and oversee that the developments are (a) adhering to the tech design, (b) following the software development and documentation standards, and (c) consistently of the highest quality.Follow an agile development process, and enable rapid solutioning to business challenges. Plan integrations: Plan API based integrations with third party tech platforms (e.g. GDSs operating in Travel or Hospitality industries, Seller services in marketplaces like Amazon, FlipKart, Public Apps in Shopify, woocommerce; sales enablement platforms like SalesForce, Microsoft Dynamics, etc.) Oversee data engineering: Build complete understanding of data architecture, write data processing scripts to prepare data for feeding to the AI engine(s). AI and ML modules: Ensure seamless integrations of AI\/ML modules at production scale. App performance optimization: Enhance the data processing and tech architecture to deliver superior computation performance and scalability of the Apps. Product enhancements: Interact with client and business side teams to develop a roadmap for product enhancements. "} {"heading":"Candidate profile","id":2,"content":" Entrepreneurial mind set with a positive attitude is a must. Prior experience (minimum 5 years) in tech architecture designs of multi-tenant SaaS full stack application development. Expert knowledge of new generation technologies is a must (e.g. Machine Queues, MSA architecture, PWA architecture, React, No-SQL, Serverless, Redis, Python, Django, SQL, Docker\/Containers\/Kubernetes, etc.). Strong acumen in algorithmic problem solving. Self-learner: highly curious, self-starter and can work with minimum supervision and guidance. Track record of excellence in academics or non-academic areas, with significant accomplishments. Excellent written & oral communication and interpersonal skills, with a high degree of comfort working in teams and making teams successful. "} {"heading":"Qualifications" , "id":3 , "content":" Toppers and Tech Geeks who are BTech \/ BE in Computer Science\/Engineering, IT 5-7 years work experience is must in deep tech roles "} The primary role of Tech Architect will be to lead technical architecture of high scalable and data computation heavy SaaS products for AI powered dynamic price optimization.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

We help the world run better Job Title: Engineer - Business Data Cloud - Data Product Runtime Team Job Description: As a n Engineer in the Data Product Runtime Team , you will be a crucial part of the team expansion in Bangalore, contributing to SAP Business Data Cloud initiatives. This role offers an opportunity to learn from experienced engineers and develop skills in Spark optimization, scalable data processing, and data transformation pipelines as part of SAPs Data & AI strategy. Responsibilities: Support the implementation and evolution of a scalable data processing framework running on Spark. Participate in the creation and optimization of pluggable data transformation pipelines. Optimize CI/CD workflows using GitOps practices. Apply SQL skills in support of data transformation initiatives. Learn to incorporate AI & ML technologies into engineering workflows. Engage with SAP HANA Spark in data processing tasks. Collaborate with global colleagues for effective project contribution. Qualifications: Basic experience in data engineering and distributed data processing. Proficiency in Python (PySpark) is essential; knowledge of Scala and Java is beneficial. Familiarity with Spark optimization and scalable data processing. Foundation in developing data transformation pipelines. Experience with Kubernetes, GitOps , and modern cloud stacks . Interest in AI & ML technologies and industry trends. Good communication skills for effective collaboration in a global team. Eagerness to learn about SAP Data Processing solutions and platform initiatives. Bring out your best . Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 426942 | Work Area: Software-Design and Development | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid. Requisition ID: 426942 Posted Date: May 27, 2025 Work Area: Software-Design and Development Career Status: Professional Employment Type: Regular Full Time Expected Travel: 0 - 10% Location:

Posted 3 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

We help the world run better Job Title: Senior Engineer - Business Data Cloud - Data Product Runtime Team Job Description: As a Senior Engineer in the Data Product Runtime Team , you will be an essential asset to the expansion of our Foundation Services team in Bangalore . Your expertise in Spark optimization, scalable data processing, and data transformation pipelines will drive SAP Business Data Cloud initiatives, supporting SAPs Data & AI strategy. Responsibilities: Design , Implement and evolve capabilities of a highly scalable data processing framework running on Spark. Implement and refine pluggable data transformation pipelines for enhanced data processing efficiency. Enhance CI/CD workflows through GitOps methodologies for greater operational efficiency. Apply SQL skills to execute and optimize data transformations in large-scale projects. Integrate AI & ML technologies into engineering processes, adhering to industry best practices. Utilize SAP HANA Spark for effective data processing solutions. Communicate effectively with global colleagues to ensure successful project delivery. Qualifications: Solid experience in data engineering, distributed data processing, and SAP HANA or related databases. Proficiency in Python ( PySpark ) is required ; Scala and Java are beneficial. Experience in Spark optimization and scalable data processing. Expertise in developing data transformation pipelines. Familiarity with Kubernetes, GitOps , and modern cloud stacks Understanding of AI & ML technologies and industry trends. Excellent communication skills for teamwork in a global context. Background in SAP Data Processing solutions is a plus . Bring out your best . Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 426935 | Work Area: Software-Design and Development | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid. Requisition ID: 426935 Posted Date: May 27, 2025 Work Area: Software-Design and Development Career Status: Professional Employment Type: Regular Full Time Expected Travel: 0 - 10% Location:

Posted 3 weeks ago

Apply

9.0 - 13.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

We help the world run better Job Title: Engineering Expert - Business Data Cloud - Data Product Runtime Team Job Description: As a n Engineering Expert, you will be instrumental in the expansion of our Foundation Services team in Bangalore. Your profound expertise in distributed data processing, Spark optimization, and data transformation pipelines will drive scalable data solutions, reinforcing SAP Business Data Clouds pivotal role in SAPs Data & AI strategy. Responsibilities: Lead the design and execution of optimized data processing solutions . Spearhead Spark optimization strategies to maximize performance and scalability in data processing. Architect and refine pluggable data transformation pipelines for efficient processing of large datasets. Implement GitOps practices to advance CI/CD pipelines and operational efficiency. Apply advanced SQL skills to refine data transformations across vast datasets. Integrate AI & ML technologies into high-performance data solutions, staying informed on emerging trends. Utilize SAP HANA Spark to drive innovation in data engineering processes. Mentor junior engineers, fostering a culture of continuous improvement and technical excellence. Collaborate effectively with global stakeholders to achieve successful project outcomes. Qualifications: Extensive experience in data engineering, distributed data processing, and expertise in SAP HANA or similar databases. Proficiency in Python (PySpark) is essential; knowledge of Scala and Java is advantageous. Advanced understanding of Spark optimization and scalable data processing techniques. Proven experience in architecting data transformation pipelines. Knowledge of Kubernetes, GitOps , and modern cloud stacks . Strong understanding of AI & ML technologies and industry trends. Effective communication skills within a global, multi-cultural environment. Proven track record of leadership in data processing and platform initiatives.

Posted 3 weeks ago

Apply

2.0 - 3.0 years

3 - 4 Lacs

Jaipur, Rajasthan

Work from Office

Naukri logo

We are hiring an experienced MIS Executive for our Jaipur office with 3+ years of expertise in data management, reporting, and automation. The ideal candidate must be proficient in Google Sheets, Advanced Excel, and scripting tools like Google Apps Script or Python. The role involves automating dashboards, analyzing large datasets, generating MIS reports, and ensuring data accuracy across business functions. Strong analytical, coordination, and technical skills are a must. Experience with BI tools like Tableau or Power BI is an added advantage. Key Responsibilities: Develop, manage, and maintain MIS reports using Google Sheets, Excel. Automate reports, dashboards, and data processes to improve efficiency and accuracy. Utilize Advanced Excel functions such as Pivot Tables, VLOOKUP, HLOOKUP, XLOOKUP, Macros, and VBA scripting/Apps Scripting. Work with Google Sheets scripts (Google Apps Script) for automation and data processing. Extract, clean, and analyses large datasets for business insights and decision making. Coordinate with various departments to gather and consolidate data for accurate reporting. Develop custom scripts and applications to enhance data processing and management. Troubleshoot and resolve data discrepancies and ensure data integrity. Assist in database management. Generate periodic and ad-hoc reports as required by management.

Posted 3 weeks ago

Apply

0.0 - 1.0 years

0 - 1 Lacs

Jaipur

Work from Office

Naukri logo

Job Role * The role involves managing, updating the travel details such as hotel fares, promotions, stop sales, excursions hotel profile on the travel system * Work closely with the team-lead and stack-holders to upload correct information on the system with the aim of maintain quality. * Manage emails and tasks on google sheets and outlook. * Manage daily production that includes contract loading, promotion loading, and QC. * He/she must have strong command on excel and English communication * Edit, proofread and improve writers- posts Key Competencies * The applicant should have a strong command on English communication and contract understanding. * The applicant should have an analytical approach to problem-solving and is able to think "user first"

Posted 3 weeks ago

Apply

0.0 - 1.0 years

0 - 1 Lacs

Jaipur

Work from Office

Naukri logo

Job Role * The role involves managing, updating the travel details such as hotel fares, promotions, stop sales, excursions hotel profile on the travel system * Work closely with the team-lead and stack-holders to upload correct information on the system with the aim of maintain quality. * Manage emails and tasks on google sheets and outlook. * Manage daily production that includes contract loading, promotion loading, and QC. * He/she must have strong command on excel and English communication * Edit, proofread and improve writers- posts Key Competencies * The applicant should have a strong command on English communication and contract understanding. * The applicant should have an analytical approach to problem-solving and is able to think "user first"

Posted 3 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will oversee the application development process and ensure successful project delivery. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Lead the design and development of applications.- Act as the primary point of contact for application-related queries.- Collaborate with team members to ensure project success.- Provide technical guidance and mentorship to junior team members.- Stay updated on industry trends and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of big data processing and analytics.- Experience with cloud platforms like AWS or Azure.- Hands-on experience in building scalable applications.- Knowledge of data modeling and database design. Additional Information:- The candidate should have a minimum of 3 years of experience in PySpark.- This position is based at our Mumbai office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the development and implementation of complex applications- Conduct code reviews and provide technical guidance to team members- Stay updated on industry trends and best practices to enhance application development processes Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark- Strong understanding of distributed computing and data processing- Experience in building scalable and efficient data pipelines- Knowledge of cloud platforms such as AWS or Azure- Hands-on experience with data manipulation and transformation techniques Additional Information:- The candidate should have a minimum of 5 years of experience in PySpark- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement efficient PySpark applications.- Collaborate with cross-functional teams to analyze and address application requirements.- Optimize application performance and troubleshoot issues.- Stay updated with industry trends and best practices in PySpark development.- Provide technical guidance and mentor junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data processing and manipulation using PySpark.- Experience in building scalable and efficient data pipelines.- Hands-on experience with PySpark libraries and functions.- Good To Have Skills: Experience with Apache Spark. Additional Information:- The candidate should have a minimum of 3 years of experience in PySpark.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

1.0 - 2.0 years

1 - 5 Lacs

Bhiwadi

Work from Office

Naukri logo

Job Description: Power BI, Power Automate, Power Apps Specialist Location: Bhiwadi - Rajasthan Timings: 7 AM - 3 PM (Majorly) Position Overview: We are seeking a dedicated and experienced specialist in Power BI, Power Automate, and Power Apps. The ideal candidate will have a solid background in Office 365 skills, capable of working on-site in Bhiwadi, and managing multiple projects and priorities. Key Responsibilities: Develop and manage Power BI dashboards, Power Automate workflows, and Power Apps solutions. Ensure timely delivery of projects based on the schedule. Link and integrate data from different resources to provide comprehensive business solutions. Maintain operational discipline across all assigned tasks and projects. Collaborate with cross-functional teams to ensure the effective implementation of solutions. Required Skills and Qualifications: Minimum of 1.5-2 years of experience with Office 365 skills, particularly in Power BI, Power Automate, and Power Apps . Strong domain knowledge in the mentioned skills. Technical capability to deliver projects on time based on the schedule. Excellent ability to link and integrate data from various sources. Strong operational discipline and organizational skills. Preferred Qualifications: Previous experience in a similar role. Strong problem-solving and analytical skills. Excellent communication and interpersonal skills. Ability to manage multiple projects and priorities effectively. Thanks & Regards Your Manpower Manager” DIVYA SHARMA Contact No-6262000413 Officer- TA | HR Ashkom.hr1@ashkom.com Divya.ashkom@gmail.com Ashkom Media India Private Limited Website: www.ashkom.com

Posted 3 weeks ago

Apply

9.0 - 13.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Senior Software Engineer (12-15 yrs) Supply chain retail technology Who We Are :Wayfair runs the largest custom e-commerce large parcel network in the United States, approximately 1 6 million square meters of logistics space The nature of the network is inherently a highly variable ecosystem that requires flexible, reliable, and resilient systems to operate efficiently We are looking for a passionate Backend Software Engineer to join the Fulfilment Optimisation team What Youll D Partner with your business stakeholders to provide them with transparency, data, and resources to make informed decisions Be a technical leader within and across the teams you work with Drive high impact architectural decisions and hands-on development, including inception, design, execution, and delivery following good design and coding practices Obsessively focus on production readiness for the team including testing, monitoring, deployment, documentation and proactive troubleshootin Identify risks and gaps in technical approaches and propose solutions to meet team and project goals Create proposals and action plans to garner support across the organizatio Influence and contribute to the teams strategy and roadmap Tenacity for learning curious, and constantly pushing the boundary of what is possible We Are a Match Because You Ha A Bachelors Degree in Computer Science or a related engineering At least 12 years of experience in a senior engineer or technical lead role Should have mentored 10-12 people Experience developing and designing scalable distributed systems with deep understanding of architectural and design patterns, object oriented design, modern program languages Excellent communication skills and ability to work effectively with engineers, product managers, data scientists, analysts and business stakeholders, Passion for mentoring and leading peer engineers Experience designing APIs and micro services Experience working on cloud technologies specifically GCP is a plus Deep understanding of data processing and data pipelines Common open source platforms, tools and framework, eg: Kafka, Kubernetes, Containerization, Java microservices, GraphQL APIs, Aerospike etc Designing and developing recommendation systems and productionalizing ML models for real time decisions, large-scale data processing and event-driven systems and technologies is a plus About Wayfair Inc Wayfair is one of the worlds largest online destinations for the home Whether you work in our global headquarters in Boston or Berlin, or in our warehouses or offices throughout the world, were reinventing the way people shop for their homes Through our commitment to industry-leading technology and creative problem-solving, we are confident that Wayfair will be home to the most rewarding work of your career If youre looking for rapid growth, constant learning, and dynamic challenges, then youll find that amazing career opportunities are knock ing No matter who you are, Wayfair is a place you can call home Were a community of innovators, risk-takers, and trailblazers who celebrate our differences, and know that our unique perspectives make us stronger, smarter, and well-positioned for success We value and rely on the collective voices of our employees, customers, community, and suppliers to help guide us as we build a better Wayfair and world for all Every voice, every perspective matters Thats why were proud to be an equal opportunity employer We do not discriminate on the basis of race, color, ethnicity, ancestry, religion, sex, national origin, sexual orientation, age, citizenship status, marital status, disability, gender identity, gender expression, veteran status, genetic information, or any other legally protected characteristic Your personal data is processed in accordance with our Candidate Privacy Notice (https:// wayfair,com/careers/privacy) If you have any questions or wish to exercise your rights under applicable privacy and data protection laws, please contact us at dataprotectionofficer@wayfair , com

Posted 3 weeks ago

Apply

2.0 - 5.0 years

9 - 14 Lacs

Gurugram

Work from Office

Naukri logo

About This Role BlackRock is seeking a highly skilled and motivated Analyst to support its growing and dynamic Client Data function! In this role, you will be responsible to drive the accuracy, quality and consistent use of the most impactful, globally relevant data fields, facilitating scale & efficiency across BLKs global sales and service ecosystem You will work closely with cross-functional teamsincluding business stakeholders and technical teams for Client Data to establish standards for the entry and maintenance of client data, implement exception monitoring to identify data inconsistencies and complete high-risk updates where required At BlackRock, we are dedicated to encouraging an inclusive environment where every team member can thrive and contribute to our world-class success This is your chance to be part of a firm that is not only ambitious but also committed to delivering flawless and proven investment strategies, Key Responsibilities: As a Data Analyst, you will play a pivotal role in ensuring the accuracy and efficiency of our client data Your responsibilities will include: Data Governance & Quality: Monitor data health and integrity, and ensure data products meet strict standards for accuracy, completeness, and consistency Conduct regular assessments to identify deficiencies and opportunities for improvement, Data Management: Maintain, cleanse and update records within the Client Relationship Management systems This may include researching information across a variety of data sources, working with internal client support groups to create data structures that mimic client asset pools and connecting client information across data sources, Process Improvement and Efficiency: Identify and complete process improvements from initial ideation to implementation Collaborate with cross-functional teamsproduct managers, engineers, and business stakeholdersto plan, design, and deliver data products, Quality Assurance: Collaborate with teams to test new CRM features, ensuring tools function accurately and identifying defects for resolution, Collaboration & Communication: Prioritize effectively with various collaborators across BlackRock Ensure efficient and timely data governance and maintenance in an agile environment, Qualifications & Requirements: We seek candidates who are ambitious, diligent, and have a proven track record in data management The ideal candidate will possess the following qualifications: Experience: MBA or equivalent experience required; major in Business, Finance, MIS, Computer Science or related fields preferred 1 to 4 years of experience in data management or data processing Financial services industry experience is a plus but not required Skills And Qualifications: Proficiency in SQL; Python experience a plus Proficiency in data management / reporting tools and technologies such as POWER BI a plus Experience with business applications including Excel and PowerPoint Experience working with CRM platforms; Microsoft Dynamics experience a plus Organized and detail-oriented with strong time management skills Self-motivated with a strong focus on service and ability to liaise with many groups across the company Excellent online research skills Exceptional written and verbal communication skills Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about, Our hybrid work model BlackRocks hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week Some business groups may require more time in the office due to their roles and responsibilities We remain focused on increasing the impactful moments that arise when we work together in person aligned with our commitment to performance and innovation As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock, About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being Our clients, and the people they serve, are saving for retirement, paying for their childrens educations, buying homes and starting businesses Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress, This mission would not be possible without our smartest investment the one we make in our employees Its why were dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive, For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: linkedin,com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law, #EarlyCareers About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being Our clients, and the people they serve, are saving for retirement, paying for their childrens educations, buying homes and starting businesses Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress, This mission would not be possible without our smartest investment the one we make in our employees Its why were dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive, For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: linkedin,com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law,

Posted 3 weeks ago

Apply

3.0 - 6.0 years

20 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Data Scientist Reports to: Lead – Data Science About upl: UPL is focused on emerging as a premier global provider of total crop solutions designed to secure the world’s long-term food supply. Winning farmers hearts across the globe, while leading the way with innovative products and services that make agriculture sustainable , UPL is the fastest growing company in the industry. UPL has a rich history of 50+ years with presence in 120+ countries . Based on the recognition that humankind is one community, UPL’s overarching commitment is to improve areas of its presence, workplace, and customer engagement . Our purpose is ‘ OpenAg’. An Open agriculture network that feeds sustainable growth for all. No limits, no borders . In order to create sustainable food chain, UPL is now working to build a future-ready, analytics-driven organization that will be even more efficient, more innovative, and more agile. We are setting up “Digital and Analytics” CoE to work on some disruptive projects that will have an impact that matters for the planet. It will help us reimagine our business to ensure the best outcomes for growers, consumers, employees, and our planet. Work with us to get exposure to cutting-edge solutions in digital & advanced analytics, mentorship from senior leaders & domain experts, and access to a great work environment . JOb Responsibilities: The Data Scientists will leverage expertise in advanced statistical and modelling techniques to design, prototype, and build the next-generation analytics engines and services. They will work closely with the analytics teams and business teams to derive actionable insights, helping the organization in achieving its’ strategic goals. Their work will involve high levels of interaction with the integrated analytics team, including data engineers, translators, and more senior data scientists. Has expertise in implementing complex statistical analyses for data processing, exploration, model building and implementation Lead teams of 2-3 associate data scientists in the use case building and delivery process Can communicate complex technical concepts to both technical and non-technical audience Plays a key role in driving ideation around the modelling process and developing models. Can conceptualize and drive re-iteration and fine tuning of models Contribute to knowledge building and sharing by researching best practices, documenting solutions, and continuously iterating on new ways to solve problems. Mentors junior team members to do the same REQUIRED EDUCATION AND EXPERIENCE: Master’s degree in Computer Science, Statistics, Math, Operations Research, Economics, or a related field Advanced level programming skills in at least 1 coding language (R/Python/Scala) Practical experience of developing advanced statistical and machine learning models At least 2 years of relevant analytics experience Experience in using large database systems preferred Has developed niche expertise in at least one functional domain REQUIRED SKILLS: Ability to work well in agile environments in diverse teams with multiple stakeholders Experience of leading small teams Able to problem solve complex problems and break them down into simpler parts Ability to effectively communicate complex analytical and technical content High energy and passionate individual who can work closely with other team members Strong entrepreneurial drive to test new out of the box techniques Able to prioritize workstreams and adopt an agile approach Willing to adopt an iterative approach; experimental mindset to drive innovation LOCATION: Bangalore What’s in it for you ? Disruptive projects : Work on ‘ breakthrough’ digital-and-analytics projects to enable UPL’s vision of building a future ready organization. It involves deploy ing solutions to help us increase our sales, sustain our profitability, improve our speed to market, supercharge our R&D efforts, and support the way we work internally. Help us ensure we have access to the best business insights that our data analysis can offer us. Cross functional leadership exposure : Work directly under guidance of functional leadership at UPL, on the most critical business problems for the organization (and the industry) today. It will give you exposure to a large cross-functional team (e.g.: spanning manufacturing, procurement, commercial, quality, IT/OT experts), allowing multi-functional learning in D&A deployment Environment fostering professional and personal development : Strengthen professional learning in a highly impact-oriented and meritocratic environment that is focused on delivering disproportionate business value through innovative solutions. It will be supported by on-the-job coaching from experienced domain experts, and continuous feedback from a highly motivated and capable set of peers. Comprehensive training programs for continuous development through UPL's D&A academy will help in accelerating growth opportunities. Come join us in this transformational journey! Let’s collectively Change the game with Digital & Analytics!

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies