Jobs
Interviews

4894 Data Processing Jobs - Page 28

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

13 - 18 Lacs

pune

Work from Office

Senior Software Developers collaborate with Business and Quality Analysts, Designers, Project Managers and more to design software solutions that will create meaningful change for our clients. They listen thoughtfully to understand the context of a business problem and write clean and iterative code to deliver a powerful end result whilst consistently advocating for better engineering practices. By balancing strong opinions with a willingness to find the right answer, Senior Software Developers bring integrity to technology, ensuring all voices are heard. For a team to thrive, it needs collaboration and room for healthy, respectful debate. Senior Developers are the technologists who cultivate this environment while driving teams toward delivering on an aspirational tech vision and acting as mentors for more junior-level consultants. You will leverage deep technical knowledge to solve complex business problems and proactively assess your teams health, code quality and nonfunctional requirements. Job responsibilities You will learn and adopt best practices like writing clean and reusable code using TDD, pair programming and design patterns You will use and advocate for continuous delivery practices to deliver high-quality software as well as value to end customers as early as possible You will work in collaborative, value-driven teams to build innovative customer experiences for our clients You will create large-scale distributed systems out of microservices You will collaborate with a variety of teammates to build features, design concepts and interactive prototypes and ensure best practices and UX specifications are embedded along the way. You will apply the latest technology thinking from our to solve client problems You will efficiently utilize DevSecOps tools and practices to build and deploy software, advocating devops culture and shifting security left in development You will oversee or take part in the entire cycle of software consulting and delivery from ideation to deployment and everything in between You will act as a mentor for less-experienced peers through both your technical knowledge and leadership skills Job qualifications Technical Skills We are looking for an experienced Scala Developer with 5+ years of expertise in building scalable data processing solutions. Excellent Scala and Apache Spark development skills Experience with HDFS, Hive, Impala Proficiency in OOP, design patterns, and coding best practices Experience in building real-time analytics applications, microservices, and ETL pipelines You are comfortable with Agile methodologies, such as Extreme Programming (XP), Scrum and/or Kanban You have a good awareness of TDD, continuous integration and continuous delivery approaches/tools Bonus points if you have working knowledge of cloud technology such as AWS, Azure, Kubernetes and Docker Professional Skills You enjoy influencing others and always advocate for technical excellence while being open to change when needed Presence in the external tech community: you willingly share your expertise with others via speaking engagements, contributions to open source, blogs and more Youre resilient in ambiguous situations and can approach challenges from multiple perspectives

Posted 2 weeks ago

Apply

0.0 - 1.0 years

0 Lacs

pune

Work from Office

":" Job Title: Python Developer Intern Location: Pune, India Position Type : Full-time About the Role: We are seeking a talented and motivated Python Developer to join our dynamic team. As a Python Developer, you will play a crucial role in developing innovative features using popular Python web frameworks like Django and Flask. You will collaborate with our team to manipulate and analyze data using powerful libraries such as Pandas and NumPy, ensuring our applications are robust, efficient, and scalable. Requirements Responsibilities: Feature Development: Utilize your expertise in Django and/or Flask to design and implement new features, contributing to our cutting-edge projects. Data Manipulation: Work with data manipulation libraries, specifically Pandas and NumPy, to process, analyze, and interpret complex datasets. Code Documentation: Maintain well-documented code, ensuring it is understandable, maintainable, and reusable. Testing and Debugging: Assist in the testing and debugging process to enhance the quality and performance of our applications. Requirements: Proficiency in Python: Demonstrated experience and expertise in Python programming. Web Frameworks: Hands-on knowledge and practical experience with Django and/or Flask. Data Processing: Strong understanding of data processing and analysis using Pandas and NumPy. Database Skills: Proficiency with SQL and relational databases, especially PostgreSQL, as well as non-relational databases like MongoDB. Version Control: Familiarity with Git and GitHub for version control and collaboration. Learning and Contribution: An eagerness to learn new technologies and methodologies, with a proactive approach to contributing to team efforts. Benefits Competitive salary with performance-based bonuses. Generous vacation and paid time off. Company-sponsored training and professional development opportunities. ","

Posted 2 weeks ago

Apply

0.0 - 5.0 years

2 - 7 Lacs

hyderabad

Work from Office

Project Overview: Media Mix Optimization (MMO) Our MMO platform is an in-house initiative designed to empower clients with data-driven decision-making in marketing strategy. By applying Bayesian and frequentist approaches to media mix modeling, we are able to quantify channel-level ROI, measure incrementality, and simulate outcomes under varying spend scenarios. Key components of the project include: Data Integration: Combining client first-party, third-party, and campaign-level data across digital, offline, and emerging channels into a unified modeling framework. Model Development: Building and validating media mix models (MMM) using advanced statistical and machine learning techniques such as hierarchical Bayesian regression, regularized regression (Ridge/Lasso), and time-series modeling. Scenario Simulation: Enabling stakeholders to forecast outcomes under different budget allocations through simulation and optimization algorithms. Deployment & Visualization: Using Streamlit to build interactive, client-facing dashboards for model exploration, scenario planning, and actionable recommendation delivery. Scalability: Engineering the system to support multiple clients across industries with varying data volumes, refresh cycles, and modeling complexities. Responsibilities Develop, validate, and maintain media mix models to evaluate cross-channel marketing effectiveness and return on investment. Engineer and optimize end-to-end data pipelines for ingesting, cleaning, and structuring large, heterogeneous datasets from multiple marketing and business sources. Design, build, and deploy Streamlit-based interactive dashboards and applications for scenario testing, optimization, and reporting. Conduct exploratory data analysis (EDA) and advanced feature engineering to identify drivers of performance. Apply Bayesian methods, regularization, and time-series analysis to improve model accuracy, stability, and interpretability. Implement optimization and scenario-planning algorithms to recommend budget allocation strategies that maximize business outcomes. Collaborate closely with product, engineering, and client teams to align technical solutions with business objectives. Present insights and recommendations to senior stakeholders in both technical and non- technical language. Stay current with emerging tools, techniques, and best practices in media mix modeling, causal inference, and marketing science. Bachelor s or Master s degree in Data Science, Statistics, Computer Science, Applied Mathematics, or related field. Proven hands-on experience in media mix modeling, marketing analytics, or econometrics.

Posted 2 weeks ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

bengaluru

Work from Office

We are looking for a forward-thinking Data & AI Engineer with 1 3 years of experience in data engineering and a passion for using modern AI tools to accelerate development workflows. The ideal candidate is proficient in Python, SQL, PySpark, and has experience working in on-premise big data environments (e.g., Spark, Hadoop, Hive, HDFS). This role is ideal for someone eager to blend traditional data engineering practices with AI-augmented software development, helping us build high-performance pipelines and deliver faster, smarter solutions. What you ll be doing Develop and maintain robust ETL/ELT pipelines using Python, SQL, and PySpark. Work with on-premise big data platforms such as Spark, Hadoop, Hive, and HDFS. Optimize and troubleshoot workflows to ensure performance, reliability, and quality. Use AI tools to assist with code generation, testing, debugging, and documentation. Collaborate with data scientists, analysts, and engineers to support data-driven use cases. Maintain up-to-date documentation using AI summarization tools. Apply AI-augmented software engineering practices, including automated testing, code reviews, and CI/CD. Identify opportunities for automation and process improvement across the data lifecycle. 1 3 years of hands-on experience as a Data Engineer or in a similar data-focused engineering role. Proficiency in Python for data manipulation, automation, and scripting. Solid understandin

Posted 2 weeks ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

chennai, bengaluru

Work from Office

Data Analyst Job Description Data Analyst Position Overview: We are looking for a Data Analyst to support hospital operations through data-driven insights. The role involves analysing workforce-related data, tracking performance metrics, and building dashboards that enable leadership to make informed decisions and optimize organizational efficiency. Key Responsibilities: Collect, clean, and analyze workforce and hospital operational data to identify trends and gaps. Build dashboards and reports using Power BI, Tableau, and Excel for management decision-making. Monitor and evaluate key metrics such as attendance, manpower utilization, shift planning, training effectiveness, and performance outcomes. Provide insights on staff deployment, process efficiency, and resource allocation to improve overall hospital operations. Automate recurring reports and streamline reporting processes to enhance accuracy and timeliness. Conduct predictive and ad-hoc analyses to support strategic planning and workforce optimization. Ensure accuracy, confidentiality, and compliance in handling sensitive organizational data. Required Skills & Competencies: Strong proficiency in Excel (advanced functions, pivot tables, dashboards). Hands-on experience with Power BI and Tableau for visualization and reporting. Knowledge of Python and SQL (basics) for data processing and automation. Understanding of Business Intelligence Analytics and workforce data interpretation. Strong analytical and problem-solving skills with attention to detail. Excellent communication skills to present data insights clearly to management. Proactive, quick learner and team-oriented mind-set. Educational & Professional Background: Bachelor s degree in Business Analytics / Data Analytics / Statistics / or related field. Exposure to analytics in hospital/healthcare/operations domain Added Advantage. Certifications in Business Intelligence, R Programming, Excel, or Project Management are an added advantage. Job Experience: 0 1 2 3 4 5 Job Location: Rajaji Nagar No. of vacancies: 1

Posted 2 weeks ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

gurugram

Work from Office

Associate Economics Analyst, Global Growth and Operations-1 Overview The Mastercard Economics Institute (MEI) is an economics lab powering scale at Mastercard by owning economic thought leadership in support of Mastercard s efforts to build a more inclusive and sustainable digital economy MEI was launched in 2020 to analyze economic trends through the lens of the consumer to deliver tailored and actionable insights on economic issues for customers, partners and policymakers The Institute is composed of a team of economists and data scientists that utilize & synthesize the anonymized and aggregated data from the Mastercard network together with public data to bring powerful insights to life, in the form of 1:1 presentation, global thought leadership, media participation, and commercial work through the company s product suites ________________________ About the Role Mastercard s Economics Institute is seeking a talented Economic Analyst with R or Python Programmer skills to join our Global Growth and Operations team. Reporting to the Director, Growth and Operations, this individual will blend advanced economic research with strong programming and data visualization skills. This is a unique opportunity for someone passionate about applying economic thinking and technical expertise to real-world questions at the intersection of economics, retail, and commerce. The role involves working on large-scale data analytics, developing innovative economic insights, and building compelling visualizations that help communicate these insights to diverse audiences. Support client and stakeholder engagements across MEI Collaborate with team economists, econometricians, developers, visualization experts and industry partners. Develop and test hypotheses at the intersection of economics, retail, and commerce. Structure work and manage small project streams, delivering impactful results. Identify creative analyses and develop proprietary diagnostic indices using large and complex datasets, including macroeconomic and big data sources. Generate insights, synthesize analyses into impactful storylines and interactive visuals, and help write reports and client presentations. Enhance existing products and partner with internal stakeholders to develop new economic solutions. Help create thought leadership and intellectual capital. Create and format analytical content using Jupyter Notebooks, R Markdown and/or Quarto. Work with databases and data platforms, including Databricks, SQL and Hadoop. Write clear, well-documented code that others can understand and maintain. Collaborate using Git for version control. ________________________ All About You Bachelor s degree in Economics (preferred), Statistics, Mathematics, or a related field. Proficient in working with relational databases and writing SQL queries. Expertise in working with large-scale data processing frameworks and tools, including Hadoop, Apache Spark, Apache Hive, and Apache Impala. Proficient in R or Python with experience in data processing packages. Skilled in creating data visualizations to communicate complex economic insights to diverse audiences. Experience using data visualization tools such as Tableau or Power BI. Proficiency in machine learning, econometric and statistical techniques, including predictive modeling, survival analysis, time series modeling, classification, regression and clustering methods is desirable. Strong problem-solving skills and critical thinking abilities. Excellent communication skills, both written and verbal. Organized and able to prioritize work across multiple projects. Collaborative team player who can also work independently. Passionate about data, technology, and creating impactful economic insights.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

pune

Work from Office

Join us as a Machine Learning Engineer at Barclays, responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards. Youll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Machine Learning Engineer you should have experience with: Exposure to Data Science concepts - Understanding of machine learning life cycle (EDA, Data processing, Model choice, evals) Analytics/machine learning and demonstrated machine learning application to multiple use cases using Python Exposure to GenAI concepts such as RAG, MCP etc. Exposure to Data Engineering concepts and good in handling large volume data using spark etc. Should have sound understanding of various generalization techniques like Ensemble, stacking Understanding of time series concepts pertaining to machine learning Some other highly valued skills may include: Proficiency in Python programming language Able to quickly build prototypes/UI in streamlit/react or other python/js based scripting languages Exposure to Infra based tools such as AWS CloudFormation, Terraform etc. Understanding of Cloud Security Best practices and exposure to IAM, VPC, Security groups etc. Knowledge of cloud monitoring systems Exposure to version control systems (git, bitbucket etc.) Familiar with databased systems and basic SQL queries You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To use innovative data analytics and machine learning techniques to extract valuable insights from the banks data reserves, leveraging these insights to inform strategic decision-making, improve operational efficiency, and drive innovation across the organisation. Accountabilities Identification, collection, extraction of data from various sources, including internal and external sources. Performing data cleaning, wrangling, and transformation to ensure its quality and suitability for analysis. Development and maintenance of efficient data pipelines for automated data acquisition and processing. Design and conduct of statistical and machine learning models to analyse patterns, trends, and relationships in the data. Development and implementation of predictive models to forecast future outcomes and identify potential risks and opportunities. Collaborate with business stakeholders to seek out opportunities to add value from data through Data Science. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

pune

Work from Office

"Reporting & Analytics SAC/BPC Consultant" with expertise in SAP Analytics Cloud (SAC) and Business Planning and Consolidation (BPC) software, responsible for designing, developing, and implementing data-driven reporting and analytics solutions to meet business needs by collaborating with stakeholders to gather requirements, build dashboards, visualizations, and reports utilizing the capabilities of both SAC and BPC platforms. Key Responsibilities: Requirement Gathering: Conduct workshops and interviews with stakeholders to understand business needs, identify key performance indicators (KPIs), and define reporting requirements. Solution Design: Develop functional specifications, data models, and design architecture for reporting and analytics applications within SAC and BPC based on gathered requirements. Data Integration and Modeling: Extract, transform, and load data from various sources into SAC and BPC, creating data models to facilitate analysis and reporting. Report Development: Design and build interactive dashboards, reports, and visualizations using SAC capabilities, including charts, graphs, tables, and filtering options. Planning and Forecasting: Utilize BPC functionalities to enable users to create budget plans, forecasts, and perform scenario analysis. User Training and Support: Provide training to end-users on how to access, navigate, and utilize the developed reports and dashboards within SAC and BPC. Performance Optimization: Monitor system performance, identify bottlenecks, and optimize data processing and query execution within SAC and BPC. Project Management: Contribute to project planning, execution, and delivery of SAC/BPC implementations, ensuring adherence to timelines and budget. Required Skills: Technical Expertise: Proficient in SAP Analytics Cloud (SAC) and BPC functionalities, including data modeling, calculations, data visualization, and reporting capabilities. Business Acumen: Strong understanding of business processes, financial reporting, and key performance metrics relevant to the clients industry. Communication Skills: Excellent ability to communicate complex technical concepts to both technical and non-technical stakeholders. Analytical Skills: Strong analytical skills to identify trends, patterns, and insights from data. Project Management Skills: Experience in managing project phases, timelines, and deliverables. Relevant Experience: Proven experience in implementing and supporting SAP Analytics Cloud (SAC) and BPC solutions Expertise in data integration, data cleansing, and data transformation techniques Experience working with various data sources like ERP systems, databases, and flat files Knowledge of data warehousing concepts is beneficial

Posted 2 weeks ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

gurugram

Work from Office

Associate Economics Analyst, Global Growth and Operations-2 Overview The Mastercard Economics Institute (MEI) is an economics lab powering scale at Mastercard by owning economic thought leadership in support of Mastercard s efforts to build a more inclusive and sustainable digital economy MEI was launched in 2020 to analyze economic trends through the lens of the consumer to deliver tailored and actionable insights on economic issues for customers, partners and policymakers The Institute is composed of a team of economists and data scientists that utilize & synthesize the anonymized and aggregated data from the Mastercard network together with public data to bring powerful insights to life, in the form of 1:1 presentation, global thought leadership, media participation, and commercial work through the company s product suites ________________________ About the Role Mastercard s Economics Institute is seeking a talented Economic Analyst with R or Python Programmer skills to join our Global Growth and Operations team. Reporting to the Director, Growth and Operations, this individual will blend advanced economic research with strong programming and data visualization skills. This is a unique opportunity for someone passionate about applying economic thinking and technical expertise to real-world questions at the intersection of economics, retail, and commerce. The role involves working on large-scale data analytics, developing innovative economic insights, and building compelling visualizations that help communicate these insights to diverse audiences. Support client and stakeholder engagements across MEI Collaborate with team economists, econometricians, developers, visualization experts and industry partners. Develop and test hypotheses at the intersection of economics, retail, and commerce. Structure work and manage small project streams, delivering impactful results. Identify creative analyses and develop proprietary diagnostic indices using large and complex datasets, including macroeconomic and big data sources. Generate insights, synthesize analyses into impactful storylines and interactive visuals, and help write reports and client presentations. Enhance existing products and partner with internal stakeholders to develop new economic solutions. Help create thought leadership and intellectual capital. Create and format analytical content using Jupyter Notebooks, R Markdown and/or Quarto. Work with databases and data platforms, including Databricks, SQL and Hadoop. Write clear, well-documented code that others can understand and maintain. Collaborate using Git for version control. ________________________ All About You Bachelor s degree in Economics (preferred), Statistics, Mathematics, or a related field. Proficient in working with relational databases and writing SQL queries. Expertise in working with large-scale data processing frameworks and tools, including Hadoop, Apache Spark, Apache Hive, and Apache Impala. Proficient in R or Python with experience in data processing packages. Skilled in creating data visualizations to communicate complex economic insights to diverse audiences. Experience using data visualization tools such as Tableau or Power BI. Proficiency in machine learning, econometric and statistical techniques, including predictive modeling, survival analysis, time series modeling, classification, regression and clustering methods is desirable. Strong problem-solving skills and critical thinking abilities. Excellent communication skills, both written and verbal. Organized and able to prioritize work across multiple projects. Collaborative team player who can also work independently. Passionate about data, technology, and creating impactful economic insights.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

bengaluru

Work from Office

Analytics Engineering Manager HubSpot s mission is to Help Millions of Organizations Grow Better. HubSpot s Data, Systems, & Intelligence team is core to this goal by supporting the HubSpotters who attract, engage, and delight HubSpot s customers. Our goal is to embed advanced analytics and algorithmic intelligence into the DNA of decision-making across Operations. Flexible, well-connected, well-architected data are the backbone of our strategy, along with the Analytics Engineers who create that landscape. We believe in the power of data to drive effective decision-making. We leverage data for both day-to-day operations (improving efficiency by surfacing the right data at the right time) and for strategy (fostering deep understanding of what makes our flywheel spin, and projecting how to spin it faster in the future). In this role, you ll be a member of the Data, Systems, & Intelligence team to design and build a modern data foundation for efficient models and analyses. Your work will directly support analytic and ML products that will change the way we make decisions within Operations at HubSpot. We re looking for a manager for a team of Analytic Engineers with a strong interest in building and maintaining flexible, reusable architecture for our highest value data assets. In this role, you will develop a deep understanding of the business and analytics priorities to lead a team to create solutions that will unlock our ability to scale and drive actionable insights. In this role, you ll get to: Lead a team of Analytics Engineers, supporting their development, aligning them with the most impactful projects, setting the team s vision, and establishing a clear operating model Provide the analyst s viewpoint on cross functional data projects, ensuring solutions are robust, practical, and useful Own and champion data assets that answer HubSpot s most critical questions to help our business and customers grow, reinforcing them as Source of Truth across the organization Develop a program for team enablement around reporting best practices, leveraging HubSpot s data assets within Snowflake SQL, dbt, LookML, Atlan, Monte Carlo, Cursor, and git Define teamwide coding standards, documentation requirements and development processes (code reviews, testing requirements, etc) Work with the Data, Systems, & Intelligence leadership team to set the future strategic direction for Operations, particularly in the area of model building and AI applications We are looking for people with experience in the following areas: Software Development & Analytics Strong understanding of the analyst s workflow as it relates to both structured and unstructured datasets Technical execution of high complexity business intelligence and analytics projects in a cloud native environment Strong understanding of data pipelines, architectures and data sets; experience diagramming architecture and entity relationships with Lucidcharts (for example) Hands-on experience with advanced SQL, cloud data warehouses (eg Snowflake) and relational databases Experience with script based analytic transformation tools like dbt Bonus: Conceptual knowledge of Looker (LookML, Looks and Dashboards) Analytics Engineering Leadership Hiring and retaining top Analytics Engineering talent Patience, empathy, finding success in the team s success; giving and receiving feedback frequently Training and onboarding colleagues onto an analytics environment; especially on a remote-first / globally distributed team Experience identifying and driving process improvement around data modeling and tool usage Best practice definition: identifying the minimal set of rules for the greatest teamwide gains in clarity, accuracy, discoverability, and reusability Project Management Experience discovering, defining, evaluating and prioritizing new projects or areas of opportunity Leading cross-functional projects, especially the construction of new data products or capabilities Influence without direct control: driving alignment and accountability across peer teams Driving execution: defining project objectives and success criteria, roadmaps, responsibilities Effective communication across multiple channels (video, wikis, decks, guided exercises) and a knack for choosing the best format for the task and audience at hand We know the confidence gap and impostor syndrome can get in the way of meeting spectacular candidates, so please don t hesitate to apply we d love to hear from you. If you need accommodations or assistance due to a disability, please reach out to us using this form . At HubSpot, we value both flexibility and connection. Whether you re a Remote employee or work from the Office, we want you to start your journey here by building strong connections with your team and peers. If you are joining our Engineering team, you will be required to attend a regional HubSpot office for in-person onboarding. If you join our broader Product team, you ll also attend other in-person events such as your Product Group Summit and other gatherings to continue building on those connections. If you require an accommodation due to travel limitations or other reasons, please inform your recruiter during the hiring process. We are committed to supporting candidates who may need alternative arrangements Massachusetts Applicants: It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. Germany Applicants: (m/f/d) - link to HubSpots Career Diversity page here . India Applicants: link to HubSpot Indias equal opportunity policy here . About HubSpot HubSpot (NYSE: HUBS) is an AI-powered customer platform with all the software, integrations, and resources customers need to connect marketing, sales, and service. HubSpots connected platform enables businesses to grow faster by focusing on what matters most: customers. At HubSpot, bold is our baseline. Our employees around the globe move fast, stay customer-obsessed, and win together. Our culture is grounded in four commitments: Solve for the Customer, Be Bold, Learn Fast, Align, Adapt & Go!, and Deliver with HEART. These commitments shape how we work, lead, and grow. We re building a company where people can do their best work . We focus on brilliant work, not badge swipes. By combining clarity, ownership, and trust, we create space for big thinking and meaningful progress. And we know that when our employees grow, our customers do too. Recognized globally for our award-winning culture by Comparably, Glassdoor, Fortune, and more, HubSpot is headquartered in Cambridge, MA, with employees and offices around the world. Explore more: HubSpot Careers Life at HubSpot on Instagram By submitting your application, you agree that HubSpot may collect your personal data for recruiting, global organization planning, and related purposes. Refer to HubSpots Recruiting Privacy Notice for details on data processing and your rights.

Posted 2 weeks ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

coimbatore

Work from Office

ROS/ROS2 Developer Experience : 3 - 6 Positions : 1 Location : Coimbatore | In-Office Work Employment Type : Full Time / Long Term Key Skills : ROS/ROS2, C++ and Python, OpenCV, PCL, LiDAR, cameras, IMUs, Gazebo, RViz, Webots, Linux and real-time systems Position Overview: We are looking for a talented and motivated ROS/ROS2 Developer to join our team. In this role, you will work on designing, developing, and integrating robotic software systems using the Robot Operating System (ROS/ROS2). You will contribute to the development of autonomous systems, robotic platforms, and software tools while collaborating closely with cross-functional teams. Key Responsibilities: Design, develop, and implement software modules using ROS/ROS2 for robotic systems. Create and optimize robot navigation, localization, perception, and manipulation systems. Develop custom ROS nodes and interfaces for hardware integration and data processing. Work on sensor integration, including LiDAR, cameras, IMUs, and other peripherals. Collaborate with hardware and software teams to ensure seamless integration of robotic systems. Optimize robotic algorithms for real-time performance and efficiency. Conduct testing and debugging of robotic systems in simulation and real-world environments. Write clear and maintainable documentation for developed software. Qualifications Required skills: Proficiency in developing applications using ROS/ROS2. Strong programming skills in C++ and Python. Experience with robotic perception (e.g., OpenCV, PCL) and control systems. Familiarity with robotics hardware, such as sensors, actuators, and embedded systems. Hands-on experience with simulation tools like Gazebo, RViz, or Webots. Knowledge of SLAM, path planning, and motion control algorithms. Experience with Linux operating systems and command-line tools. Understanding of real-time systems and multi-threaded programming. Preferred Skills Familiarity with DDS (Data Distribution Service) middleware in ROS2. Experience with hardware drivers and low-level interfaces. Knowledge of machine learning or deep learning for robotics. Exposure to Agile development methodologies and version control systems like Git. Familiarity with Docker and CI/CD pipelines for robotics projects.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

noida

Work from Office

Python - AI / ML Engineer : Noida Location: Noida Experience: 4 to 8 Years Role Summary & Key Objectives We are seeking an experienced AI / ML Engineer with 4 8 years of industry experience in designing, developing, and deploying scalable machine learning solutions. The role involves working across the full lifecycle of ML systems from data acquisition and feature engineering to model training, optimization, deployment, and monitoring. The ideal candidate will collaborate with product, data, and engineering teams to solve real-world problems using advanced machine learning and AI techniques. Key Objectives: Translate business problems into ML/AI solutions with measurable impact. Build and optimize production-grade ML models that are scalable and reliable. Ensure robustness, fairness, and efficiency in ML pipelines. Drive adoption of AI/ML best practices across the organization. Core Responsibilities Design and implement ML algorithms for prediction, classification, recommendation, NLP, or computer vision use cases. Collect, clean, and preprocess large datasets for model development. Develop, train, validate, and fine-tune machine learning / deep learning models. Deploy ML models into production using MLOps best practices (CI/CD pipelines, monitoring, retraining). Collaborate with cross-functional teams (data engineers, product managers, software developers) to integrate AI features into products. Continuously evaluate new ML techniques, frameworks, and research to improve model accuracy and performance. Document solutions, conduct knowledge-sharing sessions, and mentor junior engineers. Must-Have Skills (Technical & Soft) Technical: Strong programming skills in Python (NumPy, Pandas, Scikit-learn, PyTorch, TensorFlow). Experience with machine learning algorithms (supervised, unsupervised, reinforcement learning). Hands-on expertise in deep learning (CNNs, RNNs, Transformers). Proficiency in data preprocessing, feature engineering, and statistical analysis . Experience with SQL/NoSQL databases and handling large datasets. Familiarity with MLOps tools (MLflow, Kubeflow, Airflow, Docker, Kubernetes). Knowledge of cloud platforms (AWS Sagemaker, Azure ML, GCP AI Platform). Generative AI models (e.g., LLMs, GANs, Diffusion Models, VAEs, Transformers). Fine-tune and optimize foundation models for domain-specific applications Agentic AI framework (AutoGen, LangChain/LangGraph, CrewAI, and Microsoft Semantic Kernel, OpenAI) Experience in multimodal AI (text, image, audio, video generation). Familiarity with prompt engineering & fine-tuning LLMs. Knowledge of vector databases (Pinecone, Weaviate, FAISS, Milvus etc) for retrieval-augmented generation (RAG) Soft Skills: Strong problem-solving and analytical thinking. Excellent communication and presentation skills. Ability to work in collaborative, cross-functional teams. Self-driven and proactive in exploring new technologies. Good-to-Have Skills Exposure to NLP frameworks (Hugging Face, spaCy, NLTK). Computer Vision experience (OpenCV, YOLO, Detectron). Experience with big data frameworks (Spark, Hadoop). Knowledge of generative AI (LLMs, diffusion models, prompt engineering). Contribution to research papers, open-source projects, or Kaggle competitions. Familiarity with A/B testing and experimentation frameworks . Experience Requirements 4 to 8 years of professional experience in AI/ML development. Proven track record of building and deploying ML models into production. Experience in solving business problems through applied machine learning. KPIs / Success Metrics Accuracy, precision, recall, F1-score, or other relevant model performance metrics. Successful deployment of ML models with minimal downtime and robust monitoring. Reduction in data processing or inference time (efficiency improvements). Measurable business impact (e.g., improved predictions, reduced churn, better personalization). Contribution to team learning through code reviews, mentoring, and documentation. Staying updated and adopting relevant cutting-edge ML/AI techniques.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

bengaluru

Work from Office

":" Experience: Minimum 4+ years (relevant experience mandatory) Key Skills Required Strong hands-on experience with Scala (mandatory) and Apache Spark Experience with Hadoop ecosystem HDFS, Hive, Impala, Sqoop Data ingestion and pipeline development for large-scale systems Proficiency in Java and distributed data processing Knowledge of data warehousing and query optimization Job Description We are seeking a skilled Data Engineer (Spark & Scala) with hands-on expertise in big data technologies and large-scale data processing. The role involves building and optimizing data ingestion pipelines , working with the Hadoop ecosystem , and ensuring high-performance data workflows. Responsibilities Design, develop, and optimize data ingestion pipelines using Spark and Scala. Work with Hadoop ecosystem tools (HDFS, Hive, Impala, Sqoop) for large-scale data processing. Collaborate with cross-functional teams to integrate structured and unstructured data sources. Implement data transformation, validation, and quality checks. Optimize data workflows for scalability, performance, and fault tolerance. Write clean, efficient, and maintainable code in Scala and Java. Ensure compliance with best practices for data governance and security. Desired Candidate Profile Minimum 4+ years of experience in data engineering. Strong expertise in Scala (mandatory) and Apache Spark . Hands-on experience with Hadoop ecosystem tools (HDFS, Hive, Impala, Sqoop). Proficiency in Java for distributed system development. Strong problem-solving and analytical skills. Ability to work in fast-paced, collaborative environments. " , "Job_Opening_ID":"ZR_3382_JOB" , "Job_Type":"Contract" , "Job_Opening_Name":"Data Engineer (Spark & Scala)" , "State":"Karnataka" , "Currency":"INR" , "Country":"India" , "Zip_Code":"560001" , "id":"40099000030883728" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-29"}]);

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

bengaluru

Work from Office

We are seeking a highly skilled and experienced Senior Generative AI Engineer to join our innovative team, with a paramount focus on developing and rigorously evaluating sophisticated multi-agent AI systems. This role is crucial for designing, building, deploying, and ensuring the accuracy and reliability of cutting-edge generative AI solutions that leverage collaborative AI agents. The ideal candidate will possess a deep understanding of generative models, combined with robust MLOps practices, strong back-end engineering skills in microservices architectures on cloud platforms like AWS or GCP, and an absolute mastery of Python, Langgraph, and Langchain. Proven experience with evaluation methodologies, including working with evaluation datasets and measuring the accuracy of multi-agent systems using tools like Langsmith or other open-source alternatives, is a must-have. Key Responsibilities: Generative AI Development & Multi-Agent Systems: Design, develop, and implement advanced generative AI models (LLMs) for various applications, from ideation to production. Architect, build, and deploy intelligent multi-agent AI systems, enabling collaborative behaviors and complex decision-making workflows. Utilize and extend frameworks like Langchain and Langgraph extensively for building sophisticated, multi-step AI applications, intelligent agents, and agentic workflows, with a strong focus on their evaluability. Fine-tune and adapt pre-trained generative models to specific business needs and datasets, often as components within agentic systems. Develop strategies for prompt engineering and RAG (Retrieval Augmented Generation) to enhance model performance and control, particularly in multi-agent contexts. Research and stay abreast of the latest advancements in generative AI, natural language processing, multi-agent systems, and autonomous AI. Multi-Agent System Evaluation & Accuracy: Design, develop, and execute comprehensive evaluation strategies for multi-agent systems, defining key performance indicators (KPIs) and success metrics. Create, manage, and utilize high-quality evaluation datasets to rigorously test the accuracy, coherence, consistency, and robustness of multi-agent system outputs. Implement and leverage tools like Langsmith or other open-source solutions (e.g., TruLens, Ragas, custom frameworks) to trace agent interactions, analyze trajectories, and measure the accuracy and effectiveness of multi-agent system behavior. Perform root cause analysis for evaluation failures and drive iterative improvements to agent design and system performance. Develop methods for assessing inter-agent communication efficiency, task allocation accuracy, and collaborative problem-solving success. MLOps & Deployment: Establish and implement robust MLOps pipelines for training, evaluating, deploying, monitoring, and managing generative AI models and multi-agent systems in production environments. Ensure model and agent system scalability, reliability, and performance in a production setting. Implement version control for models, data, and code. Monitor model drift, performance degradation, and data quality, implementing proactive solutions for both individual models and interconnected agents. Back-end Engineering (Microservices on AWS/GCP): Develop and maintain highly scalable and resilient microservices to integrate generative AI models and orchestrate multi-agent systems into larger applications. Design and implement APIs for model inference and agent interaction and coordination. Deploy and manage microservices on cloud platforms such as AWS or GCP, utilizing services like EC2, S3, Lambda, EKS/ECS, Sagemaker, GCP Compute Engine, GCS, GKE, Vertex AI, etc., with a focus on supporting agentic architectures. Implement best practices for security, logging, monitoring, and error handling in microservices, especially concerning inter-agent communication and system resilience Collaboration & Leadership: Collaborate closely with data scientists, machine learning engineers, product managers, and other stakeholders to translate business requirements into technical solutions, with a keen eye on opportunities for multi-agent automation and their measurable impact. Mentor junior engineers and contribute to the growth of the teams technical capabilities, particularly in agentic AI development and rigorous evaluation. Participate in code reviews, architectural discussions, and technical design sessions, championing best practices for multi-agent system design and testability. Required Qualifications: Bachelors or Masters degree in Computer Science, Artificial Intelligence, Machine Learning, or a related quantitative field. 4+ years of experience in software engineering with at least 2+ years focused on Machine Learning Engineering or Generative AI development. Demonstrable prior experience in multi-agent product development, including designing, implementing, and deploying systems with interacting AI agents. Mandatory experience in working with evaluation datasets, defining metrics, and assessing the accuracy and performance of multi-agent systems using tools like Langsmith or comparable open-source alternatives. Exceptional proficiency in Python and its ecosystem for machine learning (e.g., PyTorch, TensorFlow, Hugging Face Transformers). Deep expertise with Langgraph and Langchain for building complex LLM applications, intelligent agents, and orchestrating multi-agent workflows. Solid understanding and practical experience with various generative AI models (LLMs) Proven experience with MLOps principles and tools (e.g., MLflow, Kubeflow, Data Version Control (DVC), CI/CD for ML), with an emphasis on agent system lifecycle management and continuous evaluation. Extensive experience designing, developing, and deploying microservices architectures on either AWS or GCP. Proficiency with containerization technologies (Docker) and orchestration (Kubernetes). Strong understanding of API design and development (RESTful, gRPC). Excellent problem-solving skills, with a focus on building robust, scalable, and maintainable solutions. Strong communication and collaboration skills. Preferred Qualifications: Experience with Apache Spark for large-scale data processing Experience with specific AWS services (e.g., Sagemaker, Lambda, EKS) or GCP services (e.g., Vertex AI, GKE, Cloud Functions) for deploying and managing agentic systems. Familiarity with other distributed computing frameworks. Contributions to open-source projects in the AI/ML space, especially those related to multi-agent systems or agent frameworks (e.g., AutoGen, CrewAI). Experience with real-time inference for generative models and real-time agent decision-making and evaluation.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

hyderabad

Work from Office

Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas Oncology, Inflammation, General Medicine, and Rare Disease we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Senior Associate Software Engineer - Cloud Ops What you will do Let s do this. Let s change the world. In this vital role you will be responsible for designing, developing, and maintaining software applications and solutions that meet business needs and ensuring the availability and performance of critical systems and applications. This role involves working closely with product managers, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Design and implement systems and processes to improve the reliability, scalability, and performance of applications Automate routine operational tasks, such as deployments, monitoring, and incident response, to improve efficiency and reduce human error Develop and maintain monitoring tools and dashboards to track system health, performance, and availability along with Optimizing cloud cost and performance (e. g. , right-sizing instances) Respond to and resolve incidents promptly, conducting root cause analysis and implementing preventive measures Provide ongoing maintenance and support for existing systems, ensuring that they are secure, efficient, and reliable Work on integrating various software applications and platforms to ensure seamless operation across the organization Implement and maintain security measures to protect systems from unauthorized access and other threats Focus on managing and operating cloud infrastructure (compute, storage, networking, security). Ensures day-to-day cloud environments (AWS, GCP, Azure) are healthy, cost-optimized, and secure. Works closely on provisioning, configuration, and lifecycle management of cloud resources. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree / Bachelors degree and 5 to 9 years computer science or STEM majors with a minimum of 1-2 years of Information Systems experience Working experience with various cloud services on AWS (Azure, GCP) and containerization technologies (Docker, Kubernetes). Programming skills in languages such as Python. Working experience of infrastructure as code (IaC) tools (Terraform, CloudFormation). Working experience with monitoring and alerting tools (Prometheus, Grafana, etc. ). Working experience with DevOps/MLOps practice and CI/CD pipelines Good-to-Have Skills: Understanding of cloud platforms (e. g. , AWS, GCP, Azure) and containerization technologies (e. g. , Docker, Kubernetes) Experience with monitoring and logging tools (e. g. , Prometheus, Grafana, Splunk) Experience with data processing tools like Hadoop, Spark, or similar Professional Certifications: AWS Certified DevOps Engineer Associate certification (preferred) AWS Certified CloudOps Engineer Associate certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

bengaluru

Work from Office

Data Engineer Roles and Responsibilities: Create and maintain efficient and scalable data models as per business needs Create and maintain optimal data pipelines against multiple data sources lie SQL, BigData on Azure / AWS cloud; Assemble and process large, complex data sets to meet both functional and non-functional business requirements; Analyze and improve existing data models, pipelines, related infrastructure and processes Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics; Monitor and improve data quality and data governance policies Collaborate with stakeholders including the executive, product, data and design teams to assist with data-related technical issues and support their data infrastructure needs; Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader; Work with data and analytics experts to strive for greater functionality in our data systems. Must Have Skills: 5+ years of experience working with distributed data technologies (e.g. Hadoop, MapReduce, Spark, Kafka, Flink etc) for building efficient, large-scale big data pipelines; Strong Software Engineering experience with proficiency in at least one of the following programming languages: Java, Scala, Python or equivalent; Implement data ingestion pipelines both real time and batch using best practices; Experience with building stream-processing applications using Apache Flink, Kafka Streams or others; Experience with Cloud Computing platforms like Azure,Amazon AWS, Google Cloud etc.; Experience supporting and working with cross-functional teams in a dynamic environment; Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with ELK stack. Ability to work in a Linux environment. Nice to Have: Experience in building distributed, high-volume data services; Experience with big data processing and analytics stack in AWS: EMR, S3, EC2, Athena, Kinesis, Lambda, Quicksight etc.; Knowledge of data science tools and their integration with data lakes; Experience in container technologies like Docker/Kubernetes. Qualification: Bachelor of Science in Computer Science or equivalent technical training and professional work experience. About Nomiso: Nomiso is a product and services engineering company. We are a team of Software Engineers, Architects, Managers, and Cloud Experts with expertise in Technology and Delivery Management. Our mission is to Empower and Enhance the lives of our customers, through efficient solutions for their complex business problems. At Nomiso we encourage entrepreneurial spirit - to learn, grow and improve. A great workplace, thrives on ideas and opportunities. That is a part of our DNA. We re in pursuit of colleagues who share similar passions, are nimble and thrive when challenged. We offer a positive, stimulating and fun environment with opportunities to grow, a fast-paced approach to innovation, and a place where your views are valued and encouraged. We invite you to push your boundaries and join us in fulfilling your career aspirations!

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

pune

Work from Office

[{"Remote_Job":false , "Posting_Title":"Data Engineer" , "Is_Locked":false , "City":"Pune" , "Industry":"IT Services" , "Job_Opening_ID":"RRF_5743" , "Job_Description":" Who are we Fulcrum Digital is an agile and next-generation digital accelerating company providing digital transformation and technology services right from ideation to implementation. These services have applicability across a variety of industries, including banking & financial services, insurance, retail, higher education, food, health care, and manufacturing. TheRole Designing and building optimized data pipelines using cutting-edge technologies in a cloud environment to drive analytical insights. Constructing infrastructure for efficient ETL processes from various sources and storage systems. Leading the implementation of algorithms and prototypes to transform raw data into useful information. Architecting, designing, and maintaining database pipeline architectures, ensuring readiness for AI/ML transformations. Creating innovative data validation methods and data analysis tools. Ensuring compliance with data governance and security policies. Interpreting data trends and patterns to establish operational alerts. Developing analytical tools, programs, and reporting mechanisms. Conducting complex data analysis and presenting results effectively. Preparing data for prescriptive and predictive modeling. Continuously exploring opportunities to enhance data quality and reliability. Applying strong programming and problem-solving skills to develop scalable solutions. Requirements Experience in the Big Data technologies (Hadoop, Spark, Nifi,Impala) 5+ years of hands-on experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient, and distributed data pipelines. High proficiency in Scala/Java and Spark for applied large-scale data processing. Expertise with big data technologies, including Spark, Data Lake, and Hive. Solid understanding of batch and streaming data processing techniques. Proficient knowledge of the Data Lifecycle Management process, including data collection, access, use, storage, transfer, and deletion. Expert-level ability to write complex, optimized SQL queries across extensive data volumes. Experience on HDFS, Nifi, Kafka. Experience on Apache Ozone, Delta Tables, Databricks, Axon(Kafka), Spring Batch, Oracle DB Familiarity with Agile methodologies. Obsession for service observability, instrumentation, monitoring, and alerting. Knowledge or experience in architectural best practices for building data lakes. " , "Job_Type":"Contract" , "Job_Opening_Name":"Data Engineer" , "State":"Maharashtra" , "Country":"India" , "Zip_Code":"411001" , "id":"613047000046931476" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-19"}]

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

bengaluru

Work from Office

We are the AI Platform Team! We are looking for a highly motivated, self-reliant, experienced SRE and customer support engineer who is passionate about driving major transformation within the AI organization. This role will support the AI Platform and services that provide machine learning infrastructure to researchers and data scientists across the company. Youll be expected to stay in touch with the latest technology development and drive implementation of DevOps practices across the organization and provide customer support. Job Functions You will be a member of our AI Platform Team, supporting the next generation AI architecture for various research and engineering teams within the organization. Youll partner with vendors and the infrastructure engineering team for security and service availability. Youll fix production issues with engineering teams, researchers, data scientists, including performance and functional issues. Diagnose and solve customer technical problems. Participate in training customers and prepare reports on customer issues. Be responsible for customer service improvements and recommend product improvements. Write support documentation. Youll design and implement zero-downtime to monitor and accomplish a highly available service (99.999%). As a support engineer, find opportunities to automate as part of the problem management process, creating automation to avoid issues. Define engineering excellence for operational maturity. Youll work together with AI platform developers to provide the CI/CD model to deploy and configure the production system automatically. Develop and follow operational standard processes for tools and automation development, including style guides, versioning practices, source control, branching and merging patterns and advising other engineers on development standards. Deliver solutions that accelerate the activities, phenomenal engineers would perform through automation, deep domain expertise, and knowledge sharing. Required Skills: Demonstrated ability in designing, building, refactoring and releasing software written in Python. Hands-on experience with ML frameworks such as PyTorch, TensorFlow, Triton. Ability to handle framework-related issues, version upgrades, and compatibility with data processing / model training environments. Experience with AI/ML model training and inferencing platforms is a big plus. Experience with the LLM fine tuning system is a big plus. Debugging and triaging skills. Cloud technologies like Kubernetes, Docker and Linux fundamentals. Familiar with DevOps practices and continuous testing. DevOps pipeline and automations: app deployment/configuration & performance monitoring. Test automations, Jenkins CI/CD. Excellent communication, presentation, and leadership skills to be able to work and collaborate with partners, customers and engineering teams. Well organized and able to manage multiple projects in a fast paced and demanding environment. Good oral/reading/writing English ability.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

bengaluru

Work from Office

About the Team Our team is dedicated to unlocking the rich knowledge embedded within Elsevier s content through our semantic data platform; this empowers researchers, clinicians, and innovators worldwide to gain new insights, make informed decisions, and accelerate progress across research, healthcare, and life sciences. We lead the ongoing transformation of Elsevier s vast, unstructured information into richly interconnected knowledge graphs that capture the full depth and nuance of scientific meaning. Through our dynamic knowledge discovery platform, we combine graph-powered agentic AI with advanced search technologies to deliver contextually relevant, trustworthy, and precise answers. As part of the Data Engineering team, you ll contribute to the systems and infrastructure that fuel this mission. This role focuses on building reusable platform components and delivering Data Products that enable scalable, reliable, and high-performance data solutions across Elsevier s ecosystem. The Role We are looking for a Senior Software Engineer II who can lead the design and implementation of complex systems, mentor team members, and contribute to our evolving data engineering architecture. You ll work on large-scale data pipelines, orchestration frameworks, and services that support our data products collaborating closely with product, platform, and other engineering teams to deliver impactful, high-quality solutions. Responsibilities Design and develop scalable data processing workflows and microservices using Spark , Spark Streaming , and Airflow . Write modular, testable code in Python or Scala , aligned with software engineering best practices. Lead implementation of system components that span multiple services and modules. Diagnose and resolve complex technical issues in distributed data systems. Participate in architecture discussions, design reviews, and engineering rituals. Develop and maintain data models to support analytical and operational use cases. Collaborate with cross-functional stakeholders to translate requirements into engineering solutions. Contribute to mentoring and onboarding of junior engineers. Champion continuous improvement and knowledge-sharing across the team. What We re Looking For 5+ years of professional experience in software or data engineering. Proven track record building and optimizing large-scale batch and streaming data systems. Proficiency with Spark , Spark Streaming , Airflow , and either Python or Scala . Deep understanding of distributed system design, data modeling, and performance optimization. Strong experience with test-driven development and CI/CD practices. Ability to independently drive technical outcomes from problem to deployment. Familiarity with Agile or other iterative development methodologies. Nice to Have Exposure to graph-based data models or knowledge graph architecture. Experience building internal platforms or reusable engineering components. Knowledge of observability best practices for data systems (e.g., logging, metrics, alerts). Career Progression This role provides a strong foundation for advancement to Software Engineering Lead overseeing larger domains and mentoring broader teams. Principal Software Engineer driving architecture and cross-team technical strategy. There are also opportunities for horizontal moves into Systems Engineering, Program Management, or Quality Engineering depending on your career interests. Why Elsevier At Elsevier, you ll work on systems that underpin some of the world s most critical research and innovation workflows. Our products and platforms are used by scientists, clinicians, and experts in over 180 countries and you ll be part of the team responsible for building the data infrastructure that powers them. You ll be supported by a culture that values engineering quality, autonomy, and deep technical exploration. We are actively transforming legacy systems and pioneering modern approaches to data and knowledge representation at scale. Work Environment This is a hybrid role. Our teams operate in a flexible hybrid work model, combining in-person collaboration with remote flexibility. You ll be expected to participate in regular team meetings and engineering rituals in line with your team s cadence. We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our Applicant Request Support Form or please contact 1-855-833-5120. Criminals may pose as recruiters asking for money or personal information. We never request money or banking details from job applicants. Learn more about spotting and avoiding scams here . Please read our Candidate Privacy Policy . We are an equal opportunity employer qualified applicants are considered for and treated during employment without regard to race, color, creed, religion, sex, national origin, citizenship status, disability status, protected veteran status, age, marital status, sexual orientation, gender identity, genetic information, or any other characteristic protected by law. USA Job Seekers EEO Know Your Rights .

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

hyderabad

Work from Office

Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas Oncology, Inflammation, General Medicine, and Rare Disease we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Specialist Software Engineer Cloud Ops What you will do Let s do this. Let s change the world. In this vital role you will be responsible for designing, developing, and maintaining software applications and solutions that meet business needs and ensuring the availability and performance of critical systems and applications. This role involves working closely with product managers, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Design and implement systems and processes to improve the reliability, scalability, and performance of applications Automate routine operational tasks, such as deployments, monitoring, and incident response, to improve efficiency and reduce human error Develop and maintain monitoring tools and dashboards to track system health, performance, and availability along with Optimizing cloud cost and performance (e.g., right-sizing instances) Respond to and resolve incidents promptly, conducting root cause analysis and implementing preventive measures Provide ongoing maintenance and support for existing systems, ensuring that they are secure, efficient, and reliable Work on integrating various software applications and platforms to ensure seamless operation across the organization Implement and maintain security measures to protect systems from unauthorized access and other threats Focus on managing and operating cloud infrastructure (compute, storage, networking, security). Ensures day-to-day cloud environments (AWS, GCP, Azure) are healthy, cost-optimized, and secure. Works closely on provisioning, configuration, and lifecycle management of cloud resources. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Masters degree / Bachelors degree and computer science or STEM majors with a minimum of 5 years of Information Systems experience Working experience with various cloud services on AWS (Azure, GCP) and containerization technologies (Docker, Kubernetes). Strong programing skills in languages such as Python. Working experience of infrastructure as code (IaC) tools (Terraform, CloudFormation). Working experience with monitoring and alerting tools (Prometheus, Grafana, etc.). Working experience with DevOps/MLOps practice and CI/CD pipelines Proficiency in automated testing tools and frameworks (e.g., Selenium, JUnit, pytest), Incident Management, Production Issue Root Cause Analysis and Improve System Quality. Preferred Qualifications: Strong understanding of cloud platforms (e.g., AWS, GCP, Azure) and containerization technologies (e.g., Docker, Kubernetes) Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk) Experience with data processing tools like Hadoop, Spark, or similar Experience with SAP integration technologies Professional Certifications: AWS Certified DevOps Engineer Associate or Professional certification (preferred) AWS Certified CloudOps Engineer Associate or Professional certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 2 weeks ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

bengaluru

Work from Office

You will be a key member of our Data Engineering team, focused on designing, developing, and maintaining robust data solutions on on-prem environments. You will work closely with internal teams and client stakeholders to build and optimize data pipelines and analytical tools using Python, Scala, SQL, Spark and Hadoop ecosystem technologies. This role requires deep hands-on experience with big data technologies in traditional data centre environments (non-cloud). What you ll be doing Design, build, and maintain on-prem data pipelines to ingest, process, and transform large volumes of data from multiple sources into data warehouses and data lakes Develop and optimize Scala-Spark and SQL jobs for high-performance batch and real-time data processing Ensure the scalability, reliability, and performance of data infrastructure in an on-prem setup Collaborate with data scientists, analysts, and business teams to translate their data requirements into technical solutions Troubleshoot and resolve issues in data pipelines and data processing workflows Monitor, tune, and improve Hadoop clusters and data jobs for cost and resource efficiency Stay current with on-prem big data technology trends and suggest enhancements to improve data engineering capabilities Bachelors degree in software engineering, or a related field 5+ years of experience in data engineering or a related domain Strong programming skills in Python & S

Posted 2 weeks ago

Apply

8.0 - 10.0 years

25 - 30 Lacs

hyderabad

Work from Office

Career Category Sales & Marketing Operations Job Description ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: The Patient Data Management Senior Manager will be responsible for building and growing the Patient Data Management team and capabilities, insourcing work from external vendors to drive operational excellence in our Innovation Center in India. The Sr. Manager will collaborate with other data and analytics colleagues across the organization to manage Amgen s patient data assets (e. g. data generated from Patient Support Programs and Specialty Pharmacy patient data contracts). This also includes managing solutions to compliantly connect patient data assets that will serve as the foundation for driving patient journey analytics that inform enhanced patient experiences. This position is responsible for overseeing cross-functional processes, SOPs, and governance to ensure secure and appropriate access to these sensitive data sets. Roles & Responsibilities: Collaborate with cross-functional data & analytics leads to define use cases and deliver high quality data that supports longitudinal patient journey analytics Deliver end to end data curation including aggregation, processing, integration and secure access for new and existing patient data assets, ensuring adherence to all Amgen privacy and compliance SOPs Implement patient data connectivity across Amgens first party data and third party syndicated data assets to enable longitudinal analytics across the patient journey Drive productivity and improve cycle times with automations in data processing and data quality Drive data requirements for patient data contracting with Specialty Pharmacies and Sites of Care organizations Develop guidelines and SOPs for appropriate access and usage for Patient Data, with guidance from Legal, Privacy and Compliance leads Drive standardization of data deliverables including source data layouts, reference data and reporting solutions Manage and oversee outsourced data aggregation vendor partners, ensuring adherence to compliance guidelines, processes and documentation related to processing of Patient Data Collaborate and build strong partnerships across the organization, including Commercial Analytics, Data Platforms, Data Sciences, Patient Access, Privacy, Legal, Compliance, IS, and external vendor partners Basic Qualifications and Experience: Master s degree and 8 to 10 years of data management experience OR Bachelor s degree and 10 to 14 years of relevant data management experience Proven experience in building teams, managing, and retaining talent in India. Demonstrated success in managing complex transitions. Strong leadership, stakeholder management, and communication skills. Excellent English oral and written communication. Comfortable working in global teams and across time zones and cultures. Functional Skills: Must-Have Skills: Experience in Data Management, Analytics, Sales and/or Marketing Operations, Value & Access, or a related field Experience working in the life sciences data domain space, ideally in the areas of Data Platforms and Patient Data. Expertise with relevant data, analytics, and technology solutions (AWS/other cloud platforms, SQL, Databricks, Tableau, SAS, Python, R, etc. ) Direct experience in analyzing large data sets and translating data into information and insights Excellent communication skills, including interpersonal skills to foster collaboration and success in a highly matrixed environment; strong oral/written presentation skills Strong project management skills design, lead and manage project teams in a matrixed and, at times, ambiguous environment Prior experience in partnering with Global stakeholders and managing teams on planning and execution of multi-year programs Quick learning agility to understand new concepts such as our end-to-end Commercial data flows to identify and remediate interdependencies. Alignment with best practices and a commitment to championing new and innovative methodologies and tools to enhance patient data strategies and solutions to drive business outcomes. Good-to-Have Skills: Strong expertise and experience working with patient level datasets including first party sources such as Patient Support Programs, Specialty Pharmacies, Syndicated Claims data, Digital, etc. ) Ability to effectively translate business stakeholder requirements to IT partners to ensure high quality, timely, integrated data assets Prioritize and deliver an aggressive set of data deliveries each quarter, through agile execution and continuous improvement Demonstrated self-starter, ability to work under limited supervision Strong interpersonal skills, negotiation skills, active listening, and relationship management skills Strong vendor management skills Familiarity with and application of Scaled Agile Framework (SAFe). Soft Skills: Excellent leadership and team management skills. Exceptional collaboration and communication skills. Strong data & analytics/critical-thinking and decision-making abilities. Able to perform well in a fast-paced, changing environment. Strong oral, written, and presentation skills, with the ability to articulate complex concepts and controversial findings clearly and compellingly. .

Posted 2 weeks ago

Apply

0.0 - 3.0 years

2 - 2 Lacs

pune

Work from Office

Freshers are welcome!!! This is a non technical job and include data cleansing, data processing & validation for financial investors. Excel & language efficiency is MUST. For more information - Please feel free to call on - +918657778477 Over time allowance Employee state insurance Gratuity

Posted 2 weeks ago

Apply

8.0 - 13.0 years

25 - 35 Lacs

hyderabad

Work from Office

Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas Oncology, Inflammation, General Medicine, and Rare Disease we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Senior Data Engineer What you will do Let s do this. Let s change the world. The role is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Masters degree / Bachelors degree and 8 to 13 years in Computer Science, IT or related field Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools Excellent problem-solving skills and the ability to work with large, complex datasets Solid understanding of data governance frameworks, tools, and standard methodologies. Knowledge of data protection regulations and compliance requirements (e. g. , GDPR, CCPA) Preferred Qualifications: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Solid understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Professional Certifications (Preferred): AWS Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 2 weeks ago

Apply

8.0 - 13.0 years

25 - 35 Lacs

hyderabad

Work from Office

Experience Required: 8+Years Mode of work: Remote Skills Required: Azure DataBricks, Eventhub, Kafka, Architecture,Azure Data Factory, Pyspark, Python, SQL, Spark Notice Period : Immediate Joiners/ Permanent/Contract role (Can join within September 15th 2025) Design, develop, and maintain scalable and robust data solutions in the cloud using Apache Spark and Databricks. Gather and analyse data requirements from business stakeholders and identify opportunities for data-driven insights. Build and optimize data pipelines for data ingestion, processing, and integration using Spark and Databricks. Ensure data quality, integrity, and security throughout all stages of the data lifecycle. Collaborate with cross-functional teams to design and implement data models, schemas, and storage solutions. Optimize data processing and analytics performance by tuning Spark jobs and leveraging Databricks features. Provide technical guidance and expertise to junior data engineers and developers. Stay up to date with emerging trends and technologies in cloud computing, big data, and data engineering. Contribute to the continuous improvement of data engineering processes, tools, and best practices. Bachelor s or master s degree in computer science, engineering, or a related field. 10+ years of experience as a Data Engineer, Software Engineer, or similar role, with a focus on building cloud-based data solution

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies