Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job purpose: Design, develop, and deploy end-to-end AI/ML systems, focusing on large language models (LLMs), prompt engineering, and scalable system architecture. Leverage technologies such as Java/Node.js/NET to build robust, high-performance solutions that integrate with enterprise systems. Who You Are: Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. PhD is a plus. 10+ years of experience in AI/ML development, with at least 2 years working on LLMs or NLP. Proven expertise in end-to-end system design and deployment of production-grade AI systems. Hands-on experience with Java/Node.js/.NET for backend development. Proficiency in Python and ML frameworks (TensorFlow, PyTorch, Hugging Face Transformers). Key Responsibilities: 1. Model Development & Training: Design, train, and fine-tune large language models (LLMs) for tasks such as natural language understanding, generation, and classification. Implement and optimize machine learning algorithms using frameworks like TensorFlow, PyTorch, or Hugging Face. 2. Prompt Engineering: Craft high-quality prompts to maximize LLM performance for specific use cases, including chatbots, text summarization, and question-answering systems. Experiment with prompt tuning and few-shot learning techniques to improve model accuracy and efficiency. 3. End-to-End System Design: Architect scalable, secure, and fault-tolerant AI/ML systems, integrating LLMs with backend services and APIs. Develop microservices-based architectures using Java/Node.js/.NET for seamless integration with enterprise applications. Design and implement data pipelines for preprocessing, feature engineering, and model inference. 4. Integration & Deployment: Deploy ML models and LLMs to production environments using containerization (Docker, Kubernetes) and cloud platforms (AWS/Azure/GCP). Build RESTful or GraphQL APIs to expose AI capabilities to front-end or third-party applications. 5. Performance Optimization: Optimize LLMs for latency, throughput, and resource efficiency using techniques like quantization, pruning, and model distillation. Monitor and improve system performance through logging, metrics, and A/B testing. 6. Collaboration & Leadership: Work closely with data scientists, software engineers, and product managers to align AI solutions with business objectives. Mentor junior engineers and contribute to best practices for AI/ML development. What will excite us: Strong understanding of LLM architectures and prompt engineering techniques. Experience with backend development using Java/Node.js (Express)/.NET Core. Familiarity with cloud platforms (AWS, Azure, GCP) and DevOps tools (Docker, Kubernetes, CI/CD). Knowledge of database systems (SQL, NoSQL) and data pipeline tools (Apache Kafka, Airflow). Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. Ability to work in a fast-paced, collaborative environment. What will excite you: Lead AI innovation in a fast-growing, technology-driven organization. Work on cutting-edge AI solutions, including LLMs, autonomous AI agents, and Generative AI applications. Engage with top-tier enterprise clients and drive AI transformation at scale. Location: Ahmedabad
Posted 1 week ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Role It is an exciting opportunity to work with global team to act as backbone of all reporting and analysis for the P&C Finance team. Roles And Responsibilities We're looking for someone who enjoys working with data and is comfortable wearing multiple hats — from working in raw messy files to shaping dashboards people can actually use. Here's what the day-to-day may look like: Collaborating with finance and non-finance teams to retrieve data timely across systems and divisions. Designing and building robust data pipelines in Palantir Foundry — working with large datasets and Foundry Ontology models. Using PySpark and Apache-based logic to enrich, align, and transform raw data from multiple source systems into streamlined data models. Supporting and enhancing existing reporting platforms, particularly in Power BI, by updating datasets, fixing DAX, or adjusting visuals as per stakeholder needs. Building new reporting tools or dashboards that help visualize financial and operational data clearly and efficiently. Constantly looking for ways to automate manual reporting tasks — whether via data flows, transformation logic, or re-usable queries. Working closely with stakeholders (finance, ops, and others) to understand their problems, resolve data queries, and offer practical, scalable solutions. Taking ownership of reporting problems with a solution-first mindset — if something breaks, you're the type who dives in to figure out why and how to fix it. About You You don’t need years of experience — but you do need curiosity, ownership, and a willingness to learn fast. This could be a perfect fit if: You’re a fresher or someone with 6 months to 1 year experience, ideally in a data, analytics, or reporting-heavy role. You’re comfortable with SQL and Python, and you've built things using Advanced Excel, Power Query, or Power BI. You’ve written some DAX logic, or are excited to learn more about how to shape metrics and KPIs. You like working with big messy datasets and finding ways to clean and align them so others can use them with confidence. You’re comfortable talking to business users, not just writing code — and can explain your logic without needing to “sound technical”. Maybe you’ve worked on a college or internship project where you pulled together data from different places and made something useful. That’s great. Prior experience with Palantir Foundry, or working with finance data, is a big plus We're more interested in how you think and solve problems than just checking boxes. So if you're eager to learn, open to feedback, and enjoy finding insights in data — we’d love to hear from you. Nice to Have (but not mandatory) These Aren’t Must-haves, But If You’ve Worked On Any Of The Following, It’ll Definitely Make You Stand Out You've written user-defined functions (UDFs) in PySpark to make your transformation logic reusable and cleaner across multiple pipelines. You try to follow systematic coding practices — like organizing logic into steps, adding meaningful comments, or handling edge cases cleanly. You’ve worked with version control (Git or similar), and understand how to manage updates to code or revert changes if something breaks. You care about performance optimization — like reducing pipeline runtime, minimizing joins, or improving how fast visuals load in tools like Power BI. You’re comfortable thinking not just about “how to get it to work” but also “how to make it better, faster, and easier to maintain.” About Swiss Re Swiss Re is one of the world’s leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient. We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime. We cover both Property & Casualty and Life & Health. Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients. This is possible thanks to the collaboration of more than 14,000 employees across the world. Our success depends on our ability to build an inclusive culture encouraging fresh perspectives and innovative thinking. We embrace a workplace where everyone has equal opportunities to thrive and develop professionally regardless of their age, gender, race, ethnicity, gender identity and/or expression, sexual orientation, physical or mental ability, skillset, thought or other characteristics. In our inclusive and flexible environment everyone can bring their authentic selves to work and their passion for sustainability. If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience. Keywords Reference Code: 134825
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Software Engineer (Java, Spring boot, Cloud) Overview Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. The Fraud Products team (part of O&T) is developing new capabilities for MasterCard's Decision Management Platform, which serves as the core for multiple business solutions to combat fraud and validate cardholder identity. Our patented Java-based platform processes billions of transactions per month in tens of milliseconds using a multi-tiered, message-oriented approach for high performance and availability. MasterCard software engineering teams leverage Agile development principles, advanced development and design practices, and an obsession over security, reliability, and performance to deliver solutions that delight our customers. We're looking for talented software development engineers and architects to develop advanced technologies and applications that are revolutionizing payments. Would you like to develop industry leading solutions for fighting fraud? Are you motivated by speeding business solutions to market? Do you want to innovate, using cutting edge technologies on challenging business problems? Do you want to work for a company that offers above and beyond benefits including paid parental leave, flexible work hours, gift matching, and even volunteer incentives while encouraging your own professional learning and development? Do you thrive in a place where you are continuously learning more while growing your skills and career? Role Successfully lead major projects and complex assignments with broad scope and long-term business implications. Work closely with other technical leads on assigned projects to assist in design and implementation tasks. Assist with production support issues by acting as a subject matter expert in resolving incidents and problem tickets. Plan, design and develop technical solutions and alternatives to meet business requirements in adherence with MasterCard standards, processes, and best practices. Participate in PoCs (Proof of Concept) and help the Department with selection of Vendor Solutions, Technologies, Methodologies and Frameworks. Design and build technical roadmaps to optimize services and functions with a focus on performance and cost/benefit optimization Conduct brownbag sessions on new and upcoming technologies, methodologies, and application appropriate frameworks. Actively look for opportunities to enhance standards and improve process efficiency. Be an integral part of the Agile SAFe discover and elaboration sessions. Perform requirements and design reviews, peer code reviews and PCI security reviews to ensure compliance with MasterCard standards. Have strong ownership of your team’s software and are deep in the maintenance characteristics, runtime properties and dependencies including hardware, operating system, and build. Communicate, collaborate, and work effectively in a global environment. Public speaking as a technology evangelist for Mastercard. All About You Must be high-energy, detail-oriented, proactive and can function under pressure in an independent environment. Must provide the necessary skills to have a high degree of initiative and self-motivation to drive results. Possesses strong communication skills -- both verbal and written – and strong relationship, collaborative skills, and organizational skills. Willingness and ability to learn and take on challenging opportunities and to work as a member of matrix based diverse and geographically distributed project team. Knowledge of software development processes including agile processes and test-driven development Experience with the design and development of complex, multi-tier cloud native architectures. Degree in Computer Science or related field Essential Skills Required Technical experience using Java/J2EE Spring Framework (including Spring Boot) Distributed Computing at scale Cloud technologies like cloud foundry, Kubernetes Strong Linux and shell scripting Oracle & PL/SQL and advanced SQL scripting IDE such as JBoss Developer Studio/IntelliJ Desirable Skills Experience working in at-scale distributed compute such as Gemfire, Apache Spark, Distributed Redis, Hazelcast, GridGain or similar Messaging – MQ and JMS Experience integrating vendor and open-source products into an overall system Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Dev Engineer Overview Be part of the Operations & Technology Fraud Products team developing new capabilities for MasterCard's Decision Management Platform, which serves as the core for multiple business solutions to combat fraud and validate cardholder identity. Our patented Java-based platform processes billions of transactions per month in tens of milliseconds using a multi-tiered, message-oriented approach for high performance and availability. Would you like to develop industry leading solutions for fighting fraud? Are you motivated by speeding business solutions to market? Do you want to innovate, using cutting edge technologies on challenging business problems? Role Deliver solutions by providing direct development of software. Work closely with technical leads for assigned projects to assist in design and implementation tasks Assist with production support issues by acting as a subject matter expert in resolving incidents and problem tickets. Plan, design and develop technical solutions and alternatives to meet business requirements in adherence with Mastercard standards, processes and best practices. Lead day to day system development and maintenance activities of the team to meet service level agreements (SLAs) and create solutions with high level of innovation, cost effectiveness, high quality and faster time to market. Accountable for full systems development life cycle including creating high quality requirements documents, use-cases, design and other technical artifacts including but not limited to detailed test strategy/test design, performance benchmarking, release rollout and deployment plans, contingency/back-out plans, feasibility study, cost and time analysis and detailed estimates. Participate in PoCs (Proof of Concept) and help the Department with selection of Vendor Solutions, Technologies, Methodologies and Frameworks. Conduct brownbag sessions on new and upcoming technologies, methodologies and application appropriate frameworks. Ensure knowledge transfer of vendor technology to Mastercard staff. Provide technical training to the other team members. Actively look for opportunities to enhance standards and improve process efficiency. Mentor and guide other team members during all phases of the SDLC. Ensure adequate test coverage in Unit Testing, System Testing/Integration Testing and Performance Testing. Perform Quality Inspections and Walkthroughs throughout the SDLC including Requirements Review, Design Review, Code Review and Security Review to ensure compliance with Mastercard standards. All About You Must be high-energy, detail-oriented, proactive and have the ability to function under pressure in an independent environment. Must provide the necessary skills to have a high degree of initiative and self-motivation to drive results. Possesses strong communication skills -- both verbal and written – and strong relationship, collaborative skills and organizational skills. Willingness and ability to learn and take on challenging opportunities and to work as a member of matrix based diverse and geographically distributed project team. Good knowledge of Agile software development processes. Experience with the design and development of complex, multi-tier software solutions. Comfortable working in a Linux environment, using VI editor and general command line proficiency Essential Skills: ○ Creating and debugging J2EE REST Web Services and Web Applications ○ Database experience including Oracle and SQL scripting ○ Experience with Spring Framework (including Spring Boot) and Maven ○ Experience writing unit tests with Junit and Mockito ○ Experience working with Apache Tomcat ○ Experience with Git Desirable skills ○ Experience working with containerised environments, such as Kubernetes/OpenShift/CloudFoundry ○ Experience with integration frameworks such as Apache Camel/Spring Integration ○ Experience with monitoring service performance ○ Experience with Angular or modern SPA frameworks such as React + Redux. Experience with HTML5, ES5+ES6 and/or Typescript, SASS and CSS3. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description : We are seeking a highly skilled and motivated Java Developer to join our development team. The ideal candidate will have strong experience in building high-performing, scalable, enterprise-grade applications. You will be responsible for Java application development while providing expertise in the full software development lifecycle. Responsibilities : • Understand integration workflows, architectures, development process, deployment process, and support process. • Develop/deliver/support integration modules/services (API services, integration adapters on existing platform(AWScloud, AWS API Gateway, etc.) • Develop unit test and integration test cases to make sure integration flow works as required. • Monitor integration workflow and perform analysis of incident, defect, bug, issue on integration area. • Good knowledge in software development practices and be able to apply design principles to code. • Good sense of urgency, able to prioritize works appropriately. Understand and adopt changes quickly and reasonably. • Willing to work in team, able to communicate efficiently and concise. • Enjoy optimizing everything from how your code is compiled to how it scales across servers to provide the best end-user experience. • Able to coach others and initiate innovative ideas (senior role) Qualifications : • Strong in Java programming language and Java’s framework (Spring, Apache Camel, etc.) • Good experience in software integration area (Middle & Senior Level), or willing to learn software integration. • Experience in event messaging including Apache Kafka, JMS, Apache Message Queue, Rabbit MQ, AWSSQS, AWSKinesis, etc. • Experience in Git, AWS Cloud and other AWS services. • Good experience in developing web service both REST API, SOAP API, and API security (certificate, OAuth 2, basic authentication, etc.). • Experience in using ELK, or another Application Log management (Splunk). • Able to influence and drive projects to meet key milestones and overcome challenges comfortable working.
Posted 1 week ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Analyst, Big Data Analytics & Engineering Overview Job Title: Sr. Analyst, Data Engineering, Value Quantification Team (Based in Pune, India) About Mastercard Mastercard is a global technology leader in the payments industry, committed to powering an inclusive, digital economy that benefits everyone, everywhere. By leveraging secure data, cutting-edge technology, and innovative solutions, we empower individuals, financial institutions, governments, and businesses to achieve their potential. Our culture is driven by our Decency Quotient (DQ), ensuring inclusivity, respect, and integrity guide everything we do. Operating across 210+ countries and territories, Mastercard is dedicated to building a sustainable world with priceless opportunities for all. Position Overview This is a techno-functional position that combines strong technical skills with a deep understanding of business needs and requirements with 5-7 years or experience. The role focuses on developing and maintaining advanced data engineering solutions for pre-sales value quantification within the Services business unit. As a Sr. Analyst, you will be responsible for creating and optimizing data pipelines, managing large datasets, and ensuring the integrity and accessibility of data to support Mastercard’s internal teams in quantifying the value of services, enhancing customer engagement, and driving business outcomes. The role requires close collaboration across teams to ensure data solutions meet business needs and deliver measurable impact. Role Responsibilities Data Engineering & Pipeline Development: Develop and maintain robust data pipelines to support the value quantification process. Utilize tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx to ensure efficient data integration and transformation. Data Management and Analysis: Manage and analyze large datasets using SQL, Hadoop, and other database management systems. Perform data extraction, transformation, and loading (ETL) to support value quantification efforts. Advanced Analytics Integration: Use advanced analytics techniques, including machine learning algorithms, to enhance data processing and generate actionable insights. Leverage programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Utilize business intelligence platforms such as Tableau and Power BI to create insightful dashboards and reports that communicate the value of services. Generate actionable insights from data to inform strategic decisions and provide clear, data-backed recommendations. Cross-Functional Collaboration & Stakeholder Engagement: Collaborate with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communicate insights and data value through compelling presentations and dashboards to senior leadership and internal teams, ensuring tool adoption and usage. All About You Data Engineering Expertise: Proficiency in data engineering tools and techniques to develop and maintain data pipelines. Experience with data integration tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx. Advanced SQL Skills: Strong skills in SQL for querying and managing large datasets. Experience with database management systems and data warehousing solutions. Programming Proficiency: Knowledge of programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Experience in creating insightful dashboards and reports using business intelligence platforms such as Tableau and Power BI. Statistical Analysis: Ability to perform statistical analysis to identify trends, correlations, and insights that support strategic decision-making. Cross-Functional Collaboration: Strong collaboration skills to work effectively with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communication and Presentation: Excellent communication skills to convey insights and data value through compelling presentations and dashboards to senior leadership and internal teams. Execution Focus: A results-driven mindset with the ability to balance strategic vision with tactical execution, ensuring that data solutions are delivered on time and create measurable business value. Education Bachelor’s degree in Data Science, Computer Science, Business Analytics, Economics, Finance, or a related field. Advanced degrees or certifications in analytics, data science, AI/ML, or an MBA are preferred. Why Us? At Mastercard, you’ll have the opportunity to shape the future of internal operations by leading the development of data engineering solutions that empower teams across the organization. Join us to make a meaningful impact, drive business outcomes, and help Mastercard’s internal teams create better customer engagement strategies through innovative value-based ROI narratives. Location: Gurgaon/Pune, India Employment Type: Full-Time Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 1 week ago
3.0 - 8.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Role Expectations: Perform functional, performance, and load testing of web applications using tools such as JMeter and Postman. Develop, maintain, and execute automated test scripts using Selenium with Java for web application testing. Design and implement tests for RESTful APIs using REST Assured (Java library) for testing HTTP responses and ensuring proper API functionality. Collaborate with development teams to identify and resolve software defects through effective debugging and testing. Utilize the Robot Framework with Python for acceptance testing and acceptance test-driven development. Conduct end-to-end testing and ensure that systems meet all functional requirements. Ensure quality and compliance of software releases by conducting thorough test cases and evaluating product quality. Qualifications: Postman API Testing: Experience in testing RESTful APIs and web services using Postman. Experience Range 3 to 8 years Java: Strong knowledge of Java for test script development, particularly with Selenium and REST Assured. JMeter: Experience in performance, functional, and load testing using Apache JMeter. Selenium with Java: Expertise in Selenium WebDriver for automated functional testing, including script development and maintenance using Java. REST Assured: Proficient in using the REST Assured framework (Java library) for testing REST APIs and validating HTTP responses. Robot Framework: Hands-on experience with the Robot Framework for acceptance testing and test-driven development (TDD) in Python. Networking Knowledge: Deep understanding of networking concepts, specifically around RAN elements and network architectures (ORAN, SMO, RIC, OSS). ORAN/SMO/RIC/OSS Architecture: In-depth knowledge of ORAN (Open Radio Access Network), SMO (Service Management Orchestration), RIC (RAN Intelligent Controller), and OSS (Operations Support Systems) architectures. Monitoring Tools: Experience with Prometheus, Grafana, and Kafka for real-time monitoring and performance tracking of applications and systems. Keycloak: Familiarity with Keycloak for identity and access management.
Posted 1 week ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
We are seeking a talented individual to join our Technology team at Mercer. This role will be based in Gurugram. This is a hybrid role that has a requirement of working at least three days a week in the office. Role: Senior Devops Engineer We are looking for an ideal candidate with minimum 6 years of experience in Devops. The candidate should have strong and deep understanding of Amazon Web Services (AWS) & Devops tools like Terraform, Ansible, Jenkins. Location: Gurgaon Functional Area: Engineering Education Qualification: Graduate/ Postgraduate Experience: 6-9 Years We will count on you to: Deploy infrastructure on AWS cloud using Terraform Deploy updates and fixes Build tools to reduce occurrence of errors and improve customer experience Perform root cause analysis of production errors and resolve technical issues Develop scripts to automation Troubleshooting and maintenance What you need to have: 6+ years of technical experience in devops area. Knowledge of the following technologies and applications: AWS Terraform Linux Administration, Shell Script Ansible CI Server: Jenkins Apache/Nginx/Tomcat Good to have Experience in following technologies: Python What makes you stand out: Excellent verbal and written communication skills, comfortable interfacing with business users Good troubleshooting and technical skills Able to work independently Why join our team: We help you be your best through professional development opportunities, interesting work and supportive leaders. We foster a vibrant and inclusive culture where you can work with talented colleagues to create new solutions and have impact for colleagues, clients and communities. Our scale enables us to provide a range of career opportunities, as well as benefits and rewards to enhance your well-being. Mercer, a business of Marsh McLennan (NYSE: MMC), is a global leader in helping clients realize their investment objectives, shape the future of work and enhance health and retirement outcomes for their people. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businesses: Marsh, Guy Carpenter, Mercer and Oliver Wyman . With annual revenue of $23 billion and more than 85,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit mercer.com , or follow on LinkedIn and X. Mercer Assessments business, one of the fastest-growing verticals within the Mercer brand, is a leading global provider of talent measurement and assessment solutions. As part of Mercer, the world’s largest HR consulting firm and a wholly owned subsidiary of Marsh McLennan—we are dedicated to delivering talent foresight that empowers organizations to make informed, critical people decisions. Leveraging a robust, cloud-based assessment platform, Mercer Assessments partners with over 6,000 corporations, 31 sector skill councils, government agencies, and more than 700 educational institutions across 140 countries. Our mission is to help organizations build high-performing teams through effective talent acquisition, development, and workforce transformation strategies. Our research-backed assessments, advanced technology, and comprehensive analytics deliver transformative outcomes for both clients and their employees. We specialize in designing tailored assessment solutions across the employee lifecycle, including pre-hire evaluations, skills assessments, training and development, certification exams, competitions and more. At Mercer Assessments, we are committed to enhancing the way organizations identify, assess, and develop talent. By providing actionable talent foresight, we enable our clients to anticipate future workforce needs and make strategic decisions that drive sustainable growth and innovation.
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: GCP Data Engineer Location: Chennai 34350 Type: Contract Budget: Up to ₹18 LPA Notice Period: Immediate joiners preferred 🧾 Job Description We are seeking an experienced Google Cloud Platform (GCP) Data Engineer to join our team in Chennai. This role is centered on designing and building cloud-based data solutions that support AI/ML, analytics, and business intelligence use cases. You will develop scalable and high-performance pipelines, integrate and transform data from various sources, and support both real-time and batch data needs. 🛠️ Key Responsibilities Design and implement scalable batch and real-time data pipelines using GCP services such as BigQuery, Dataflow, Dataform, Cloud Composer (Airflow), Data Fusion, Dataproc, Cloud SQL, Compute Engine, and others. Build data products that combine historical and live data for business insights and analytical applications. Lead efforts in data transformation, ingestion, integration, data mart creation, and activation of data assets. Collaborate with cross-functional teams including AI/ML, analytics, DevOps, and product teams to deliver robust cloud-native solutions. Optimize pipelines for performance, reliability, and cost-effectiveness. Contribute to data governance, quality assurance, and security best practices. Drive innovation by integrating AI/ML features, maintaining strong documentation, and applying continuous improvement strategies. Provide production support, troubleshoot failures, and meet SLAs using GCP’s monitoring tools. Work within an Agile environment, follow CI/CD practices, and apply test-driven development (TDD). ✅ Skills Required Strong experience in: BigQuery, Dataflow, Dataform, Data Fusion, Cloud SQL, Compute Engine, Dataproc, Airflow (Cloud Composer), Cloud Functions, Cloud Run Programming experience with Python, Java, PySpark, or Apache Beam Proficient in SQL (5+ years) for complex data handling Hands-on with Terraform, Tekton, Cloud Build, GitHub, Docker Familiarity with Apache Kafka, Pub/Sub, Kubernetes GCP Certified (Associate or Professional Data Engineer) ⭐ Skills Preferred Deep knowledge of cloud architecture and infrastructure-as-code tools Experience in data security, regulatory compliance, and data governance Experience with AI/ML solutions or platforms Understanding of DevOps pipelines, CI/CD using Cloud Build, and containerization Exposure to financial services data or similar regulated environments Experience in mentoring and leading engineering teams Tools: JIRA, Artifact Registry, App Engine 🎓 Education Required: Bachelor's Degree (in Computer Science, Engineering, or related field) Preferred: Master’s Degree 📌 Additional Details Role Type: Contract-based Work Location: Chennai, Onsite Target Candidates: Mid to Senior level with minimum 5+ years of data engineering experience Skills: gcp,apache,pyspark,data,docker
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
This role is for one of the Weekday's clients Min Experience: 5 years Location: Remote (India), Bengaluru, Chennai JobType: full-time We are seeking a skilled ML (Data) Platform Engineer to help scale a next-generation AutoML platform. This role sits at the critical intersection of machine learning, data infrastructure, and platform engineering. You will work on systems central to feature engineering, data management, and time series forecasting at scale. This is not your typical ETL role — the position involves building powerful data platforms that support automated model development, experimentation workflows, and high-reliability data lineage systems. If you're passionate about building scalable systems for both ML and analytics use cases, this is a high-impact opportunity. Requirements Key Responsibilities: Design, build, and scale robust data management systems that power AutoML and forecasting platforms. Own and enhance feature stores and associated engineering workflows. Establish and enforce strong data SLAs and build lineage systems for time series pipelines. Collaborate closely with ML engineers, infrastructure, and product teams to ensure platform scalability and usability. Drive key architectural decisions related to data versioning, distribution, and system composability. Contribute to designing reusable platforms to address diverse supply chain challenges. Must-Have Qualifications: Strong experience with large-scale and distributed data systems. Hands-on expertise in ETL workflows, data lineage, and reliability tooling. Solid understanding of ML feature engineering and experience building or maintaining feature stores. Exposure to time series forecasting systems or AutoML platforms. Strong analytical and problem-solving skills, with the ability to deconstruct complex platform requirements. Good-to-Have Qualifications: Familiarity with modern data infrastructure tools such as Apache Iceberg, ClickHouse, or Data Lakes. Product-oriented mindset with an ability to anticipate user needs and build intuitive systems. Experience with building composable, extensible platform components. Previous exposure to AutoML frameworks such as SageMaker, Vertex AI, or equivalent internal ML platforms. Skills: MLOps, Data Engineering, Big Data, ETL, Feature Store, Feature Engineering, AutoML, Forecasting Pipelines, Data Management
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
This role is for one of the Weekday's clients Min Experience: 5 years Location: Remote (India), Bengaluru, Chennai JobType: full-time We are seeking a skilled ML (Data) Platform Engineer to help scale a next-generation AutoML platform. This role sits at the critical intersection of machine learning, data infrastructure, and platform engineering. You will work on systems central to feature engineering, data management, and time series forecasting at scale. This is not your typical ETL role — the position involves building powerful data platforms that support automated model development, experimentation workflows, and high-reliability data lineage systems. If you're passionate about building scalable systems for both ML and analytics use cases, this is a high-impact opportunity. Requirements Key Responsibilities: Design, build, and scale robust data management systems that power AutoML and forecasting platforms. Own and enhance feature stores and associated engineering workflows. Establish and enforce strong data SLAs and build lineage systems for time series pipelines. Collaborate closely with ML engineers, infrastructure, and product teams to ensure platform scalability and usability. Drive key architectural decisions related to data versioning, distribution, and system composability. Contribute to designing reusable platforms to address diverse supply chain challenges. Must-Have Qualifications: Strong experience with large-scale and distributed data systems. Hands-on expertise in ETL workflows, data lineage, and reliability tooling. Solid understanding of ML feature engineering and experience building or maintaining feature stores. Exposure to time series forecasting systems or AutoML platforms. Strong analytical and problem-solving skills, with the ability to deconstruct complex platform requirements. Good-to-Have Qualifications: Familiarity with modern data infrastructure tools such as Apache Iceberg, ClickHouse, or Data Lakes. Product-oriented mindset with an ability to anticipate user needs and build intuitive systems. Experience with building composable, extensible platform components. Previous exposure to AutoML frameworks such as SageMaker, Vertex AI, or equivalent internal ML platforms. Skills: MLOps, Data Engineering, Big Data, ETL, Feature Store, Feature Engineering, AutoML, Forecasting Pipelines, Data Management
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
hyderabad, telangana
On-site
You should have a minimum of 10 to 15 years of experience in PHP Full Stack Development and Product Support. As a candidate, you will be responsible for taking ownership of product support and enhancements, managing team members including Solution Architects, Coders, Database Admins, IT Infra, Testing, and Product Service Delivery. Additionally, you will handle B2B client escalation calls and provide support during odd hours. Your expertise should lie in full stack PHP development on both server and client sides, with a strong command over PHP coding. You should possess a sound understanding of PHP web frameworks such as Cake PHP and Yii, and have previous experience in building CMS and CRM products. Proficiency in working with Rest APIs, JSON, API creation, and integration is crucial. Hands-on experience with MySQL and a good understanding of SQL are required. Furthermore, familiarity with Apache, NGINX, Composer, AngularJS, Node.js, JavaScript, HTML5, LAMP development, and Object-Oriented Programming of PHP is essential. Experience with version control systems like GitHub and deployment tools such as Ansistano is preferred. Knowledge of Agile Development Methodology, code review, and optimization is advantageous. Strong problem-solving skills, analytical abilities, self-reliance, and effective communication and interpersonal skills are necessary for this role. The ideal candidate should have prior experience in PHP e-commerce product development and support, exhibiting high motivation, energy, and organizational skills. You should be adept at handling PHP Full Stack Development and Product Support for B2B clients in Europe and America. Strong solution architect capabilities, core PHP fundamentals, and expertise in Database Designing are key requirements. A deep understanding of supporting and enhancing e-commerce platforms in a distribution architect setup is highly desirable. Please note that we are seeking individuals who are willing to work flexible hours beyond the standard 9:00 AM to 6:00 PM, especially for remote business support.,
Posted 1 week ago
8.0 years
0 Lacs
Gujarat, India
On-site
Job Summary : We are seeking a highly skilled and motivated Lead DevOps Engineer with Solution Architect expertise to manage end-to-end infrastructure projects across cloud, hybrid, and dedicated server environments. This role demands hands-on experience with WHM/cPanel, OpenPanel, load balancers , and deep knowledge of modern DevOps practices. The ideal candidate will also lead a team of DevOps engineers, drive technical excellence, and serve as the go-to expert for scalable, secure, and high-availability infrastructure solutions. Key Responsibilities : DevOps & Infrastructure Management Architect, implement, and maintain scalable infrastructure solutions across cloud and dedicated server environments. Manage hosting infrastructure including WHM/cPanel, OpenPanel , Apache/Nginx, MySQL, DNS, mail servers, and firewalls. Design and configure load balancing strategies using HAProxy, NGINX, or cloud-native load balancers. Automate provisioning, configuration, deployment, and monitoring using tools like Ansible , Terraform , CI/CD (Jenkins, GitLab CI) . Ensure infrastructure reliability, security, and disaster recovery processes are in place. Solution Architecture Translate business and application requirements into robust infrastructure blueprints. Lead design reviews and architectural discussions for client and internal projects. Create documentation and define architectural best practices for hosting and DevOps. Team Management & Leadership Lead and mentor a team of DevOps engineers across multiple projects. Allocate resources, manage project timelines, and ensure successful delivery. Foster a culture of innovation, continuous improvement, and collaboration. Conduct performance reviews, provide training, and support career development of team members. Monitoring, Security & Optimization Set up and maintain observability systems (e.g., Prometheus, Grafana, Zabbix). Conduct performance tuning, cost optimization, and environment hardening. Ensure compliance with internal policies and external standards (ISO, GDPR, SOC2, etc.). Required Skills & Experience : 8+ years of experience in DevOps, systems engineering, or cloud infrastructure management . 3+ years of experience in team leadership or technical management . Proven expertise in hosting infrastructure , including WHM/cPanel, OpenPanel, Plesk, DNS, and mail configurations. Strong experience with Linux servers , networking , security , and automation scripting (Bash, Python). Hands-on experience with cloud platforms (AWS, Azure, GCP) and hybrid environments. Deep understanding of CI/CD pipelines , Docker/Kubernetes , and version control (Git). Familiarity with load balancing , high availability, and failover strategies. Preferred Qualifications : Certifications such as AWS Solutions Architect , RHCE , CKA , or Linux Foundation Certified Engineer . Experience in IT services or hosting/cloud consulting environments. Knowledge of compliance frameworks (e.g., ISO 27001, SOC 2, PCI-DSS). Familiarity with agile methodologies and DevOps lifecycle management tools.
Posted 1 week ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Develop comprehensive digital analytics solutions utilizing Adobe Analytics for web tracking, measurement, and insight generation Design, manage, and optimize interactive dashboards and reports using Power BI to support business decision-making Lead the design, development, and maintenance of robust ETL/ELT pipelines integrating diverse data sources Architect scalable data solutions leveraging Python for automation, scripting, and engineering tasks Oversee workflow orchestration using Apache Airflow to ensure timely and reliable data processing Provide leadership and develop robust forecasting models to support sales and marketing strategies Develop advanced SQL queries for data extraction, manipulation, analysis, and database management Implement best practices in data modeling and transformation using Snowflake and DBT; exposure to Cosmos DB is a plus Ensure code quality through version control best practices using GitHub Collaborate with cross-functional teams to understand business requirements and translate them into actionable analytics solutions Stay updated with the latest trends in digital analytics; familiarity or hands-on experience with Adobe Experience Platform (AEP) / Customer Journey Analytics (CJO) is highly desirable Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Master’s or Bachelor’s degree in Computer Science, Information Systems, Engineering, Mathematics, Statistics, Business Analytics, or a related field 8+ years of progressive experience in digital analytics, data analytics or business intelligence roles Experience with data modeling and transformation using tools such as DBT and Snowflake; familiarity with Cosmos DB is a plus Experience developing forecasting models and conducting predictive analytics to drive business strategy Advanced proficiency in web and digital analytics platforms (Adobe Analytics) Proficiency in ETL/ELT pipeline development and workflow orchestration (Apache Airflow) Skilled in creating interactive dashboards and reports using Power BI or similar BI tools Deep understanding of digital marketing metrics, KPIs, attribution models, and customer journey analysis Industry certifications relevant to digital analytics or cloud data platforms Ability to deliver clear digital reporting and actionable insights to stakeholders at all organizational levels At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. #NJP
Posted 1 week ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role : Big Data Developer Location : Chennai Experience : 7+ years Work Mode : Work from Office Key Skills Required Google Cloud Platform (GCP) BigQuery (BQ) Dataflow Dataproc Cloud Spanner Strong knowledge of distributed systems, data processing frameworks, and big data architecture. Proficiency in programming languages like Python, Java, or Scala. Roles And Responsibilities BigQuery (BQ): Design and develop scalable data warehouses using BigQuery. Optimize SQL queries for performance and cost-efficiency in BigQuery. Implement data partitioning and clustering strategies. Dataflow: Build and maintain batch and streaming data pipelines using Apache Beam on GCP Dataflow. Ensure data transformation, enrichment, and cleansing as per business needs. Monitor and troubleshoot pipeline performance issues. Dataproc: Develop and manage Spark and Hadoop jobs on GCP Dataproc. Perform ETL/ELT operations using PySpark, Hive, or other tools. Automate and orchestrate jobs for scheduled data workflows. Cloud Spanner: Design and manage globally distributed, scalable transactional databases using Cloud Spanner. Optimize schema and query design for performance and reliability. Implement high availability and disaster recovery strategies. General Responsibilities: Collaborate with data architects, analysts, and business stakeholders to understand data requirements. Implement data quality and data governance best practices. Ensure security and compliance with GCP data handling standards. Participate in code reviews, CI/CD deployments, and Agile development cycles.
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
kochi, kerala
On-site
We are seeking a talented Full Stack Developer to create scalable software solutions as part of a collaborative team responsible for the entire software development lifecycle, from inception to deployment. As a Full Stack Developer, you should possess proficiency in both front-end and back-end coding languages, development frameworks, and third-party libraries. A strong team player with skills in visual design and utility is ideal for this role. Familiarity with Agile methodologies is a plus. Location: Kochi, Kerala (India) Requirements and Skills: - 4 to 8 years of experience - Demonstrated experience as a Full Stack Developer or in a similar role - Experience in developing desktop and mobile applications - Knowledge of common stacks - Proficiency in multiple front-end languages and libraries (e.g., HTML/CSS, JavaScript, XML, jQuery) - Proficiency in multiple back-end languages (e.g., .NET, Java, Python) and JavaScript frameworks (e.g., Angular, React, Node.js) - Familiarity with databases (e.g., MySQL, MongoDB), web servers (e.g., Apache), and UI/UX design - Excellent communication and teamwork skills - Strong attention to detail - Organizational skills - Analytical mindset - Degree in Computer Science, Statistics, or a relevant field Responsibilities: - Collaborate with development teams and product managers to conceptualize software solutions - Design client-side and server-side architecture - Develop visually appealing front-end applications - Create and maintain functional databases and applications - Write efficient APIs - Conduct software testing to ensure responsiveness and effectiveness - Troubleshoot, debug, and enhance software - Implement security and data protection measures - Develop mobile-responsive features and applications - Prepare technical documentation - Collaborate with data scientists and analysts to enhance software performance,
Posted 1 week ago
5.0 years
6 - 10 Lacs
Bangalore City, Bengaluru, Karnataka
On-site
INetFrame Technologies is a dynamic IT services company focused on delivering innovative solutions to clients globally. We foster a collaborative culture that encourages growth, learning, and cutting-edge technology adoption. We are seeking a skilled to join our client’s team on a contact-to-hire basis, transitioning to a full-time role with INetFrame Technologies. Preferred Qualifications Graduate or Post Graduate degree in Computer Science or equivalent qualification Minimum 5 years of deep experience with Performance Testing domain of large-scale web applications. Sound understanding and technical depth on Apache JMeter and the plugins used for JMeter test development . Deep knowledge on monitoring tools such as Dynatrace/ App Dynamics/ Grafana, created custom dashboards Identifying memory leakage, connection issues & other Bottleneck problems in the application. Analyze and Identify issues with Database (Oracle. MySQL, SQL Server etc.) integrations using webservices/REST. Job Type: Contract Pay: ₹600,000.00 - ₹1,000,000.00 per year Ability to commute/relocate: Bangalore City, Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Preferred) Work Location: In person
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
coimbatore, tamil nadu
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Role: SAP BTP Full Stack Developer Level: Manager (7-10 years experience) As a SAP BTP Full Stack Developer at EY, you will work with development teams and product managers to ideate software solutions. You will understand and design client-side and server-side architecture, build the front-end of applications through appealing visual design, write and integrate effective APIs, and test software to ensure responsiveness and efficiency. Additionally, you will develop and manage well-functioning databases and applications, write technical documentation, and have an understanding of SAP BTP, its services, and deployment approach. Skills required for this role include proven experience as a Full Stack Developer or similar role, experience in developing desktop and mobile applications, knowledge of multiple front-end languages and libraries such as HTML/CSS, JavaScript, XML, knowledge of multiple back-end languages like C#, Java, Python, and JavaScript frameworks like Angular, React, Node.js. Familiarity with databases like MySQL, MongoDB, web servers like Apache, and UI/UX design is essential. Excellent communication and teamwork skills are also crucial, along with hands-on experience with SAP BTP service uses and deployment strategy. EY exists to build a better working world, helping to create long-term value for clients, people, and society, and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Company : They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society. About Client: Our client is a global digital solutions and technology consulting company headquartered in Mumbai, India. The company generates annual revenue of over $4.29 billion (₹35,517 crore), reflecting a 4.4% year-over-year growth in USD terms. It has a workforce of around 86,000 professionals operating in more than 40 countries and serves a global client base of over 700 organizations. Our client operates across several major industry sectors, including Banking, Financial Services & Insurance (BFSI), Technology, Media & Telecommunications (TMT), Healthcare & Life Sciences, and Manufacturing & Consumer. In the past year, the company achieved a net profit of $553.4 million (₹4,584.6 crore), marking a 1.4% increase from the previous year. It also recorded a strong order inflow of $5.6 billion, up 15.7% year-over-year, highlighting growing demand across its service lines. Key focus areas include Digital Transformation, Enterprise AI, Data & Analytics, and Product Engineering—reflecting its strategic commitment to driving innovation and value for clients across industries. Job Description Python API / FAST API Developer Location : Hyderabad Who are we looking for? We are seeking a Python Developer with strong expertise in Python and Databases & hands-on experience in Azure cloud technologies. The role will focus on migrating processes from the current 3 rd Party RPA modules to Apache Airflow modules, ensuring seamless orchestration and automation of workflows. The ideal candidate will bring technical proficiency, problem-solving skills, and a deep understanding of workflow automation, along with a strong grasp of the North America insurance industry processes . Technical Skills: · Design, develop, and implement workflows using Apache Airflow to replace the current 3 rd Party RPA modules. · Build and optimize Python scripts to enable automation and integration with Apache Airflow pipelines. · Leverage Azure cloud services for deployment, monitoring, and scaling of Airflow · Collaborate with cross-functional teams to understand existing processes, dependencies, and business objectives. · Lead the migration of critical processes such as Auto, Package, Work Order Processing, and Policy Renewals within CI, Major Accounts, and Middle Market LOBs. · Ensure the accuracy, efficiency, and scalability of new workflows post-migration. · Perform unit testing, troubleshooting, and performance tuning for workflows and scripts. · Document workflows, configurations, and technical details to maintain clear and comprehensive project records. · Mentor junior developers and share best practices for Apache Airflow and Python Responsibilities · Proficiency in Python programming for API Development, Scripting, Data transformation, and Process Automation & Database interactions.
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka
Remote
Who we are At Twilio, we're shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences. Our dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you're part of a vibrant team with diverse experiences making a global impact each day. As we continue to revolutionize how the world interacts, we're acquiring new skills and experiences that make work feel truly rewarding. Your career at Twilio is in your hands. See yourself at Twilio Join the team as our next Senior Machine Learning Engineer (L3) in our Comms Platform Engineering team About the job This position is needed to scope, design, and deploy machine learning systems into the real world, the individual will closely partner with Product & Engineering teams to execute the roadmap for Twilio's AI/ML products and services. Twilio is looking for a Senior Machine Learning engineer to join the rapidly growing Comms Platform Engineering team of our Messaging business unit. You will understand the needs of our customers and build data products that solve their needs at a global scale. Working side by side with other engineering teams and product counterparts, you will own end-to-end execution of ML solutions. To thrive in this role, you must have a background in ML engineering, and a track record of solving data & machine-learning problems at scale. You are a self-starter, embody a growth attitude, and collaborate effectively across the entire Twilio organization Responsibilities In this role, you'll: Build and maintain scalable machine learning solutions in production Train and validate both deep learning-based and statistical-based models considering use-case, complexity, performance, and robustness Demonstrate end-to-end understanding of applications and develop a deep understanding of the "why" behind our models & systems Partner with product managers, tech leads, and stakeholders to analyze business problems, clarify requirements and define the scope of the systems needed Work closely with data platform teams to build robust scalable batch and realtime data pipelines Work closely with software engineers, build tools to enhance productivity and to ship and maintain ML models Drive engineering best practices around code reviews, automated testing and monitoring Qualifications: Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having "desired" qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table! Required: 5+ years of applied ML experience. Proficiency in Python is preferred. We will also consider strong quantitative candidates with a background in other programming languages Strong background in the foundations of machine learning and building blocks of modern deep learning Track record of building, shipping and maintaining machine learning models in production in an ambiguous and fast paced environment. You have a clear understanding of frameworks like - PyTorch, TensorFlow, or Keras, why and how these frameworks do what they do Familiarity with ML Ops concepts related to testing and maintaining models in production such as testing, retraining, and monitoring. Demonstrated ability to ramp up, understand, and operate effectively in new application / business domains. You've explored some of the modern data storage, messaging, and processing tools (Kafka, Apache Spark, Hadoop, Presto, DynamoDB etc.) Experience working in an agile team environment with changing priorities Experience of working on AWS Desired: Experience with Large Language Models Location This role will be remote, and based in India (only in Karnataka, TamilNadu, Maharashtra, Telangana and New Delhi). Travel We prioritize connection and opportunities to build relationships with our customers and each other. For this role, you may be required to travel occasionally to participate in project or team in-person meetings. What We Offer Working at Twilio offers many benefits, including competitive pay, generous time off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location. Twilio thinks big. Do you? We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts. So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now! If this role isn't what you're looking for, please consider other open positions. Twilio is proud to be an equal opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Additionally, Twilio participates in the E-Verify program in certain locations, as required by law.
Posted 1 week ago
8.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Category: Software Development/ Engineering Main location: India, Karnataka, Bangalore Position ID: J0725-1837 Employment Type: Full Time Position Description: Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Lead Data Engineer and Developer Position: Tech Lead Experience:8+ Years Category: Software Development Main location: Hyderabad, Chennai Position ID: J0625-0503 Employment Type: Full Time Lead Data Engineers and Developers with clarity on execution, design, architecture and problem solving. Strong understanding of Cloud engineering concepts, particularly AWS. Participate in Sprint planning and squad operational activities to guide the team on right prioritization. SQL - Expert (Must have) AWS (Redshift/Lambda/Glue/SQS/SNS/Cloudwatch/Step function/CDK(or Terrafoam)) - Expert (Must have) Pyspark -Intermediate/Expert AWS Airflow - Intermediate (Nice of have) Python - Intermediate (Must have or Pyspark knowledge) Your future duties and responsibilities: Lead Data Engineers and Developers with clarity on execution, design, architecture and problem solving. Strong understanding of Cloud engineering concepts, particularly AWS. Participate in Sprint planning and squad operational activities to guide the team on right prioritization. Required qualifications to be successful in this role: Must have Skills: SQL - Expert (Must have) AWS (Redshift/Lambda/Glue/SQS/SNS/Cloudwatch/Step function/CDK(or Terrafoam)) - Expert (Must have) Pyspark -Intermediate/Expert Python - Intermediate (Must have or Pyspark knowledge) Good to have skills: AWS Airflow - Intermediate (Nice of have) Skills: Apache Spark Python SQL What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Category: Testing/Quality Assurance Main location: India, Karnataka, Bangalore Position ID: J0725-1838 Employment Type: Full Time Position Description: Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Description: Job Title: ETL Testing Experience: 5-8 Years location: Chennai, Bangalore Employment Type: Full Time. Job Type: Work from Office (Monday - Friday) Shift Timing: 12:30 PM to 9:30 PM Required Skills: Analytics skills to understand requirements to develop test cases, understand and manage data, strong SQL skills, hands on testing of data pipelines built using Glue, S3, Redshift and Lambda, collaborate with developers to build automated testing where appropriate, understanding of data concepts like data lineage, data integrity and quality, experience testing financial data is a plus Your future duties and responsibilities: Expert level analytical and problem solving skills; able to show flexibility regarding testing. Awareness of Quality Management tools and techniques. Ensures best practice quality assurance of deliverables; understands & works within agreed architectural process; data and organizational frameworks. Advanced communication skills; fluent in English (written/verbal) and local language as appropriate. Open minded; able to share information; transfer knowledge and expertise to team members Required qualifications to be successful in this role: Must have skills: ETL, SQL, Hands on testing of data pipelines, Glue, S3, Redshift, data lineage, data integrity Good to have skills: Experience testing financial data is a plus. Skills: Apache Spark Python SQL What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Category: Business Analysis (functional and technical) Main location: India, Karnataka, Bangalore Position ID: J0725-1836 Employment Type: Full Time Position Description: Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Business System Analyst Experience: 5-8 Years location: Chennai, Bangalore Employment Type: Full Time. Job Type: Work from Office (Monday - Friday) Shift Timing: 12:30 PM to 9:30 PM Strong Financial Services (preferred Banking) experience, Translate Finacial and accounting concepts into business and systems requirements, data analysis, identify data anomalies and provide remediation options, data mapping, strong data base design concepts, good familiarity with SQL, assist in the creation of meta data, data lineage and data flow diagrams, support UAT planning and execution functions. Your future duties and responsibilities: Collaborate with business stakeholders to understand their goals, processes, and requirements. Gather, document, and analyze business requirements, user stories, and workflows. Translate business needs into functional and technical specifications. Liaise between business units and IT teams to ensure solutions align with business objectives. Assist in the design, testing, implementation, and support of business systems and applications. Develop process models, data flow diagrams, and use cases. Support system integrations, data migrations, and application enhancements. Required qualifications to be successful in this role: Must have skills: Finacial and accounting concepts, data analysis, identify data anomalies, SQL. Good to have: Strong Financial Services (preferred Banking) experience creation of meta data, data lineage and data flow diagrams, UAT planning and execution. Skills: Apache Spark Python SQL What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 1 week ago
0.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Category: Software Development/ Engineering Main location: India, Karnataka, Bangalore Position ID: J0725-1834 Employment Type: Full Time Position Description: Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Description: Job Title: Data Engineer Experience: 5-8 Years location: Chennai, Bangalore Employment Type: Full Time. Job Type: Work from Office (Mon-Fri) Shift Timing: 12:30 PM to 9:30 PM Required Skills: 5-8 years' experience candidate as back end - data engineer. Strong experience in SQL. Strong knowledge and experience Python and Py Spark. Experience in AWS. Experience in Docker and OpenShift. Hands on experience with REST Concepts. Design and Develop business solutions on the data front. Experience in implementation of new enhancements and also handling defect triage. Candidate must have strong analytical abilities. Skills/ Competency Additionally Preferred Jira, Bit Bucket Experience on Kafka. Experience on Snowflake. Domain knowledge in Banking. Analytical skills. Excellent communication skills Working knowledge of Agile. Your future duties and responsibilities: Design and Develop business solutions on the data front. Experience in implementation of new enhancements and also handling defect triage. Candidate must have strong analytical abilities. Required qualifications to be successful in this role: Must have skills: SQL, Python and Py Spark, AWS, Docker and OpenShift, REST Concepts. Good to have skills: Jira, Bit Bucket Experience on Kafka. Experience on Snowflake. Domain knowledge in Banking. Analytical skills. Excellent communication skills Working knowledge of Agile. Skills: Apache Spark Python SQL What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 1 week ago
0.0 - 3.0 years
0 Lacs
Hyderabad, Telangana
On-site
Location: Hyderabad, Telangana Time type: Full time Job level: Senior Associate Job type: Regular Category: Technology Consulting ID: JR111910 About us We are the leading provider of professional services to the middle market globally, our purpose is to instill confidence in a world of change, empowering our clients and people to realize their full potential. Our exceptional people are the key to our unrivaled, inclusive culture and talent experience and our ability to be compelling to our clients. You’ll find an environment that inspires and empowers you to thrive both personally and professionally. There’s no one like you and that’s why there’s nowhere like RSM. Snowflake Engineer We are currently seeking an experienced Snowflake Engineer for our Data Analytics team. This role involves designing, building, and maintaining our Snowflake cloud data warehouse. Candidates should have strong Snowflake, SQL, and cloud data solutions experience. Responsibilities Design, develop, and maintain efficient and scalable data pipelines in Snowflake, encompassing data ingestion, transformation, and loading (ETL/ELT). Implement and manage Snowflake security, including role-based access control, network policies, and data encryption. Develop and maintain data models optimized for analytical reporting and business intelligence. Collaborate with data analysts, scientists, and stakeholders to understand data requirements and translate them into technical solutions. Monitor and troubleshoot Snowflake performance, identifying and resolving bottlenecks. Automate data engineering processes using scripting languages (e.g., Python, SQL) and orchestration tools (e.g., Airflow, dbt). Designing, developing, and deploying APIs within Snowflake using stored procedures and user-defined functions (UDFs) Lead and mentor a team of data engineers and analysts, providing technical guidance, coaching, and professional development opportunities. Stay current with the latest Snowflake features and best practices. Contribute to the development of data engineering standards and best practices. Document data pipelines, data models, and other technical specifications. Qualifications Bachelor’s degree or higher in computer science, Information Technology, or a related field. A minimum of 5 years of experience in data engineering and management, including over 3 years of working with Snowflake. Strong understanding of data warehousing concepts, including dimensional modeling, star schemas, and snowflake schemas. Proficiency in SQL and experience with data transformation and manipulation. Experience with ETL/ELT tools and processes. Experience with Apache Iceberg. Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Preferred qualifications Snowflake certifications (e.g., SnowPro Core Certification). Experience with scripting languages (e.g., Python) and automation tools (e.g., Airflow, dbt). Experience with cloud platforms (e.g., AWS, Azure, GCP). Experience with data visualization tools (e.g., Tableau, Power BI). Experience with Agile development methodologies. Experience with Snowflake Cortex, including Cortex Analyst, Arctic TILT, and Snowflake AI & ML Studio. At RSM, we offer a competitive benefits and compensation package for all our people. We offer flexibility in your schedule, empowering you to balance life’s demands, while also maintaining your ability to serve clients. Learn more about our total rewards at https://rsmus.com/careers/india.html. RSM does not tolerate discrimination and/or harassment based on race; colour; creed; sincerely held religious beliefs, practices or observances; sex (including pregnancy or disabilities related to nursing); gender (including gender identity and/or gender expression); sexual orientation; HIV Status; national origin; ancestry; familial or marital status; age; physical or mental disability; citizenship; political affiliation; medical condition (including family and medical leave); domestic violence victim status; past, current or prospective service in the Indian Armed Forces; Indian Armed Forces Veterans, and Indian Armed Forces Personnel status; pre-disposing genetic characteristics or any other characteristic protected under applicable provincial employment legislation. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process and/or employment/partnership. RSM is committed to providing equal opportunity and reasonable accommodation for people with disabilities. If you require a reasonable accommodation to complete an application, interview, or otherwise participate in the recruiting process, please send us an email at careers@rsmus.com.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France