Jobs
Interviews

6241 Scala Jobs - Page 47

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 18.0 years

40 - 75 Lacs

Bengaluru

Hybrid

- Backend Applications using Java/J2EE, RESTful Web Services, HTTP and JSON - 5 yrs. of Techno Managerial role - Expertise in Python & Java, with a deep understanding of its ecosystems and frameworks. - Expertise with Node.js / JavaScript / Scala

Posted 3 weeks ago

Apply

5.0 - 10.0 years

30 - 32 Lacs

Pune

Hybrid

Let me tell you about the role We are looking for an Information Security Engineering Specialist with great knowledge in security fundamentals and is eager to apply them in complex environments. In this role, you will assist in implementing security controls, executing vulnerability assessments, and supporting automation initiatives. This position will have an emphasis in one or more of the following areas cloud security; infrastructure security; and/or data security. You will have an opportunity to learn and grow under the mentorship of senior engineers, while also contributing to critical security tasks that keep our organization safe. What you will deliver Define security policies that can be used to improve our cloud, infrastructure or data security posture. Integrate our vulnerability assessment tooling into our environments, to provide continuous scans, uncovering vulnerabilities, misconfiguration or potential security gaps. Work with engineering teams to support the remediation and validation of vulnerability mitigations and fixes. Integrate security validations into continuous integration/continuous deliver (CI/CD) pipelines and develop scripts to automate security tasks. Maintain clear, detailed documentation of security procedures and policies, including how to embed and measure security on our cloud, infrastructure or data environments. What you will need to be successful (experience and qualifications) Seasoned security professional with 3+ years delivering security engineering services and/or building security solutions within a complex organization. Practical experience designing, planning, productizing, maintaining and documenting reliable and scalable data, infrastructure, cloud and/or platform solutions in complex environments. Firm foundation of information and cyber security principles and standard processes. Professional and technical security certifications such as CISSP, CISM, GEVA, CEH, OSCP or equivalent are a plus. Development experience in one or more object-oriented programming languages (e.g., Python, Scala, Java, C#) and/or cloud environments (including AWS, Azure, Alibaba, etc.) Exposure/experience with full stack development. Experience with security tooling (vulnerability scanners, CNAPP, Endpoint and/or DLP) and automation and scription for security tasks (e.g., CI/CD integration). Familiarity with basic security frameworks such as NIST CSF, NIST 800-53, ISO 27001, etc. Foundational knowledge of security standards, industry laws, and regulations such as Payment Card Industry Data Security Standards (PCI-DSS), General Data Protection Regulation (GDPR), California Consumer Privacy Act (CCPA) and Sarbanes-Oxley (SOX) Continuous learning and improvement approach. This position is a hybrid of office/remote working

Posted 3 weeks ago

Apply

7.0 - 12.0 years

9 - 12 Lacs

Bengaluru

Work from Office

Responsibilities: * Design, develop, test & maintain Scala applications using Spark. * Collaborate with cross-functional teams on project delivery. * Optimize application performance through data analysis.

Posted 3 weeks ago

Apply

2.0 - 7.0 years

15 - 30 Lacs

Bengaluru

Hybrid

Role & responsibilities Design, deliver, and maintain significant features in data pipelines, ML processing, and / or service infrastructure Optimize software performance to achieve the required throughput and / or latency Work with your manager, peers, and Product Managers to scope projects and features Come up with a sound technical strategy, taking into consideration the project goals, timelines, and expected impact Take point on some cross-team efforts, taking ownership of a business problem and ensuring the different teams are in sync and working towards a coherent technical solution Take active part in knowledge sharing across the organization - both teaching and learning from others 2+ years of software design and development experience, tackling non-trivial problems in backend services and / or data pipelines A solid foundation in Data Structures, Algorithms, Object-Oriented Programming, Software Design, and core Statistics knowledge Experience in production-grade coding in Java, and Python/Scala Experience in the close examination of data and computation of statistics Experience in using and operating Big Data processing pipelines, such as: Hadoop and Spark Good verbal and written communication and collaboration skills

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About The Role Grade Level (for internal use): 10 Role: Sr. React Fullstack Developer The Team C&RS (Credit & Risk Solutions) is part of the Market Intelligence group within S&P Global. Financial Risk Analytics (FRA) delivers information-centric capital markets and risk solutions for trading desks and their risk business partners, supporting risk regulatory compliance. The UI products cover counterparty credit risk, xVA and market risk for both Buy and Sell side firms. We are currently investing in technology and data platform to develop a number of new revenue generating products, leveraging open-source, big data and cloud technologies. This role is for a software developer within the FRA software engineering team, building React (Typescript) UI applications, services and working with databases/cloud. Responsibilities Design and implement UI applications and services. Participate in system architecture and design decisions. Continuously improve development and testing best practices. Interpret and analyse business use-cases and translate feature requests into technical designs and development tasks. Take ownership of development tasks, participate in regular design and code review meetings. Delivery focused and keen to participate in the successful implementation and evolution of technology products in close coordination with product managers and colleagues. Basic Qualification Bachelor’s degree in Computer Science, Applied Mathematics, Engineering, or a related discipline, or equivalent experience. 10 + years of strong software development experience React, Typescript/js (ES6) Node.js (express) Experience with SQL relational databases such as Postgresql Demonstrable experience of using Restful API in a production setting. Test frameworks (e.g. jest, jasmine, playwright) Understanding of CI/CD pipelines Linux/Unix, Git Agile and XP (Scrum, Kanban, TDD) Desirable Highcharts, Devextreme, tanstack React Components, Bootstrap, HTML5 Understanding and implementation of security and data protection Gitlab, containerization platform AWS - CLI, Cloudfront, Cognito, S3 Python, Java/Scala What's In For You You can effectively manage timelines and enjoy working within a team You can follow relevant technology trends, actively evaluate new technologies, and use this information to improve the product You get a lot of satisfaction from on-time delivery Happy clients are important to you You take pride in your work Competencies You love to solve complex problems, whether that's making the user experience as responsive as possible or understanding complex client requirements You can confidently present your own ideas and solutions, as well as guide technical discussions. Your welcoming attitude encourages people to approach you when they have a problem you can help them solve About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 284397 Posted On: 2025-07-18 Location: Gurgaon, India

Posted 3 weeks ago

Apply

5.0 years

15 - 18 Lacs

Goregaon, Maharashtra, India

Remote

Business Intelligence Developer – Mumbai (Goregaon East) 27165 Work Mode: Hybrid (4 days office, 1 day WFH) Shift Timings: 12:30 PM – 9:30 PM Location: Goregaon East, Nesco (Max 1 hour commute preferred) Interview: 2 rounds, in-person Responsibilities Design and develop ETL pipelines integrating diverse data sources into BI environments. Develop dashboards and reports using Microsoft Power BI, SSRS, and other BI tools. Ensure data quality, maintain data catalog/dictionary, and support data marts/lakes. Collaborate with business partners to understand needs and translate them into BI solutions. Lead development and maintenance of complex BI dashboards and reports. Provide user training and support adoption of BI tools. Proactively identify opportunities for business growth, risk mitigation, and efficiency. Support Microsoft BI platform technologies and innovate solutions for scalability and reuse. Must-Have Skills & Experience 5-7+ years working with Microsoft BI platform: SQL Server DB, SSIS, SSRS, SSAS, Power BI, Azure Cloud services. Strong experience building and maintaining large scale data integration and ETL processes. Proficient in data warehouse architecture, data modeling, and dashboard/report development. Expertise in optimizing data integration routines and database design. Excellent communication and documentation skills. Ability to work independently in a fast-paced environment. Nice-to-Haves Experience with other BI tools like QlikView, Tableau, MicroStrategy, or open-source reporting. Cloud-based data platforms (Azure, AWS, Snowflake). DevOps experience and CI/CD deployment knowledge. Experience with data lakes and Power BI Report Server administration. Knowledge of analytics tools like R, Python, Scala, SAS. Skills: power bi,business intelligence,communication,dashboards,azure,data,dashboard/report development,ssas,documentation,ssrs,data warehouse architecture,etl,cloud,data integration,ssis,sql server db,data modeling,azure cloud services,design,microsoft

Posted 3 weeks ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Bengaluru

Work from Office

About AQR Capital Management AQR is a global investment management firm built at the intersection of financial theory and practical application. We strive to deliver superior, long-term results for our clients by seeking to filter out market noise to identify and isolate what matters most, and by developing ideas that stand up to rigorous testing. Underpinning this philosophy is an unrelenting commitment to excellence in technology powering our insights and analysis. This unique combination has made us leaders in alternative and traditional strategies with more than 125 Bn$ of asset under management. Job description: The Team Our Bengaluru office is key component of our global Engineering strategy. Our Software engineers work in research, portfolio implementation, trading, enterprise engineering teams. Quantitative Research Development (QRD) team partners closely with business teams to build the quant models, infrastructure, applications, tools that power our quantitative research and quantitative investment process. Portfolio Implementation team is part of QRD team. Your Role As a Tech Lead in Portfolio Implementation Engineering team you will design and develop - Global asset risk estimation system incorporating large amount of data High-performance historical simulation engine Portfolio construction systems for our quantitative investment strategies Portfolio optimization systems to incorporate real world constraints on research strategies Solutions to implement business processes that rebalance the portfolios based on quantitative models and interface with trading systems to generate orders You will partner with not only local but also global team of engineers and researchers for successful product delivery. You would be expected to lead initiatives both in technology transformation and business driven projects along with significant individual contribution along with guiding and mentoring junior team members. What You ll Bring Bachelors/Masters/PhD in Computer Science, Engineering, or related discipline 10+ years of software development experience Expertise in Java programming language Outstanding coding, debugging, and analytical skills Experience of design and architecture including object-oriented design, distributed systems, cloud native applications and microservices Ability to lead technology initiatives through the development lifecycle s Ability to manage multiple workstreams with task allocation, execution and monitoring Ability to manage teams and guide team members Experience of working with cloud technologies and containers would be a plus Knowledge of other programming languages (Python, C++, Go, Scala) would be a plus Knowledge and experience of Finance is desirable Excellent communication skills both verbal and written Willingness to learn and work on new technologies and domain concepts Who You Are Mature, thoughtful, and a natural fit for a collaborative, team-oriented culture Hard-working and eager to learn in a fast-paced, innovative environment Committed to intellectual integrity, transparency, and openness Motivated by the transformational effects of technology-at-scale

Posted 3 weeks ago

Apply

6.0 - 11.0 years

8 - 14 Lacs

Bengaluru

Work from Office

GCP Cloud Architecture. Model Deployment Lifecycle Knowledge of creating Training & Serving Pipeline Familiar with any one of workflow: Kubeflow, Airflow, ML Flow, Argo etc" Strong in Python Adequate SQL skill Must have skill : Python, SQL, ML Engineer (Model Deployment/MLOPS), ML Pipeline-(Kubeflow, Airflow Flow, Argo etc,) Preferred Skill: Pytorch, TensorFlow, Exp in hiper scaler/Cloud Service, Deep learning framework,

Posted 3 weeks ago

Apply

7.0 - 12.0 years

12 - 16 Lacs

Bengaluru

Work from Office

We are looking for lead or principal software engineers to join our Data Cloud team. Our Data Cloud team is responsible for the Zeta Identity Graph platform, which captures billions of behavioural, demographic, environmental, and transactional signals, for people-based marketing. As part of this team, the data engineer will be designing and growing our existing data infrastructure to democratize data access, enable complex data analyses, and automate optimization workflows for business and marketing operations. Job Description: Essential Responsibilities: As a Lead or Principal Data Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as HDFS, Spark, Snowflake, Hive, HBase, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in 24/7 on-call rotation (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 7 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and onpremises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark, HDFS, Hive, HBase Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience with web frameworks such as Flask, Django

Posted 3 weeks ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Gurugram

Hybrid

Job Title: Lead Data Engineer Location: Gurgaon Department: Data Engineering / Technology Experience Required: 510 years (with 1–2 years in a lead role preferred) About the Role: We are looking for a highly skilled and motivated Lead Data Engineer to join our growing data team. The ideal candidate will have hands-on experience in designing, building, and optimizing scalable data pipelines and architectures. You will work closely with data scientists, analysts, and product teams to enable data-driven decisions across the organization. Key Responsibilities: Design, develop, and maintain large-scale distributed data processing systems using Spark (on EMR) and Scala Build and manage real-time data pipelines with Apache Kafka Leverage SQL , Athena , and other AWS data tools for efficient data querying and transformation Orchestrate workflows using Apache Airflow Deploy and manage infrastructure using AWS components (e.g., EKS , EMR, S3, etc.) Collaborate with stakeholders to build interactive dashboards and reports using Superset or other data visualization tools Ensure high-quality data availability, integrity, and governance across systems Provide technical leadership, code reviews, and mentoring to junior engineers Required Skills: Strong experience with Apache Spark and Scala Hands-on expertise in Apache Kafka for streaming data solutions Strong command of SQL ; experience with Athena is a plus Experience working with AWS services (especially EMR , EKS , S3 , etc.) Experience with Airflow for job scheduling and orchestration Familiarity with Superset or similar data visualization tools (e.g., Tableau, Power BI) Understanding of data warehouse technologies like Hive and Presto (deep expertise not required if strong in SQL) Preferred Qualifications: Prior experience in a leadership or mentoring role Exposure to best practices in data engineering, data governance, and security Strong problem-solving skills and ability to work in a fast-paced environment What We Offer: Opportunity to work on cutting-edge data technologies Collaborative and inclusive work culture

Posted 3 weeks ago

Apply

8.0 - 13.0 years

10 - 14 Lacs

Bengaluru

Work from Office

About the Role: Grade Level (for internal use): 11 S&P Global Mobility The RoleLead Data Engineer( AWS Cloud, Python) We are seeking a Senior Data Engineer with deep expertise in AWS Cloud Development to join our fast-paced data engineering organization. This role is critical to both the development of new data products and the modernization of existing platforms. The ideal candidate is a seasoned data engineer with hands-on experience designing, building, and optimizing large-scale data pipelines and architectures in both on-premises (e.g., Oracle) and cloud environments (especially AWS). This individual will also serve as a Cloud Development expert , mentoring and guiding other data engineers as they enhance their cloud skillsets. Responsibilities Data Engineering & Architecture Design, build, and maintain scalable data pipelines and data products. Develop and optimize ELT/ETL processes using a variety of data tools and technologies. Support and evolve data models that drive operational and analytical workloads. Modernize legacy Oracle-based systems and migrate workloads to cloud-native platforms. Cloud Development & DevOps (AWS-Focused) Build, deploy, and manage cloud-native data solutions using AWS services (e.g., S3, Lambda, Glue, EMR, Redshift, Athena, Step Functions). Implement CI/CD pipelines, IaC (e.g., Terraform or CloudFormation), and monitor cloud infrastructure for performance and cost optimization. Ensure data platform security, scalability, and resilience in the AWS cloud. Technical Leadership & Mentoring Act as a subject matter expert on cloud-based data development and DevOps best practices. Mentor data engineers on AWS architecture, infrastructure as code, and cloud-first design patterns. Participate in code and architecture reviews, enforcing best practices and high-quality standards. Cross-functional Collaboration Work closely with product managers, data analysts, software engineers, and other stakeholders to understand business needs and deliver end-to-end solutions. Support and evolve the roadmap for data platform modernization and new product delivery. What We're looking for: Required Qualifications 8+ years of experience in data engineering or equivalent technical role. 5+ years of hands-on experience with AWS Cloud Development and DevOps. Strong expertise in SQL , data modeling , and ETL/ELT pipelines . Deep experience with Oracle (PL/SQL, performance tuning, data extraction). Proficiency in Python and/or Scala for data processing tasks. Strong knowledge of cloud infrastructure (networking, security, cost optimization). Experience with infrastructure as code (Terraform). Familiarity with CI/CD pipelines and DevOps tooling (e.g., Jenkins, GitHub Actions). Preferred (Nice to Have) Experience with Google Cloud Platform (GCP), Snowflake Knowledge of containerization and orchestration tools. Experience with modern orchestration tools (e.g., Airflow, dbt). Exposure to data cataloging, governance, and quality tools. Statement: S&P Global delivers essential intelligence that powers decision making. We provide the worlds leading organizations with the right data, connected technologies and expertise they need to move ahead. As part of our team, youll help solve complex challenges that equip businesses, governments and individuals with the knowledge to adapt to a changing economic landscape. S&P Global Mobility turns invaluable insights captured from automotive data to help our clients understand todays market, reach more customers, and shape the future of automotive mobility. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About the Role: We are seeking a highly skilled and experienced Machine Learning Engineer to join our dynamic team. As a Machine Learning Engineer, you will be responsible for the design, development, deployment, and maintenance of machine learning models and systems that drive our [mention specific business area or product, e.g., recommendation engine, fraud detection system, autonomous vehicles]. You will work closely with data scientists, software engineers, and product managers to translate business needs into scalable and reliable machine learning solutions. This is a key role in shaping the future of CBRE and requires a strong technical foundation combined with a passion for innovation and problem-solving. Responsibilities: Model Development & Deployment: Design, develop, and deploy machine learning models using various algorithms (e.g., regression, classification, clustering, deep learning) to solve complex business problems. Select appropriate datasets and features for model training, ensuring data quality and integrity. Implement and optimize model training pipelines, including data preprocessing, feature engineering, model selection, and hyperparameter tuning. Deploy models to production environments using containerization technologies (e.g.,Docker, Kubernetes) and cloud platforms (e.g., AWS, GCP, Azure). Monitor model performance in production, identify and troubleshoot issues, and implement model retraining and updates as needed. Infrastructure & Engineering: Develop and maintain APIs for model serving and integration with other systems. Write clean, well-documented, and testable code. Collaborate with software engineers to integrate models into existing products and services. Research & Innovation : Stay up to date with the latest advancements in machine learning and related technologies. Research and evaluate new algorithms, tools, and techniques to improve model performance and efficiency. Contribute to the development of new machine learning solutions and features. Proactively identify opportunities to leverage machine learning to solve business challenges. Collaboration & Communication: * Collaborate effectively with data scientists, software engineers, product managers, and other stakeholders. * Communicate technical concepts and findings clearly and concisely to both technical and non-technical audiences. * Participate in code reviews and contribute to the team's knowledge sharing. Qualifications: * Experience : 7+ years of experience in machine learning engineering or a related field. Technical Skills: Programming Languages : Proficient in Python and experience with other languages (e.g., Java, Scala, R) is a plus. Machine Learning Libraries : Strong experience with machine learning libraries and frameworks such as scikit-learn, TensorFlow, PyTorch, Keras, etc. Data Processing : Experience with data manipulation and processing using libraries like Pandas, NumPy, and Spark. Model Deployment : Experience with model deployment frameworks and platforms (e.g., TensorFlow Serving, TorchServe, Seldon, AWS SageMaker, Google AI Platform, Azure Machine Learning). Databases : Experience with relational and NoSQL databases (e.g., SQL, MongoDB, Cassandra). Version Control : Experience with Git and other version control systems. DevOps : Familiarity with DevOps practices and tools. Strong understanding of machine learning concepts and algorithms : Regression, Classification, Clustering, Deep Learning etc. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration skills.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

11 - 16 Lacs

Noida

Work from Office

Data Engineering- Technical Lead Paytm is Indias leading digital payments and financial services company, which is focused on driving consumers and merchants to its platform by offering them a variety of payment use cases. Paytm provides consumers with services like utility payments and money transfers, while empowering them to pay via Paytm Payment Instruments (PPI) like Paytm Wallet, Paytm UPI, Paytm Payments Bank Netbanking, Paytm FASTag and Paytm Postpaid - Buy Now, Pay Later. To merchants, Paytm offers acquiring devices like Soundbox, EDC, QR and Payment Gateway where payment aggregation is done through PPI and also other banks financial instruments. To further enhance merchants business, Paytm offers merchants commerce services through advertising and Paytm Mini app store.Operating on this platform leverage, the company then offers credit services such as merchant loans, personal loans and BNPL, sourced by its financial partners. About the Role: This position requiressomeone to work on complex technical projects and closely work with peers in an innovative andfast-paced environment. For this role, we require someone with a strong product design sense & specialized in Hadoop and Spark technologies. Requirements: Minimum 6+ years of experience in Big Data technologies. The position Grow our analytics capabilities with faster, more reliabletools, handling petabytes ofdataevery day. Brainstorm and create new platforms that can help in our quest to makeavailable to cluster users in all shapes and forms, with low latency and horizontalscalability. Make changes to ourdiagnosing any problems across the entire technical stack. Design and develop a real-time events pipeline forDataingestion for real-time dash-boarding.Develop complex and efficient functions to transform rawdatasources into powerful,reliable components of ourdatalake. Design & implement new components and various emerging technologies in HadoopEco- System, and successful execution of various projects. Be a brand ambassador for Paytm- Stay Hungry, Stay Humble, Stay Relevant! Skills that will help you succeed in this role: Fluent withStrong hands-on experience with Hadoop, MapReduce, Hive, Spark, PySpark etc.Excellent programming/debugging skills in Python/Java/Scala. Experience with any scripting language such as Python, Bash etc. Good to have experience of working with noSQL databases like Hbase, Cassandra.Hands-on programming experience with multithreaded applications.Good to have experience in Database, SQL, messaging queues like Kafka. Good to have experience in developing streaming applications e.g. Spark Streaming,Flink, Storm, etc.Good to have experience with AWS and cloud technologies such as S3 Experience with caching architectures like Redis etc. Why join us: Because you get an opportunity to make a difference, and have a great time doing that.You are challenged and encouraged here to do stuff that is meaningful for you and for those we serve.You should work with us if you think seriously about what technology can do for people.We are successful, and our successes are rooted in our people's collective energy and unwavering focus on the customer, and that's how it will always be. Compensation: If you are the right fit, we believe in creating wealth for you with enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants- and we are committed to it. Indias largest digital lending story is brewing here. Its your opportunity to be a part of the story!

Posted 3 weeks ago

Apply

5.0 - 9.0 years

2 - 5 Lacs

Noida

Work from Office

Paytm is Indias leading digital payments and financial services company, which is focused on driving consumers and merchants to its platform by offering them a variety of payment use cases. Paytm provides consumers with services like utility payments and money transfers, while empowering them to pay via Paytm Payment Instruments (PPI) like Paytm Wallet, Paytm UPI, Paytm Payments Bank Net banking, Paytm FASTag and Paytm Postpaid - Buy Now, Pay Later. To merchants, Paytm offers acquiring devices like Soundbox, EDC, QR and Payment Gateway where payment aggregation is done through PPI and also other banks financial instruments. To further enhance merchants business, Paytm offers merchants commerce services through advertising and Paytm Mini app store. Operating on this platform leverage, the company then offers credit services such as merchant loans, personal loans and BNPL, sourced by its financial partners. About the team: Paytm Ads is digital advertising vertical that offers innovative ad solutions to clients across industries It offers advertisers the opportunity to engage with 300Mn+ users who interact with over 200 payment; retail services, online and offline - offered on the Paytm app. Paytm Ads maps the user transactions to their lifestyle choices and creates customized segmentation cohorts for sharp shooting ad campaigns to the most relevant TG. Expectations/ Requirements 1.Proficient in SQL/Hive and deep expertise in building scalable business reporting solutions 2. Past experience in optimizing business strategy, product or process using data & analytics 3. Working knowledge in at least one programming language like Scala, Java or Python 4. Working knowledge of Dashboard visualization. Ability to execute cross functional initiatives. 5. Maintaining product & funnel dashboard7.s, metrics on pulse, looker, superset 6.Campaign analytics and debugs 7.Data reporting for business asks, MBR, Lucky wheel revenue, growth experiments Superpowers/ Skills that will help you succeed in this role 1. 5 to 9 years of work experience in a business intelligence and analytics role in financial services, e-commerce, consulting or technology domain 2. Demonstrated ability to directly partner with business owners to understand product requirements 3. Effective spoken and written communication to senior audiences, including strong data presentation and visualization skills 4. Prior success in working with extremely large datasets using big data technologies 5. Detail-oriented, with an aptitude for solving unstructured problems Why join us -A collaborative output driven program that brings cohesiveness across businesses through technology -A solid 360 feedbacks from your peer teams on your support of their goals With enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants- and we are committed to it. Indias largest digital lending story is brewing here. Its your opportunity to be a part of the story.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Job Title - Retail Specialized Data Scientist Level 9 SnC GN Data & AI Management Level:09 - Consultant Location:Bangalore / Gurgaon / Mumbai / Chennai / Pune / Hyderabad / Kolkata Must have skills: A solid understanding of retail industry dynamics, including key performance indicators (KPIs) such as sales trends, customer segmentation, inventory turnover, and promotions. Strong ability to communicate complex data insights to non-technical stakeholders, including senior management, marketing, and operational teams. Meticulous in ensuring data quality, accuracy, and consistency when handling large, complex datasets. Gather and clean data from various retail sources, such as sales transactions, customer interactions, inventory management, website traffic, and marketing campaigns. Strong proficiency in Python for data manipulation, statistical analysis, and machine learning (libraries like Pandas, NumPy, Scikit-learn). Expertise in supervised and unsupervised learning algorithms Use advanced analytics to optimize pricing strategies based on market demand, competitor pricing, and customer price sensitivity. Good to have skills: Familiarity with big data processing platforms like Apache Spark, Hadoop, or cloud-based platforms such as AWS or Google Cloud for large-scale data processing. Experience with ETL (Extract, Transform, Load) processes and tools like Apache Airflow to automate data workflows. Familiarity with designing scalable and efficient data pipelines and architecture. Experience with tools like Tableau, Power BI, Matplotlib, and Seaborn to create meaningful visualizations that present data insights clearly. Job Summary : The Retail Specialized Data Scientist will play a pivotal role in utilizing advanced analytics, machine learning, and statistical modeling techniques to help our retail business make data-driven decisions. This individual will work closely with teams across marketing, product management, supply chain, and customer insights to drive business strategies and innovations. The ideal candidate should have experience in retail analytics and the ability to translate data into actionable insights. Roles & Responsibilities: Leverage Retail Knowledge:Utilize your deep understanding of the retail industry (merchandising, customer behavior, product lifecycle) to design AI solutions that address critical retail business needs. Gather and clean data from various retail sources, such as sales transactions, customer interactions, inventory management, website traffic, and marketing campaigns. Apply machine learning algorithms, such as classification, clustering, regression, and deep learning, to enhance predictive models. Use AI-driven techniques for personalization, demand forecasting, and fraud detection. Use advanced statistical methods help optimize existing use cases and build new products to serve new challenges and use cases. Stay updated on the latest trends in data science and retail technology. Collaborate with executives, product managers, and marketing teams to translate insights into business actions. Professional & Technical Skills : Strong analytical and statistical skills. Expertise in machine learning and AI. Experience with retail-specific datasets and KPIs. Proficiency in data visualization and reporting tools. Ability to work with large datasets and complex data structures. Strong communication skills to interact with both technical and non-technical stakeholders. A solid understanding of the retail business and consumer behavior. Programming Languages:Python, R, SQL, Scala Data Analysis Tools:Pandas, NumPy, Scikit-learn, TensorFlow, Keras Visualization Tools:Tableau, Power BI, Matplotlib, Seaborn Big Data Technologies:Hadoop, Spark, AWS, Google Cloud Databases:SQL, NoSQL (MongoDB, Cassandra) Additional Information: - Qualification Experience: Minimum 3 year(s) of experience is required Educational Qualification: Bachelors or Master's degree in Data Science, Statistics, Computer Science, Mathematics, or a related field.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Kochi

Work from Office

Job Title - + + Management Level: Location:Kochi, Coimbatore, Trivandrum Must have skills:Big Data, Python or R Good to have skills:Scala, SQL Job Summary A Data Scientist is expected to be hands-on to deliver end to end vis a vis projects undertaken in the Analytics space. They must have a proven ability to drive business results with their data-based insights. They must be comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes. Roles and Responsibilities Identify valuable data sources and collection processes Supervise preprocessing of structured and unstructured data Analyze large amounts of information to discover trends and patterns for insurance industry. Build predictive models and machine-learning algorithms Combine models through ensemble modeling Present information using data visualization techniques Collaborate with engineering and product development teams Hands-on knowledge of implementing various AI algorithms and best-fit scenarios Has worked on Generative AI based implementations Professional and Technical Skills 3.5-5 years experience in Analytics systems/program delivery; at least 2 Big Data or Advanced Analytics project implementation experience Experience using statistical computer languages (R, Python, SQL, Pyspark, etc.) to manipulate data and draw insights from large data sets; familiarity with Scala, Java or C++ Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications Hands on experience in Azure/AWS analytics platform (3+ years) Experience using variations of Databricks or similar analytical applications in AWS/Azure Experience using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop) Strong mathematical skills (e.g. statistics, algebra) Excellent communication and presentation skills Deploying data pipelines in production based on Continuous Delivery practices. Additional Information Multi Industry domain experience Expert in Python, Scala, SQL Knowledge of Tableau/Power BI or similar self-service visualization tools Interpersonal and Team skills should be top notch Nice to have leadership experience in the past Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)

Posted 3 weeks ago

Apply

5.0 - 8.0 years

4 - 7 Lacs

Mumbai

Work from Office

Excellent Knowledge on Spark; The professional must have a thorough understanding Spark framework, Performance Tuning etc Excellent Knowledge and hands-on experience of at least 4+ years in Scala and PySpark Excellent Knowledge of the Hadoop eco System- Knowledge of Hive mandatory Strong Unix and Shell Scripting Skills Excellent Inter-personal skills and for experienced candidates Excellent leadership skills Mandatory for anyone to have Good knowledge of any of the CSPs like Azure,AWS or GCP; Certifications on Azure will be additional Plus. Mandatory Skills: PySpark. Experience: 5-8 Years.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Mumbai

Work from Office

> 5+ years of professional experience with SQL, ETL, data modeling, and at least one programming language (e.g., Python, C++, C#, Scala, etc.) 3+ years of professional experience with Enterprise Domains Like HR, Finance, Supply Chain 4+ years of professional experience with more than one SQL and relational databases including expertise in Presto, Spark, and MySQL Professional experience designing and implementing real-time pipelines (Apache Kafka, or similar technologies) 5+ years of professional experience in custom ETL design, implementation, and maintenance 3+ years of professional experience with Data Modeling including expertise in Data Warehouse design and Dimensional Modeling 5+ years of professional experience working with cloud or on-premises Big Data/MPP analytics platform (Teradata, AWS Redshift, Google BigQuery, Azure Synapse Analytics, or similar) Experience with data quality and validation (using Apache Airflow) Experience with anomaly/outlier detection Experience with Data Science workflow (Jupyter Notebooks, Bento, or similar tools) Experience with Airflow or similar workflow management systems Experience querying massive datasets using Spark, Presto, Hive, or similar Experience building systems integrations, tooling interfaces, implementing integrations for ERP systems (Oracle, SAP, Salesforce, etc.) Experience in data visualizations using Power BI and Tableau. Proficiency in Python programming language and Python libraries, with a focus on data engineering and data science applications. Professional fluency in English required Mandatory Skills: Data Analysis. Experience: 3-5 Years. >

Posted 3 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Hyderabad

Work from Office

> Long Description Bachelors Degree preferred, or equivalent combination of education, training, and experience. 5+ years of professional experience with SQL, ETL, data modeling, and at least one programming language (e.g., Python, C++, C#, Scala, etc.) 3+ years of professional experience with Enterprise Domains Like HR, Finance, Supply Chain 6+ years of professional experience with more than one SQL and relational databases including expertise in Presto, Spark, and MySQL Professional experience designing and implementing real-time pipelines (Apache Kafka, or similar technologies) 5+ years of professional experience in custom ETL design, implementation, and maintenance 3+ years of professional experience with Data Modeling including expertise in Data Warehouse design and Dimensional Modeling 5+ years of professional experience working with cloud or on-premises Big Data/MPP analytics platform (Teradata, AWS Redshift, Google BigQuery, Azure Synapse Analytics, or similar) Experience with data quality and validation (using Apache Airflow) Experience with anomaly/outlier detection Experience with Data Science workflow (Jupyter Notebooks, Bento, or similar tools) Experience with Airflow or similar workflow management systems Experience querying massive datasets using Spark, Presto, Hive, or similar Experience building systems integrations, tooling interfaces, implementing integrations for ERP systems (Oracle, SAP, Salesforce, etc.) Experience in data visualizations using Power BI and Tableau. Proficiency in Python programming language and Python libraries, with a focus on data engineering and data science applications. Professional fluency in English required Mandatory Skills: Data Analysis. Experience: 5-8 Years. >

Posted 3 weeks ago

Apply

5.0 - 9.0 years

9 - 13 Lacs

Gurugram

Work from Office

At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities,collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client"s challenges of today and tomorrow.Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. Your role As a Senior Data Scientist, you are expected to develop and implement Artificial Intelligence based solutions across various disciplines for the Intelligent Industry vertical of Capgemini Invent. You are expected to work as an individual contributor or along with a team to help design and develop ML/NLP models as per the requirement. You will work closely with the Product Owner, Systems Architect and other key stakeholders right from conceptualization till the implementation of the project. You should take ownership while understanding the client requirement, the data to be used, security & privacy needs and the infrastructure to be used for the development and implementation. The candidate will be responsible for executing data science projects independently to deliver business outcomes and is expected to demonstrate domain expertise, develop, and execute program plans and proactively solicit feedback from stakeholders to identify improvement actions. This role requires a strong technical background, excellent problem-solving skills, and the ability to work collaboratively with stakeholders from different functional and business teams. The role also requires the candidate to collaborate on ML asset creation and eager to learn and impart trainings to fellow data science professionals. We expect thought leadership from the candidate, especially on proposing to build a ML/NLP asset based on expected industry requirements. Experience in building Industry specific (e.g. Manufacturing, R&D, Supply Chain, Life Sciences etc), production ready AI Models using microservices and web-services is a plus. Programming Languages Python NumPy, SciPy, Pandas, MatPlotLib, Seaborne Databases RDBMS (MySQL, Oracle etc.), NoSQL Stores (HBase, Cassandra etc.) ML/DL Frameworks SciKitLearn, TensorFlow (Keras), PyTorch, Big data ML Frameworks - Spark (Spark-ML, Graph-X), H2O Cloud Azure/AWS/GCP Your Profile Predictive and Prescriptive modelling using Statistical and Machine Learning algorithms including but not limited to Time Series, Regression, Trees, Ensembles, Neural-Nets (Deep & Shallow CNN, LSTM, Transformers etc.). Experience with open-source OCR engines like Tesseract, Speech recognition, Computer Vision, face recognition, emotion detection etc. is a plus. Unsupervised learning Market Basket Analysis, Collaborative Filtering, Dimensionality Reduction, good understanding of common matrix decomposition approaches like SVD. Various Clustering approaches Hierarchical, Centroid-based, Density-based, Distribution-based, Graph-based clustering like Spectral. NLP Information Extraction, Similarity Matching, Sentiment Analysis, Text Clustering, Semantic Analysis, Document Summarization, Context Mapping/Understanding, Intent Classification, Word Embeddings, Vector Space Models, experience with libraries like NLTK, Spacy, Stanford Core-NLP is a plus. Usage of Transformers for NLP and experience with LLMs like (ChatGPT, Llama) and usage of RAGs (vector stores like LangChain & LangGraps), building Agentic AI applications. Model Deployment ML pipeline formation, data security and scrutiny check and ML-Ops for productionizing a built model on-premises and on cloud. Required Qualifications Masters degree in a quantitative field such as Mathematics, Statistics, Machine Learning, Computer Science or Engineering or a bachelors degree with relevant experience. Good experience in programming with languages such as Python/Java/Scala, SQL and experience with data visualization tools like Tableau or Power BI. Preferred Experience Experienced in Agile way of working, manage team effort and track through JIRA Experience in Proposal, RFP, RFQ and pitch creations and delivery to the big forum. Experience in POC, MVP, PoV and assets creations with innovative use cases Experience working in a consulting environment is highly desirable. Presupposition High Impact client communication The job may also entail sitting as well as working at a computer for extended periods of time. Candidates should be able to effectively communicate by telephone, email, and face to face. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Entity: Technology Job Family Group: IT&S Group Job Description: You will work with You will be part of a high-energy, top-performing team of engineers and product managers, working alongside technology and business leaders to support the execution of transformative data initiatives that make a real impact. Let me tell you about the role As a Senior Data Platform Services Engineer, you will play a strategic role in shaping and securing enterprise-wide technology landscapes, ensuring their resilience, performance, and compliance. You will provide deep expertise in security, infrastructure, and operational excellence, driving large-scale transformation and automation initiatives. Your role will encompass platform architecture, system integration, cybersecurity, and operational continuity. You will be collaborating with engineers, architects, and business partners, working to establish robust governance models, technology roadmaps, and innovative security frameworks to safeguard critically important enterprise applications. What You Will Deliver Contribute to enterprise technology architecture, security frameworks, and platform engineering for our core data platform. Support end-to-end security implementation across our unified data platform, ensuring compliance with industry standards and regulatory requirements. Help drive operational excellence by supporting system performance, availability, and scalability. Contribute to modernization and transformation efforts, assisting in integration with enterprise IT systems. Assist in the design and execution of automated security monitoring, vulnerability assessments, and identity management solutions. Apply DevOps, CI/CD, and Infrastructure-as-Code (IaC) approaches to improve deployment and platform consistency. Support disaster recovery planning and high availability for enterprise platforms. Collaborate with engineering and operations teams to ensure platform solutions align with business needs. Provide guidance on platform investments, security risks, and operational improvements. Partner with senior engineers to support long-term technical roadmaps that reduce operational burden and improve scalability! What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelor’s degree in technology, engineering, or a related technical discipline. 3–5 years of experience in enterprise technology, security, or platform operations in large-scale environments. Experience with CI/CD pipelines, DevOps methodologies, and Infrastructure-as-Code (e.g., AWS CDK, Azure Bicep). Knowledge of ITIL, Agile delivery, and enterprise governance frameworks. Proficiency with big data technologies such as Apache Spark, Hadoop, Kafka, and Flink. Experience with cloud platforms (AWS, GCP, Azure) and cloud-native data solutions (BigQuery, Redshift, Snowflake, Databricks). Strong skills in SQL, Python, or Scala, and hands-on experience with data platform engineering. Understanding of data modeling, data warehousing, and distributed systems architecture. Essential Skills Technical experience in Microsoft Azure, AWS, Databricks, and Palantir. Understanding of data ingestion pipelines, governance, security, and data visualization. Experience supporting multi-cloud data platforms at scale—balancing cost, performance, and resilience. Familiarity with performance tuning, data indexing, and distributed query optimization. Exposure to both real-time and batch data streaming architectures Skills That Set You Apart Proven success navigating global, highly regulated environments, ensuring compliance, security, and enterprise-wide risk management. AI/ML-driven data engineering expertise, applying intelligent automation to optimize workflows. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more} Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Entity: Technology Job Family Group: IT&S Group Job Description: You will work with You will be part of a high-energy, top-performing team of engineers and product managers, working alongside technology and business leaders to support the execution of transformative data initiatives that make a real impact. Let Me Tell You About The Role As a Senior Data Tooling Services Engineer, you will play a strategic role in shaping and securing enterprise-wide technology landscapes, ensuring their resilience, performance, and compliance. You will provide deep expertise in security, infrastructure, and operational excellence, driving large-scale transformation and automation initiatives. Your role will encompass platform architecture, system integration, cybersecurity, and operational continuity. You will be collaborating with engineers, architects, and business partners, working to establish robust governance models, technology roadmaps, and innovative security frameworks to safeguard critically important enterprise applications. What You Will Deliver Contribute to enterprise technology architecture, security frameworks, and platform engineering for our core data platform. Support end-to-end security implementation across our unified data platform, ensuring compliance with industry standards and regulatory requirements. Help drive operational excellence by supporting system performance, availability, and scalability. Contribute to modernization and transformation efforts, assisting in integration with enterprise IT systems. Assist in the design and execution of automated security monitoring, vulnerability assessments, and identity management solutions. Apply DevOps, CI/CD, and Infrastructure-as-Code (IaC) approaches to improve deployment and platform consistency. Support disaster recovery planning and high availability for enterprise platforms. Collaborate with engineering and operations teams to ensure platform solutions align with business needs. Provide guidance on platform investments, security risks, and operational improvements. Partner with senior engineers to support long-term technical roadmaps that reduce operational burden and improve scalability! What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelor’s degree in technology, engineering, or a related technical discipline. 3–5 years of experience in enterprise technology, security, or platform operations in large-scale environments. Experience with CI/CD pipelines, DevOps methodologies, and Infrastructure-as-Code (e.g., AWS CDK, Azure Bicep). Knowledge of ITIL, Agile delivery, and enterprise governance frameworks. Proficiency with big data technologies such as Apache Spark, Hadoop, Kafka, and Flink. Experience with cloud platforms (AWS, GCP, Azure) and cloud-native data solutions (BigQuery, Redshift, Snowflake, Databricks). Strong skills in SQL, Python, or Scala, and hands-on experience with data platform engineering. Understanding of data modeling, data warehousing, and distributed systems architecture. Essential Skills Technical experience in Microsoft Azure, AWS, Databricks, and Palantir. Understanding of data ingestion pipelines, governance, security, and data visualization. Experience supporting multi-cloud data platforms at scale—balancing cost, performance, and resilience. Familiarity with performance tuning, data indexing, and distributed query optimization. Exposure to both real-time and batch data streaming architectures Skills That Set You Apart Proven success navigating global, highly regulated environments, ensuring compliance, security, and enterprise-wide risk management. AI/ML-driven data engineering expertise, applying intelligent automation to optimize workflows. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more} Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Location : Pune About Team & About Role As a Senior Software Engineer(SSE) in the Continuous Product Delivery (CPD) team, you will play a key role in providing long-term stability and last-mile delight to our customers. You will lead a small team of engineers and work closely with the core engineering team and product and support organization. You will work across Rubrik releases on our data backup & management offering. You are expected to develop a strong understanding of our product and engineering architecture, such as our distributed job framework, data lifecycle management, filesystem, and metadata store. Within CPD, you will work closely with the Platform and Systems Engineering team at Rubrik. The mission of this team is to develop a highly reliable, secure, scalable, and performant software-defined platform that radically simplifies building, deploying, and managing physical and virtual appliances on-premise and in the cloud. Rubrik CPD - SEs are self-starters, driven, and can manage themselves. We believe in giving engineers responsibility, not tasks. Our goal is to motivate and challenge you to do your best work by empowering you to make your own decisions. To do that, we have a very transparent structure that gives people the freedom to exercise judgment, even in critical scenarios. This develops more capable engineers and keeps everyone engaged and happy, ultimately leading to customer delight. Key Responsibilities Ownership of features, including design, implementation, and testing Design and develop infrastructure services and processes for regularly performing Linux kernel and Ubuntu OS upgrades. Diagnose and resolve problems in complex customer environments Develop and maintain code written in Python and/or Scala, where required. Troubleshoot complex software problems in a timely and accurate manner. Collaborate with cross-functional teams to define, design, and ship new features. Write and maintain technical documentation for software systems and applications. Participate in code reviews and ensure adherence to coding standards. Continuously improve software quality through process improvement initiatives. Keep up-to-date with emerging trends in software development. About You BTech/MTech/PhD in Computer Science 6-10 years of software development experience on Linux, preferably in Platform/Systems/Kernel or Networking domain Strong fundamentals in data structures, algorithms, and distributed systems design Solid grasp of major Linux distributions, such as Ubuntu Strong background in Systems Programming Expertise in debugging and troubleshooting performance and system-level issues Good experience with performing Linux kernel upgrades or equivalent and kernel debugging Excellent troubleshooting, problem-solving, and analytical skills. Strong communication skills and ability to work in a team environment. Proficient in a scripting language and either C++, Java, or Scala Large distributed systems design and development experience is preferred Knowledge of Storage, Filesystems, or Data Protection technologies is a plus Join Us in Securing the World's Data Rubrik (NYSE: RBRK) is on a mission to secure the world’s data. With Zero Trust Data Security™, we help organizations achieve business resilience against cyberattacks, malicious insiders, and operational disruptions. Rubrik Security Cloud, powered by machine learning, secures data across enterprise, cloud, and SaaS applications. We help organizations uphold data integrity, deliver data availability that withstands adverse conditions, continuously monitor data risks and threats, and restore businesses with their data when infrastructure is attacked. Linkedin | X (formerly Twitter) | Instagram | Rubrik.com Inclusion @ Rubrik At Rubrik, we are dedicated to fostering a culture where people from all backgrounds are valued, feel they belong, and believe they can succeed. Our commitment to inclusion is at the heart of our mission to secure the world’s data. Our goal is to hire and promote the best talent, regardless of background. We continually review our hiring practices to ensure fairness and strive to create an environment where every employee has equal access to opportunities for growth and excellence. We believe in empowering everyone to bring their authentic selves to work and achieve their fullest potential. Our inclusion strategy focuses on three core areas of our business and culture: Our Company: We are committed to building a merit-based organization that offers equal access to growth and success for all employees globally. Your potential is limitless here. Our Culture: We strive to create an inclusive atmosphere where individuals from all backgrounds feel a strong sense of belonging, can thrive, and do their best work. Your contributions help us innovate and break boundaries. Our Communities: We are dedicated to expanding our engagement with the communities we operate in, creating opportunities for underrepresented talent and driving greater innovation for our clients. Your impact extends beyond Rubrik, contributing to safer and stronger communities. Equal Opportunity Employer/Veterans/Disabled Rubrik is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability. Rubrik provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. In addition to federal law requirements, Rubrik complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. Federal law requires employers to provide reasonable accommodation to qualified individuals with disabilities. Please contact us at hr@rubrik.com if you require a reasonable accommodation to apply for a job or to perform your job. Examples of reasonable accommodation include making a change to the application process or work procedures, providing documents in an alternate format, using a sign language interpreter, or using specialized equipment. EEO IS THE LAW NOTIFICATION OF EMPLOYEE RIGHTS UNDER FEDERAL LABOR LAWS

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

PFB the detailed JD: 🔹 Experience: 6 to 8+ years (Hands-on) 🔹 Location: Pune (WFO) 🔹 Notice Period: 0-30 Days Must Have: Proficiency in at least one of the following programming languages: Java/Scala/Python Good understanding of SQL Experience of development and deployment of at least one end-to-end data storage/processing pipeline Strong Experience in Spark development with batch and streaming Intermediate level expertise in HDFS and Hive Experience with Pyspark and Data Engineering ETL implementation and migration to spark Experience of working with Hadoop cluster Python, PySpark, Data Bricks developer with knowledge of cloud Experience with Kafka and Spark streaming (Dstream and Structured Streaming) Experience with using Jupyter notebooks or any other developer tool Experience with Airflow or other workflow engines Good communication skills and logical skills Good to Have Skills: Prior experience of writing Spark jobs using Java is highly appreciated Prior experience of working with Cloudera Data Platform (CDP) Hands-on experience with NoSQL databases like HBase, Cassandra, Elasticsearch, etc. Experience of using maven and git Agile scrum methodologies Flink and Kudu streaming Automation of workflows CI/CD Nifi streaming and transformation

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Entity: Technology Job Family Group: IT&S Group Job Description: You will work with This team is responsible for response and management of cyber incidents, applying an intelligence-led approach for identification, mitigation, and rapid response to safeguard bp on a global scale. By applying lessons learned and data analytics, they establish engineering principles and enhance the technology stack to continuously bolster bp’s cybersecurity posture. Let me tell you about the role We are looking for a Security Engineering Specialist who will support a team dedicated to enabling security experts and software engineers to write, deploy, integrate, and maintain security standards and develop secure applications and automations. You will advocate for and help ensure that cloud, infrastructure, and data teams adhere to secure policies, uncover vulnerabilities and provide remediation insights, and contribute to the adoption of secure practices. You will stay informed on industry and technology trends to strengthen bp’s security posture and contribute to a culture of excellence. What you will deliver Support development of and implement platform security standards, co-design schemas, ensure quality at the source of infrastructure build and configuration, and find opportunities to automate manual secure processes wherever possible. Work with business partners to implement security strategies and to coordinate remediation activities to ensure products safely meet business requirements. Contribute as a subject matter expert in at least one domain (cloud, infrastructure, or data). Provide hands-on support to teams on secure configuration and remediation strategies. Align strategy, processes, and decision-making across teams. Actively participate in a positive engagement and governance framework and contribute to an inclusive work environment with teams and collaborators including engineers, developers, product owners, product managers and portfolio managers. Evolve the security roadmap to meet anticipated future requirements and needs. Provide support to the squads and teams through technical guidance and by managing dependencies and risks. Create and articulate materials on how to embed and measure security on our cloud, infrastructure, or data environments. Contribute to mentoring and promote a culture of continuous development! What you will need to be successful (experience and qualifications) 3+ years of experience in security engineering or technical infrastructure roles. A minimum of 3 years of Cyber Security experience on one of the following areas: Cloud (AWS and Azure), Infrastructure (IAM, Network, endpoint, etc.), or Data (DLP, data lifecycle management, etc.). Deep and hands-on experience designing security architectures and solutions for reliable and scalable data infrastructure, cloud and data products in complex environments. Development experience in one or more object-oriented programming languages (e.g., Python, Scala, Java, C#) and/or development experience in one or more cloud environments (including AWS, Azure, Alibaba, etc.). Exposure/experience with full stack development. Experience with automation and scripting for security tasks (e.g., IaC, CI/CD integration) and security tooling (e.g., vulnerability scanners, CNAPP, Endpoint and/or DLP). Deep knowledge and hands-on experience in technologies across all data lifecycle stages. Foundational knowledge of security standards, industry laws, and regulations such as Payment Card Industry Data Security Standards (PCI-DSS), General Data Protection Regulation (GDPR), California Consumer Privacy Act (CCPA) and Sarbanes-Oxley (SOX). Strong collaborator management and ability to influence teams through technical guidance. Continuous learning and improvement approach. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Even though the job is advertised as full time, please contact the hiring manager or the recruiter as flexible working arrangements may be considered. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Automation system digital security, Client Counseling, Conformance review, Digital Forensics, Incident management, incident investigation and response, Information Assurance, Information Security, Information security behaviour change, Intrusion detection and analysis, Legal and regulatory environment and compliance, Risk Management, Secure development, Security administration, Security architecture, Security evaluation and functionality testing, Solution Architecture, Stakeholder Management, Supplier security management, Technical specialism Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies