Jobs
Interviews

6071 Scala Jobs - Page 12

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 7.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Senior Data Engineer Apply now Date: 27 Jul 2025 Location: Bangalore, IN Company: kmartaustr A place you can belong: We celebrate the rich diversity of the communities in which we operate and are committed to creating inclusive and safe environments where all our team members can contribute and succeed. We believe that all team members should feel valued, respected, and safe irrespective of your gender, ethnicity, indigeneity, religious beliefs, education, age, disability, family responsibilities, sexual orientation and gender identity and we encourage applications from all candidates. Job Description: 5-7 Yrs of expereience in Data Engineer 3+ yrs in AWS service like IAM, API gateway, EC2,S3 2+yrs expereince in creating and deploying containers on kubernestes 2+yrs expereince with CI-CD pipelines like Jrnkins, Github 2+yrs expereince with snowflake data warehousing, 5-7 yrs with ETL/ELT paradign 5-7 yrs in Big data technologies like Spark, Kafka Strong Expereince skills in Python, Java or scala A place you can belong: We celebrate the rich diversity of the communities in which we operate and are committed to creating inclusive and safe environments where all our team members can contribute and succeed. We believe that all team members should feel valued, respected, and safe irrespective of your gender, ethnicity, indigeneity, religious beliefs, education, age, disability, family responsibilities, sexual orientation and gender identity and we encourage applications from all candidates. Apply now Find similar jobs:

Posted 5 days ago

Apply

2.0 - 6.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Diverse Lynx is looking for Pyspark Developer to join our dynamic team and embark on a rewarding career journey Designing and developing big data applications using the PySpark framework to meet the needs of the business Writing and optimizing Spark SQL statements to extract and manipulate large datasets Developing and deploying Spark algorithms to perform data processing and analytics tasks, such as machine learning and graph processing Debugging and troubleshooting Spark code to resolve any issues and improve the performance of the applications Collaborating with cross-functional teams, such as data engineers and data analysts, to ensure that the PySpark applications are integrated with other systems Creating and maintaining documentation to ensure that the big data architecture, design, and functionality are well understood by others Should be detail-oriented, have excellent problem-solving and communication skills

Posted 5 days ago

Apply

3.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting and optimizing existing data workflows to enhance performance and reliability. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering. - Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Apache Spark and data warehousing solutions. - Strong understanding of data modeling and database design principles. - Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. - Experience in programming languages such as Python or Scala for data processing. Additional Information: - The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Bengaluru office. - A 15 years full time education is required., 15 years full time education

Posted 5 days ago

Apply

5.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Shift Ahead Technologies, based in Pune required a couple of Senior Engineers (5 to 7 years) experience in SCALA develpment , self sufficient, that can work autonomously under customer supervision. This role shall be work from home. Should be willing to join in 7 odd days. Desirably excellent english communication and can work independently with client. Design, develop, and maintain Scala-based applications and software solutions. Write clean, efficient, and scalable code following functional programming principles and best practices Participate in architectural decisions and contribute to the design and development process of projects. Test, debug, and optimize applications to ensure high performance, security, and scalability Collaborate with cross-functional teams including developers, analysts, QA engineers, and stakeholders throughout the development cycle Collaborate with cross-functional teams including developers, analysts, QA engineers, and stakeholders throughout the development cycle Integrate Scala solutions with other platforms, frameworks (such as Akka, Play, or Spark), and APIs for data or service integration Confident candidates may apply or mail to careers@shiftahead.tech

Posted 5 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Manager Software Engineer Overview We are the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Our Team Within Mastercard – Data & Services The Data & Services team is a key differentiator for Mastercard, providing the cutting-edge services that are used by some of the world's largest organizations to make multi-million dollar decisions and grow their businesses. Focused on thinking big and scaling fast around the globe, this agile team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business Test & Learn experimentation, and data-driven information and risk management services. Targeting Analytics Program Within the D&S Technology Team, the Targeting Analytics program is a relatively new program that is comprised of a rich set of products that provide accurate perspectives on Credit Risk, Portfolio Optimization, and Ad Insights. Currently, we are enhancing our customer experience with new user interfaces, moving to API-based data publishing to allow for seamless integration in other Mastercard products and externally, utilizing new data sets and algorithms to further analytic capabilities, and generating scalable big data processes. We are seeking an innovative Lead Software Engineer to lead our team in designing and building a full stack web application and data pipelines. The goal is to deliver custom analytics efficiently, leveraging machine learning and AI solutions. This individual will thrive in a fast-paced, agile environment and partner closely with other areas of the business to build and enhance solutions that drive value for our customers. Engineers work in small, flexible teams. Every team member contributes to designing, building, and testing features. The range of work you will encounter varies from building intuitive, responsive UIs to designing backend data models, architecting data flows, and beyond. There are no rigid organizational structures, and each team uses processes that work best for its members and projects. Here are a few examples of products in our space: Portfolio Optimizer (PO) is a solution that leverages Mastercard’s data assets and analytics to allow issuers to identify and increase revenue opportunities within their credit and debit portfolios. Audiences uses anonymized and aggregated transaction insights to offer targeting segments that have high likelihood to make purchases within a category to allow for more effective campaign planning and activation. Credit Risk products are a new suite of APIs and tooling to provide lenders real-time access to KPIs and insights serving thousands of clients to make smarter risk decisions using Mastercard data. Help found a new, fast-growing engineering team! Position Responsibilities As a Lead Software Engineer, you will: Lead the scoping, design and implementation of complex features Lead and push the boundaries of analytics and powerful, scalable applications Design and implement intuitive, responsive UIs that allow issuers to better understand data and analytics Build and maintain analytics and data models to enable performant and scalable products Ensure a high-quality code base by writing and reviewing performant, well-tested code Mentor junior software engineers and teammates Drive innovative improvements to team development processes Partner with Product Managers and Customer Experience Designers to develop a deep understanding of users and use cases and apply that knowledge to scoping and building new modules and features Collaborate across teams with exceptional peers who are passionate about what they do Ideal Candidate Qualifications 10+ years of engineering experience in an agile production environment. Experience leading the design and implementation of complex features in full-stack applications. Proficiency with object-oriented languages, preferably Java/ Spring. Proficiency with modern front-end frameworks, preferably React with Redux, Typescript. High proficiency in using Python or Scala, Spark, Hadoop platforms & tools (Hive, Impala, Airflow, NiFi, Scoop) Fluent in the use of Git, Jenkins. Solid experience with RESTful APIs and JSON/SOAP based API Solid experience with SQL, Multi-threading, Message Queuing. Experience in building and deploying production-level data-driven applications and data processing workflows/pipelines and/or implementing machine learning systems at scale in Java, Scala, or Python and deliver analytics involving all phases. Desirable Capabilities Hands on experience of cloud native development using microservices. Hands on experience on Kafka, Zookeeper. Knowledge of Security concepts and protocol in enterprise application. Expertise with automated E2E and unit testing frameworks. Knowledge of Splunk or other alerting and monitoring solutions. Core Competencies Strong technologist eager to learn new technologies and frameworks. Experience coaching and mentoring junior teammates. Customer-centric development approach Passion for analytical / quantitative problem solving Ability to identify and implement improvements to team development processes Strong collaboration skills with experience collaborating across many people, roles, and geographies Motivation, creativity, self-direction, and desire to thrive on small project teams Superior academic record with a degree in Computer Science or related technical field Strong written and verbal English communication skills #AI3 Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 5 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

We are looking for a skilled Data Engineer with a solid background in building and maintaining scalable data pipelines and systems. You will work closely with data analysts, engineering teams, and business stakeholders to ensure seamless data flow across platforms. Responsibilities Design, build, and optimize robust, scalable data pipelines (batch and streaming). Develop ETL/ELT processes using tools like Airflow, DBT, or custom scripts. Integrate data from various sources (e. g., APIs, S3 databases, SaaS tools). Collaborate with analytics and product teams to ensure high-quality datasets. Monitor pipeline performance and troubleshoot data quality or latency issues. Work with cloud data warehouses (e. g., Redshift, Snowflake, BigQuery). Implement data validation, error handling, and alerting for production jobs. Maintain documentation for pipelines, schemas, and data sources. Requirements 3+ years of experience in Data Engineering or similar roles. Strong in SQL and experience with data modeling and transformation. Hands-on experience with Python or Scala for scripting/data workflows. Experience working with Airflow, AWS (S3 Redshift, Lambda), or equivalent cloud tools. Knowledge of version control (Git) and CI/CD workflows. Strong problem-solving and communication skills. Good To Have Experience with DBT, Kafka, or real-time data processing. Familiarity with BI tools(e. g., Tableau, Looker, Power BI). Exposure to Docker, Kubernetes, or DevOps practices. This job was posted by Harika K from Invictus.

Posted 5 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting and optimizing existing data workflows to enhance performance and reliability. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering. - Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Apache Spark and data warehousing solutions. - Strong understanding of data modeling and database design principles. - Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. - Experience in programming languages such as Python or Scala for data processing. Additional Information: - The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Bengaluru office. - A 15 years full time education is required.

Posted 5 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering. - Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Apache Spark and data warehousing solutions. - Strong understanding of data modeling and database design principles. - Experience with cloud platforms such as AWS, Azure, or Google Cloud. - Familiarity with programming languages such as Python or Scala for data manipulation. Additional Information: - The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform. - This position is based in Hyderabad. - A 15 years full time education is required.

Posted 5 days ago

Apply

5.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by delivering high-quality applications that enhance operational efficiency and user experience. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Experience with data integration and ETL processes. - Strong understanding of cloud computing concepts and services. - Familiarity with programming languages such as Python or Scala. - Ability to work with data visualization tools to present insights effectively. Additional Information: - The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Mumbai office. - A 15 years full time education is required.

Posted 5 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

We are looking for a skilled AWS Data Engineer to design, develop, and maintain scalable data pipelines and cloud-based data infrastructure on Amazon Web Services (AWS). The ideal candidate will work closely with data scientists, analysts, and software engineers to ensure high availability and performance of data solutions across the organization. Responsibilities Build/support applications using speech-to-text AWS services like Transcribe, Comprehend, along with Bedrock. Experience working with BI tools like QuickSight. Design, build, and manage scalable data pipelines using AWS services (e. g., Glue, Lambda, Step Functions, S3 EMR, Kinesis, Snowflake). Optimize data storage and retrieval for large-scale datasets in data lakes or data warehouses. Monitor, debug, and optimize the performance of data jobs and workflows. Ensure data quality, consistency, and security across environments. Collaborate with analytics, engineering, and business teams to understand data needs. Automate infrastructure deployment using IaC tools like CloudFormation or Terraform. Apply best practices for cloud cost optimization, data governance, and DevOps. Stay current with AWS services and recommend improvements to data architecture. Understanding machine learning pipelines and MLOps (nice to have). Requirements Bachelor's degree in computer science or a related field. 5+ years of experience as a Data Engineer, with at least 3 years focused on AWS. Strong experience with AWS services, including Transcribe, Bedrock, and QuickSight. Familiarity with Glue, S3 Snowflake, Lambda, Step Function, Kinesis, Athena, EC2/EMR, Power BI, or Tableau. Proficient in Python, PySpark, or Scala for data engineering tasks. Hands-on experience with SQL and data modeling. Familiarity with CI/CD pipelines and version control (e. g., Git, CodePipeline). Experience with orchestration tools (e. g., Airflow, Step Functions). Knowledge of data security, privacy, and compliance standards (GDPR, HIPAA, etc. ). Good To Have Skills AWS certifications (e. g., AWS Certified Data Analytics - Specialty, AWS Certified Solutions Architect). Experience with containerization (Docker, ECS, EKS). Experience working in Agile/Scrum environments. This job was posted by Shailendra Singh from PearlShell Softech.

Posted 5 days ago

Apply

3.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering. - Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Apache Spark and data warehousing solutions. - Strong understanding of data modeling and database design principles. - Experience with cloud platforms such as AWS, Azure, or Google Cloud. - Familiarity with programming languages such as Python or Scala for data manipulation. Additional Information: - The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform. - This position is based in Hyderabad. - A 15 years full time education is required., 15 years full time education

Posted 5 days ago

Apply

6.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

PayPay India is looking for a Backend engineer to work on our payment system to deliver the best payment experience for our customers. Responsibilities Design large-scale systems with high complexity to support our high-throughput applications. Understand how to leverage infrastructure for solving such large-scale problems. Develop tools and contribute to open source wherever possible. Adopt problem solving as a way of life - always go to the root cause! Support the code you write in production. Requirements Tech Stack: Java, Kotlin, Scala, Spring Boot, JUnit, Resilience4j, Feign, MySQL/AuroraDB, DynamoDB, ELK, Kafka, Redis, TiDB, Docker, Kubernetes, ArgoCD, AWS, GCP, GitHub, IntelliJ, Gradle, Maven, npm/yarn, Flyway, Jenkins, Snyk, BigQuery, Kibana, Spark, PlantUML, draw.io, Miro.com, Slack, Zoom. 6 years of experience having excellent skills in Java and any other generalized programming language, such as Scala, Python, or Go. Interest and ability to learn other coding languages as needed. Experience with SQL and NoSQL databases, along with distributed cache. Strong fundamentals in data structures, algorithms, and object-oriented programming. In-depth understanding of concurrency and distributed computing. Experience implementing platform components such as RESTful APIs, Pub/Sub Systems, and Database Clients. Experience with microservices. Experience designing high-traffic systems. Degree in Computer Engineering or Computer Science, or 5+ years equivalent experience in SaaS platform development. Business of English or Japanese. Preferred Qualifications Experience in working on system development in finance, payment, or similar industries. Language ability in Japanese and English is a plus (We have a professional translator, but it is nice to have language skills). Experience with AWS services. This job was posted by Tanu Jha from PayPay India.

Posted 5 days ago

Apply

4.0 years

0 Lacs

India

Remote

Role Highlights: Position: Big Data Engineer Experience: 4+ years Location: All India-Remote, Hyderabad- Hybrid Notice Period: Immediate/7 days joiners mandate Job Overview: Must have skills- Big Data, Scala, AWS and Python or Java

Posted 5 days ago

Apply

10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Role - Data Analytics Architect Exp . - 10+ years Location - PAN India (Prefer - Thane, Mumbai, Hyderabad) Required Technical Skill Set - Snowflake Desired Competencies (Technical/Behavioural Competency): Experience in architecture definition, design and implementation of data lake solutions on Azure/ AWS/ Snowflake . Designs and models Data lake architecture, implements standards, best practices and processes to improve the management of information and data throughout its lifecycle across this platform. Design and implement data engineering, ingestion and curation functions on data lake using native components or custom programming (Azure/ AWS). Proficient in tools/ technologies – Azure (Azure data factory, synapse, ADLS, Data bricks etc.)/ AWS (Redshift, S3, Glue, Athena, DynamoDB etc.)/ Snowflake technology stack/ Talend/ Informatica Analyse data requirements, application and processing architectures, data dictionaries and database schema(s). Analyzes complex data systems and documents data elements, data flow, relationships, and dependencies. Collaborates with Infrastructure and Security Architects to ensure alignment with Enterprise standards and designs Data Modelling, Data Warehousing, Dimensional Modelling, Data Modelling for Big Data & Metadata Management. Knowledge on Data catalogue tools, metadata management and data quality management Experience in design & implementation of Dashboards using tools like Power BI, Qlik etc. Strong oral and written Communication skills. Good presentation skills Analytical Skills Business orientation & acumen (exposure) Advisory experience, to be able to position or seen as an expert Willingness to travel internationally, collocate with clients for short or long term Basic knowledge of advanced analytics. Exposure to leveraging Artificial intelligence and Machine Learning for analysis of complex and large datasets. Tools like Python/ Scala etc. Responsibilities Executing various consulting & implementation engagements for Data lake solutions Data integration, Data modelling, data delivery, statistics, analytics and math Identify right solutions to business problems Learn and Leverage tools/ technologies and product solutions in Data & Analytics area Implement advanced analytics, cognitive analytics models Support RFPs by providing business perspective, participate in RFP discussions, coordination within support groups in TCS Conduct business research and demonstrate thought leadership through analyst engagements, white papers and participation in industry focus areas

Posted 5 days ago

Apply

1.0 - 4.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Responsible for the analysis, definition, design, construction, testing, installation, modification, and maintenance of properly engineered information systems, containing software as the major component to meet agreed business needs. To view the full job code description, copy and paste the following url into your browser to access the GRF site, https://reachingourpeople.com/career-development/global-role-framework/. About the Role: The Legal & Research Technology in Bangalore provides systems development and support for the content pathways and content processing needs of WestLaw. The group oversees and executes on a wide range of project types, ranging from cost-saving infrastructure to revenue-driving product development initiatives. We are looking for a highly motivated, innovative, and detailed oriented individual who will make an impact by contributing to the team's development needs right away. The key area of focus for this position is serving as a Software engineer for a multi-year project to deliver new and re-engineered systems using AWS and its capabilities with excellent proficiency in Python, Groovy, JavaScript, and Angular 6+. About you Development of high-quality code/script in Python, Groovy , JavaScript and/or Angular 6+ Work with XML content Write Lambdas with Self-service and extensible configurations Adhere to best practices for development in Python, Groovy, JavaScript, and Angular Come up with Functional Unit Test cases for the requirements in Python, Groovy, JavaScript, and Angular Actively participate in Code review of own and the peers Work with different AWS capabilities Understand Integration points of upstream and downstream processes Learn new frameworks that are needed for implementation Maintain and update the Agile/Scrum dashboard for accurate tracking of own tasks Proactively pick up tasks and work toward the completion of them with aggressive timelines Understand the existing functionality of the systems and suggest how we can improve About the role Youre a fit for the role of Software Engineer if you are: Very Strong in OO design patterns and concepts Strong in Python, Groovy, and/or Scala Strong in Angular scripting Good understanding of cloud concepts Strong understanding of Agile and Scrum methodologies Strong written and verbal communication skills Ability to work under pressure Attention to Detail Working Knowledge of some of the AWS capabilities Knowledge of Agile/Scrum tracking tools Keen on picking up newer technologies 'Team Player Interact with internal/external teams Adaptability and Flexibility #LI-NP1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 5 days ago

Apply

7.0 - 12.0 years

8 - 13 Lacs

Bengaluru

Work from Office

We are looking for a self-motivated individual with appetite to learn new skills and be part of a fast-paced team that is delivering cutting edge solutions that drive new products and features that are critical for our customers. Our senior software engineers are responsible for designing, developing and ensuring the quality, reliability and availability of key systems that provide critical data and algorithms.Responsibilities of this role will include developing new and enhancing existing applications and you will work collaboratively with technical leads and architect to design, develop and test these critical applications. About the role Actively participate in the full life cycle of software delivery, including analysis, design, implementation and testing of new projects and features using Hadoop, Spark/Pyspark, Scala or Java, Hive, SQL, and other open-source tools and design patterns. Python knowledge is a bonus for this role. Working experience with HUDI , Snowflake or similar Must have technologies like Big Data, AWS services like EMR, S3, Lambdas, Elastic, step functions. Actively participate in the development and testing of features for assigned projects with little to no guidance. The position holds opportunities to work under technical experts and also to provide guidance and assistance to less experienced team members or new joiners in the path of the project. Appetite for learning will be key attribute for doing well in the role as the Org is very dynamic and have tremendous scope into various technical landscapes. We consider AI inclusion as a key to excel in this role, we want dynamic candidates who use AI tools as build partners and share experiences to ignite the Org. Proactively share knowledge and best practices on using new and emerging technologies across all of the development and testing groups Create, review and maintain technical documentation of software development and testing artifacts Work collaboratively with others in a team-based environment. Identify and participate in the resolution of issues with the appropriate technical and business resources Generate innovative approaches and solutions to technology challenges Effectively balance and prioritize multiple projects concurrently. About you Bachelors or Masters degree in computer science or a related field 7+ year experience in IT industry Product and Platform development preferred. Strong programming skill with Java or Scala. Must have technologies includes Big Data, AWS. Exposure to services like EMR, S3, Lambdas, Elastic, step functions. Knowledge of Python will be preferred. Experience with Agile methodology, continuous integration and/or Test-Driven Development. Self-motivated with a strong desire for continual learning Take personal responsibility to impact results and deliver on commitments. Effective verbal and written communication skills. Ability to work independently or as part of an agile development team. #LI-SP1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 5 days ago

Apply

0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Engineer - Data and AI Wroclaw, Poland AXA XL recognises digital, data and information assets are critical for the business, both in terms of managing risk and enabling new business opportunities. Data and Applied AI assets should not only be high quality, but also drive a sustained competitive advantage and delivering a superior experience to our internal, external customers and improving efficiency. Our Innovation, Data and Analytics (IDA) function is focused on driving innovation by optimizing how we leverage digital, data and AI to drive strategy and differentiate ourselves from the competition. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and strengthen our digital, AI capabilities, we are seeking an Engineer - Data and AI. In this role, you will work under guidance of Engineering Manager to apply engineering best practice and latest technology to lead design and execution of data and AI applications utilizing the capacity of AXA XL’s Data and AI Platforms and cloud technologies. Our tech stack continues to evolve together with Azure data and AI offering and relies on Azure Databricks and Azure AI pillars. Additionally, AXA XL consumes the wider technology offering from AXA Group, such as managed OpenShift, VM and DevOps platforms. We use Scrum methodology. What You’ll Be DOING What will your essential responsibilities include? Act as a key engineering team member to collaborate product owners on timely and compliant delivery of wide range of data & ML applications and business tools. Understand current and future data and AI consumer needs to contribute to application design that is scalable, economical, and focused on end-user value. Support the execution of data and AI projects by collaborating with stakeholders, refining requirements, and assisting in execution planning, with a focus on outcomes that align with IDA objectives. Develop maintainable source code which follows best practices that you can instil in more junior team members. Participate in peer reviews and pair programming where beneficial. Manage technical debt strategically to achieve project outcomes while considering longer-term objectives such as simplicity, maintainability, and control of the IT estate. Participate in research and hands-on execution of POCs into new technologies such as AI tooling, storage and query solutions, and governance tools. Provide support to the data science and data engineering community in maximizing the value of their work through automation, tooling, and engineering. Advise on engineering best practices and skill gaps when necessary. In this role, you will report to Engineering Manager - Data and AI What You Will BRING We’re looking for someone who has these abilities and skills: Engineering professional with hands-on skills in designing, building, and optimizing scalable cost-efficient data systems and applications in a cloud-first environment. Familiarity with engineering practices such as CI/CD, release lifecycle, observability, testing with a proven ability to implement and deliver changes in a controlled, informed, and safe manner. Programming experience - ideally in Python or a willingness to use it. Familiarity with some of the following: microservices, Databricks or Spark, analytical and operational (SQL, NoSQL) databases, Kubernetes, orchestration tools, monitoring tools, infrastructure as code, Docker, and streaming technologies. Proven experience as a Data Engineer / Software Engineer / ML Engineer / AI Apps on an open-source tech stack, with experience in Python, Java / Scala, Spark, cloud platforms, and database technologies. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What We OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. AXA XL is an Equal Opportunity Employer.

Posted 5 days ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

This role involves the development and application of engineering practice and knowledge in the following technologiesStandards and protocols, application software and embedded software for wireless and satellite networks, fixed networks and enterprise networks; connected devices (IOT and device engineering), connected applications (5G/ edge, B2X apps); and Telco Cloud, Automation and Edge Compute platforms. This role also involves the integration of network systems and their operations, related to the above technologies. - Grade Specific Focus on Connectivity and Network Engineering. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.

Posted 5 days ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. - Grade Specific The role involves leading and managing a team of data engineers, defining and executing the data engineering strategy, and ensuring the effective delivery of data solutions. They provide technical expertise, drive innovation, and collaborate with stakeholders to deliver high-quality, scalable, and reliable data infrastructure and solutions.

Posted 5 days ago

Apply

1.0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Company Description At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. Job Description Job Purpose: Develop and enhance our flagship Video, Audio, Automotive and Sports metadata software solutions. Design applications with a Platform-first mentality where scale, consistency and reliability are at the core of every decision. Job Description: As a Software Engineer, you will be responsible for designing, developing, and maintaining high-quality software applications using various technologies. You will be contributing to solutions which acquire, enrich and deliver world’s richest content metadata to customers across the globe using technology first approach Qualifications Key Responsibilities: Design, develop, and maintain scalable and robust Java applications. Write clean, maintainable, and efficient code following best practices and coding standards. Own the entire SDLC, from requirements gathering and design through implementation, testing, deployment, and maintenance, ensuring high-quality software solutions that meet business objectives. Thrives on continuous learning, demonstrates a keen interest in emerging technologies, and proactively seeks opportunities to expand skillset. Troubleshoot and debug applications to optimize performance and resolve issues. Participate in the full software development lifecycle, including planning, development, testing, and deployment. Stay up-to-date with emerging technologies and industry trends to continuously improve skills and knowledge. Qualifications Bachelor's in Computer Science, Engineering or related field. 1 to 3 years of professional experience in software development Strong analytical and logical reasoning skills with a passion for problem-solving and innovation. Excellent math capabilities for algorithm design, optimization, and data analysis. Solid understanding of data structures, algorithms, and computer science fundamentals. Proficiency in one or more programming languages such as Java, Python, C++, SCALA, GoLang. Knowledge of Data Science is an added advantage. Excellent communication, teamwork and adaptability skills. What We Offer Competitive salary and benefits package. Opportunities for professional growth and development. A collaborative and inclusive work environment. Additional Information Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels.

Posted 5 days ago

Apply

3.0 - 6.0 years

25 - 35 Lacs

Pune

Work from Office

Role & responsibilities Develop and maintain search functionality in the Fusion Lucidworks platform. Experience - 3 years to 6 years Connect databases for pulling data into Fusion from various types of data sources. Implement real time indexing of large-scale data sets residing in database files and other sources, using Fusion as the search platform Proven experience in implementing and maintaining enterprise search solutions in large-scale environments. Experience developing and deploying Search solutions in a public cloud such as AWS. Proficient in high-level programming languages: Java, Scala, Python. Familiarity with containerization, scripting, cloud platforms, and CI/CD. Work with Business analyst and customers to translate business needs into software solutions Have understanding of the software development process, version control, etc.

Posted 5 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Noida

Work from Office

Changing the world through digital experiences is what Adobe s all about. We give everyone from emerging artists to global brands everything they need to design and deliver exceptional digital experiences. We re passionate about empowering people to craft beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We re on a mission to hire the very best and are committed to building exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Digital Experience (DX) (https: / / www.adobe.com / experience-cloud.html) is a USD 3B+ business serving the needs of enterprise businesses including 95%+ of fortune 500 organizations. Adobe Journey Optimizer (AJO) within DX provides a platform for designing cross-channel customer experiences and provides an environment for visual campaign orchestration, real time interaction management and cross channel execution. It is built natively on the Adobe Experience Platform and combines a unified, real-time customer profile, an API-first open framework, centralized offer decisioning, and artificial intelligence (AI) and machine learning (ML) for personalization and optimization. Beyond the usual responsibility of designing, developing, documenting, and thoroughly testing code, Computer Scientists @ Adobe would own features of varying complexity, which may require understanding interactions with other parts of the system, moderately sophisticated algorithms and good design judgment. We are looking for strong and passionate engineers to join our team as we scale the business by building the next gen products and contributing to our existing offerings. What youll do This is an individual contributor position. Expectations will be on the below lines: Responsible for design and architecture of new products. Work in full DevOps mode, be responsible for all phases of engineering. From early specs, design/architecture, technology choice, development, unit-testing/integration automation, and deployment. Collaborate with architects, product management and other engineering teams to build the technical vision, and road map for the team. Build technical specifications, prototypes and presentations to communicate your ideas. Be well versed in emerging industry technologies and trends, and have the ability to communicate that knowledge to the team and use it to influence product direction. Orchestrate with team to develop a product or parts of a large product. Requirements B.Tech / M.Tech degree in Computer Science from a premier institute. 7-9.5years of relevant experience in software development. Should have excellent computer science fundamentals and a good understanding of design, and performance of algorithms Proficient in Java/Scala Programming Proficient in writing code that is reliable, maintainable, secure, and performant Knowledge of Azure services and/or AWS. Internal Opportunities We re glad that you re pursuing career development opportunities at Adobe. Here s what you ll need to do: Apply with your complete LinkedIn profile or resume/CV. Schedule a Check-in meeting with your manager to discuss this internal opportunity and your career aspirations. Check-ins should include ongoing discussions about expectations, feedback and career development. Learn more about Check-in here. Learn more about the internal career opportunities process in this FAQ. If you re contacted for an interview, here are some tips. At Adobe, you will be immersed in an exceptional work environment that is recognized throughout the world on Best Companies lists. You will also be surrounded by colleagues who are committed to helping each other grow through our unique Check-In approach where ongoing feedback flows freely. If you re looking to make an impact, Adobes the place for you. Discover what our employees are saying about their career experiences on the Adobe Life blog and explore the meaningful benefits we offer. Adobe is an equal opportunity employer. We welcome and encourage diversity in the workplace regardless of gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, or veteran status. .

Posted 5 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

We are seeking talented Senior Software Engineers to join our Engineering team, supporting Search Engineering efforts. In this role, you will play a key part in designing and optimizing data infrastructure, enabling real-time and batch data processing to enhance search retrieval, ranking, and product experiences. You will work closely with BE and ML engineers, data scientists, and product teams to build robust, scalable, and high-performance data systems that power personalized user experiences. What the Candidate Will Need / Bonus Points What the Candidate Will Do ---- Develop serving infrastructure to enhance system latency, throughput, and reliability Enhance search relevance by improving indexing, retrieval, and ranking mechanisms. Develop and optimize search algorithms, ranking models, and query processing techniques. Implement and maintain scalable search pipelines and distributed indexing systems. Work with machine learning engineers to integrate AI-driven search ranking and personalization models. Analyze search performance metrics and run A/B experiments to measure improvements. Optimize latency, throughput, and scalability of search infrastructure. Contribute to system design and architecture decisions to improve search quality and efficiency. Write clean, efficient, and maintainable code in Python, Java, or Go. Collaborate with cross-functional teams to enhance search relevance and user experience. Monitor and troubleshoot search-related production issues to ensure system reliability. Basic Qualifications ---- 5+ years of experience in software engineering Expertise in big data technologies such as Apache Spark, Kafka, Flink, Airflow, Presto, or Snowflake. Strong experience with search and recommendation systems, working with Elasticsearch, OpenSearch, Solr, or similar technologies. Proficiency in distributed data processing frameworks and real-time streaming architectures. Deep understanding of data modeling, ETL pipelines, and data warehousing principles. Strong programming skills in Golan, Python, Scala, or Java. Experience with cloud platforms (AWS, GCP, or Azure) and modern data infrastructure tools. Ability to work on high-scale distributed systems and troubleshoot performance bottlenecks. Strong problem-solving and analytical skills, with a passion for data-driven decision-making. Preferred Qualifications ---- Hands-on experience with search technologies such as Elasticsearch, OpenSearch, Solr, or Vespa. Familiarity with search retrieval, ranking techniques, query understanding, and text processing. Ubers mission is to reimagine the way the world moves for the better. Here, bold ideas create real-world impact, challenges drive growth, and speed fuelds progress. What moves us, moves the world - let s move it forward, together. Offices continue to be central to collaboration and Ubers cultural identity. Unless formally approved to work fully remotely, Uber expects employees to spend at least half of their work time in their assigned office. For certain roles, such as those based at green-light hubs, employees are expected to be in-office for 100% of their time. Please speak with your recruiter to better understand in-office expectations for this role.

Posted 5 days ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR

Hybrid

Ready to shape the future of work? At Genpact, we dont just adapt to changewe drive it. AI and digital innovation are redefining industries, and were leading the charge. Genpacts AI Gigafactory , our industry-first accelerator, is an example of how were scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team thats shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Senior Principal Consultant, AWS DataLake! Responsibilities Having knowledge on DataLake on AWS services with exposure to creating External Tables and spark programming. The person shall be able to work on python programming. Writing effective and scalable Python codes for automations, data wrangling and ETL. ¢ Designing and implementing robust applications and work on Automations using python codes. ¢ Debugging applications to ensure low-latency and high-availability. ¢ Writing optimized custom SQL queries ¢ Experienced in team and client handling ¢ Having prowess in documentation related to systems, design, and delivery. ¢ Integrate user-facing elements into applications ¢ Having the knowledge of External Tables, Data Lake concepts. ¢ Able to do task allocation, collaborate on status exchanges and getting things to successful closure. ¢ Implement security and data protection solutions ¢ Must be capable of writing SQL queries for validating dashboard outputs ¢ Must be able to translate visual requirements into detailed technical specifications ¢ Well versed in handling Excel, CSV, text, json other unstructured file formats using python. ¢ Expertise in at least one popular Python framework (like Django, Flask or Pyramid) ¢ Good understanding and exposure on any Git, Bamboo, Confluence and Jira. ¢ Good in Dataframes and SQL ANSI using pandas. ¢ Team player, collaborative approach and excellent communication skills Qualifications we seek in you! Minimum Qualifications ¢BE/B Tech/ MCA ¢Excellent written and verbal communication skills ¢Good knowledge of Python, Pyspark Preferred Qualifications/ Skills ¢ Strong ETL knowledge on any ETL tool good to have. ¢ Good to have knowledge on AWS cloud and Snowflake. ¢ Having knowledge of PySpark is a plus. Why join Genpact? Be a transformation leader Work at the cutting edge of AI, automation, and digital innovation Make an impact Drive change for global enterprises and solve business challenges that matter Accelerate your career Get hands-on experience, mentorship, and continuous learning opportunities Work with the best Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Lets build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 5 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune, Gurugram, Bengaluru

Work from Office

What Youll Do Build, Refine and Use ML Engineering platforms and components. Scaling machine learning algorithms to work on massive data sets and strict SLAs. Build and orchestrate model pipelines including feature engineering, inferencing and continuous model training. Implement ML Ops including model KPI measurements, tracking, model drift & model feedback loop. Collaborate with client facing teams to understand business context at a high level and contribute in technical requirement gathering. Implement basic features aligning with technical requirements. Write production-ready code that is easily testable, understood by other developers and accounts for edge cases and errors. Ensure highest quality of deliverables by following architecture/design guidelines, coding best practices, periodic design/code reviews. Write unit tests as well as higher level tests to handle expected edge cases and errors gracefully, as well as happy paths. Uses bug tracking, code review, version control and other tools to organize and deliver work. Participate in scrum calls and agile ceremonies, and effectively communicate work progress, issues and dependencies. Consistently contribute in researching & evaluating latest architecture patterns/technologies through rapid learning, conducting proof-of-concepts and creating prototype solutions. What Youll Bring A masters or bachelors degree in Computer Science or related field from a top university. 5+ years hands-on experience in ML development. Good fundamentals of machine learning Strong programming expertise in Python, PySpark/Scala. Expertise in crafting ML Models for high performance and scalability. Experience in implementing feature engineering, inferencing pipelines, and real time model predictions. Experience in ML Ops to measure and track model performance, experience working with MLFlow Experience with Spark or other distributed computing frameworks. Experience in ML platforms like Sage maker, Kubeflow. Experience with pipeline orchestration tools such Airflow. Experience in deploying models to cloud services like AWS, Azure, GCP, Azure ML. Expertise in SQL, SQL DB's. Knowledgeable of core CS concepts such as common data structures and algorithms. Collaborate well with teams with different backgrounds / expertise / functions. Additional Skills Understanding of DevOps, CI / CD, data security, experience in designing on cloud platform; Experience in data engineering in Big Data systems

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies