Home
Jobs

2896 Scala Jobs - Page 23

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 3.0 years

5 - 9 Lacs

Kochi

Work from Office

Naukri logo

Job Title - + + Management Level: Location:Kochi, Coimbatore, Trivandrum Must have skills:Python/Scala, Pyspark/Pytorch Good to have skills:Redshift Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture) Job Summary Youll capture user requirements and translate them into business and digitally enabled solutions across a range of industries. Your responsibilities will include: Roles and Responsibilities Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional and Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 2-3 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)

Posted 6 days ago

Apply

2.0 - 3.0 years

5 - 9 Lacs

Kochi

Work from Office

Naukri logo

Job Title - + + Management Level: Location:Kochi, Coimbatore, Trivandrum Must have skills:Python/Scala, Pyspark/Pytorch Good to have skills:Redshift Job Summary Youll capture user requirements and translate them into business and digitally enabled solutions across a range of industries. Your responsibilities will include: Roles and Responsibilities Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional and Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 2-3 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)

Posted 6 days ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Kochi

Work from Office

Naukri logo

Job Title - + + Management Level: Location:Kochi, Coimbatore, Trivandrum Must have skills:Big Data, Python or R Good to have skills:Scala, SQL Job Summary A Data Scientist is expected to be hands-on to deliver end to end vis a vis projects undertaken in the Analytics space. They must have a proven ability to drive business results with their data-based insights. They must be comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes. Roles and Responsibilities Identify valuable data sources and collection processes Supervise preprocessing of structured and unstructured data Analyze large amounts of information to discover trends and patterns for insurance industry. Build predictive models and machine-learning algorithms Combine models through ensemble modeling Present information using data visualization techniques Collaborate with engineering and product development teams Hands-on knowledge of implementing various AI algorithms and best-fit scenarios Has worked on Generative AI based implementations Professional and Technical Skills 3.5-5 years experience in Analytics systems/program delivery; at least 2 Big Data or Advanced Analytics project implementation experience Experience using statistical computer languages (R, Python, SQL, Pyspark, etc.) to manipulate data and draw insights from large data sets; familiarity with Scala, Java or C++ Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications Hands on experience in Azure/AWS analytics platform (3+ years) Experience using variations of Databricks or similar analytical applications in AWS/Azure Experience using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop) Strong mathematical skills (e.g. statistics, algebra) Excellent communication and presentation skills Deploying data pipelines in production based on Continuous Delivery practices. Additional Information Multi Industry domain experience Expert in Python, Scala, SQL Knowledge of Tableau/Power BI or similar self-service visualization tools Interpersonal and Team skills should be top notch Nice to have leadership experience in the past Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)

Posted 6 days ago

Apply

3.0 - 4.0 years

5 - 9 Lacs

Kochi

Work from Office

Naukri logo

Job Title - + + Management Level : Location:Kochi, Coimbatore, Trivandrum Must have skills:Python, Pyspark Good to have skills:Redshift Job Summary : We are seeking a highly skilled and experienced Senior Data Engineer to join our growing Data and Analytics team. The ideal candidate will have deep expertise in Databricks and cloud data warehousing, with a proven track record of designing and building scalable data pipelines, optimizing data architectures, and enabling robust analytics capabilities. This role involves working collaboratively with cross-functional teams to ensure the organization leverages data as a strategic asset. Your responsibilities will include: Roles & Responsibilities Design, build, and maintain scalable data pipelines and ETL processes using Databricks and other modern tools. Architect, implement, and manage cloud-based data warehousing solutions on Databricks (Lakehouse Architecture) Develop and maintain optimized data lake architectures to support advanced analytics and machine learning use cases. Collaborate with stakeholders to gather requirements, design solutions, and ensure high-quality data delivery. Optimize data pipelines for performance and cost efficiency. Implement and enforce best practices for data governance, access control, security, and compliance in the cloud. Monitor and troubleshoot data pipelines to ensure reliability and accuracy. Lead and mentor junior engineers, fostering a culture of continuous learning and innovation. Excellent communication skills Ability to work independently and along with client based out of western Europe Professional & Technical Skills: Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional and Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 3-4 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering Qualification Experience:5-8 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)

Posted 6 days ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Neo4j, Stardog Good to have skills : JavaMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Neo4j.- Good To Have Skills: Experience with Java.- Strong understanding of data modeling and graph database concepts.- Experience with data integration tools and ETL processes.- Familiarity with data quality frameworks and best practices.- Proficient in programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 5 years of experience in Neo4j.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Lead the design and development of applications.- Act as the primary point of contact for application-related queries.- Collaborate with team members to ensure project success.- Provide technical guidance and mentorship to junior team members.- Stay updated on industry trends and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of big data processing and analytics.- Experience with data processing frameworks like Apache Spark.- Hands-on experience in building scalable data pipelines.- Knowledge of cloud platforms for data processing.- Experience in performance tuning and optimization. Additional Information:- The candidate should have a minimum of 3 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

15.0 - 25.0 years

10 - 14 Lacs

Gurugram

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Apache Spark Good to have skills : PySpark, Python (Programming Language)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary As an Application Lead, you will be responsible for designing, building, and configuring applications. Acting as the primary point of contact, you will lead the development team, oversee the delivery process, and ensure successful project execution. Roles & ResponsibilitiesAct as a Subject Matter Expert (SME) in application developmentLead and manage a development team to achieve performance goalsMake key technical and architectural decisionsCollaborate with cross-functional teams and stakeholdersProvide technical solutions to complex problems across multiple teamsOversee the complete application development lifecycleGather and analyze requirements in coordination with stakeholdersEnsure timely and high-quality delivery of projects Professional & Technical SkillsMust-Have SkillsProficiency in Apache SparkStrong understanding of big data processingExperience with data streaming technologiesHands-on experience in building scalable, high-performance applicationsKnowledge of cloud computing platformsMust-Have Additional SkillsPySparkSpark SQL / SQLAWS Additional InformationThis is a full-time, on-site role based in GurugramCandidates must have a minimum of 5 years of hands-on experience with Apache SparkA minimum of 15 years of full-time formal education is mandatory Qualification 15 years full time education

Posted 6 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are efficient, scalable, and aligned with business objectives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze data requirements.- Design and implement robust data pipelines to support data processing and analytics. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and data integration techniques.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

12.0 - 15.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Engineering Good to have skills : Java Enterprise EditionMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Engineering.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and frameworks.- Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.- Knowledge of data warehousing concepts and technologies. Additional Information:- The candidate should have minimum 12 years of experience in Data Engineering.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

2.0 - 3.0 years

4 - 8 Lacs

Kochi

Work from Office

Naukri logo

Job Title - Data Engineer Sr.Analyst ACS SONG Management Level:Level 10 Sr. Analyst Location:Kochi, Coimbatore, Trivandrum Must have skills:Python/Scala, Pyspark/Pytorch Good to have skills:Redshift Job Summary Youll capture user requirements and translate them into business and digitally enabled solutions across a range of industries. Your responsibilities will include: Roles and Responsibilities Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional and Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 2-3 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering About Our Company | Accenture (do not remove the hyperlink) Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)

Posted 6 days ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Neo4j, Stardog Good to have skills : JavaMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Neo4j.- Good To Have Skills: Experience with Java.- Strong understanding of data modeling and graph database concepts.- Experience with data integration tools and ETL processes.- Familiarity with data quality frameworks and best practices.- Proficient in programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 5 years of experience in Neo4j.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

7.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

About Agoda Agoda is an online travel booking platform for accommodations, flights, and more. We build and deploy cutting-edge technology that connects travelers with a global network of 4.7M hotels and holiday properties worldwide, plus flights, activities, and more . Based in Asia and part of Booking Holdings, our 7,100+ employees representing 95+ nationalities in 27 markets foster a work environment rich in diversity, creativity, and collaboration. We innovate through a culture of experimentation and ownership, enhancing the ability for our customers to experience the world. Our Purpose – Bridging the World Through Travel We believe travel allows people to enjoy, learn and experience more of the amazing world we live in. It brings individuals and cultures closer together, fostering empathy, understanding and happiness. We are a skillful, driven and diverse team from across the globe, united by a passion to make an impact. Harnessing our innovative technologies and strong partnerships, we aim to make travel easy and rewarding for everyone. About Agoda Agoda is an online travel booking platform for accommodation, flights, and more. We build and deploy cutting edge technology that connects travelers with more than 2.5 million accommodations globally. Based in Asia and part of Booking Holdings, our 4,000+ talents coming from 90+ different nationalities foster a work environment rich in diversity, creativity, and collaboration. We innovate through a culture of experimentation and ownership, enabling our customers to experience the world Get to Know Our Team Fintech is a rapidly expanding industry with endless opportunities for growth and innovation. At Agoda, we are proud to be at the forefront of this exciting field, working closely with our finance business team and product owners to reduce risk, increase efficiency, and seize new market opportunities. Our fintech projects are diverse and varied, ranging from traditional finance to cutting-edge customer-facing solutions. Whether using big data technologies for reconciliation, expanding and enhancing payment options for our customers, or developing lightning-fast tax calculation systems, we are constantly pushing the boundaries of what’s possible in fintech. With a talented team of data and backend engineers, we are well-equipped to tackle any challenge that comes our way. Agoda is the perfect place for you if you’re passionate about fintech and looking to make a real impact in this dynamic industry. The Opportunity We are seeking highly skilled engineers with a range of experience in fintech to join our team. Whether you are a seasoned expert or just starting out in the field, we welcome your application. We are looking for intelligent and agile engineers with strong attention to detail and the ability to work on both back-end and data engineering tasks. If you have a passion for fintech technology and are excited about the opportunity to build and innovate, we would love to hear from you. We value a wide range of experience and backgrounds and encourage all qualified candidates to apply. In this Role, you will get to Think and own the full life cycle of our products, not just a single piece of code – from business requirements, technology selection, coding standards, agile development, unit and application testing, to CI/CD and proper monitoring Design, develop and maintain platforms and data pipelines across fintech Improve scalability, stability, and efficiency of our existing systems Write great code and help others write great code – mentor people in your team and the wider organisation Collaborate with other teams and departments Help us hire extraordinary talent such as yourself! What You’ll Need To Succeed Minimum 7 years of experience under your belt developing performance-critical applications that run in a production environment using Scala, Java, C# or Kotlin Experience with data tooling: Spark, Kafka, Workflow Orchestration Tools Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Experience building and optimizing ‘big data’ data pipelines, architectures and data sets. In depth knowledge of Model and Design of DB schemas for read and write performance Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement Build processes supporting data transformation, data structures, metadata, dependency and workload management Experience supporting and working with Scrum in Agile Cross-functional teams Excellent verbal and written English communication skills It’s great if you have Deep experience with spark based distributed data pipelines. Able to deep dive and solve challenges in spark query plans and optimize code. Experience in Spark streaming is a plus Strong experience building finance stack applications including ledgers, revenue recognition, monetary transfers and reconciliations, financial accounting platforms etc Experience handling financial data risk management and data governance projects. Experience supporting internal and external financial audits Experience with data profiling, data lineage, data cataloging to enhance data governance and documentation Experience in leading projects, initiatives and/or teams, with full ownership of the systems involved Experience with building data pipelines to integrate 3rd parties is plus #telaviv #jerusalem #IT #ENG #4 #sanfrancisco #sanjose #losangeles #sandiego #oakland #denver #miami #orlando #atlanta #chicago #boston #detroit #newyork #portland #philadelphia #dallas #houston #austin #seattle #sydney #melbourne #perth #toronto #vancouver #montreal #shanghai #beijing #shenzhen #prague #Brno #Ostrava #cairo #alexandria #giza #estonia #paris #berlin #munich #hamburg #stuttgart #cologne #frankfurt #hongkong #budapest #jakarta #bali #dublin #telaviv #milan #rome #venice #florence #naples #turin #palermo #bologna #tokyo #osaka #kualalumpur #malta #amsterdam #oslo #manila #warsaw #krakow #doha #alrayyan #riyadh #jeddah #mecca #medina #singapore #seoul #barcelona #madrid #stockholm #zurich #taipei #tainan #taichung #kaohsiung #bangkok #Phuket #istanbul #london #manchester #edinburgh #hcmc #hanoi #lodz #wroclaw #poznan #katowice #rio #salvador #newdelhi #bangalore #bandung #yokohama #nagoya #okinawa #fukuoka #jerusalem #IT #4 #newdelhi #Pune #Hyderabad #Bangalore #Mumbai #Bengaluru #Chennai #Kolkata #Lucknow #sanfrancisco #sanjose #losangeles #sandiego #oakland #denver #miami #orlando #atlanta #chicago #boston #detroit #newyork #portland #philadelphia #dallas #houston #austin #seattle #sydney #melbourne #perth #toronto #vancouver #montreal #shanghai #beijing #shenzhen #prague #Brno #Ostrava #cairo #alexandria #giza #estonia #paris #berlin #munich #hamburg #stuttgart #cologne #frankfurt #hongkong #budapest #jakarta #bali #dublin #telaviv #milan #rome #tokyo #osaka #kualalumpur #amsterdam #oslo #manila #warsaw #krakow #bucharest #moscow #saintpetersburg #capetown #johannesburg #seoul #barcelona #madrid #stockholm #zurich #taipei #bangkok #Phuket #istanbul #london #manchester #edinburgh #kiev #hcmc #hanoi #wroclaw #poznan #katowice #rio #salvador #IT #4 #5 Equal Opportunity Employer At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person’s merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics. We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy . Disclaimer We do not accept any terms or conditions, nor do we recognize any agency’s representation of a candidate, from unsolicited third-party or agency submissions. If we receive unsolicited or speculative CVs, we reserve the right to contact and hire the candidate directly without any obligation to pay a recruitment fee. Show more Show less

Posted 6 days ago

Apply

7.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

About Agoda Agoda is an online travel booking platform for accommodations, flights, and more. We build and deploy cutting-edge technology that connects travelers with a global network of 4.7M hotels and holiday properties worldwide, plus flights, activities, and more . Based in Asia and part of Booking Holdings, our 7,100+ employees representing 95+ nationalities in 27 markets foster a work environment rich in diversity, creativity, and collaboration. We innovate through a culture of experimentation and ownership, enhancing the ability for our customers to experience the world. Our Purpose – Bridging the World Through Travel We believe travel allows people to enjoy, learn and experience more of the amazing world we live in. It brings individuals and cultures closer together, fostering empathy, understanding and happiness. We are a skillful, driven and diverse team from across the globe, united by a passion to make an impact. Harnessing our innovative technologies and strong partnerships, we aim to make travel easy and rewarding for everyone. About Agoda Agoda is an online travel booking platform for accommodation, flights, and more. We build and deploy cutting edge technology that connects travelers with more than 2.5 million accommodations globally. Based in Asia and part of Booking Holdings, our 4,000+ talents coming from 90+ different nationalities foster a work environment rich in diversity, creativity, and collaboration. We innovate through a culture of experimentation and ownership, enabling our customers to experience the world Get to Know Our Team Fintech is a rapidly expanding industry with endless opportunities for growth and innovation. At Agoda, we are proud to be at the forefront of this exciting field, working closely with our finance business team and product owners to reduce risk, increase efficiency, and seize new market opportunities. Our fintech projects are diverse and varied, ranging from traditional finance to cutting-edge customer-facing solutions. Whether using big data technologies for reconciliation, expanding and enhancing payment options for our customers, or developing lightning-fast tax calculation systems, we are constantly pushing the boundaries of what’s possible in fintech. With a talented team of data and backend engineers, we are well-equipped to tackle any challenge that comes our way. Agoda is the perfect place for you if you’re passionate about fintech and looking to make a real impact in this dynamic industry. The Opportunity We are seeking highly skilled engineers with a range of experience in fintech to join our team. Whether you are a seasoned expert or just starting out in the field, we welcome your application. We are looking for intelligent and agile engineers with strong attention to detail and the ability to work on both back-end and data engineering tasks. If you have a passion for fintech technology and are excited about the opportunity to build and innovate, we would love to hear from you. We value a wide range of experience and backgrounds and encourage all qualified candidates to apply. In this Role, you will get to Think and own the full life cycle of our products, not just a single piece of code – from business requirements, technology selection, coding standards, agile development, unit and application testing, to CI/CD and proper monitoring Design, develop and maintain platforms and data pipelines across fintech Improve scalability, stability, and efficiency of our existing systems Write great code and help others write great code – mentor people in your team and the wider organisation Collaborate with other teams and departments Help us hire extraordinary talent such as yourself! What You’ll Need To Succeed Minimum 7 years of experience under your belt developing performance-critical applications that run in a production environment using Scala, Java, C# or Kotlin Experience with data tooling: Spark, Kafka, Workflow Orchestration Tools Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Experience building and optimizing ‘big data’ data pipelines, architectures and data sets. In depth knowledge of Model and Design of DB schemas for read and write performance Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement Build processes supporting data transformation, data structures, metadata, dependency and workload management Experience supporting and working with Scrum in Agile Cross-functional teams Excellent verbal and written English communication skills It’s great if you have Deep experience with spark based distributed data pipelines. Able to deep dive and solve challenges in spark query plans and optimize code. Experience in Spark streaming is a plus Strong experience building finance stack applications including ledgers, revenue recognition, monetary transfers and reconciliations, financial accounting platforms etc Experience handling financial data risk management and data governance projects. Experience supporting internal and external financial audits Experience with data profiling, data lineage, data cataloging to enhance data governance and documentation Experience in leading projects, initiatives and/or teams, with full ownership of the systems involved Experience with building data pipelines to integrate 3rd parties is plus #telaviv #jerusalem #IT #ENG #4 #sanfrancisco #sanjose #losangeles #sandiego #oakland #denver #miami #orlando #atlanta #chicago #boston #detroit #newyork #portland #philadelphia #dallas #houston #austin #seattle #sydney #melbourne #perth #toronto #vancouver #montreal #shanghai #beijing #shenzhen #prague #Brno #Ostrava #cairo #alexandria #giza #estonia #paris #berlin #munich #hamburg #stuttgart #cologne #frankfurt #hongkong #budapest #jakarta #bali #dublin #telaviv #milan #rome #venice #florence #naples #turin #palermo #bologna #tokyo #osaka #kualalumpur #malta #amsterdam #oslo #manila #warsaw #krakow #doha #alrayyan #riyadh #jeddah #mecca #medina #singapore #seoul #barcelona #madrid #stockholm #zurich #taipei #tainan #taichung #kaohsiung #bangkok #Phuket #istanbul #london #manchester #edinburgh #hcmc #hanoi #lodz #wroclaw #poznan #katowice #rio #salvador #newdelhi #bangalore #bandung #yokohama #nagoya #okinawa #fukuoka #jerusalem #IT #4 #newdelhi #Pune #Hyderabad #Bangalore #Mumbai #Bengaluru #Chennai #Kolkata #Lucknow #sanfrancisco #sanjose #losangeles #sandiego #oakland #denver #miami #orlando #atlanta #chicago #boston #detroit #newyork #portland #philadelphia #dallas #houston #austin #seattle #sydney #melbourne #perth #toronto #vancouver #montreal #shanghai #beijing #shenzhen #prague #Brno #Ostrava #cairo #alexandria #giza #estonia #paris #berlin #munich #hamburg #stuttgart #cologne #frankfurt #hongkong #budapest #jakarta #bali #dublin #telaviv #milan #rome #tokyo #osaka #kualalumpur #amsterdam #oslo #manila #warsaw #krakow #bucharest #moscow #saintpetersburg #capetown #johannesburg #seoul #barcelona #madrid #stockholm #zurich #taipei #bangkok #Phuket #istanbul #london #manchester #edinburgh #kiev #hcmc #hanoi #wroclaw #poznan #katowice #rio #salvador #IT #4 #5 Equal Opportunity Employer At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person’s merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics. We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy . Disclaimer We do not accept any terms or conditions, nor do we recognize any agency’s representation of a candidate, from unsolicited third-party or agency submissions. If we receive unsolicited or speculative CVs, we reserve the right to contact and hire the candidate directly without any obligation to pay a recruitment fee. Show more Show less

Posted 6 days ago

Apply

1.0 - 3.0 years

20 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Skills Required : Kafka, Spark Streaming. Proficiency in one of the programming languages preferably Java, Scala or Python. Education/Qualification : Bachelor's Degree in Computer Science, Engineering, Technology or related field Desirable Skills : Kafka, Spark Streaming. Proficiency in one of the programming languages preferably Java, Scala or Python.

Posted 6 days ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Mumbai

Work from Office

Naukri logo

Entity :- Accenture Strategy & Consulting Team :- Global Network - Data & AI Practice :- Banking & Financial Services Analytics Title :- Consultant Level 9 Job location :- Bengaluru, Gurugram, Mumbai About S&C - Global Network Data & AI :- Accenture Global Network - Data & AI practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling to outperform the competition. Role Overview :- As an experienced financial services professional, your market-leading industry, management and technology expertise will provide critical solutions that answer unparalleled strategic, operational, technology and sourcing demands. And, in doing so, you'll improve the future of the global financial services industry. Accenture serves the world's leading financial services organisations across three industry groups Central Banks:Monetary authorities, reserve banks across globe Banking :Retail and commercial banks and diversified financial institutions. Capital Markets :Investment banks, broker/dealers, asset & wealth management firms, depositories, exchanges, clearing & settlement organisations. Whats in it for you Youll be part of a diverse, vibrant, global Accenture data science community, continually pushing the boundaries of analytical capabilities Get to work with top financial clients globally Build new skills, grow existing skills, develop new areas of expertise within functional and technical areas of business Get access to resources that will allow you to leverage the latest technologiesand bring innovation to life with the worlds most recognizable companies What you would do in this role Work closely with clients to understand their business goals and challenges. Develop data strategies aligned with business objectives and industry best practices. Collaborate with business stakeholders to understand analytical requirements and deliver actionable insights Identification of key business questions through data collection and ETL, and from performing analyses and using a wide range of statistical, machine learning, and applied mathematical techniques to delivery insights Develop and implement predictive models to assess and manage financial risks, leveraging advanced data science techniques. Build predictive models to assess and maximize Customer Lifetime Value, enabling targeted retention and acquisition strategies. Help in responding to RFPs and design POVs Working across client teams to develop and architect Generative AI solutions using ML and GenAI Implement data security measures to safeguard sensitive financial information and ensure adherence to industry standards. Qualification Who we are looking for 3-8+ years of experience in Data Scienece preferably with financial services clients Bachelors or masters degree in a relevant field Computer Science / Statistics / Data Science / Econometrics / Economics / Engineering from reputed institute or MBA from Tier1 colleges Proficiency in programming languages such as Python, R, Scala,pyspark Possess strong analytical skills to derive meaningful insights from complex data sets Experience with data visualization tools (Tableau, Power BI, etc.) Knowledge of Data Science and Machine Learning concepts and algorithms such as clustering, regression, classification, forecasting, hyperparameters optimization, NLP, computer vision, speech processing; understanding of ML model lifecycle would be an asset Experience in Supervisory analytics (like Network Analytics, IFRS9 / Basel, Risk Analytics, Balance of Payments Analytics, Licensing analytics, AML etc) is a plus Strong domain experience in one of the following: Central banks (Monetary / Regulatory / Compliance / BASEL), Commercial Banking, Asset & Wealth management Proven experience in one of data engineering, data governance, data science roles Experience in Generative AI or Central / Supervisory banking is a plus. Experience with any Cloud Technologies (MS Azure, GCP, AWS) Familiarity with Deep Learning concepts & tools (H2O, TensorFlow, PyTorch, Keras, Theano, etc.) is a plus. Excellent communication and client-facing skills Ability to work independently and collaboratively in a team Project management skills and the ability to manage multiple tasks concurrently Strong written and oral communication skills Accenture is an equal opportunities employer and welcomes applications from all sections of society and does not discriminate on grounds of race, religion or belief, ethnic or national origin, disability, age, citizenship, marital, domestic or civil partnership status, sexual orientation, gender identity, or any other basis as protected by applicable law.

Posted 6 days ago

Apply

2.0 - 6.0 years

40 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

Management Level: Ind & Func AI Decision Science Consultant Location: Gurgaon, Mumbai, Bangalore Must-have skills: Risk Analytics, Model Development, Validation, and Auditing, Performance Evaluation, Monitoring, Governance, Statistical Techniques:Linear Regression, Logistic Regression, GLM, GBM, XGBoost, Time Series (ARMA/ARIMA), Programming Languages:SAS, R, Python, Spark, Scala, Tools:Tableau, PowerBI, Regulatory Knowledge:Basel/CCAR/DFAST/CECL/IFRS9, Risk Reporting and Dashboard Solutions Good to have skills: Advanced Data Science Techniques, AML, Operational Risk Modelling, Cloud Platform Experience (AWS/Azure/GCP), Machine Learning Interpretability and Bias Algorithms Job Summary We are seeking a highly skilled Ind & Func AI Decision Science Consultant to join the Accenture Strategy & Consulting team in the Global Network Data & AI practice. You will be responsible for risk model development, validation, and auditing activities, ensuring performance evaluation, monitoring, governance, and documentation. This role offers opportunities to work with top financial clients globally, utilizing cutting-edge technologies to drive business capabilities and foster innovation. Roles & Responsibilities: Engagement Execution Work independently/with minimal supervision in client engagements that may involve model development, validation, governance, strategy, transformation, implementation, and end-to-end delivery of risk solutions for Accentures clients. Ability to manage workstreams of small projects, overseeing the quality of deliverables for junior team members. Demonstrated ability to manage day-to-day interactions with client stakeholders. Practice Enablement Guide junior team members. Support development of the practice by driving innovations and initiatives. Develop thought leadership and disseminate information around current and emerging trends in Risk. Professional & Technical Skills: 2-6 years of relevant Risk Analytics experience at one or more Financial Services firms or Professional Services/Risk Advisory with significant exposure to Credit Risk: PD/LGD/EAD Models, CCAR/DFAST Loss Forecasting, Revenue Forecasting Models, IFRS9/CECL Loss Forecasting across Retail and Commercial portfolios. Credit Acquisition/Behavior: Modeling, Credit Policies, Limit Management, Acquisition Frauds, Collections Agent Matching/Channel Allocations across Retail and Commercial portfolios. Regulatory Capital and Economic Capital Models Liquidity Risk: Liquidity Models, Stress Testing Models, Basel Liquidity Reporting Standards Anti-Money Laundering (AML): AML Scenarios/Alerts, Network Analysis Operational Risk: AMA Modeling, Operational Risk Reporting Modeling Techniques: Linear Regression, Logistic Regression, GLM, GBM, XGBoost, CatBoost, Neural Networks, Time Series (ARMA/ARIMA), ML Interpretability and Bias Algorithms Programming Languages & Tools: SAS, R, Python, Spark, Scala, Tableau, QlikView, PowerBI, SAS VA Strong understanding of Risk functions and their application in client discussions and project implementation. Additional Information: Masters degree in a quantitative discipline (mathematics, statistics, economics, financial engineering, operations research) or MBA from top-tier universities. Industry Certifications:FRM, PRM, CFA preferred. Excellent Communication and Interpersonal Skills. About Our Company | Accenture Qualification Experience: Minimum 2-6 years of relevant Risk Analytics experience, Exposure to Financial Services firms or Professional Services/Risk Advisory Educational Qualification: Masters degree in a quantitative discipline (mathematics, statistics, economics, financial engineering, operations research) or MBA from top-tier universities, Industry certifications such as FRM, PRM, CFA preferred

Posted 6 days ago

Apply

7.0 - 12.0 years

45 - 50 Lacs

Bengaluru

Work from Office

Naukri logo

Management Level :07- I&F Decision Sci Practitioner Manager Location :Mumbai Must-have skills :Risk Analytics, Model Development, Validation, and Auditing, Performance Evaluation, Monitoring, Governance, Statistical Techniques:Linear Regression, Logistic Regression, GLM, GBM, XGBoost, CatBoost, Neural Networks, Programming Languages:SAS, R, Python, Spark, Scala, Tools:Tableau, QlikView, PowerBI, SAS VA, Regulatory Knowledge:Basel/CCAR/DFAST/CECL/IFRS9, Risk Reporting and Dashboard Solutions Good to have skills :Advanced Data Science Techniques, AML, Operational Risk Modelling, Cloud Platform Experience (AWS/Azure/GCP), Machine Learning Interpretability and Bias Algorithms Job Summary We are seeking a highly skilled I&F Decision Sci Practitioner Manager to join the Accenture Strategy & Consulting team in the Global Network Data & AI practice. You will be responsible for leading risk model development, validation, and auditing activities, ensuring performance evaluation, monitoring, governance, and documentation. This role also provides opportunities to work with top financial clients globally, utilizing cutting-edge technologies to drive business capabilities and foster innovation. Roles & Responsibilities: Engagement Execution Lead the team in the development, validation, governance, strategy, transformation, implementation, and end-to-end delivery of risk solutions for clients. Manage workstreams for large and small projects, overseeing the quality of deliverables for junior team members. Develop and frame Proof of Concept for key clients where applicable. Practice Enablement Mentor, guide, and counsel analysts and consultants. Support the development of the practice by driving innovations and initiatives. Support efforts of sales team to identify and win potential opportunities by assisting with RFPs, RFI. Assist in designing POVs, GTM collateral. Professional & Technical Skills: 7-12 years of relevant Risk Analytics experience at one or more Financial Services firms or Professional Services / Risk Advisory with significant exposure to Credit Risk :PD/LGD/EAD Models, CCAR/DFAST Loss Forecasting, Revenue Forecasting Models, IFRS9/CECL Loss Forecasting across Retail and Commercial portfolios. Credit Acquisition/Behavior :Modeling, Credit Policies, Limit Management, Acquisition Frauds, Collections Agent Matching/Channel Allocations across Retail and Commercial portfolios. Regulatory Capital and Economic Capital Models Liquidity Risk :Liquidity Models, Stress Testing Models, Basel Liquidity Reporting Standards Anti-Money Laundering (AML) :AML Scenarios/Alerts, Network Analysis Operational Risk :AMA Modeling, Operational Risk Reporting Modeling Techniques :Linear Regression, Logistic Regression, GLM, GBM, XGBoost, CatBoost, Neural Networks, Time Series (ARMA/ARIMA), ML Interpretability and Bias Algorithms Programming Languages & Tools :SAS, R, Python, Spark, Scala, Tableau, QlikView, PowerBI, SAS VA Strong understanding of Risk functions and their application in client discussions and project implementation. Additional Information: Masters Degree in a quantitative discipline (mathematics, statistics, economics, financial engineering, operations research) or MBA from top-tier universities Industry Certifications :FRM, PRM, CFA preferred Excellent Communication and Interpersonal Skills About Our Company | Accenture Qualification Experience :Minimum 7-12 years of relevant Risk Analytics experience, Exposure to Financial Services firms or Professional Services/Risk Advisory Educational Qualification :Masters degree in a quantitative discipline (mathematics, statistics, economics, financial engineering, operations research) or MBA from top-tier universities, Industry certifications such as FRM, PRM, CFA preferred

Posted 6 days ago

Apply

10.0 - 15.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts. Job Category Software Engineering Job Details About Salesforce . Role Description Salesforce has immediate opportunities for software developers who want their lines of code to have significant and measurable positive impact for users, the companys bottom line, and the industry. You will be working with a group of world-class engineers to build the breakthrough features our customers will love, adopt, and use while keeping our trusted CRM platform stable and scalable. The software engineer role at Salesforce encompasses architecture, design, implementation, and testing to ensure we build products right and release them with high quality. Code review, mentoring junior engineers, and providing technical guidance to the team (depending on the seniority level) We pride ourselves on writing high-quality, maintainable code that strengthens the stability of the product and makes our lives easier. We embrace the hybrid model and celebrate the individual strengths of each team member while encouraging everyone on the team to grow into the best version of themselves. We believe that autonomous teams with the freedom to make decisions will empower the individuals, the product, the company, and the customers they serve to thrive. Your Impact As a Lead Engineer, your job responsibilities will include: Build new and exciting components in an ever-growing and evolving market technology to provide scale and efficiency. Develop high-quality, production-ready code that can be used by millions of users of our applications Make design decisions on the basis of performance, scalability, and future expansion. Work in a Hybrid Engineering model and contribute to all phases of SDLC including design, implementation, code reviews, automation, and testing of the features. Build efficient components/algorithms on a microservice multi-tenant SaaS cloud environment Code review, mentoring junior engineers, and providing technical guidance to the team (depending on the seniority level) Required Skills: Mastery of multiple programming languages and platforms; 10+ years of software development experience; Deep knowledge of object-oriented programming and other scripting languages: Java, Python, Scala C#, Go, Node.JS and C++; Strong SQL skills and experience and experience with relational and non-relational databases e.g. (Postgress / Trino / redshift / Mongo). Experience with developing SAAS applications over public cloud infrastructure - AWS/Azure/GCP; Proficiency in queues, locks, scheduling, event-driven architecture, and workload distribution, along with a deep understanding of relational database and non-relational databases; A deeper understanding of software development best practices and demonstrating leadership skills; Degree or equivalent relevant experience required. Experience will be evaluated based on the core competencies for the role (e.g. extracurricular leadership roles, military experience, volunteer roles, work experience, etc.) BENEFITS PERKS Comprehensive benefits package including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more! World-class enablement and on-demand training with Trailhead.com Exposure to executive thought leaders and regular 1:1 coaching with leadership Volunteer opportunities and participation in our 1:1:1 model for giving back to the community

Posted 6 days ago

Apply

2.0 - 7.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title - S&C Global Network - AI - CMT DE- Consultant Management Level:9- Consultant Location:Open Must-have skills: Data Engineering Good to have skills: Ability to leverage design thinking, business process optimization, and stakeholder management skills. Job Summary : We are looking for a passionate and results-driven Data Engineer to join our growing data team. You will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure that support data-driven decision-making across the organization. Roles & Responsibilities: Design, build, and maintain robust, scalable, and efficient data pipelines (ETL/ELT). Work with structured and unstructured data across a wide variety of data sources. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Optimize data systems and architecture for performance, scalability, and reliability. Monitor data quality and support initiatives to ensure clean, accurate, and consistent data. Develop and maintain data models and metadata. Implement and maintain best practices in data governance, security, and compliance. Professional & Technical Skills: 2+ years in data engineering or related fields Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Strong programming skills in Python, Scala, or Java. Experience with big data technologies such as Spark, Hadoop, or Hive. Familiarity with cloud platforms like AWS, Azure, or GCP, especially services like S3, Redshift, BigQuery, or Azure Data Lake. Experience with orchestration tools like Airflow, Luigi, or similar. Solid understanding of data warehousing concepts and data modeling techniques. Good problem-solving skills and attention to detail. Experience with modern data stack tools like dbt, Snowflake, or Databricks. Knowledge of CI/CD pipelines and version control (e.g., Git). Exposure to containerization (Docker, Kubernetes) and infrastructure as code (Terraform, CloudFormation). Additional Information: - The ideal candidate will possess a strong educational background in quantitative discipline and experience in working with Hi-Tech clients - This position is based at our Bengaluru (preferred) and other AI Accenture locations. About Our Company | Accenture Qualification Experience: 4+ years Educational Qualification: Btech/ BE

Posted 6 days ago

Apply

12.0 - 14.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title - Ind & Func AI Decision Science Manager Management Level: 7-Manager Location: Bengaluru, BDC7C Must-have skills: Risk Analytics Good to have skills: Experience in financial modeling, valuation techniques, and deal structuring. Job Summary : This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. WHATS IN IT FOR YOU Accenture CFO & EV team under Data & AI team has comprehensive suite of capabilities in Risk, Fraud, Financial crime, and Finance. Within risk realm, our focus revolves around the model development, model validation, and auditing of models. Additionally, our work extends to ongoing performance evaluation, vigilant monitoring, meticulous governance, and thorough documentation of models. Get to work with top financial clients globally Access resources enabling you to utilize cutting-edge technologies, fostering innovation with the worlds most recognizable companies. Accenture will continually invest in your learning and growth and will support you in expanding your knowledge. Youll be part of a diverse and vibrant team collaborating with talented individuals from various backgrounds and disciplines continually pushing the boundaries of business capabilities, fostering an environment of innovation. What you would do in this role Engagement Execution Work independently/with minimal supervision in client engagements that may involve model development, validation, governance, strategy, transformation, implementation and end-to-end delivery of risk solutions for Accentures clients. Ability to manage workstream of small projects with responsibilities of managing quality of deliverables for junior team members. Demonstrated ability of managing day to day interactions with the Client stakeholders o Practice Enablement o Guide junior team members. o Support development of the Practice by driving innovations, initiatives. o Develop thought capital and disseminate information around current and emerging trends in Risk. Professional & Technical Skills: - Relevant experience in the required domain. - Strong analytical, problem-solving, and communication skills. - Ability to work in a fast-paced, dynamic environment. Development, validation, and audit of: Credit Risk- PD/LGD/EAD Models, CCAR/DFAST Loss Forecasting and Revenue Forecasting Models, IFRS9/CECL Loss Forecasting Models across Retail and Commercial portfolios Credit Acquisition/Behavior/Collections/Recovery Modeling and Strategies, Credit Policies, Limit Management, Acquisition Frauds, Collections Agent Matching/Channel Allocations across Retail and Commercial portfolios Regulatory Capital and Economic Capital Models Liquidity Risk Liquidity models, stress testing models, Basel Liquidity reporting standards o Anti Money Laundering AML scenarios/alerts, Network Analysis o Operational risk AMA modeling, operational risk reporting Conceptual understanding of Basel/CCAR/DFAST/CECL/IFRS9 and other risk regulations Experience in conceptualizing and creating risk reporting and dashboarding solutions. Experience in modeling with statistical techniques such as linear regression, logistic regression, GLM, GBM, XGBoost, CatBoost, Neural Networks, Time series ARMA/ARIMA, ML interpretability and bias algorithms etc. Programing Languages - SAS, R, Python, Spark, Scala etc., Tools such as Tableau, QlikView, PowerBI, SAS VA etc. Strong understanding of Risk function and ability to apply them in client discussions and project implementation. Academic : Masters degree in a quantitative discipline mathematics, statistics, economics, financial engineering, operations research or related field or MBA from top-tier universities. Strong academic credentials and publications, if applicable. Industry certifications such as FRM, PRM, CFA preferred. Excellent communication and interpersonal skills. Additional Information: - Opportunity to work on innovative projects. - Career growth and leadership exposure. About Our Company | Accenture Qualification Experience: 12-14Years Educational Qualification: Any Degree

Posted 6 days ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : BE Project Role :Software Development Engineer Project Role Description :Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have Skills :PySparkGood to Have Skills :No Industry SpecializationJob :Key Responsibilities :Overall 8 years of experience working in Data Analytics projects, Work on client projects to deliver AWS, PySpark, Databricks based Data engineering Analytics solutions Build and operate very large data warehouses or data lakes ETL optimization, designing, coding, tuning big data processes using Apache Spark Build data pipelines applications to stream and process datasets at low latencies Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience :Minimum of 2 years of experience in Databricks engineering solutions on any of the Cloud platforms using PySpark Minimum of 5 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture delivery Minimum 3 year of Experience in one or more programming languages Python, Java, Scala Experience using airflow for the data pipelines in min 1 project 2 years of experience developing CICD pipelines using GIT, Jenkins, Docker, Kubernetes, Shell Scripting, Terraform. Must be able to understand ETL technologies and translate into Cloud (AWS, Azure, Google Cloud) native tools or Pyspark. Professional Attributes :1 Should have involved in data engineering project from requirements phase to delivery 2 Good communication skill to interact with client and understand the requirement 3 Should have capability to work independently and guide the team. Educational Qualification:Additional Info : Qualification BE

Posted 6 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions that align with organizational goals and enhance operational efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to understand project requirements and deliver high-quality solutions.- Develop and maintain applications using Microsoft Azure Databricks.- Troubleshoot and debug applications to ensure optimal performance.- Implement best practices for application development and deployment.- Stay updated with the latest technologies and trends in application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Strong understanding of cloud computing principles and services.- Experience with data processing and analytics using Azure services.- Knowledge of programming languages such as Python, Scala, or SQL.- Hands-on experience in building and deploying applications on Azure cloud platform. Additional Information:- The candidate should have a minimum of 3 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform, Payroll SAP Integration Suppor, SAP S4 integration support Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality applications that meet user expectations and business goals. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Collaborate with cross-functional teams to gather requirements and provide technical insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with cloud computing platforms.- Strong understanding of application development methodologies.- Familiarity with data integration and ETL processes.- Experience in programming languages such as Python or Scala. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

7.0 - 11.0 years

50 - 60 Lacs

Bengaluru

Work from Office

Naukri logo

Developing Scala Spark pipelines that are resilient, modular and tested. Help automate and scale governance through technology enablement Enable users finding the "right data for the "right use case Participate in identifying and proposing solutions to data quality issues, and data management solutions Support technical implementation of solutions through data pipeline development Maintain technical processes and procedures for data management Very good understanding of MS Azure Data Lake and associated setups ETL knowledge to build semantic layers for reporting Creation / modification of pipelines based on source and target systems User and access Management and Training end users

Posted 6 days ago

Apply

10.0 - 15.0 years

20 - 60 Lacs

Hyderabad

Work from Office

Naukri logo

WHO WE ARE: As part of our software engineering team, you will take the lead in building the next-generation infrastructure and platforms for Zinnia, including but not limited to: scalable data storage infrastructure, analytics platform, streams processing and data pipelines, cutting-edge search platform, best-in-class AI/ML infrastructure, Kubernetes compute infrastructure, document storage infrastructure, etc. You will work and learn among the best, putting to use your passion for distributed technologies and algorithms, API design and systems design, and your passion for writing code that performs at massive scale. We also work with industry standard open source infrastructure technologies like Kubernetes, GRPC and GraphQL - come join our infrastructure teams and share the knowledge with a broader community while making a real impact within our company. As a Sr. Staff Engineer, you will be a key technical leader and role model within the organization. We are looking for a technical lead who designs and develops technology to serve business and technology objectives, aligns points of view across teams and makes trade offs to help achieve the goals of individual teams as well as Zinnia s broader goals. You will foster Zinnia s culture and values around be bold , team up; & deliver value . You will work closely with technical leadership and management within and outside our organization to contribute to building best-in-class core systems infrastructure for Zinnia. WHAT YOU LL DO: Deliver impact by driving innovation while building and shipping software at scale Provide architectural guidance and mentorship to up-level the engineering organization Actively improve the level of craftsmanship at Zinnia by developing best practices and defining best strategies Design products/services/tools and code that can be used by others while upholding operational impact of all decisions Functioning as the tech-lead for multiple key initiatives, identify problems and opportunities and lead teams to architect, design, implement and operationalize systems Partner closely with teams within the org and customers to execute on the vision for long-term success of our core infrastructure teams Working closely with the open-source community to participate and influence cutting edge open-source projects. Keep a platform first approach while designing products/service WHAT YOU LL NEED: BS/BA in Computer Science or related technical field or equivalent technical experience 10+ years of industry experience in software design, development, and algorithm related solutions 5+ years of experience programming in object-oriented languages such as Java, Go, Rust, Python, Scala 3+ years of experience as an architect, or technical leadership position Hands-on experience developing large-scale, distributed systems, and databases BONUS POINTS MS or PhD degree in Computer Science or related technical discipline 10+ years of experience in software design, development, and algorithm related solutions with at least 5 years of experience in a technical leadership position 10+ years of experience in an object-oriented programming language such as Java, Go, Rust, Python, Scala 5+ years of experience with large-scale distributed systems and client-server architectures Experience in architecting and designing large-scale distributed systems related to data infrastructure, IaaS, Kubernetes, and platforms. Distributed Systems Technical Leadership Infrastructure as a Service (IaaS) Systems Infrastructure WHAT S IN IT FOR YOU? We re looking for the best and brightest innovators in the industry to join our team. At Zinnia, you collaborate with smart, creative professionals who are dedicated to delivering cutting-edge technologies, deeper data insights, and enhanced services to transform how insurance is done. Visit our website at www.zinnia.com for more information. Apply by completing the online application on the careers section of our website. We are an Equal Opportunity employer committed to a diverse workforce. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability.

Posted 6 days ago

Apply

Exploring Scala Jobs in India

Scala is a popular programming language that is widely used in India, especially in the tech industry. Job seekers looking for opportunities in Scala can find a variety of roles across different cities in the country. In this article, we will dive into the Scala job market in India and provide valuable insights for job seekers.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving tech ecosystem and have a high demand for Scala professionals.

Average Salary Range

The salary range for Scala professionals in India varies based on experience levels. Entry-level Scala developers can expect to earn around INR 6-8 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 15 lakhs per annum.

Career Path

In the Scala job market, a typical career path may look like: - Junior Developer - Scala Developer - Senior Developer - Tech Lead

As professionals gain more experience and expertise in Scala, they can progress to higher roles with increased responsibilities.

Related Skills

In addition to Scala expertise, employers often look for candidates with the following skills: - Java - Spark - Akka - Play Framework - Functional programming concepts

Having a good understanding of these related skills can enhance a candidate's profile and increase their chances of landing a Scala job.

Interview Questions

Here are 25 interview questions that you may encounter when applying for Scala roles:

  • What is Scala and why is it used? (basic)
  • Explain the difference between val and var in Scala. (basic)
  • What is pattern matching in Scala? (medium)
  • What are higher-order functions in Scala? (medium)
  • How does Scala support functional programming? (medium)
  • What is a case class in Scala? (basic)
  • Explain the concept of currying in Scala. (advanced)
  • What is the difference between map and flatMap in Scala? (medium)
  • How does Scala handle null values? (medium)
  • What is a trait in Scala and how is it different from an abstract class? (medium)
  • Explain the concept of implicits in Scala. (advanced)
  • What is the Akka toolkit and how is it used in Scala? (medium)
  • How does Scala handle concurrency? (advanced)
  • Explain the concept of lazy evaluation in Scala. (advanced)
  • What is the difference between List and Seq in Scala? (medium)
  • How does Scala handle exceptions? (medium)
  • What are Futures in Scala and how are they used for asynchronous programming? (advanced)
  • Explain the concept of type inference in Scala. (medium)
  • What is the difference between object and class in Scala? (basic)
  • How can you create a Singleton object in Scala? (basic)
  • What is a higher-kinded type in Scala? (advanced)
  • Explain the concept of for-comprehensions in Scala. (medium)
  • How does Scala support immutability? (medium)
  • What are the advantages of using Scala over Java? (basic)
  • How do you implement pattern matching in Scala? (medium)

Closing Remark

As you explore Scala jobs in India, remember to showcase your expertise in Scala and related skills during interviews. Prepare well, stay confident, and you'll be on your way to a successful career in Scala. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies