Home
Jobs

3773 Scala Jobs - Page 41

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 10.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. Responsibilities: Job Description:  Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services.  Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics.  Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications.  Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL.  Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards.  Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks  Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained  Working with other members of the project team to support delivery of additional project components (API interfaces)  Evaluating the performance and applicability of multiple tools against customer requirements  Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.  Integrate Databricks with other technologies (Ingestion tools, Visualization tools).  Proven experience working as a data engineer  Highly proficient in using the spark framework (python and/or Scala)  Extensive knowledge of Data Warehousing concepts, strategies, methodologies.  Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks).  Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics  Experience in designing and hands-on development in cloud-based analytics solutions.  Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required.  Designing and building of data pipelines using API ingestion and Streaming ingestion methods.  Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential.  Thorough understanding of Azure Cloud Infrastructure offerings.  Strong experience in common data warehouse modelling principles including Kimball.  Working knowledge of Python is desirable  Experience developing security models.  Databricks & Azure Big Data Architecture Certification would be plus  Must be team oriented with strong collaboration, Prioritization, And Adaptability Skills Required Mandatory skill sets: Azure Databricks Preferred Skill Sets Azure Databricks Years Of Experience Required 3-10 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Databricks Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Description 👋🏼We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience 5+ years. Hands on working experience in Data engineering. Strong working experience in SQL, Python or Scala. Deep understanding of Cloud Design Patterns and their implementation. Experience working with Snowflake as a data warehouse solution. Experience with Power BI data integration. Design, develop, and maintain scalable data pipelines and ETL processes. Work with structured and unstructured data from multiple sources (APIs, databases, flat files, cloud platforms). Strong understanding of data modelling, warehousing (e.g., Star/Snowflake schema), and relational database systems (PostgreSQL, MySQL, etc.) Hands-on experience with ETL tools such as Apache Airflow, Talend, Informatica, or similar. Strong problem-solving skills and a passion for continuous improvement. Strong communication skills and the ability to collaborate effectively with cross-functional teams. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the client's business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the clients' requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers. Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Job Description : Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained Working with other members of the project team to support delivery of additional project components (API interfaces) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Integrate Databricks with other technologies (Ingestion tools, Visualization tools). Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics Experience in designing and hands-on development in cloud-based analytics solutions. Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. Designing and building of data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Thorough understanding of Azure Cloud Infrastructure offerings. Strong experience in common data warehouse modeling principles including Kimball. Working knowledge of Python is desirable Experience developing security models. Databricks & Azure Big Data Architecture Certification would be plus Mandatory Skill Sets ADE, ADB, ADF Preferred Skill Sets ADE, ADB, ADF Years Of Experience Required 3-7 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Data Engineering, GCP Dataflow Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 7 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 week ago

Apply

10.0 years

2 - 9 Lacs

Hyderābād

On-site

Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: 1. Spark Application Development Design, develop, and optimize distributed data processing pipelines using Apache Spark. Implement ETL processes for large-scale data ingestion, transformation, and storage. Write efficient, maintainable, and scalable Spark jobs in PySpark, Scala, or Java. Collaborate with data engineers and analysts to ensure data quality and reliability. 2. DevOps Leadership: Design and implement CI/CD pipelines for Spark applications and big data workflows. Automate deployment, monitoring, and scaling of Spark jobs in cloud or on-premises environments. Manage infrastructure as code (IaC) using tools like Terraform, Ansible, or CloudFormation. Ensure system reliability, availability, and performance through proactive monitoring and alerting. 3. Technical Leadership Lead a team of developers and DevOps engineers, providing technical guidance and mentorship. Define best practices for Spark development, DevOps, and cloud-native architectures. Conduct code reviews, enforce coding standards, and ensure adherence to project timelines. Collaborate with stakeholders to gather requirements and translate them into technical solutions. 4. Cloud and Big Data Ecosystem Work with cloud platforms (AWS, Azure, GCP) to deploy and manage Spark applications. Integrate Spark with big data tools like Hadoop, Hive, Kafka, and Delta Lake. Optimize resource usage and cost efficiency in cloud-based Spark clusters. Requirements To be successful in this role, you should meet the following requirements: 1)- Technical Expertise: Strong experience with Apache Spark 3.x, Delta Lake (PySpark, Scala, or Java). Proficiency in big data technologies (Hadoop, Hive, Kafka, etc.). Hands-on experience with CI/CD tools (Jenkins, GitHub, Ansible CI/CD, etc.). Knowledge of containerization and orchestration (Docker, Kubernetes). Experience with cloud platforms (AWS EMR, Azure Databricks, GCP DataProc). 2) DevOps Skills: Expertise in infrastructure automation tools (Terraform, Ansible, etc.). Strong understanding of monitoring and logging tools (Prometheus, Grafana, Splunk ). Experience with version control systems (Git) and branching strategies. 3) Leadership and Communication: o Proven experience leading development and DevOps teams. o Strong problem-solving skills and ability to make technical decisions. o Excellent communication and collaboration skills. Preferred Qualifications: o Experience with real-time data processing frameworks (e.g., Spark Streaming, Flink). o Knowledge of data lake architectures and Delta Lake. o Certifications in cloud platforms (AWS, Azure, GCP). o Familiarity with Agile/Scrum methodologies. ? Education and Experience: o Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. o 10+ years of experience in Spark application development and DevOps. o 3+ years of experience in a technical leadership role. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 1 week ago

Apply

14.0 years

1 - 8 Lacs

Hyderābād

On-site

Job Description: Lead Data Engineer Job Description About AT&T Chief Data Office The Chief Data Office (CDO) at AT&T is responsible for leveraging data as a strategic asset to drive business value. The team focuses on data governance, data engineering, artificial intelligence, and advanced analytics to enhance customer experience, optimize operations, and enable innovation. Candidates will: Work on cutting-edge Cloud Technologies, AI/ML, and data-driven solutions, be a part of a dynamic and innovative team driving digital transformation. Lead high-impact Agile initiatives with top talent in the industry. Get opportunity to grow and implement Agile at an enterprise level. Offered competitive compensation, flexible work culture, and learning opportunities. Shift timing (if any): 12.30 to 9.30 IST(Bangalore)/1:00-10:00 pm (Hyderabad) Work mode: Hybrid (3 days mandatory in office) Location / Additional Location (if any): Bangalore, Hyderabad Job Title / Advertise Job Title: Lead Data Engineer Roles and Responsibilities Create product roadmap and project plan. Design, develop, and maintain scalable ETL pipelines using Azure Services to process, transform, and load large datasets into Cloud platforms. Collaborate with cross-functional teams, including data architects, analysts, and business stakeholders, to gather data requirements and deliver efficient data solutions. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Work together with data scientists/architects and analysts to understand the needs for data and create effective data workflows. Exposure to Snowflake Warehouse. Big Data Engineer with solid background with the larger Hadoop ecosystem and real-time analytics tools including PySpark/Scala-Spark/Hive/Hadoop CLI/MapReduce/Storm/Kafka/Lambda Architecture Implementing data validation and cleansing techniques. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Experience in designing and hands-on development in cloud-based analytics solutions. Expert level understanding on Azure Data Factory Azure Data Lake, Snowflake, Pyspark is required. Good to have exp in full Stack Development background with Java and JavaScript/CSS/HTML. Knowledge of ReactJs/Angular is a plus. Designing and building of data pipelines using API ingestion and Streaming ingestion methods. Unix/Linux expertise; comfortable with Linux operating system and Shell Scripting. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is desirable. PL/SQL, RDBMS background with Oracle/MySQL Comfortable with microServices, CI/CD, Dockers, and Kubernetes Strong experience in common Data Vault data warehouse modelling principles. Creating/modifying Dockers and deploying them via Kubernetes. Additional Skills Required: The ideal candidate should have at least 14+ years of experience in IT along in addition to the following: Having 10+ years of extensive development experience using snowflake or similar data warehouse technology Having working experience with dbt and other technologies of the modern datastack, such as Snowflake, Azure, Databricks and Python, Experience in agile processes, such as SCRUM Extensive experience in writing advanced SQL statements and performance tuning. Experience in Data Ingestion techniques using custom or SAAS tool Experience in data modelling and can optimize existing/new data models Experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets Technical Qualifications: Preferred: Bachelor's degree in Computer Science, Information Systems, or a related field. Experience in high-tech, software, or telecom industries is a plus. Strong analytical skills to translate insights into impactful product initiatives. #DataEngineering Weekly Hours: 40 Time Type: Regular Location: Hyderabad, Andhra Pradesh, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made. Job ID R-63188 Date posted 06/05/2025 Benefits Your needs? Met. Your wants? Considered. Take a look at our comprehensive benefits. Paid Time Off Tuition Assistance Insurance Options Discounts Training & Development

Posted 1 week ago

Apply

5.0 - 8.0 years

6 - 9 Lacs

Hyderābād

On-site

About the Role: Grade Level (for internal use): 10 Market Intelligence The Role: Senior Full Stack Developer Grade level : 10 The Team: You will work with a team of intelligent, ambitious, and hard-working software professionals. The team is responsible for the architecture, design, development, quality, and maintenance of the next-generation financial data web platform. Other responsibilities include transforming product requirements into technical design and implementation. You will be expected to participate in the design review process, write high-quality code, and work with a dedicated team of QA Analysts, and Infrastructure Teams The Impact: Market Intelligence is seeking a Software Developer to create software design, development, and maintenance for data processing applications. This person would be part of a development team that manages and supports the internal & external applications that is supporting the business portfolio. This role expects a candidate to handle any data processing, big data application development. We have teams made up of people that learn how to work effectively together while working with the larger group of developers on our platform. What’s in it for you: Opportunity to contribute to the development of a world-class Platform Engineering team . Engage in a highly technical, hands-on role designed to elevate team capabilities and foster continuous skill enhancement. Be part of a fast-paced, agile environment that processes massive volumes of data—ideal for advancing your software development and data engineering expertise while working with a modern tech stack. Contribute to the development and support of Tier-1, business-critical applications that are central to operations. Gain exposure to and work with cutting-edge technologies including AWS Cloud , EMR and Apache NiFi . Grow your career within a globally distributed team , with clear opportunities for advancement and skill development. Responsibilities: Design and develop applications, components, and common services based on development models, languages and tools, including unit testing, performance testing and monitoring and implementation Support business and technology teams as necessary during design, development and delivery to ensure scalable and robust solutions Build data-intensive applications and services to support and enhance fundamental financials in appropriate technologies.( C#, .Net Core, Databricsk, Spark ,Python, Scala, NIFI , SQL) Build data modeling, achieve performance tuning and apply data architecture concepts Develop applications adhering to secure coding practices and industry-standard coding guidelines, ensuring compliance with security best practices (e.g., OWASP) and internal governance policies. Implement and maintain CI/CD pipelines to streamline build, test, and deployment processes; develop comprehensive unit test cases and ensure code quality Provide operations support to resolve issues proactively and with utmost urgency Effectively manage time and multiple tasks Communicate effectively, especially written with the business and other technical groups What We’re Looking For: Basic Qualifications: Bachelors /Master’s Degree in Computer Science, Information Systems or equivalent. Minimum 5 to 8 years of strong hand-development experience in C#, .Net Core, Cloud Native, MS SQL Server backend development. Proficiency with Object Oriented Programming. Advance SQL programming skills Preferred experience or familiarity with tools and technologies such as Odata, Grafana, Kibana, Big Data platforms, Apache Kafka, GitHub, AWS EMR, Terraform, and emerging areas like AI/ML and GitHub Copilot. Highly recommended skillset in Databricks, SPARK, Scala technologies. Understanding of database performance tuning in large datasets Ability to manage multiple priorities efficiently and effectively within specific timeframes Excellent logical, analytical and communication skills are essential, with strong verbal and writing proficiencies Knowledge of Fundamentals, or financial industry highly preferred. Experience in conducting application design and code reviews Proficiency with following technologies: Object-oriented programming Programing Languages (C#, .Net Core) Cloud Computing Database systems (SQL, MS SQL) Nice to have: No-SQL (Databricks, Spark, Scala, python), Scripting (Bash, Scala, Perl, Powershell) Preferred Qualifications: Hands-on experience with cloud computing platforms including AWS , Azure , or Google Cloud Platform (GCP) . Proficient in working with Snowflake and Databricks for cloud-based data analytics and processing. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316195 Posted On: 2025-06-06 Location: Hyderabad, Telangana, India

Posted 1 week ago

Apply

9.0 years

6 - 9 Lacs

Hyderābād

On-site

Lead, Software Engineering Hyderabad, India Information Technology 313260 Job Description About The Role: Grade Level (for internal use): 11 The Team: Our team is responsible for the design, architecture, and development of our client facing applications using a variety of tools that are regularly updated as new technologies emerge. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. The Impact: The work you do will be used every single day, it’s the essential code you’ll write that provides the data and analytics required for crucial, daily decisions in the capital and commodities markets. What’s in it for you: Build a career with a global company. Work on code that fuels the global financial markets. Grow and improve your skills by working on enterprise level products and new technologies. Responsibilities: Solve problems, analyze and isolate issues. Provide technical guidance and mentoring to the team and help them adopt change as new processes are introduced. Champion best practices and serve as a subject matter authority. Develop solutions to develop/support key business needs. Engineer components and common services based on standard development models, languages and tools Produce system design documents and lead technical walkthroughs Produce high quality code Collaborate effectively with technical and non-technical partners As a team-member should continuously improve the architecture Basic Qualifications: 9-12 years of experience designing/building data-intensive solutions using distributed computing. Proven experience in implementing and maintaining enterprise search solutions in large-scale environments. Experience working with business stakeholders and users, providing research direction and solution design and writing robust maintainable architectures and APIs. Experience developing and deploying Search solutions in a public cloud such as AWS. Proficient programming skills at a high-level languages - Java, Scala, Python Solid knowledge of at least one machine learning research frameworks Familiarity with containerization, scripting, cloud platforms, and CI/CD. 5+ years’ experience with Python, Java, Kubernetes, and data and workflow orchestration tools 4+ years’ experience with Elasticsearch, SQL, NoSQL, Apache spark, Flink, Databricks and Mlflow. Prior experience with operationalizing data-driven pipelines for large scale batch and stream processing analytics solutions Good to have experience with contributing to GitHub and open source initiatives or in research projects and/or participation in Kaggle competitions Ability to quickly, efficiently, and effectively define and prototype solutions with continual iteration within aggressive product deadlines. Demonstrate strong communication and documentation skills for both technical and non-technical audiences. Preferred Qualifications: Search Technologies: Query and Indexing content for Apache Solr, Elastic Search, etc. Proficiency in search query languages (e.g., Lucene Query Syntax) and experience with data indexing and retrieval. Experience with machine learning models and NLP techniques for search relevance and ranking. Familiarity with vector search techniques and embedding models (e.g., BERT, Word2Vec). Experience with relevance tuning using A/B testing frameworks. Big Data Technologies: Apache Spark, Spark SQL, Hadoop, Hive, Airflow Data Science Search Technologies: Personalization and Recommendation models, Learn to Rank (LTR) Preferred Languages: Python, Java Database Technologies: MS SQL Server platform, stored procedure programming experience using Transact SQL. Ability to lead, train and mentor. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Inclusive Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering an inclusive workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and equal opportunity, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 313260 Posted On: 2025-04-28 Location: Hyderabad, Telangana, India

Posted 1 week ago

Apply

2.0 years

4 - 6 Lacs

Hyderābād

On-site

- 2+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) - Knowledge of AWS Infrastructure - Knowledge of writing and optimizing SQL queries in a business environment with large-scale, complex datasets - Strong analytical and problem solving skills. Curious, self-motivated & a self-starter with a ‘can do attitude’. Comfortable working in fast paced dynamic environment The Data Engineer will own the data infrastructure for the Reverse Logistics Team which includes collaboration with software development teams to build the data infrastructure and maintain a highly scalable, reliable and efficient data system to support the fast growing business. You will work with analytic tools, can write excellent SQL scripts, optimize performance of SQL queries and can partner with internal customers to answer key business questions. We look for candidates who are self-motivated, flexible, hardworking and who like to have fun. About the team Reverse Logistics team at Amazon Hyderabad Development Center is an agile team whose charter is to deliver the next generation of Reverse Logistics platform. As a member of this team, your mission will be to design, develop, document and support massively scalable, distributed data warehousing, querying and reporting system. Bachelor's degree in a quantitative/technical field such as computer science, engineering, statistics Proven track record of strong interpersonal and communication (verbal and written) skills. Experience developing insights across various areas of customer-related data: financial, product, and marketing Proven problem solving skills, attention to detail, and exceptional organizational skills Ability to deal with ambiguity and competing objectives in a fast paced environment Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing and operations Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

10.0 years

6 - 9 Lacs

Hyderābād

On-site

Lead, Application Development Hyderabad, India; Ahmedabad, India; Gurgaon, India Information Technology 316185 Job Description About The Role: Grade Level (for internal use): 11 S&P Global EDO The Role: Lead- Software Engineering IT- Application Development. Join Our Team: Step into a dynamic team at the cutting edge of data innovation! You’ll collaborate daily with talented professionals from around the world, designing and developing next-generation data products for our clients. Our team thrives on a diverse toolkit that evolves with emerging technologies, offering you the chance to work in a vibrant, global environment that fosters creativity and teamwork. The Impact: As a Lead Software Developer at S&P Global, you’ll be a driving force in shaping the future of our data products. Your expertise will streamline software development and deployment, aligning cutting-edge solutions with business needs. By ensuring seamless integration and continuous delivery, you’ll enhance product capabilities, delivering high-quality systems that meet the highest standards of availability, security, and performance. Your work will empower our clients with impactful, data-driven solutions, making a real difference in the financial world. What’s in it for You: Career Development: Build a rewarding career with a global leader in financial information and analytics, supported by continuous learning and a clear path to advancement. Dynamic Work Environment: Thrive in a fast-paced, forward-thinking setting where your ideas fuel innovation and your contributions shape groundbreaking solutions. Skill Enhancement: Elevate your expertise on an enterprise-level platform, mastering the latest tools and techniques in software development. Versatile Experience: Dive into full-stack development with hands-on exposure to cloud computing, Bigdata, and revolutionary GenAI technologies. Leadership Opportunities: Guide and inspire a skilled team, steering the direction of our products and leaving your mark on the future of technology at S&P Global. Responsibilities: Architect and develop scalable Bigdata and cloud applications, harnessing a range of cloud services to create robust, high-performing solutions. Design and implement advanced CI/CD pipelines, automating software delivery for fast, reliable deployments that keep us ahead of the curve. Tackle complex challenges head-on, troubleshooting and resolving issues to ensure our products run flawlessly for clients. Lead by example, providing technical guidance and mentoring to your team, driving innovation and embracing new processes. Deliver top-tier code and detailed system design documents, setting the standard with technical walkthroughs that inspire excellence. Bridge the gap between technical and non-technical stakeholders, turning complex requirements into elegant, actionable solutions. Mentor junior developers, nurturing their growth and helping them build skills and careers under your leadership. What We’re Looking For: We’re seeking a passionate, experienced professional with: 10-13 years of hands-on experience designing and building data-intensive solutions using distributed computing, showcasing your mastery of scalable architectures. Proven success implementing and maintaining enterprise search solutions in large-scale environments, ensuring peak performance and reliability. A history of partnering with business stakeholders and users to shape research directions and craft robust, maintainable products. Extensive experience deploying data engineering solutions in public clouds like AWS, GCP, or Azure, leveraging cloud power to its fullest. Advanced programming skills in Python, Java, .NET or Scala, backed by a portfolio of impressive projects. Strong knowledge of Gen AI tools (e.g., GitHub Copilot, ChatGPT, Claude, or Gemini) and their power to boost developer productivity. Expertise in containerization, scripting, cloud platforms, and CI/CD practices, ready to shine in a modern development ecosystem. 5+ years working with Python, Java, .NET, Kubernetes, and data/workflow orchestration tools, proving your technical versatility. Deep experience with SQL, NoSQL, Apache Spark, Airflow, or similar tools, operationalizing data-driven pipelines for large-scale batch and stream processing. A knack for rapid prototyping and iteration, delivering high-quality solutions under tight deadlines. Outstanding communication and documentation skills, adept at explaining complex ideas to technical and non-technical audiences alike. Take the Next Step: Ready to elevate your career and make a lasting impact in data and technology? Join us at S&P Global and help shape the future of financial information and analytics. Apply today! Return to Work Have you taken time out for caring responsibilities and are now looking to return to work? As part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply and will actively support your return to the workplace. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316185 Posted On: 2025-06-06 Location: Hyderabad, Telangana, India

Posted 1 week ago

Apply

6.0 years

7 - 10 Lacs

Hyderābād

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Implementing end to end features and user stories from analysis/design, building of feature, validation and deployment and post deployment support Implement the core and complex user stories and components of the platform Implement required POCs to make sure that suggested design/technologies meet the requirements Identify and create re-usable components Create Scala/Spark jobs for data transformation and aggregation, Implement user stories in data processing pipelines in AWS/Azure Produce unit tests for Spark transformations and helper methods Perform code review and provide meaningful feedback to improve code quality Possess/acquire solid troubleshooting skills and be interested in performing troubleshooting of issues in different desperate technologies and environments Identify and integrate well over all integration points in context of a project as well as other applications in the environment Give solution to any issue that is raised during code review and be able to justify the decision taken Help teams in complex and unusual bugs and troubleshooting scenarios Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelor's degree in Computer Science, Information Technology, or equivalent experience 6+ years of working experience with Scala/Python Spark, Hadoop MapR, Kafka, Scala, Kubernetes, CI/CD Experience in Java is added advantage Understand all non-functional requirements and be able to address them in design and code Understand and relate technology integration scenarios and be able to apply these learnings in complex troubleshooting scenarios Understand CI/CD pipelines Proficient in programming languages like Scala, Python, Java and used IDE like eclipse, IntelliJ Understanding agile methodology Proven proactive and self-motivated, spot improvement opportunities within and outside of project and present Proven solid written and verbal communication skills including explaining complex concepts effectively to technical and non-technical audience At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 1 week ago

Apply

3.0 years

2 - 7 Lacs

Hyderābād

On-site

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India; Gurugram, Haryana, India . Minimum qualifications: Bachelor's degree in Computer Science, Mathematics, a related technical field, or equivalent practical experience. 3 years of experience in building Machine Learning or Data Science solutions. Experience in Python, Scala, R, or related, with data structures, algorithms, and software design. Ability to travel up to 30% of the time as needed. Preferred qualifications: Experience with recommendation engines, data pipelines, or distributed machine learning with data analytics, data visualization techniques and software, and deep learning frameworks. Experience in software development, professional services, solution engineering, technical consulting with architecting and rolling out new technology and solution initiatives. Experience with Data Science techniques. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, Extract, Transform, and Load/Extract, Load and Transform (ETL/ELT) and reporting tools and environments. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. Excellent communication skills. About the job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. In this role, you will play a role in ensuring that customers have the best experience moving to the Google Cloud machine learning (ML) suite of products. You will design and implement machine learning solutions for customer use cases, leveraging core Google products. You will work with customers to identify opportunities to transform their business with machine learning, and will travel to customer sites to deploy solutions and deliver workshops designed to educate and empower customers to realize the potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product code, and address customer and partner needs.You will lead the execution of adopting the Google Cloud Platform solutions to the customer’s requirements.Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver big data and machine learning solutions and solve technical customer tests. Act as a trusted technical advisor to Google’s customers. Identify new product features and feature gaps, provide guidance on existing product tests, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver recommendations, tutorials, blog articles, and technical presentations adapting to different levels of business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Greater Chennai Area

Remote

Linkedin logo

Your work days are brighter here. At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. About The Team About the Team Are you interested in an exciting new adventure building developer tooling? The Product Developer Tooling organization develops software and tools to support all of Workday Application Development and Testing and is extremely passionate about improving developer productivity. As a Software Engineer in our Tooling organization, you will be at the foundation of Workday’s technology, building software that empowers engineering teams to rapidly develop, test and deliver high quality products. Our team currently serves the almost 3,000 strong Workday development community by providing scalable development and testing tools that are vital to support an efficient continuous delivery platform. We have a work environment that is not driven by external product launches, but instead by the needs of our own development community, which allows us to focus on producing well thought-out solutions that enhance our development environment, automated testing and delivery pipeline. About The Role We are looking for a passionate, experienced, Sr. Software Engineer to join us on our mission to help shape the next generation of our Workday Developer Tools! We want someone who will be at the forefront of shaping the development and test lifecycle of the other passionate developers who build our Workday Products. Our team follows a hybrid remote model and is built on collaborative teamwork and trust. We love Slack and Zoom to enable our varied communication models, but also value face-to-face time during the moments that matter to our team. This role is for you if you are... Passionate about technology and building world-class applications and frameworks in a fast-paced, fun, agile work environment! A proficient OO and/or functional programmer, enthusiastic about learning and applying sound architectural principles to build scalable/performant designs Someone who is eager to contribute to the scoping, planning, architecture, design, implementation, testing and delivery of key Product features Enthusiastic about collaborating with peers, engineering managers and senior/principal engineers on the technical designs and implementation of new features Interested in participating in the release planning process by understanding the details of the upcoming features (design, effort, risk, priority, size) Interested in Product quality, testing and functional test methodologies (Unit testing, TDD, BDD, etc) About You About You Basic Qualifications 7+ years of Object Oriented and/or Functional Design and Programming (Java, Javascript, Ruby, Scala, etc) Experience working with automation, CI/CD or web testing software Proficient with HTTP, REST, SOAP, XML, JSON and other key web frameworks (e.g. React, Angular) Demonstrated ability to deliver on time, working in a fast-paced agile environment Competence in communicating design ideas cohesively using UML or technical presentations Agile Methodologies, Code Reviews, Java, Javascript, Python (Programming Language), Software Development BS/MS in Computer Science or related technical field Other Qualifications Test focused with good TDD / Unit & System Testing, debugging and profiling skills Experienced with common IDE, build & CI/CD tools (e.g. IntelliJ, Git, Gradle, maven, Jenkins, TeamCity, Artifactory) Good code review skills and capacity to both provide and act on constructive feedback Excellent collaboration and communication skills Pursuant to applicable Fair Chance law, Workday will consider for employment qualified applicants with arrest and conviction records. Workday is an Equal Opportunity Employer including individuals with disabilities and protected veterans. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process! Show more Show less

Posted 1 week ago

Apply

70.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

hackajob is collaborating with Zipcar to connect them with exceptional tech professionals for this role. Who are we? Glad you asked! Avis Budget Group is a leading provider of mobility options, with brands including Avis, Budget & Budget Truck, and Zipcar. With more than 70 years of experience and 11,000 locations in 180 countries, we are shaping the future of our industry and want you to join us in our mission. Zipcar is the world’s leading car-sharing network, found in urban areas and university campuses in more than 500 cities and towns. Our team is smart, creative and fun, and we’re driven by a mission - to enable simple and responsible urban living Apply today to get connected to an exciting career, a supportive family of employees and a world of opportunities. What is ABG’s strategy in India? At our India Build Center, you will play a key role in driving the digital transformation narrative of ABG. Being at the core of ABG’s growth strategy, we will develop technology-led offerings that would position Avis and its brands as the best vehicle rental company in the world. Our goal is to create the future of customer experience through technology. The India Build Center is based in Bengaluru, India . We are currently located at WeWork Kalyani Roshni Tech Hub in Marathahalli on Outer Ring Road , strategically located close to product companies and multiple tech-parks like Embassy Tech Park, ETV, RMZ Ecospace, Kalyani Tech Park, EPIP Zone and ITPL among others. The Fine Print We encourage Zipsters to bring their whole selves to work - unique perspectives, personal experiences, backgrounds, and however they identify. We are proud to be an equal opportunity employer - M/F/D/V. This document does not constitute a promise or guarantee of employment. This document describes the general nature and level of this position only. Essential functions and responsibilities may change as business needs require. This position may be with any affiliate of Avis Budget Group. Data Engineer/SDE 3 (Data Engineering) Location: Bengaluru, India | 100% on-site The Impact You’ll Make We are looking for a talented and passionate senior engineer to lead the way on the development and maintenance of Zipcar’s core platform services. These are the underlying services that support our car sharing mobile and web ecommerce products - the primary driver of $9B in annual revenue. This role requires a resourceful individual, a persistent problem solver, and a strong hands-on engineer. This is a great opportunity to have a big impact as part of a growing team in the midst of technology and product transformation. Watch our talk at a recent AWS Re: Invent conference here . What You’ll Do Build a deep understanding of existing systems. Participate in or lead design reviews with peers and stakeholders. Develop robust, testable code that meets design requirements. Review code developed by other developers, providing feedback on style, functional correctness, testability, and efficiency. Triage system-wide issues and identify root cause of incidents. Can work independently and can participate/contribute to architecture discussions. Identify and resolve existing critical technical debt. Build transparent systems with proper monitoring, observability, and alerting. Plan for robust build, test, and deployment automation Work with product stakeholders and front-end developers to understand the essence of requirements and to provide pragmatic solutions Work within an Agile framework What We’re Looking For 3-5 years of Professional experience designing/building/maintaining highly available data and analytics platform. 3+ years of experience in data engineering, with a focus on building large-scale data processing systems. Hands-on experience with AWS or similar cloud platform building data engineering solutions for analytics and science. (2+ years) Must have experience building complex data pipelines - batch and/or real time event-based processing (2+ years) Strong experience in designing, building and maintaining data warehouse in Redshift or similar cloud-based solutions. (2+ years) Experience in Matillion or similar ETL/ELT tool for developing data ingestion and curation flow (2+ years) Must have strong hands-on experience in SQL. (2+ years) Strong hands-on experience in modern scripting languages using Python. (2+ years) Experience building complex ETL using Spark (Scala or Python) for event based big data processing (1+ years) Strong hands-on experience with NoSQL DBs - MongoDB, Cassandra or DynamoDB (1+ years) Strong experience with AWS deployment using CI/CD pipeline is preferred. (1+ years) Experience in infrastructure as a code services like Terraform preferred. (1+ years) Experience building mission critical systems, running 24x7. Desire to work within a team of engineers at all levels of experience. Desire to mentor junior developers, maximizing their productivity. Good written and spoken communication skills. Show more Show less

Posted 1 week ago

Apply

10.0 years

3 - 7 Lacs

Gurgaon

On-site

About the Role: Grade Level (for internal use): 11 S&P Global EDO The Role: Lead- Software Engineering IT- Application Development. Join Our Team: Step into a dynamic team at the cutting edge of data innovation! You’ll collaborate daily with talented professionals from around the world, designing and developing next-generation data products for our clients. Our team thrives on a diverse toolkit that evolves with emerging technologies, offering you the chance to work in a vibrant, global environment that fosters creativity and teamwork. The Impact: As a Lead Software Developer at S&P Global, you’ll be a driving force in shaping the future of our data products. Your expertise will streamline software development and deployment, aligning cutting-edge solutions with business needs. By ensuring seamless integration and continuous delivery, you’ll enhance product capabilities, delivering high-quality systems that meet the highest standards of availability, security, and performance. Your work will empower our clients with impactful, data-driven solutions, making a real difference in the financial world. What’s in it for You: Career Development: Build a rewarding career with a global leader in financial information and analytics, supported by continuous learning and a clear path to advancement. Dynamic Work Environment: Thrive in a fast-paced, forward-thinking setting where your ideas fuel innovation and your contributions shape groundbreaking solutions. Skill Enhancement: Elevate your expertise on an enterprise-level platform, mastering the latest tools and techniques in software development. Versatile Experience: Dive into full-stack development with hands-on exposure to cloud computing, Bigdata, and revolutionary GenAI technologies. Leadership Opportunities: Guide and inspire a skilled team, steering the direction of our products and leaving your mark on the future of technology at S&P Global. Responsibilities: Architect and develop scalable Bigdata and cloud applications, harnessing a range of cloud services to create robust, high-performing solutions. Design and implement advanced CI/CD pipelines, automating software delivery for fast, reliable deployments that keep us ahead of the curve. Tackle complex challenges head-on, troubleshooting and resolving issues to ensure our products run flawlessly for clients. Lead by example, providing technical guidance and mentoring to your team, driving innovation and embracing new processes. Deliver top-tier code and detailed system design documents, setting the standard with technical walkthroughs that inspire excellence. Bridge the gap between technical and non-technical stakeholders, turning complex requirements into elegant, actionable solutions. Mentor junior developers, nurturing their growth and helping them build skills and careers under your leadership. What We’re Looking For: We’re seeking a passionate, experienced professional with: 10-13 years of hands-on experience designing and building data-intensive solutions using distributed computing, showcasing your mastery of scalable architectures. Proven success implementing and maintaining enterprise search solutions in large-scale environments, ensuring peak performance and reliability. A history of partnering with business stakeholders and users to shape research directions and craft robust, maintainable products. Extensive experience deploying data engineering solutions in public clouds like AWS, GCP, or Azure, leveraging cloud power to its fullest. Advanced programming skills in Python, Java, .NET or Scala, backed by a portfolio of impressive projects. Strong knowledge of Gen AI tools (e.g., GitHub Copilot, ChatGPT, Claude, or Gemini) and their power to boost developer productivity. Expertise in containerization, scripting, cloud platforms, and CI/CD practices, ready to shine in a modern development ecosystem. 5+ years working with Python, Java, .NET, Kubernetes, and data/workflow orchestration tools, proving your technical versatility. Deep experience with SQL, NoSQL, Apache Spark, Airflow, or similar tools, operationalizing data-driven pipelines for large-scale batch and stream processing. A knack for rapid prototyping and iteration, delivering high-quality solutions under tight deadlines. Outstanding communication and documentation skills, adept at explaining complex ideas to technical and non-technical audiences alike. Take the Next Step: Ready to elevate your career and make a lasting impact in data and technology? Join us at S&P Global and help shape the future of financial information and analytics. Apply today! Return to Work Have you taken time out for caring responsibilities and are now looking to return to work? As part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply and will actively support your return to the workplace. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316190 Posted On: 2025-06-06 Location: Hyderabad, Telangana, India

Posted 1 week ago

Apply

3.0 years

3 - 8 Lacs

Gurgaon

On-site

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India; Gurugram, Haryana, India . Minimum qualifications: Bachelor's degree in Computer Science, Mathematics, a related technical field, or equivalent practical experience. 3 years of experience in building Machine Learning or Data Science solutions. Experience in Python, Scala, R, or related, with data structures, algorithms, and software design. Ability to travel up to 30% of the time as needed. Preferred qualifications: Experience with recommendation engines, data pipelines, or distributed machine learning with data analytics, data visualization techniques and software, and deep learning frameworks. Experience in software development, professional services, solution engineering, technical consulting with architecting and rolling out new technology and solution initiatives. Experience with Data Science techniques. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, Extract, Transform, and Load/Extract, Load and Transform (ETL/ELT) and reporting tools and environments. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. Excellent communication skills. About the job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. In this role, you will play a role in ensuring that customers have the best experience moving to the Google Cloud machine learning (ML) suite of products. You will design and implement machine learning solutions for customer use cases, leveraging core Google products. You will work with customers to identify opportunities to transform their business with machine learning, and will travel to customer sites to deploy solutions and deliver workshops designed to educate and empower customers to realize the potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product code, and address customer and partner needs.You will lead the execution of adopting the Google Cloud Platform solutions to the customer’s requirements.Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver big data and machine learning solutions and solve technical customer tests. Act as a trusted technical advisor to Google’s customers. Identify new product features and feature gaps, provide guidance on existing product tests, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver recommendations, tutorials, blog articles, and technical presentations adapting to different levels of business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 1 week ago

Apply

9.0 years

0 Lacs

Bengaluru

On-site

JOB DESCRIPTION About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. Equal employment opportunity information KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you. Major Duties & Responsibilities • Work with business stakeholders and cross-functional SMEs to deeply understand business context and key business questions • Create Proof of concepts (POCs) / Minimum Viable Products (MVPs), then guide them through to production deployment and operationalization of projects • Influence machine learning strategy for Digital programs and projects • Make solution recommendations that appropriately balance speed to market and analytical soundness • Explore design options to assess efficiency and impact, develop approaches to improve robustness and rigor • Develop analytical / modelling solutions using a variety of commercial and open-source tools (e.g., Python, R, TensorFlow) • Formulate model-based solutions by combining machine learning algorithms with other techniques such as simulations. • Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories. • Create algorithms to extract information from large, multiparametric data sets. • Deploy algorithms to production to identify actionable insights from large databases. • Compare results from various methodologies and recommend optimal techniques. • Design, adapt, and visualize solutions based on evolving requirements and communicate them through presentations, scenarios, and stories. • Develop and embed automated processes for predictive model validation, deployment, and implementation • Work on multiple pillars of AI including cognitive engineering, conversational bots, and data science • Ensure that solutions exhibit high levels of performance, security, scalability, maintainability, repeatability, appropriate reusability, and reliability upon deployment • Lead discussions at peer review and use interpersonal skills to positively influence decision making • Provide thought leadership and subject matter expertise in machine learning techniques, tools, and concepts; make impactful contributions to internal discussions on emerging practices • Facilitate cross-geography sharing of new ideas, learnings, and best-practices Required Qualifications • Bachelor of Science or Bachelor of Engineering at a minimum. • 9+ years of work experience as a Data Scientist • A combination of business focus, strong analytical and problem-solving skills, and programming knowledge to be able to quickly cycle hypothesis through the discovery phase of a project • Advanced skills with statistical/programming software (e.g., R, Python) and data querying languages (e.g., SQL, Hadoop/Hive, Scala) • Good hands-on skills in both feature engineering and hyperparameter optimization • Experience producing high-quality code, tests, documentation • Experience with Microsoft Azure or AWS data management tools such as Azure Data factory, data lake, Azure ML, Synapse, Databricks • Understanding of descriptive and exploratory statistics, predictive modelling, evaluation metrics, decision trees, machine learning algorithms, optimization & forecasting techniques, and / or deep learning methodologies • Proficiency in statistical concepts and ML algorithms • Good knowledge of Agile principles and process • Ability to lead, manage, build, and deliver customer business results through data scientists or professional services team • Ability to share ideas in a compelling manner, to clearly summarize and communicate data analysis assumptions and results • Self-motivated and a proactive problem solver who can work independently and in teams QUALIFICATIONS B.Tech/M.Tech/MCA/M.Sc

Posted 1 week ago

Apply

10.0 years

12 - 20 Lacs

Bengaluru

Remote

Data Architect Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Experience: 10+ Yrs Location: Udaipur , Jaipur,Bangalore Domain: Telecom Job Description: We are seeking an experienced Telecom Data Architect to join our team. In this role, you will be responsible for designing comprehensive data architecture and technical solutions specifically for telecommunications industry challenges, leveraging TMforum frameworks and modern data platforms. You will work closely with customers, and technology partners to deliver data solutions that address complex telecommunications business requirements including customer experience management, network optimization, revenue assurance, and digital transformation initiatives. Key Responsibilities: Design and articulate enterprise-scale telecom data architectures incorporating TMforum standards and frameworks, including SID (Shared Information/Data Model), TAM (Telecom Application Map), and eTOM (enhanced Telecom Operations Map) Develop comprehensive data models aligned with TMforum guidelines for telecommunications domains such as Customer, Product, Service, Resource, and Partner management Create data architectures that support telecom-specific use cases including customer journey analytics, network performance optimization, fraud detection, and revenue assurance Design solutions leveraging Microsoft Azure and Databricks for telecom data processing and analytics Conduct technical discovery sessions with telecom clients to understand their OSS/BSS architecture, network analytics needs, customer experience requirements, and digital transformation objectives Design and deliver proof of concepts (POCs) and technical demonstrations showcasing modern data platforms solving real-world telecommunications challenges Create comprehensive architectural diagrams and implementation roadmaps for telecom data ecosystems spanning cloud, on-premises, and hybrid environments Evaluate and recommend appropriate big data technologies, cloud platforms, and processing frameworks based on telecom-specific requirements and regulatory compliance needs. Design data governance frameworks compliant with telecom industry standards and regulatory requirements (GDPR, data localization, etc.) Stay current with the latest advancements in data technologies including cloud services, data processing frameworks, and AI/ML capabilities Contribute to the development of best practices, reference architectures, and reusable solution components for accelerating proposal development Required Skills: 10+ years of experience in data architecture, data engineering, or solution architecture roles with at least 5 years in telecommunications industry Deep knowledge of TMforum frameworks including SID (Shared Information/Data Model), eTOM, TAM, and their practical implementation in telecom data architectures Demonstrated ability to estimate project efforts, resource requirements, and implementation timelines for complex telecom data initiatives Hands-on experience building data models and platforms aligned with TMforum standards and telecommunications business processes Strong understanding of telecom OSS/BSS systems, network management, customer experience management, and revenue management domains Hands-on experience with data platforms including Databricks, and Microsoft Azure in telecommunications contexts Experience with modern data processing frameworks such as Apache Kafka, Spark and Airflow for real-time telecom data streaming Proficiency in Azure cloud platform and its respective data services with an understanding of telecom-specific deployment requirements Knowledge of system monitoring and observability tools for telecommunications data infrastructure Experience implementing automated testing frameworks for telecom data platforms and pipelines Familiarity with telecom data integration patterns, ETL/ELT processes, and data governance practices specific to telecommunications Experience designing and implementing data lakes, data warehouses, and machine learning pipelines for telecom use cases Proficiency in programming languages commonly used in data processing (Python, Scala, SQL) with telecom domain applications Understanding of telecommunications regulatory requirements and data privacy compliance (GDPR, local data protection laws) Excellent communication and presentation skills with ability to explain complex technical concepts to telecom stakeholders Strong problem-solving skills and ability to think creatively to address telecommunications industry challenges Good to have TMforum certifications or telecommunications industry certifications Relevant data platform certifications such as Databricks, Azure Data Engineer are a plus Willingness to travel as required Educational Qualifications: · Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm Job Types: Full-time, Permanent Pay: ₹1,287,062.21 - ₹2,009,304.16 per year Benefits: Flexible schedule Health insurance Leave encashment Paid time off Provident Fund Work from home Schedule: Day shift Monday to Friday Supplemental Pay: Overtime pay Performance bonus Quarterly bonus Yearly bonus Ability to commute/relocate: Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Required) Application Question(s): How Many years of experience in Telecom-Data Engineering? Experience: Data science: 9 years (Required) Location: Bengaluru, Karnataka (Required) Work Location: In person

Posted 1 week ago

Apply

3.0 years

3 - 10 Lacs

Bengaluru

On-site

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India; Gurugram, Haryana, India . Minimum qualifications: Bachelor's degree in Computer Science, Mathematics, a related technical field, or equivalent practical experience. 3 years of experience in building Machine Learning or Data Science solutions. Experience in Python, Scala, R, or related, with data structures, algorithms, and software design. Ability to travel up to 30% of the time as needed. Preferred qualifications: Experience with recommendation engines, data pipelines, or distributed machine learning with data analytics, data visualization techniques and software, and deep learning frameworks. Experience in software development, professional services, solution engineering, technical consulting with architecting and rolling out new technology and solution initiatives. Experience with Data Science techniques. Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, Extract, Transform, and Load/Extract, Load and Transform (ETL/ELT) and reporting tools and environments. Knowledge of cloud computing, including virtualization, hosted services, multi-tenant cloud infrastructures, storage systems, and content delivery networks. Excellent communication skills. About the job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. In this role, you will play a role in ensuring that customers have the best experience moving to the Google Cloud machine learning (ML) suite of products. You will design and implement machine learning solutions for customer use cases, leveraging core Google products. You will work with customers to identify opportunities to transform their business with machine learning, and will travel to customer sites to deploy solutions and deliver workshops designed to educate and empower customers to realize the potential of Google Cloud. You will have access to Google’s technology to monitor application performance, debug and troubleshoot product code, and address customer and partner needs.You will lead the execution of adopting the Google Cloud Platform solutions to the customer’s requirements.Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Deliver big data and machine learning solutions and solve technical customer tests. Act as a trusted technical advisor to Google’s customers. Identify new product features and feature gaps, provide guidance on existing product tests, and collaborate with Product Managers and Engineers to influence the roadmap of Google Cloud Platform. Deliver recommendations, tutorials, blog articles, and technical presentations adapting to different levels of business and technical stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 1 week ago

Apply

7.0 years

3 - 10 Lacs

Bengaluru

On-site

Job ID: 28021 Location: Bangalore, IN Area of interest: Technology Job type: Regular Employee Work style: Office Working Opening date: 28 May 2025 Job Summary Responsible for the design and development analytical data model, reports and dashboards Provide design specifications to both onsite & offshore teams Develop functional specifications and document the requirements clearly Develop ETL, Reports, data mapping documents and technical design specifications for the project deliverables. Analyse the business requirements and come up with the end-to-end design including technical implementation. Key Responsibilities Strategy Responsible for the design and development analytical data model, reports and dashboards Provide design specifications to both onsite & offshore teams Develop functional specifications and document the requirements clearly Develop data mapping documents and technical design specifications for the project deliverables. Analyse the business requirements and come up with the end-to-end design including technical implementation. Expert in Power BI, MSTR, Informatica, Hadoop Platform Ecosystem, SQL, Java, R, Python, Java Script, Hive, Spark, Linux Scripts Good knowledge on to Install, upgrade, administration, and troubleshooting of ETL & Reporting tools such as Power BI, MSTR, Informatica, Oracle and Hadoop Implement performance tuning techniques for Reports, ETL and data Migration Develop ETL procedures to ensure conformity, compliance with standards and translate business rules and functionality requirements into ETL procedures. Assess and review the report performance, come up with performance optimization techniques using VLDB settings and explain plan. Develop scripts to automate the production deployments. Conduct product demonstrations and user training sessions to business users Work with testing teams to improve the quality of the testing in adopting the automated testing tools and management of the application environments. Business Collaborate and partner with product owner, business users and senior business stakeholders to understand the data and reporting requirements of the business and clearly document it for further analysis Work closely with architects and infrastructure teams and review the solutions Interact with support teams periodically and get input on the various business users’ needs Provide all the required input and assistance to the business users in performing the data validation and ensure that the data and reporting delivered with accurate numbers. Processes Process oriented and experienced in onsite-offshore project delivery, using agile methodology and best practices Well versed with agile based project delivery methodology. Should have successfully implemented or delivered projects using best practices on technology delivery and project release automation in banking and financing industry Deployment automation for Oracle Databse and Hadoop , Informatica workflow WorkFlow, Integration, BI layer including MicroStrategy and PBI components to the feasible extent Actively participate in discussions with business users and seek endorsements and approvals wherever necessary w.r.t technology project delivery. People & Talent Minimum 7 years of experience in the business intelligence and data warehouse domain. Create project estimations, solution and design documentation, operational guidelines and production handover documentation Should have excellent technical, analytical, interpersonal and delivery capabilities in the areas of complex reporting for banking domain, especially in the area of Client Analytics and CRM. Full Life-cycle Business Intelligence (BI) and Data Warehousing project experience, starting with requirements analysis, proof-of-concepts, design, development, testing, deployment and administration Shall be a good team player with excellent written and verbal communications. Process oriented and experienced in onsite-offshore project delivery, using agile methodology and best practices Should be able play an Individual Contributor role Risk Management Assess and evaluate the risks that are related to the project delivery and update the stakeholders with appropriate remediation and mitigation approach. Review the technical solutions and deliverables with architects and key technology stakeholders and ensure that the deliverables are adhering to the risk governance rules Governance Work with Technology Governance and support teams and establish standards for simplifying the existing Microstrategy reports and Informatica batch programs Take end to end ownership of managing and administering the Informatica & Hadoop , MSTR and Power BI. Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. [Fill in for regulated roles] Lead the [country / business unit / function/XXX [team] to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * [Insert local regulator e.g. PRA/FCA prescribed responsibilities and Rationale for allocation]. [Where relevant - Additionally, for subsidiaries or relevant non -subsidiaries] Serve as a Director of the Board of [insert name of entities] Exercise authorities delegated by the Board of Directors and act in accordance with Articles of Association (or equivalent) Key stakeholders Business and Operations (Product Owners) Sales Enablement Client Coverage Reporting Technology Services Teams Production Support Teams Skills and Experience Design, development of ETL procedures using Informatica PowerCenter Performance tuning of star schemas to optimize load and query performance of SQL queries. Hive, HiveQL, HDFS, Scala, Spark, Sqoop, HBase, YARN, Presto , Dremio Experience in Oracle 11g, 19c. Strong knowledge and understanding of SQL and ability to write SQL, PL/SQL BI and analytical dashboards , reporting design and development using PBI tools and MicroStrategy Business Intelligence product suite (MicroStrategy Intelligence Server, MicroStrategy Desktop, MicroStrategy Web, MicroStrategy Architect, MicroStrategy Object Manager, MicroStrategy Command Manager, MicroStrategy Integrity Manager, MicroStrategy Office, Visual Insight, Mobile Development Design of dimensional modelling like star and snowflake schema Setting up connections to Hadoop big data (data lake) cluster through Kerberos authentication mechanisms Banking and Finance specific to financial market, Collateral, Trade Life Cycle , Operational CRM, Analytical CRM and client related reporting Design and implementation of Azure Data Solution and Microsoft Azure Cloud Qualifications About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together we: Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What we offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. www.sc.com/careers

Posted 1 week ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Company Description 👋🏼We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in Job Description REQUIREMENTS: Total experience 5+ years. Hands on working experience in Data engineering. Strong working experience in SQL, Python or Scala. Deep understanding of Cloud Design Patterns and their implementation. Experience working with Snowflake as a data warehouse solution. Experience with Power BI data integration. Design, develop, and maintain scalable data pipelines and ETL processes. Work with structured and unstructured data from multiple sources (APIs, databases, flat files, cloud platforms). Strong understanding of data modelling, warehousing (e.g., Star/Snowflake schema), and relational database systems (PostgreSQL, MySQL, etc.) Hands-on experience with ETL tools such as Apache Airflow, Talend, Informatica, or similar. Strong problem-solving skills and a passion for continuous improvement. Strong communication skills and the ability to collaborate effectively with cross-functional teams. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the client's business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the clients' requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers. Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Reference # 312796BR Job Type Full Time Your role Do you want to drive the next level of application architecture in the Compliance IT space? Would you like to apply Agile at scale while developing state of the art Model Deployment and Execution Platform on Cloud? How about shaping the future of Compliance technology together with a fun-loving cross-functional team? We are looking for a seasoned azure cloud data engineer to develop curated data pipeline to load, transform, egress the data on cloud. help us build state of the art solutions that underpin M&S surveillance models and alert generation using cutting edge technologies like Scala, Spark, Databricks, Azure Data Lake, Airflow. demonstrate superior analytical and problem-solving skills demonstrate design skills and should have various design patterns knowledge demonstrate superior collaboration skills in working closely with other development, testing and implementation teams to roll-out important regulatory and business improvement programs Your team You will be a key member of a young and expanding team that is part of the Compliance Technology function. We are a small friendly bunch who take pride in the quality of work that we do. As a team, we provide AI based solutions on top of a big data platform, working closely with peers from the business led data science team, as well as other engineering teams. Your expertise have a degree level education; preferably Computer Science (Bachelor or Master’s degree) have 8+ years of hands-on design and development experience in several of the relevant technology areas (Scala, Azure Data Lake, Apache Spark, Airflow, Oracle/Postgres etc.) strong coding skills in Scala, Spark (must have) with ETL paradigms understanding have Agile, Test Driven Development and DevOps practices part of your DNA experience in developing the application using various design patterns experience in working in a MS Azure (added advantage) experience in coding skills in python is an added advantage demonstrate strong communication skills, both to senior management and teams background in IB or Finance domain with good understanding of finance principles (added advantage) strong analytical and problem-solving skills collaborative approach towards problem solving, working closely with other colleagues in the global team and sensitive towards diversity About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors.. We have a presence in all major financial centers in more than 50 countries. How We Hire We may request you to complete one or more assessments during the application process. Learn more Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Are you interested in innovating to deliver a world-class level of service to Amazon’s Selling Partners? At Amazon International Seller Services our mission is to make Sellers successful on Amazon. We are the team in Amazon which has a charter to work with sellers from every country. This provides a great opportunity to get diverse experience and a worldwide view. In this role you will be part of our mission to improve the Customer and Selling Partner experience across EU expansion marketplaces. You will partner with Business leaders, Product managers, BI manager to help drive specific business goals and be the analytical engine in specific areas. You will help identify, track and manage Key Performance Indicators (KPIs), partner with the internal and external teams to identify root-causes and automate iterative workflows using Amazon's tools. The ideal candidate should have a proven ability to independently deliver complex analytics projects. This role has high leadership visibility and requires efficient communication with tech & non-tech stakeholders. To be successful in this role, you should be comfortable dealing with large and complex data sets, have expertise in SQL querying, excel, have experience building self-service dashboards and using visualization tools, while always applying analytical rigor to solve business problems. Should have excellent judgment, be passionate about high standards (is never satisfied with the status quo), deliver innovative solutions. Key job responsibilities Collaborate with Leaders, multiple Account Managers, Team Managers etc. to understand business requirements and to prioritize and deliver data and reporting independently. Design, develop and maintain scalable, automated, user-friendly systems, reports, dashboards, etc. that will support our analytical and business needs. Analyze key metrics to uncover trends and root causes of issues and build simplified solutions for account managers/product managers to consume. Apply Statistical and Machine Learning methods to specific business problems and data. Utilizing code (Python, R, Scala, etc.) for analyzing data and building statistical models. Lead deep dives working backwards from business hypotheses and anecdotes & build visualizations and automation to reduce iterative manual efforts. Automate workflows using Amazon tools to improve sales teams productivity. Collaborate with other analysts to adopt best practices. Continually upskill in new technologies and adopt them in day to day work Basic Qualifications Bachelor's degree in a quantitative field (e.g., Computer Science, Mathematics, Statistics, Finance). 5+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience. Experience of working with SQL and at least one data visualization tool (e.g., PowerBI, Tableau, Amazon QuickSight) in a business environment. Preferred Qualifications Proficiency in Python Knowledge of Java, JSON and Amazon Technologies: AWS S3, RS, Lambda Experience of machine learning/statistical modeling data analysis tools and techniques, and parameters that affect their performance experience Experience as business analyst, data analyst or similar role Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A2983325 Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained Working with other members of the project team to support delivery of additional project components (API interfaces) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Integrate Databricks with other technologies (Ingestion tools, Visualization tools). Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics Experience in designing and hands-on development in cloud-based analytics solutions. Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. Designing and building of data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Thorough understanding of Azure Cloud Infrastructure offerings. Strong experience in common data warehouse modeling principles including Kimball. Working knowledge of Python is desirable Experience developing security models. Databricks & Azure Big Data Architecture Certification would be plus Mandatory Skill Sets ADE, ADB, ADF Preferred Skill Sets ADE, ADB, ADF Years Of Experience Required 8-13 Years Education Qualification BE, B.Tech, MCA, M.Tech a Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Android Debug Bridge (ADB), Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 12 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 week ago

Apply

3.0 - 6.0 years

0 Lacs

Noida

Remote

Eightfold was founded with a vision to solve for employment in our society. For decades, the connection between individuals and opportunities has been based on who they are and their network's strength vs. their potential. Eightfold leverages artificial intelligence to transform how to think about skills and capabilities for individuals and how jobs and career decisions are made. Eightfold offers the industry’s first AI-powered Talent Intelligence Platform to transform how organizations plan, hire, develop and retain a diverse workforce, enabling individuals to transform their careers. To date, Eightfold AI has received more than $410 million in funding and a valuation of over $2B from leading investors to further our mission of finding the right career for everyone in the world. If you are passionate about solving one of the most fundamental challenges of our society - employment, working on hard business problems, and being part of an amazing growth story - Eightfold is the place to be! Our customer stories- https://eightfold.ai/customers/customer-stories/ Press- https://eightfold.ai/about/press About the role We are looking for a Data Engineer II, Analytics to join our growing team and help build scalable, high-performance data pipelines that enable meaningful business insights. You’ll work with modern data tools, collaborate with cross-functional teams, and contribute to building a robust data foundation that supports Eightfold’s AI-driven analytics and reporting needs. What You Will Learn To Do Design & Develop Pipelines : Build, maintain, and optimize reliable ETL/ELT pipelines to ingest, process, and transform large datasets from diverse sources using Databricks and Amazon Redshift. Data Modeling & Architecture : Support the design and implementation of scalable data models and architectures to meet evolving analytics and business intelligence needs. Data Quality & Integration : Ensure accuracy, consistency, and quality of integrated data from structured and unstructured sources across systems. Performance Tuning : Optimize the performance of queries, pipelines, and databases to ensure high efficiency and reliability for analytics workloads. Collaboration & Delivery : Partner with analytics engineers, data scientists, product managers, and business stakeholders to deliver high-quality data solutions that enable business insights and product innovations. Documentation & Best Practices : Contribute to documentation, promote data engineering best practices, and ensure data governance, security, and compliance standards are upheld. What We Need: Experience : 3–6 years of hands-on experience as a Data Engineer or in a similar data engineering role, ideally in analytics-focused environments. Databricks : Practical experience building and managing pipelines and workflows using Databricks. Amazon Redshift : Strong understanding of Amazon Redshift, including data modeling and query optimization. Programming : Proficiency in SQL and working knowledge of Python or Scala for data processing tasks. ETL/ELT Tools : Hands-on experience developing and maintaining ETL/ELT processes. Big Data Tools (Good to Have) : Familiarity with Apache Spark, Hadoop, Kafka, or other big data technologies is a plus. Analytical & Problem-Solving Skills : Ability to troubleshoot data issues and optimize performance effectively. Communication & Collaboration : Strong communication skills with a collaborative mindset in fast-paced environments. Hybrid Work @ Eightfold: We embrace a hybrid work model that aims to boost collaboration, enhance our culture, and drive innovation through a blend of remote and in-person work. We are committed to creating a dynamic and flexible work environment that nurtures the collaborative spirit of our team. Starting February 1, 2024, our employees will return to the office twice a week. We have offices in Bangalore and Noida in India. Eightfold.ai provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, or disability. Experience our comprehensive benefits with family medical, vision and dental coverage, a competitive base salary, and eligibility for equity awards and discretionary bonuses or commissions.

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About AkzoNobel Since 1792, we’ve been supplying the innovative paints and coatings that help to color people’s lives and protect what matters most. Our world class portfolio of brands – including Dulux, International, Sikkens and Interpon – is trusted by customers around the globe. We’re active in more than 150 countries and use our expertise to sustain and enhance the fabric of everyday life. Because we believe every surface is an opportunity. It’s what you’d expect from a pioneering and long-established paints company that’s dedicated to providing sustainable solutions and preserving the best of what we have today – while creating an even better tomorrow. Let’s paint the future together. For more information please visit www.akzonobel.com © 2024 Akzo Nobel N.V. All rights reserved. Job Purpose IT Delivery – Advanced Analytics The Advanced Analytics team is positioned within IT Delivery. This team serves as the brain of the business where insights are generated, processes are automated, and solutions are built. Our solutions are mainly focused on Supply Chain and Commercial. We are looking for (senior) data engineers to join our high performing team. We aim to attract and further develop the best Data Science & Supply Chain talent.a Key Activities The role and its responsibilities: Collect business requirements from the various stakeholders (Data Scientists, Data Visualizers, Product Managers, IM) Translate business requirements into technical requirements, systems, and solutions Design, develop and document the technical solution: creation of data pipelines for data transfer between different storage services Maintenance of the solution: Actively monitor and manage the system/ solutions performance in close contact with the solution architect DevOps/ Agile WoW Experience Basic Qualifications BSc in Computer Science, Statistics, Mathematics, Physics, or any quantitative field Experience in designing and building advanced ETL pipelines in a big data environment Big data frameworks: Apache Hadoop, Apache Spark, RapidMiner, Cloudera Programming languages: Scala, Python, PySpark, SQL Sound communication skills Team player Curious mind Preferred Qualifications MSc in Computer Science, Statistics, Mathematics, Physics, or any quantitative field Platform: Azure Data Factory, Databricks, PowerBI Experience in supply chain Familiar with Agile At AkzoNobel we are highly committed to ensuring an inclusive and respectful workplace where all employees can be their best self. We strive to embrace diversity in a context of tolerance. Our talent acquisition process plays an integral part in this journey, as setting the foundations for a diverse environment. For this reason we train and educate on the implications of our Unconscious Bias in order for our TA and hiring managers to be mindful of them and take corrective actions when applicable. In our organization, all qualified applicants receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age or disability. Requisition ID: 46233 Show more Show less

Posted 1 week ago

Apply

Exploring Scala Jobs in India

Scala is a popular programming language that is widely used in India, especially in the tech industry. Job seekers looking for opportunities in Scala can find a variety of roles across different cities in the country. In this article, we will dive into the Scala job market in India and provide valuable insights for job seekers.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving tech ecosystem and have a high demand for Scala professionals.

Average Salary Range

The salary range for Scala professionals in India varies based on experience levels. Entry-level Scala developers can expect to earn around INR 6-8 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 15 lakhs per annum.

Career Path

In the Scala job market, a typical career path may look like: - Junior Developer - Scala Developer - Senior Developer - Tech Lead

As professionals gain more experience and expertise in Scala, they can progress to higher roles with increased responsibilities.

Related Skills

In addition to Scala expertise, employers often look for candidates with the following skills: - Java - Spark - Akka - Play Framework - Functional programming concepts

Having a good understanding of these related skills can enhance a candidate's profile and increase their chances of landing a Scala job.

Interview Questions

Here are 25 interview questions that you may encounter when applying for Scala roles:

  • What is Scala and why is it used? (basic)
  • Explain the difference between val and var in Scala. (basic)
  • What is pattern matching in Scala? (medium)
  • What are higher-order functions in Scala? (medium)
  • How does Scala support functional programming? (medium)
  • What is a case class in Scala? (basic)
  • Explain the concept of currying in Scala. (advanced)
  • What is the difference between map and flatMap in Scala? (medium)
  • How does Scala handle null values? (medium)
  • What is a trait in Scala and how is it different from an abstract class? (medium)
  • Explain the concept of implicits in Scala. (advanced)
  • What is the Akka toolkit and how is it used in Scala? (medium)
  • How does Scala handle concurrency? (advanced)
  • Explain the concept of lazy evaluation in Scala. (advanced)
  • What is the difference between List and Seq in Scala? (medium)
  • How does Scala handle exceptions? (medium)
  • What are Futures in Scala and how are they used for asynchronous programming? (advanced)
  • Explain the concept of type inference in Scala. (medium)
  • What is the difference between object and class in Scala? (basic)
  • How can you create a Singleton object in Scala? (basic)
  • What is a higher-kinded type in Scala? (advanced)
  • Explain the concept of for-comprehensions in Scala. (medium)
  • How does Scala support immutability? (medium)
  • What are the advantages of using Scala over Java? (basic)
  • How do you implement pattern matching in Scala? (medium)

Closing Remark

As you explore Scala jobs in India, remember to showcase your expertise in Scala and related skills during interviews. Prepare well, stay confident, and you'll be on your way to a successful career in Scala. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies