Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
20 - 25 Lacs
Noida
Work from Office
Key Responsibilities Senior professional level role and hands-on enterprise level architect/ solution leader with deep experience in Data Engineering technologies and on public cloud like AWS Azure/ GCP Engage with client managers to understand their current state, business problems/ opportunities, conceptualize solution options, discuss and finalize with client stakeholders, help bootstrap a team and deliver PoCs/PoTs/MVP etc. Help build overall competency within teams working in related client engagements and rest of Iris in Data & Analytics, including Data Engineering, Analytics, Data Science, AI/ML, ML and Data Ops, Data Governance etc. related solution patterns, platforms, tools and technology. Staying up to date in the field regarding best practices, new and emerging tools, and trends in the Data and Analytics Focus on building practice competencies on Data & Analytics Professional Experience Qualifications Bachelors degree Masters degree in a Software discipline Experience w.r.t. Data architecture, Implementation of large scale Enterprise-level Data Lake/Data Warehousing, Big Data and Analytics applications. Professional with a background in Data Engineering, should have led multiple engagements in Data Engineering in terms of solutioning, architecture and delivery. Excellent English communication both written and verbal Technology o For the above skill areas, must have lifecycle experience on some of the tools such as AWS Glue Redshift Azure Data lake Databricks Snowflake , etc. o Database experience and programming experience on Spark- Spark SQL, PySpark, Python etc.
Posted 1 week ago
7.0 - 9.0 years
6 - 10 Lacs
Chennai
Work from Office
As a Technical Lead - Cloud Data Platform (AWS) at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of AWS services and tools such as AWS Glue, AWS Redshift, and AWS Lambda Experience in building scalable and reliable data pipelines using AWS services, Apache Spark, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on AWS Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Should be open to new ideas and be willing to learn and develop new skills. Should also be able to work well under pressure and manage multiple tasks and priorities. Nice-to-have skills Qualifications Qualifications 7-9 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 1 week ago
8.0 - 12.0 years
30 - 35 Lacs
Chennai
Work from Office
Technical Skills Experience building data transformation pipelines using DBT and SSIS Moderate programming experience with Python Moderate experience with AWS Glue Strong experience with SQL and ability to write efficient code and manage it through GIT repositories Nice-to-have skills Experience working with SSIS Experience working in a Wealth management industry Experience in agile development methodologies
Posted 1 week ago
4.0 - 6.0 years
6 - 10 Lacs
Gurugram
Work from Office
Role Description As a Senior Cloud Data Platform (AWS) Specialist at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of AWS services and tools such as AWS Glue, AWS Redshift, and AWS Lambda Experience in building scalable and reliable data pipelines using AWS services, Apache Spark, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on AWS Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 1 week ago
4.0 - 6.0 years
6 - 10 Lacs
Gurugram
Work from Office
Role Description As a Senior Cloud Data Platform (AWS) Specialist at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of AWS services and tools such as AWS Glue, AWS Redshift, and AWS Lambda Experience in building scalable and reliable data pipelines using AWS services, Apache Spark, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on AWS Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
Job Description: We are looking for a skilled PySpark Developer having 4-5 or 2-3 years of experience to join our team. As a PySpark Developer, you will be responsible for developing and maintaining data processing pipelines using PySpark, Apache Spark's Python API. You will work closely with data engineers, data scientists, and other stakeholders to design and implement scalable and efficient data processing solutions. Bachelor's or Master's degree in Computer Science, Data Science, or a related field is required. The ideal candidate should have strong expertise in the Big Data ecosystem including Spark, Hive, Sqoop, HDFS, Map Reduce, Oozie, Yarn, HBase, Nifi. The candidate should be below 35 years of age and have experience in designing, developing, and maintaining PySpark data processing pipelines to process large volumes of structured and unstructured data. Additionally, the candidate should collaborate with data engineers and data scientists to understand data requirements and design efficient data models and transformations. Optimizing and tuning PySpark jobs for performance, scalability, and reliability is a key responsibility. Implementing data quality checks, error handling, and monitoring mechanisms to ensure data accuracy and pipeline robustness is crucial. The candidate should also develop and maintain documentation for PySpark code, data pipelines, and data workflows. Experience in developing production-ready Spark applications using Spark RDD APIs, Data frames, Datasets, Spark SQL, and Spark Streaming is required. Strong experience of HIVE Bucketing and Partitioning, as well as writing complex hive queries using analytical functions, is essential. Knowledge in writing custom UDFs in Hive to support custom business requirements is a plus. If you meet the above qualifications and are interested in this position, please email your resume, mentioning the position applied for in the subject column at: careers@cdslindia.com.,
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Career Level - IC3 Responsibilities As a Sr. Support Engineer, you will be the technical interface to customers, Original Equipment Manufacturers (OEMs) and Value-Added Resellers (VARs) for resolution of problems related to the installation, recommended maintenance and use of Oracle products. Have an understanding of all Oracle products in their competencies and in-depth knowledge of several products and/or platforms. Also, you should be highly experienced in multiple platforms and be able to complete assigned duties with minimal direction from management. In this position, you will routinely act independently while researching and developing solutions to customer issues. RESPONSIBILITIES: To manage and resolve Service Requests logged by customers (internal and external) on Oracle products and contribute to proactive support activities according to product support strategy and model Owning and resolving problems and managing customer expectations throughout the Service Request lifecycle in accordance with global standards Working towards, adopting and contributing to new processes and tools (diagnostic methodology, health checks, scripting tools, etc.) Contributing to Knowledge Management content creation and maintenance Working with development on product improvement programs (testing, SRP, BETA programs etc) as required Operating within Oracle business processes and procedures Respond and resolve customer issues within Key Performance Indicator targets Maintaining product expertise within the team Maintain an up-to-date and in-depth knowledge of new products released in the market for supported product QUALIFICATIONS: Bachelor’s degree in Computer Science, Engineering or related technical field 5+ years of proven professional and technical experience in Big Data Appliance (BDA), Oracle Cloud Infrastructure (OCI), Linux OS and within areas like Cloudera distribution for Hadoop (CDH), HDFS, YARN, Spark, Hive, Sqoop, Oozie and Intelligent Data Lake. Excellent verbal and written skills in English SKILLS & COMPETENCIES: Minimum technical skills: As a member of the Big Data Appliance (BDA), the focus is to troubleshoot highly complex technical issues related to the Big Data Appliance and within areas like Cloudera distribution for Hadoop (CDH), HDFS, YARN, Spark, Hive, Sqoop, Oozie and Intelligent Data Lake. Have good hands on experience in Linux Systems, Cloudera Hadoop architecture, administration and troubleshooting skills with good knowledge of different technology products/services/processes. Responsible for resolving complex issues for BDA (Big Data Appliance) customers. This would include resolving issues pertaining to Cloudera Hadoop, Big Data SQL, BDA upgrades/patches and installs. The candidate will also collaborate with other teams like Hardware, development, ODI, Oracle R, etc to help resolve customer’s issues on the BDA machine. The candidate will also be responsible for interacting with customer counterparts on a regular basis and serving as the technology expert on the customer’s behalf. Experience In Multi-tier Architecture Environment Required. Fundamental understanding of computer networking, systems, and database technologies. Personal competencies: Desire to learn, or expand knowledge, about Oracle database and associated products Customer focus Structured Problem Recognition and Resolution Experience of contributing to a shared knowledge base Experience of Support level work, like resolving customer problems and managing customer expectations, and escalations. Communication Planning and organizing Working globally Quality Team Working Results oriented Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 2 weeks ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Experience: 8+ years Key Responsibilities: Collaborate with business teams to gather and understand requirements Lead hands-on design, development, and deployment of data pipelines and integration workflows Support testing, go-live, and hypercare phases Act as a mentor and guide to offshore Kafka developers; review code and ensure quality deliverables Take full ownership of assigned project deliverables Required Skills: Python | MS SQL | Java | Azure Databricks | Spark | Kenisis | Kafka | Sqoop | Hive | Apache NiFi | Unix Shell Scripting
Posted 2 weeks ago
5.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: Scala programming. Experience: 5-8 Years.
Posted 2 weeks ago
4.0 - 6.0 years
6 - 10 Lacs
Chennai
Work from Office
As a Senior Cloud Data Platform (AWS) Specialist at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of AWS services and tools such as AWS Glue, AWS Redshift, and AWS Lambda Experience in building scalable and reliable data pipelines using AWS services, Apache Spark, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on AWS Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 2 weeks ago
5.0 - 8.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Long Description Experienceand Expertise inany of the followingLanguagesat least 1 of them : Java, Scala, Python Experienceand expertise in SPARKArchitecture Experience in the range of 6-10 yrs plus Good Problem SolvingandAnalytical Skills Ability to Comprehend the Business requirementand translate to the Technical requirements Good communicationand collaborative skills with fellow teamandacross Vendors Familiar with development of life cycle includingCI/CD pipelines. Proven experienceand interested in supportingexistingstrategicapplications Familiarity workingwithagile methodology Mandatory Skills: Scala programming.: Experience: 5-8 Years.
Posted 2 weeks ago
5.0 - 8.0 years
6 - 10 Lacs
Pune
Hybrid
Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform . Location: Wipro PAN India Hybrid 3 days in Wipro office JD: Strong - SQL Strong - Python Any cloud technology (AWS, azure, GCP etc) have to be excellent GCP (preferred) PySpark (preferred) Essential Skills: Proficiency in Cloud-PaaS-GCP-Google Cloud Platform. Experience Required: 5-8 years. Position: Cloud Data Engineer. Work Location: Wipro, PAN India. Work Arrangement: Hybrid model with 3 days in Wipro office. Additional Experience: 8-13 years. Job Description: - Strong expertise in SQL. - Proficient in Python. - Excellent knowledge of any cloud technology (AWS, Azure, GCP, etc.), with a preference for GCP. - Familiarity with PySpark is preferred. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform . JD: Strong - SQL Strong - Python Any cloud technology (AWS, azure, GCP etc) have to be excellent GCP (preferred) PySpark (preferred
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Mumbai
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Responsibilities: Design and implement the data modeling, data ingestion and data processing for various datasets Design, develop and maintain ETL Framework for various new data source Develop data ingestion using AWS Glue/ EMR, data pipeline using PySpark, Python and Databricks. Build orchestration workflow using Airflow & databricks Job workflow Develop and execute adhoc data ingestion to support business analytics. Proactively interact with vendors for any questions and report the status accordingly Explore and evaluate the tools/service to support business requirement Ability to learn to create a data-driven culture and impactful data strategies. Aptitude towards learning new technologies and solving complex problem. Qualifications : Minimum of bachelors degree. Preferably in Computer Science, Information system, Information technology. Minimum 5 years of experience on cloud platforms such as AWS, Azure, GCP. Minimum 5 year of experience in Amazon Web Services like VPC, S3, EC2, Redshift, RDS, EMR, Athena, IAM, Glue, DMS, Data pipeline & API, Lambda, etc. Minimum of 5 years of experience in ETL and data engineering using Python, AWS Glue, AWS EMR /PySpark and Airflow for orchestration. Minimum 2 years of experience in Databricks including unity catalog, data engineering Job workflow orchestration and dashboard generation based on business requirements Minimum 5 years of experience in SQL, Python, and source control such as Bitbucket, CICD for code deployment. Experience in PostgreSQL, SQL Server, MySQL & Oracle databases. Experience in MPP such as AWS Redshift, AWS EMR, Databricks SQL warehouse & compute cluster. Experience in distributed programming with Python, Unix Scripting, MPP, RDBMS databases for data integration Experience building distributed high-performance systems using Spark/PySpark, AWS Glue and developing applications for loading/streaming data into Databricks SQL warehouse & Redshift. Experience in Agile methodology Proven skills to write technical specifications for data extraction and good quality code. Experience with big data processing techniques using Sqoop, Spark, hive is additional plus Experience in data visualization tools including PowerBI, Tableau. Nice to have experience in UI using Python Flask framework anglular Mandatory Skills: Python for Insights Experience : 5-8 Years.
Posted 2 weeks ago
1.0 - 2.0 years
3 - 4 Lacs
Gurugram, Bengaluru
Work from Office
About the Role: Grade Level (for internal use): 08 S&P Global Mobility The Role: Data Engineer The Team We are the Research and Modeling team, driving innovation by building robust models and tools to support the Vehicle & Powertrain Forecast team. Our work includes all aspects of development of, and ongoing support for, our business line data flows, analyst modelling solutions and forecasts, new apps, new client-facing products, and many other work areas besides. We value ownership, adaptability, and a passion for learning, while fostering an environment where diverse perspectives and mentorship fuel continuous growth. The Impact We areseekinga motivated and talented Data Engineer to be a key player in building a robust data infrastructure and flows that supports our advanced forecasting models. Your initial focus will be to create a robust data factory to ensure smooth collection and refresh of actual data, a critical component that feeds our forecast. Additionally, you will be assisting in developing mathematical models and supporting the work of ML engineers and data scientists. Your work will significantly impact our ability to deliver timely and insightful forecasts to our clients. Whats in it for you: Opportunity to build foundational data infrastructure that directly impacts advanced forecasting models and client delivery. Gain exposure to and support the development of sophisticated mathematical models, Machine Learning, and Data Science applications. Contribute significantly to delivering timely and insightful forecasts, influencing client decisions in the automotive sector. Work in a collaborative environment that fosters continuous learning, mentorship, and professional growth in data engineering and related analytical fields. Responsibilities Data Pipeline Development: Design, build, and maintain scalable and reliable data pipelines for efficient data ingestion, processing, and storage, primarily focusing on creating a data factory for our core forecasting data. Data Quality and Integrity: Implement robust data quality checks and validation processes to ensure the accuracy and consistency of data used in our forecasting models. Mathematical Model Support: Collaborate with other data engineers to develop and refine mathematical logics and models that underpin our forecasting methodologies. ML and Data Science Support Provide data support to our Machine Learning Engineers and Data Scientists. Collaboration and Communication: Work closely with analysts, developers, and other stakeholders to understand data requirements and deliver effective solutions. Innovation and Improvement: Continuously explore and evaluate new technologies and methodologies to enhance our data infrastructure and forecasting capabilities. What Were Looking For: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Minimum of 1 - 2 years of experience in data engineering, with a proven track record of building and maintaining data pipelines. Strong proficiency in SQL and experience with relational and non-relational databases. Strong Python programming skills, with experience in data manipulation and processing libraries (e.g., Pandas, NumPy). Experience with mathematical modelling and supporting ML and data science teams. Experience with cloud platforms (e.g., AWS, Azure, GCP) and cloud-based data services. Strong communication and collaboration skills, with the ability to work effectively in a team environment. Experience in the automotive sector is a plus. Statement: S&P Global delivers essential intelligence that powers decision making. We provide the worlds leading organizations with the right data, connected technologies and expertise they need to move ahead. As part of our team, youll help solve complex challenges that equip businesses, governments and individuals with the knowledge to adapt to a changing economic landscape. S&P Global Mobility turns invaluable insights captured from automotive data to help our clients understand todays market, reach more customers, and shape the future of automotive mobility. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 203 - Entry Professional (EEO Job Group) (inactive), 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH203 - Entry Professional (EEO Job Group)
Posted 2 weeks ago
5.0 - 8.0 years
4 - 7 Lacs
Mumbai
Work from Office
Excellent Knowledge on Spark; The professional must have a thorough understanding Spark framework, Performance Tuning etc Excellent Knowledge and hands-on experience of at least 4+ years in Scala and PySpark Excellent Knowledge of the Hadoop eco System- Knowledge of Hive mandatory Strong Unix and Shell Scripting Skills Excellent Inter-personal skills and for experienced candidates Excellent leadership skills Mandatory for anyone to have Good knowledge of any of the CSPs like Azure,AWS or GCP; Certifications on Azure will be additional Plus. Mandatory Skills: PySpark. Experience: 5-8 Years.
Posted 2 weeks ago
2.0 - 6.0 years
3 - 7 Lacs
Gurugram
Work from Office
We are looking for a Pyspark Developer that loves solving complex problems across a full spectrum of technologies. You will help ensure our technological infrastructure operates seamlessly in support of our business objectives. Responsibilities Develop and maintain data pipelines implementing ETL processes. Take responsibility for Hadoop development and implementation. Work closely with a data science team implementing data analytic pipelines. Help define data governance policies and support data versioning processes. Maintain security and data privacy working closely with Data Protection Officer internally. Analyse a vast number of data stores and uncover insights. Skillset Required Ability to design, build and unit test the applications in Pyspark. Experience with Python development and Python data transformations. Experience with SQL scripting on one or more platforms Hive, Oracle, PostgreSQL, MySQL etc. In-depth knowledge of Hadoop, Spark, and similar frameworks. Strong knowledge of Data Management principles. Experience with normalizing/de-normalizing data structures, and developing tabular, dimensional and other data models. Have knowledge about YARN, cluster, executor, cluster configuration. Hands on working in different file formats like Json, parquet, csv etc. Experience with CLI on Linux-based platforms. Experience analysing current ETL/ELT processes, define and design new processes. Experience analysing business requirements in BI/Analytics context and designing data models to transform raw data into meaningful insights. Good to have knowledge on Data Visualization. Experience in processing large amounts of structured and unstructured data, including integrating data from multiple sources.
Posted 2 weeks ago
1.0 - 3.0 years
9 - 13 Lacs
Pune
Work from Office
Overview We are hiring an Associate Data Engineer to support our core data pipeline development efforts and gain hands-on experience with industry-grade tools like PySpark, Databricks, and cloud-based data warehouses. The ideal candidate is curious, detail-oriented, and eager to learn from senior engineers while contributing to the development and operationalization of critical data workflows. Responsibilities Assist in the development and maintenance of ETL/ELT pipelines using PySpark and Databricks under senior guidance. Support data ingestion, validation, and transformation tasks across Rating Modernization and Regulatory programs. Collaborate with team members to gather requirements and document technical solutions. Perform unit testing, data quality checks , and process monitoring activities. Contribute to the creation of stored procedures, functions, and views . Support troubleshooting of pipeline errors and validation issues. Qualifications Bachelor’s degree in Computer Science, Engineering, or related discipline. 3+ years of experience in data engineering or internships in data/analytics teams. Working knowledge of Python, SQL , and ideally PySpark . Understanding of cloud data platforms (Databricks, BigQuery, Azure/GCP). Strong problem-solving skills and eagerness to learn distributed data processing. Good verbal and written communication skills. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 2 weeks ago
6.0 - 8.0 years
1 - 4 Lacs
Chennai
Hybrid
Job Title:Snowflake Developer Experience: 6-8 Years Location:Chennai - Hybrid Job Description : 3+ years of experience as a Snowflake Developer or Data Engineer. Strong knowledge of SQL, SnowSQL, and Snowflake schema design. Experience with ETL tools and data pipeline automation. Basic understanding of US healthcare data (claims, eligibility, providers, payers). Experience working with largescale datasets and cloud platforms (AWS, Azure, GCP). Familiarity with data governance, security, and compliance (HIPAA, HITECH).
Posted 2 weeks ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Good experience in building data pipelines using ADF Good experience with programming languages such as Python, PySpark Solid proficiency in SQL and complex queries Demonstrated ability to learn and adapt to new data technologies Proven good skills in Azure Data Processing like Azure data Factory and Azure data Bricks Proven good problem-solving skills Proven good communication skills Proven technical skills - Python, Azure Data Processing Tools Other Requirements Collaborate with team, architects, and product stakeholders to understand the scope and design of a deliverable Participate in product support activities as needed by the team Understand product architecture, features being built and come up with product improvement ideas and POCs Individual contributor for Data Engineering - Data pipelines, Data modelling and Data warehouse Preferred Qualifications Knowledge or experience with Containerization - Docker, Kubernetes. Knowledge or experience with Bigdata or Hadoop ecosystem - Spark, Hive, HBase, Sqoop etc. Experience with APIs and integrating external data sources. Experience in Build or Deployment Automation - Jenkins Knowledge or experience using Microsoft Visio, Power Point. Knowledge on Agile or Scrum At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.
Posted 2 weeks ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title : Assoc Data Engineering Mgr About UHG United Health Group is a leading health care company serving more than 85 million people worldwide. The organization is ranked 5th among Fortune 500 companies. UHG serves its customers through two different platforms - United Health Care (UHC) and Optum. UHC is responsible for providing healthcare coverage and benefits services, while Optum provides information and technology enabled health services. India operations of UHG are aligned to Optum. The Optum Global Analytics Team, part of Optum, is involved in developing broad-based and targeted analytics solutions across different verticals for all lines of business. Qualifications & Requirements Bachelors / 4 year university degree Experience - Between 7-10 Years of Experience with below mentioned Skills Must Have Skills Good understanding of Spark Architecture Good understanding of Data Architecture and solutions in Azure Good Skills in Azure Data Processing skills like Azure data Factory and Azure data Bricks Good experience in building data pipelines using ADF Strong proficiency in SQL and complex queries Good experience with programming languages such as Python, PySpark. Ability to learn and adapt to new data technologies Knowledge & experience with Containerization - Docker, Kubernetes Knowledge/Experience with Bigdata & Hadoop ecosystem - Spark, Hive, HBase, Sqoop etc. Build / Deployment Automation - Jenkins. Good problem-solving skills Good communication skills Act in a strategic capacity as a senior technical expert for all current Azure Cloud based solutions while keeping abreast of industry Cloud solutions Lead project independently & with minimal guidance, including high level client communication and project planning. Good to Have Technical lead for Data Engineering - Data pipelines, Data modelling and Data warehouse. Experience of developing cloud-based API gateways Experience & exposure to API integration frameworks Certified in Azure Data Engineering (AZ-205) Excellent time management, communication, decision making, and presentation skills Position Responsibilities Work under supervision of Data Architects to gather requirements to create Datamodel for Data Science & Business Intelligence projects Work closely with Data Architects to create project plans & list down exhaustive list of activities to be carried out to implement solution Engage in client communications for all important functions including data understanding/exploration, strategizing solutions etc. Document the Metadata information about the data sources used in the project & present that information to team members during team meetings Design & Develop Data Marts, De-normalized views & Data Models for projects Design & Develop Data Quality control processes around the data sets used for analysis Mentoring & Grooming Junior Engineers Lead and Drive Knowledge sharing session within the team Own the technical deliveries of the team Work with Senior team members to develop new capabilities for the team Being Accountable Possess Achievable Drive Passionate about software development Looks forward to build and apply technical and functional skills Focuses on understanding Goals, Priorities and Plans Possess problem solving approach
Posted 2 weeks ago
0.0 - 4.0 years
1 - 3 Lacs
Tiruvannamalai, Chennai, Vellore
Work from Office
We are looking for a highly motivated and experienced Branch Receivable Officer to join our team at Equitas Small Finance Bank. The ideal candidate will have 0-4 years of experience in the BFSI industry, preferably with knowledge of Assets, Inclusive Banking, SBL, Mortgages, and Receivables. Roles and Responsibility Manage and oversee branch receivables operations for timely and accurate payments. Develop and implement strategies to improve receivables management and reduce delinquencies. Collaborate with cross-functional teams to resolve customer complaints and issues. Analyze and report on receivables performance metrics to senior management. Ensure compliance with regulatory requirements and internal policies. Maintain accurate records and reports of receivables transactions. Job Requirements Strong understanding of BFSI industry trends and regulations. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment and meet deadlines. Proficiency in MS Office and other relevant software applications. Strong analytical and problem-solving skills. Ability to build strong relationships with customers and stakeholders. Experience working with Equitas Small Finance Bank is preferred. Location - Chennai,Vellore,Tiruvannamalai,Tirupathur
Posted 2 weeks ago
1.0 - 3.0 years
1 - 3 Lacs
Tamil Nadu
Work from Office
We are looking for a highly motivated and experienced Branch Receivable Officer to join our team at Equitas Small Finance Bank. The ideal candidate will have 1-3 years of experience in the BFSI industry, preferably with knowledge of Assets, Inclusive Banking, SBL, Mortgages, and Receivables. Roles and Responsibility Manage and oversee branch receivables operations for timely and accurate payments. Develop and implement strategies to improve receivables management and reduce delinquencies. Collaborate with cross-functional teams to resolve customer complaints and issues. Analyze and report on receivables performance metrics to senior management. Ensure compliance with regulatory requirements and internal policies. Maintain accurate records and reports of receivables transactions. Job Requirements Strong understanding of BFSI industry trends and regulations. Experience in managing branch receivables operations and improving efficiency. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment and meet deadlines. Strong analytical and problem-solving skills. Proficiency in using financial software and systems.
Posted 2 weeks ago
2.0 - 5.0 years
1 - 3 Lacs
Salem, Erode
Work from Office
We are looking for a highly skilled and experienced Branch Receivable Officer to join our team at Equitas Small Finance Bank. The ideal candidate will have 2-5 years of experience in the BFSI industry, preferably with a background in Assets, Inclusive Banking, SBL, Mortgages, or Receivables. Roles and Responsibility Manage and oversee branch receivables operations for timely and accurate payments. Develop and implement strategies to improve receivables management and reduce delinquencies. Collaborate with cross-functional teams to resolve customer complaints and issues. Analyze and report on receivables performance metrics to senior management. Ensure compliance with regulatory requirements and internal policies. Maintain accurate records and reports of receivables transactions. Job Requirements Strong knowledge of BFSI industry trends and regulations. Experience in managing branch receivables operations and improving efficiency. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment and meet deadlines. Strong analytical and problem-solving skills. Proficiency in financial software and systems.
Posted 2 weeks ago
1.0 - 6.0 years
1 - 3 Lacs
Chennai, Kanchipuram
Work from Office
We are looking for a highly skilled and experienced Branch Receivable Officer to join our team at Equitas Small Finance Bank. The ideal candidate will have 1 to 6 years of experience in the BFSI industry, with expertise in Assets, Inclusive Banking, SBL, Mortgages, and Receivables. Roles and Responsibility Manage and oversee branch receivables operations for efficient cash flow. Develop and implement strategies to improve receivables management. Collaborate with cross-functional teams to resolve customer issues and enhance service quality. Analyze and report on receivables performance metrics to senior management. Ensure compliance with regulatory requirements and internal policies. Train and guide junior staff members to improve their skills and knowledge. Job Requirements Strong understanding of BFSI industry trends and regulations. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment and meet deadlines. Proficiency in MS Office and other relevant software applications. Strong analytical and problem-solving skills. Experience in managing and leading a team of receivables professionals.
Posted 2 weeks ago
2.0 - 3.0 years
2 - 6 Lacs
Tamil Nadu
Work from Office
We are looking for a highly skilled and experienced Legal Receivable Officer to join our team at Equitas Small Finance Bank. The ideal candidate will have 2-3 years of experience in the BFSI industry, preferably with a background in Inclusive Banking - SBL, Mortgages, or Legal Receivables. Roles and Responsibility Manage and oversee legal receivables, ensuring timely recovery of outstanding amounts. Develop and implement effective strategies to minimize legal receivables and improve cash flow. Collaborate with cross-functional teams to resolve customer disputes and issues related to legal receivables. Analyze and report on legal receivables performance metrics, providing insights for improvement. Ensure compliance with regulatory requirements and internal policies related to legal receivables. Maintain accurate records and documentation of legal receivables transactions and interactions. Job Requirements Strong knowledge of legal receivables principles, practices, and procedures. Excellent communication and interpersonal skills, with the ability to work effectively with customers and internal stakeholders. Ability to analyze complex data sets and provide actionable insights to support business decisions. Strong problem-solving skills, with the ability to think critically and creatively to resolve challenging issues. Proficiency in Microsoft Office applications, particularly Excel, Word, and PowerPoint. Experience working in a fast-paced environment, prioritizing multiple tasks and deadlines while maintaining attention to detail.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough