Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 5.0 years
12 - 14 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
ETL Process Design: Designing and developing ETL processes using Talend for data integration and transformation. Data Extraction: Extracting data from various sources, including databases, APIs, and flat files. Data Transformation: Transforming data to meet business requirements and ensuring data quality. Data Loading: Loading transformed data into target systems, such as data warehouses or data lakes. Job Scheduling: Scheduling and automating ETL jobs using Talend's scheduling tools. Performance Optimization: Optimizing ETL workflows for efficiency and performance. Error Handling: Implementing robust error handling and logging mechanisms in ETL processes. Data Profiling: Performing data profiling to identify data quality issues and inconsistencies. Documentation: Documenting ETL processes, data flow diagrams, and technical specifications. Collaboration with Data Teams: Working closely with data analysts, data scientists, and other stakeholders to understand data requirements. Min 4 to Max 7yrs of Relevant exp. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 1 month ago
8.0 years
25 - 30 Lacs
India
Remote
Job Title: Informatica ETL Developer Primary Skills: Informatica ETL, SQL, Data Migration, Redshift Informatica ETL SQL Data Migration Redshift Total Experience Required: 8+ Years Relevant Informatica Experience: Minimum 5+ Years Location: [Remote ] Employment Type: [Contract] Job Overview We are seeking an experienced Informatica ETL Developer with strong expertise in ETL development, data migration, and SQL optimization —especially on Amazon Redshift . The ideal candidate will have a solid foundation in data warehousing principles and hands-on experience with Informatica PowerCenter and/or Informatica Cloud , along with proven experience migrating ETL processes from platforms like Talend or IBM DataStage to Informatica. Key Responsibilities Lead or support the migration of ETL processes from Talend or DataStage to Informatica. Design, develop, and maintain efficient ETL workflows using Informatica PowerCenter or Informatica Cloud. Write, optimize, and troubleshoot complex SQL queries—especially in Amazon Redshift. Work on data lakehouse architectures and ensure smooth integration with ETL processes. Understand and implement data warehousing concepts, including star/snowflake schema design, SCDs, data partitioning, etc. Ensure ETL performance, data integrity, scalability, and data quality in all stages of processing. Collaborate with business analysts, data engineers, and other developers to gather requirements and design end-to-end data solutions. Perform performance tuning, issue resolution, and support production ETL jobs as needed. Contribute to design and architecture discussions, documentation, and code reviews. Work with structured and unstructured data and transform it as per business logic and reporting needs. Required Skills & Qualifications Minimum 8+ years of experience in data engineering or ETL development. At least 5+ years of hands-on experience with Informatica PowerCenter and/or Informatica Cloud. Experience in ETL migration projects, specifically from Talend or DataStage to Informatica. Proficiency in Amazon Redshift and advanced SQL scripting, tuning, and debugging. Strong grasp of data warehousing principles, dimensional modeling, and ETL performance optimization. Experience working with data lakehouse architecture (e.g., S3, Glue, Athena, etc. with Redshift). Ability to handle large data volumes, complex transformations, and data reconciliation. Strong understanding of data integrity, security, and governance best practices. Effective communication skills and ability to work cross-functionally with both technical and non-technical stakeholders. Nice To Have Experience with CI/CD for data pipelines or version control tools like Git. Exposure to Agile/Scrum development methodologies. Familiarity with Informatica Intelligent Cloud Services (IICS). Experience with Python or Shell scripting for automation. Skills: etl performance optimization,data integrity,redshift,informatica etl,amazon redshift,sql,informatica,amazon,data migration,data lakehouse architecture,etl development,data warehousing,data governance,data,architecture,etl
Posted 1 month ago
8.0 - 10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
As Associate Manager, Data Engineering, You Will Lead the team of Data Engineers and develop innovative approaches on performance optimization & automation Analyzing enterprise specifics to understand current-state data schema and data model and contribute to define future-state data schema, data normalization and schema integration as required by the project Apply coding expertise, best practices and guidance in Python, SQL, Informatica and cloud data platform development to members of the team Collaborate with clients to harden, scale, and parameterize code to be scalable across brands and regions Understanding business objectives and develop business intelligence applications that help to monitor & improve critical business metrics Monitor project timelines ensuring deliverables are being met by team members Communicate frequently to stakeholders on project requirements, statuses and risks Manage the monitoring of productionized processes to ensure pipelines are executed successfully every day communicating delays as required to stakeholders Contribute to the design of scalable data integration frameworks to move and transform a variety of large data sets Develop robust work products by following best practices through all stages of development, testing & deployment Skills and Qualifications BTECH / master’s degree in a quantitative field (statistics, business analytics, computer science Team management experience is must. 8-10 Years of experience (with at least 2-4 yrs of experience in managing team) Vast background in all things data related Intermediate level of proficiency with Python and data related libraries (PySpark, Pandas, etc.) High level of proficiency with SQL (Snowflake a big plus) Snowflakes is REQUIRED. We need someone with a high level of Snowflake experience. Certification is a big plus AWS data platform development experience High level of proficiency with Data Warehousing and Data Modeling Experience with ETL tools (Informatica, Talend, DataStage) required Informatica is our tool and is required. IICS or Power Center is accepted. Ability to coach team members setting them up for success in their roles Capable of connecting with team members inspiring them to be their best The Yum! Brands story is simple. We have the four distinctive, relevant and easy global brands – KFC, Pizza Hut, Taco Bell and The Habit Burger Grill -- born from the hopes and dreams, ambitions and grit of passionate entrepreneurs. And we want more of this to create our future! As the world’s largest restaurant company we have a clear and compelling mission: to build the world’s most love, trusted and fastest-growing restaurant brands. The key and not-so-secret ingredient in our recipe for growth is our unrivaled talent and culture, which fuels our results. We’re looking for talented, motivated, visionary and team-oriented leaders to join us as we elevate and personalize the customer experience across our 48,000 restaurants, operating in 145 countries and territories around the world! We put pizza, chicken and tacos in the hands of customers through customized ordering, unique delivery approaches, app experiences, and click and collect services and consumer data analytics creating unique customer dining experiences – and we are only getting started. Employees may work for a single brand and potentially grow to support all company-owned brands depending on their role. Regardless of where they work, as a company opening an average of 8 restaurants a day worldwide, the growth opportunities are endless. Taco Bell has been named of the 10 Most Innovative Companies in the World by Fast Company; Pizza Hut delivers more pizzas than any other pizza company in the world and KFC’s still use its 75-year-old finger lickin’ good recipe including secret herbs and spices to hand-bread its chicken every day. Yum! and its brands have offices in Chicago, IL, Louisville KY, Irvine, CA, Plano, TX and other markets around the world. We don’t just say we are a great place to work – our commitments to the world and our employees show it. Yum! has been named to the Dow Jones Sustainability North America Index and ranked among the top 100 Best Corporate Citizens by Corporate Responsibility Magazine in addition to being named to the Bloomberg Gender-Equality Index. Our employees work in an environment where the value of “believe in all people” is lived every day, enjoying benefits including but not limited to: 4 weeks’ vacation PLUS holidays, sick leave and 2 paid days to volunteer at the cause of their choice and a dollar-for-dollar matching gift program; generous parental leave; competitive benefits including medical, dental, vision and life insurance as well as a 6% 401k match – all encompassed in Yum!’s world-famous recognition culture.
Posted 1 month ago
3.0 years
0 Lacs
Gurugram, Haryana
On-site
About the Role: Grade Level (for internal use): 09 S&P Global Mobility The Role: ETL Developer The Team The ETL team forms an integral part of Global Data Operations (GDO) and caters to the North America & EMEA automotive business line. Core responsibilities include translating business requirements into technical design and ETL jobs along with unit testing, integration testing, regression testing, deployments & production operations. The team has an energetic and dynamic group of individuals, always looking to work through a challenge. Ownership, raising the bar and innovation is what the team runs on! The Impact The ETL team, being part of GDO, caters to the automotive business line and helps stakeholders with an optimum solution for their data needs. The role requires close coordination with global teams such as other development teams, research analysts, quality assurance analysts, architects etc. The role is vital for the automotive business as it involves providing highly efficient data solutions with high accuracy to various stakeholders. The role forms a bridge between the business and technical stakeholders. What’s in it for you Constant learning, working in a dynamic and challenging environment! Total Rewards. Monetary, beneficial, and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! Responsibilities Using prior experience with file loading, cleansing and standardization, should be able to translate business requirements into ETL design and efficient ETL solutions using Informatica Powercenter (mandatory) and Talend Enterprise (preferred). Knowledge of tibco would be a preferred skill as well. Understand relational database technologies and data warehousing concepts and processes. Using prior experiences with High Volume data processing, be able to deal with complex technical issues Works closely with all levels of management and employees across the Automotive business line. Participates as part of cross-functional teams responsible for investigating issues, proposing solutions and implementing corrective actions. Good communication skills required for interface with various stakeholder groups; detail oriented with analytical skills What We’re Looking For The ETL development team within the Mobility domain is looking for a Software Engineer to work on design, development & operations efforts in the ETL (Informatica) domain. Primary Skills and qualifications required: Experience with Informatica and/or Talend ETL tools Bachelor’s degree in Computer Science, with at least 3+ years of development and maintenance of ETL systems on Informatica PowerCenter and 1+ year of SQL experience. 3+ years of Informatica Design and Architecture experience and 1+ years of Optimization and Performance tuning of ETL code on Informatica 1+ years of python development experience and SQL, XML experience Working knowledge or greater of Cloud Based Technologies, Development, Operations a plus. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316976 Posted On: 2025-06-25 Location: Gurgaon, Haryana, India
Posted 1 month ago
3.0 years
0 Lacs
Gurugram, Haryana
On-site
Software Engineer Gurgaon, India Information Technology 316976 Job Description About The Role: Grade Level (for internal use): 09 S&P Global Mobility The Role: ETL Developer The Team The ETL team forms an integral part of Global Data Operations (GDO) and caters to the North America & EMEA automotive business line. Core responsibilities include translating business requirements into technical design and ETL jobs along with unit testing, integration testing, regression testing, deployments & production operations. The team has an energetic and dynamic group of individuals, always looking to work through a challenge. Ownership, raising the bar and innovation is what the team runs on! The Impact The ETL team, being part of GDO, caters to the automotive business line and helps stakeholders with an optimum solution for their data needs. The role requires close coordination with global teams such as other development teams, research analysts, quality assurance analysts, architects etc. The role is vital for the automotive business as it involves providing highly efficient data solutions with high accuracy to various stakeholders. The role forms a bridge between the business and technical stakeholders. What’s in it for you Constant learning, working in a dynamic and challenging environment! Total Rewards. Monetary, beneficial, and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! Responsibilities Using prior experience with file loading, cleansing and standardization, should be able to translate business requirements into ETL design and efficient ETL solutions using Informatica Powercenter (mandatory) and Talend Enterprise (preferred). Knowledge of tibco would be a preferred skill as well. Understand relational database technologies and data warehousing concepts and processes. Using prior experiences with High Volume data processing, be able to deal with complex technical issues Works closely with all levels of management and employees across the Automotive business line. Participates as part of cross-functional teams responsible for investigating issues, proposing solutions and implementing corrective actions. Good communication skills required for interface with various stakeholder groups; detail oriented with analytical skills What We’re Looking For The ETL development team within the Mobility domain is looking for a Software Engineer to work on design, development & operations efforts in the ETL (Informatica) domain. Primary Skills and qualifications required: Experience with Informatica and/or Talend ETL tools Bachelor’s degree in Computer Science, with at least 3+ years of development and maintenance of ETL systems on Informatica PowerCenter and 1+ year of SQL experience. 3+ years of Informatica Design and Architecture experience and 1+ years of Optimization and Performance tuning of ETL code on Informatica 1+ years of python development experience and SQL, XML experience Working knowledge or greater of Cloud Based Technologies, Development, Operations a plus. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316976 Posted On: 2025-06-25 Location: Gurgaon, Haryana, India
Posted 1 month ago
3.0 years
0 Lacs
Itanagar, Arunachal Pradesh, India
Remote
Location : Remote Mode : Contract (2 Months with Possible Extension) Years of experience : 3+ Years Shift : UK shift Job Summary We are looking for a highly motivated and detail-oriented Data Engineer with a strong background in data cleansing, Python scripting, and SQL to join our team. The ideal candidate will play a critical role in ensuring data quality, transforming raw datasets into actionable insights, and supporting data-driven decision-making across the organization. Key Responsibilities Design and implement efficient data cleansing routines to remove duplicates, correct anomalies, and validate data integrity. Write robust Python scripts to automate data processing, transformation, and integration tasks. Develop and optimize SQL queries for data extraction, aggregation, and reporting. Work closely with data analysts, business stakeholders, and engineering teams to understand data requirements and deliver clean, structured datasets. Build and maintain data pipelines that support large-scale data processing. Monitor data workflows and troubleshoot issues to ensure accuracy and reliability. Contribute to documentation of data sources, transformations, and cleansing logic. Requirements Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field. 3+ years of hands-on experience in data engineering, with a focus on data quality and cleansing. Strong proficiency in Python, including libraries like Pandas and NumPy. Expert-level knowledge of SQL and working with relational databases (e.g., PostgreSQL, MySQL, SQL Server). Familiarity with data profiling tools and techniques. Excellent problem-solving skills and attention to detail. Good communication and documentation skills. Preferred Qualifications Experience with cloud platforms (AWS, Azure, GCP) and data services (e.g, S3, BigQuery, Redshift). Knowledge of ETL tools like Apache Airflow, Talend, or similar. Exposure to data governance and data cataloging practices (ref:hirist.tech)
Posted 1 month ago
4.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Description: ETL + QA Engineer Role Overview: We are looking for a skilled ETL + QA Engineer to join our team and ensure the quality and reliability of data pipelines, ETL processes, and applications. The ideal candidate should have expertise in both ETL testing and QA methodologies , with a strong understanding of data validation, automation, and performance testing. Key Responsibilities: ETL Testing: Validate ETL workflows, data transformations, and data migration processes. Perform data integrity checks across multiple databases (SQL, NoSQL, Data Lakes). Verify data extraction, transformation, and loading (ETL) processes from source to destination. Write and execute SQL queries to validate data accuracy and consistency. Identify and report issues related to data loss, truncation, and incorrect transformations. Quality Assurance: Develop, maintain, and execute test cases, test plans, and test scripts for ETL processes and applications. Perform manual and automation testing for APIs, UI, and backend systems. Conduct functional, integration, system, and regression testing. Utilize API testing tools (Postman, RestAssured) to validate endpoints and data responses. Automate test cases using Selenium, Python, or any other scripting language. Participate in performance and scalability testing for ETL jobs. CI/CD & Automation: [ Preferrable ] Implement automation frameworks for ETL testing. Integrate tests within CI/CD pipelines using Jenkins, GitHub Actions, or Azure DevOps. Collaborate with developers and data engineers to improve testing strategies and defect resolution. Required Skills & Experience: Technical Skills: Strong experience in ETL Testing and Data Validation. Proficiency in SQL, PL/SQL for complex queries and data verification. Hands-on experience with ETL tools like Informatica, Talend, SSIS, or Apache Nifi. Experience in API Testing (Postman, RestAssured, SoapUI). Knowledge of automation tools (Selenium, Python, Java, or C#). Familiarity with cloud platforms (AWS, GCP, Azure) and data warehousing solutions (Snowflake, Redshift, BigQuery). Experience working with CI/CD pipelines and Git-based workflows. Knowledge of scripting languages (Python, Shell, or PowerShell) is a plus. Soft Skills: Strong analytical and problem-solving skills. Excellent communication and documentation skills. Ability to work in an Agile/Scrum environment. Strong collaboration skills to work with cross-functional teams. Preferred Qualifications: Bachelor’s/Master’s degree in Computer Science, IT, or related field. 4-7 years of experience in ETL Testing, QA. Certifications in ISTQB, AWS, Azure, or any relevant ETL tool is a plus.
Posted 1 month ago
10.0 - 15.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Designation: Data Architect Location: Pune Experience: 10-15 years Skills Azure Expertise: The architect should have experience in architecting large scale analytics solutions using native services such as Azure Synapse, Data Lake, Data Factory, HDInsight, Databricks, Azure Cognitive Services, Azure ML, Azure Event Hub. Architecture Creation: Assist with creation of a robust, sustainable architecture that supports requirements and provides for expansion with secured access. BFSI Experience: Experience in building/running large data environment for BFSI clients. Collaboration: Work with customers, end users, technical architects, and application designers to define the data requirements and data structure for BI/Analytic solutions. Data Modeling: Designs conceptual and logical models for the data lake, data warehouse, data mart, and semantic layer (data structure, storage, and integration). Lead the database analysis, design, and build effort. Communication: Communicates physical database designs to lead data architect/database administrator. Data Model Evolution: Evolves data models to meet new and changing business requirements. Business Analysis: Work with business analysts to identify and understand requirements and source data systems. Big Data Technologies: Expert in big data technologies on Azure/GCP. ETL Platforms: Experience with ETL platforms like ADF, Glue, Ab Initio, Informatica, Talend, Airflow. Data Visualization: Experience in data visualization tools like Tableau, Power BI, etc. Data Engineering & Management: Experience in a data engineering, metadata management, database modeling and development role. Streaming Data Handling: Strong experience in handling streaming data with Kafka. Data API Understanding: Understanding of Data APIs, Web services. Data Security: Experience in Data security and Data Archiving/Backup, Encryption and define the standard processes for same. DataOps/MLOps: Experience in setting up DataOps and MLOps. Database Design: Ensure that the database designs fulfill the requirements, including data volume, frequency, and long-term BI/Analytics growth requirements. Integration: Work with other architects to ensure that all components work together to meet objectives and performance goals as defined in the requirements. System Performance: Improve system performance by conducting tests, troubleshooting, and integrating new elements. Data Science Coordination: Coordinate with the Data Science Teams to identify future data needs and requirements and creating pipelines for them. Soft Skills: Soft skills such as communication, leading the team, taking ownership and accountability to successful engagement. Quality Management: Participate in quality management reviews. Customer Management: Managing customer expectation and business user interactions. Research and Development: Deliver key research (MVP, POC) with an efficient turn-around time to help make strong product decisions. Mentorship: Demonstrate key understanding and expertise on modern technologies, architecture, and design. Mentor the team to deliver modular, scalable, and high-performance code. Innovation: Be a change agent on key innovation and research to keep the product, team at the cutting edge of technical and product innovation.
Posted 1 month ago
3.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics D&A – SSIS- Senior We’re looking for Informatica or SSIS Engineers with Cloud Background (AWS, Azure) Primary skills: Has played key roles in multiple large global transformation programs on business process management Experience in database query using SQL Should have experience working on building/integrating data into a data warehouse. Experience in data profiling and reconciliation Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Developed expertise in complex data management or Application integration solution and deployment in areas of data migration, data integration, application integration or data quality. Experience in data processing, orchestration, parallelization, transformations and ETL Fundamentals. Leverages on variety of programming languages & data crawling/processing tools to ensure data reliability, quality & efficiency (optional) Experience in Cloud Data-related tool (Microsoft Azure, Amazon S3 or Data lake) Knowledge on Cloud infrastructure and knowledge on Talend cloud is an added advantage Knowledge of data modelling principles. Knowledge in Autosys scheduling Good experience in database technologies. Good knowledge in Unix system Responsibilities: Need to work as a team member to contribute in various technical streams of Data integration projects. Provide product and design level technical best practices Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Building a quality culture Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Qualification: BE/BTech/MCA (must) with an industry experience of 3 -7 years. Experience in Talend jobs, joblets and customer components. Should have knowledge of error handling and performance tuning in Talend. Experience in big data technologies such as sqoop, Impala, hive, Yarn, Spark etc. Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Atleast experience of minimum 3-4 clients for short duration projects ranging between 6-8 + months OR Experience of minimum 2+ clients for duration of projects ranging between 1-2 years or more than that People with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieve? Read on. At UKG, you get more than just a job. You get to work with purpose. Our team of U Krewers are on a mission to inspire every organization to become a great place to work through our award-winning HR technology built for all. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. Job Description The Analytics Consultant II (Level-2) is a business intelligence focused expert that participates in the delivery of analytics solutions and reporting for various UKG products such as Pro, UKG Dimensions and UKG Datahub. The candidate is also responsible for interacting with other businesses and technical project stakeholders to gather business requirements and ensure successful delivery. The candidate should be able to leverage the strengths and capabilities of the software tools to provide an optimized solution to the customer. The Analytics Consultant II will also be responsible for developing custom analytics solutions and reports to specifications provided and support the solutions delivered. The candidate must be able to effectively communicate ideas both verbally and in writing at all levels in the organization, from executive staff to technical resources. The role requires working with the Program/Project manager, the Management Consultant, and the Analytics Consultants to deliver the solution based upon the defined design requirements and ensure it meets the scope and customer expectations. Responsibilities Include Interact with other businesses and technical project stakeholders to gather business requirements Deploy and Configure the UKG Analytics and Data Hub products based on the Design Documents Develop and deliver best practice visualizations and dashboards using a BI tools such as Cognos or BIRT or Power BI etc. Put together a test plan, validate the solution deployed and document the results Provide support during production cutover, and after go-live act as the first level of support for any requests that come through from the customer or other Consultants Analyse the customer’s data to spot trends and issues and present the results back to the customer Qualification 3+ years’ experience designing and delivering Analytical/Business Intelligence solutions required Cognos, BIRT, Power BI or other business intelligence toolset experience required ETL experience using Talend or other industry standard ETL tools strongly preferred Advanced SQL proficiency is a plus Knowledge of Google Cloud Platform or Azure or something similar is desired, but not required Knowledge of Python is desired, but not required Willingness to learn new technologies and adapt quickly is required Strong interpersonal and problem-solving skills Flexibility to support customers in different time zones is required Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal-opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation For individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com
Posted 1 month ago
3.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics D&A – SSIS- Senior We’re looking for Informatica or SSIS Engineers with Cloud Background (AWS, Azure) Primary skills: Has played key roles in multiple large global transformation programs on business process management Experience in database query using SQL Should have experience working on building/integrating data into a data warehouse. Experience in data profiling and reconciliation Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Developed expertise in complex data management or Application integration solution and deployment in areas of data migration, data integration, application integration or data quality. Experience in data processing, orchestration, parallelization, transformations and ETL Fundamentals. Leverages on variety of programming languages & data crawling/processing tools to ensure data reliability, quality & efficiency (optional) Experience in Cloud Data-related tool (Microsoft Azure, Amazon S3 or Data lake) Knowledge on Cloud infrastructure and knowledge on Talend cloud is an added advantage Knowledge of data modelling principles. Knowledge in Autosys scheduling Good experience in database technologies. Good knowledge in Unix system Responsibilities: Need to work as a team member to contribute in various technical streams of Data integration projects. Provide product and design level technical best practices Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Building a quality culture Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Qualification: BE/BTech/MCA (must) with an industry experience of 3 -7 years. Experience in Talend jobs, joblets and customer components. Should have knowledge of error handling and performance tuning in Talend. Experience in big data technologies such as sqoop, Impala, hive, Yarn, Spark etc. Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Atleast experience of minimum 3-4 clients for short duration projects ranging between 6-8 + months OR Experience of minimum 2+ clients for duration of projects ranging between 1-2 years or more than that People with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
3.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics D&A – SSIS- Senior We’re looking for Informatica or SSIS Engineers with Cloud Background (AWS, Azure) Primary skills: Has played key roles in multiple large global transformation programs on business process management Experience in database query using SQL Should have experience working on building/integrating data into a data warehouse. Experience in data profiling and reconciliation Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Developed expertise in complex data management or Application integration solution and deployment in areas of data migration, data integration, application integration or data quality. Experience in data processing, orchestration, parallelization, transformations and ETL Fundamentals. Leverages on variety of programming languages & data crawling/processing tools to ensure data reliability, quality & efficiency (optional) Experience in Cloud Data-related tool (Microsoft Azure, Amazon S3 or Data lake) Knowledge on Cloud infrastructure and knowledge on Talend cloud is an added advantage Knowledge of data modelling principles. Knowledge in Autosys scheduling Good experience in database technologies. Good knowledge in Unix system Responsibilities: Need to work as a team member to contribute in various technical streams of Data integration projects. Provide product and design level technical best practices Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Building a quality culture Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Qualification: BE/BTech/MCA (must) with an industry experience of 3 -7 years. Experience in Talend jobs, joblets and customer components. Should have knowledge of error handling and performance tuning in Talend. Experience in big data technologies such as sqoop, Impala, hive, Yarn, Spark etc. Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Atleast experience of minimum 3-4 clients for short duration projects ranging between 6-8 + months OR Experience of minimum 2+ clients for duration of projects ranging between 1-2 years or more than that People with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
10.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Role: Lead Data Engineer Location: Indore Experience required : 10+ Years Job Description: Build and maintain data pipelines for ingesting and processing structured and unstructured data. Ensure data accuracy and quality through validation checks and sanity reports. Improve data infrastructure by automating manual processes and scaling systems. Support internal teams (Product, Delivery, Onboarding) with data issues and solutions. Analyze data trends and provide insights to inform key business decisions. Collaborate with program managers to resolve data issues and maintain clear documentation. Must-Have Skills: Proficiency in SQL, Python (Pandas, NumPy), and R Experience with ETL tools (e.g., Apache NiFi, Talend, AWS Glue) Cloud experience with AWS (S3, Redshift, EMR, Athena, RDS) Strong understanding of data modeling, warehousing, and data validation Familiarity with data visualization tools (Tableau, Power BI, Looker) Experience with Apache Airflow, Kubernetes, Terraform, Docker Knowledge of data lake architectures, APIs, and custom data formats (JSON, XML, YAML)
Posted 1 month ago
2.0 - 6.0 years
5 - 9 Lacs
Pune
Work from Office
Data Engineer1 Common Skils - SQL, GCP BQ, ETL pipelines using Pythin/Airflow,Experience on Spark/Hive/HDFS,Data modeling for Data conversion Resources (4) Prior experience working on a conv/migration HR project is additional skill needed along with above mentioned skills Common Skils - SQL, GCP BQ, ETL pipelines using Pythin/Airflow, Experience on Spark/Hive/HDFS, Data modeling for Data conversion Resources (4) Prior experience working on a conv/migration HR project is additional skill needed along with above mentioned skills Data Engineer - Knows HR Knowledge , all other requirement from Functional Area given by UBER Customer Name Customer Nameuber
Posted 1 month ago
5.0 - 6.0 years
6 - 10 Lacs
Hyderabad, Pune
Work from Office
Sr Workato Developer2 Required: Experience5-6 years of experience in designing, developing, and deploying integration solutions using Workato.Technical ProficiencyStrong hands-on experience with Workato, including creating recipes, managing triggers, actions, and jobs. Expertise in API integrations and working with cloud applications (Salesforce, NetSuite, ServiceNow, etc.).Integration ExperienceProficiency in integrating SaaS and on-premise applications. Hands-on experience with RESTful and SOAP APIs, webhooks, and other integration protocols.Programming Skills: Solid experience with scripting languages (e.g., JavaScript, Python) to handle data transformations, custom actions, and other logic.Cloud PlatformsExperience working with cloud platforms like AWS, Azure, or Google Cloud.Troubleshooting & DebuggingStrong problem-solving skills to identify, troubleshoot, and resolve complex integration issues.Project ManagementAbility to manage multiple projects, prioritize tasks, and meet deadlines in a fast-paced environment.CommunicationStrong communication skills, with the ability to articulate technical concepts clearly to both technical and non-technical stakeholders. Preferred: CertificationsWorkato Certification or other relevant integration certifications.Experience with ETL toolsKnowledge of ETL processes and tools like Talend, Informatica, or similar is a plus.Agile ExperienceFamiliarity with Agile methodologies and working in Scrum teams.Database KnowledgeExperience with databases such as MySQL, SQL Server, or PostgreSQL.Advanced Skills: Experience with advanced features in Workato, such as advanced mappings, custom connectors, and error handling.
Posted 1 month ago
5.0 - 7.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Design, develop, and maintain system integration solutions that connect various applications and platforms using APIs, middleware, or other integration tools Collaborate with business analysts, architects, and development teams to gather requirements and implement robust integration workflows Monitor and troubleshoot integration processes, ensuring data consistency, accuracy, and performance Create technical documentation, perform testing, and resolve any integration-related issues Ensure compliance with security and data governance standards while optimizing system connectivity and scalability Stay updated with integration trends and tools to enhance system interoperability
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred Technical And Professional Experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks
Posted 1 month ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary Stakeholder Management: Engage with senior stakeholders from CIB, Private Banking, and FP&A to gather requirements and translate business needs into technical solutions. Business-as-Usual (BAU) Operations: Oversee daily operations, ensuring the stability and accuracy of performance management dashboards and reports. Participate in projects covering full data lifecycle, end-to-end, from design, implementation, and testing, to documentation, delivery, support, and maintenance. Change Management: Drive digital transformation by supporting the migration from legacy data platforms to a modern technology stack. Data Engineering: Develop and optimize data pipelines, ensuring data accuracy, consistency, and accessibility for reporting. Team Leadership: Manage a team of three data professionals, providing guidance, setting priorities, and fostering a collaborative work environment. Process Improvement: Identify opportunities to enhance data processes, reporting efficiency, and automation. Understanding the financial processes end to end and working efficiently towards delivery of on demand CFO analytics to support specific business scenarios. Collaboration: Work closely with technology teams to implement scalable and efficient data solutions that meet business needs. Controls and governance – Ensure data integrity, security, and compliance with internal frameworks, implementing robust controls across reporting and data management process. Performing and documenting data analysis, data validation, and data mapping/design. Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead the [country / business unit / function/XXX [team] to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. [Insert local regulator e.g. PRA/FCA prescribed responsibilities and Rationale for allocation]. [Where relevant - Additionally, for subsidiaries or relevant non -subsidiaries] Serve as a Director of the Board of [insert name of entities] Exercise authorities delegated by the Board of Directors and act in accordance with Articles of Association (or equivalent) Key stakeholders GCFO COO Management & Leadership Team CFO – CIB, WRB and products Business Finance Teams Regional and Country CFOs TTO Finance and Functions Key Responsibilities Skills and Experience Process Management Data Visualization and Financial Reporting Project Planning and Agile Project Management Management of project scope Business Intelligence Identification & Management of project Risk/Assumptions/Issues/ Dependencies Proficiency in analytical tools and platforms Stakeholder Management Communication and presentation skills Banking Products Team Management SQL | PL-SQL Data Modelling | Data Pipeline | ETL Qualifications 10+ years of commercial project implementation experience ( Database | Data warehousing | ETL | BI | Big Data | Data Science ) Commercial experience working in Business Intelligence (SAP Analytics Cloud | Tableau | MS Power BI | MicroStrategy| Qlik ) Proficiency in SQL ( DML | DDL | DCL | TCL ) Commercial experience working on database/data warehouse projects ( Oracle | Exasol | SAP BW | Teradata ) Commercial experience creating ETL processes using data integration tools ( Dataiku | Informatica | Oracle DI | Talend | Datasphere | IBM DataStage) Effective communication skills, including presenting and influencing senior management. A high degree of integrity and ability to challenge the views and actions of others in a constructive manner Ability to work effectively under pressure, multi task, lead through ambiguities, influence where he/she does not have direct authority & build on unstructured formative situations A leader, a team player with the management ability and track record to secure the confidence and respect of the peers, stakeholders and the executive management team. Strong Analytical and Strategic mindset, coupled with a thorough understanding of business performance management outcomes Should have worked in a business finance function with strong FP&A background. Ability to understand and connect business drivers and rationale for and application of those relevant to the cost management process. Significant experience of working with senior management team and of interfacing with and influencing senior stakeholders. Strong ability to understand financial statements and its drivers and synthesize them into meaningful analyses as required. Develop analysis & interpretations as required to facilitate management decision making. Ability to culturally orient in diverse & international team environment and lead and inspire multi-disciplinary teams. About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.
Posted 1 month ago
4.0 - 9.0 years
10 - 14 Lacs
Pune
Work from Office
Job Title - Sales Excellence -Client Success - Data Engineering Specialist - CF Management Level :ML9 Location:Open Must have skills:GCP, SQL, Data Engineering, Python Good to have skills:managing ETL pipelines. Job Summary : We are: Sales Excellence. Sales Excellence at Accenture empowers our people to compete, win and grow. We provide everything they need to grow their client portfolios, optimize their deals and enable their sales talent, all driven by sales intelligence. The team will be aligned to the Client Success, which is a new function to support Accentures approach to putting client value and client experience at the heart of everything we do to foster client love. Our ambition is that every client loves working with Accenture and believes were the ideal partner to help them create and realize their vision for the future beyond their expectations. You are: A builder at heart curious about new tools and their usefulness, eager to create prototypes, and adaptable to changing paths. You enjoy sharing your experiments with a small team and are responsive to the needs of your clients. The work: The Center of Excellence (COE) enables Sales Excellence to deliver best-in-class service offerings to Accenture leaders, practitioners, and sales teams. As a member of the COE Analytics Tools & Reporting team, you will help in building and enhancing data foundation for reporting tools and Analytics tool to provide insights on underlying trends and key drivers of the business. Roles & Responsibilities: Collaborate with the Client Success, Analytics COE, CIO Engineering/DevOps team, and stakeholders to build and enhance Client success data lake. Write complex SQL scripts to transform data for the creation of dashboards or reports and validate the accuracy and completeness of the data. Build automated solutions to support any business operation or data transfer. Document and build efficient data model for reporting and analytics use case. Assure the Data Lake data accuracy, consistency, and timeliness while ensuring user acceptance and satisfaction. Work with the Client Success, Sales Excellence COE members, CIO Engineering/DevOps team and Analytics Leads to standardize Data in data lake. Professional & Technical Skills: Bachelors degree or equivalent experience in Data Engineering, analytics, or similar field. At least 4 years of professional experience in developing and managing ETL pipelines. A minimum of 2 years of GCP experience. Ability to write complex SQL and prepare data for dashboarding. Experience in managing and documenting data models. Understanding of Data governance and policies. Proficiency in Python and SQL scripting language. Ability to translate business requirements into technical specification for engineering team. Curiosity, creativity, a collaborative attitude, and attention to detail. Ability to explain technical information to technical as well as non-technical users. Ability to work remotely with minimal supervision in a global environment. Proficiency with Microsoft office tools. Additional Information: Masters degree in analytics or similar field. Data visualization or reporting using text data as well as sales, pricing, and finance data. Ability to prioritize workload and manage downstream stakeholders. About Our Company | AccentureQualification Experience: Minimum 5+ year(s) of experience is required Educational Qualification: Bachelors degree or equivalent experience in Data Engineering, analytics, or similar field
Posted 1 month ago
3.0 - 8.0 years
5 - 9 Lacs
Gurugram
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Warehouse ETL Testing Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and implement ETL test cases to ensure data accuracy.- Conduct data validation and reconciliation processes.- Collaborate with cross-functional teams to troubleshoot and resolve data issues.- Create and maintain test documentation for ETL processes.- Identify opportunities for process improvement and optimization. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Warehouse ETL Testing.- Strong understanding of SQL and database concepts.- Experience with ETL tools such as Informatica or Talend.- Knowledge of data warehousing concepts and methodologies.- Hands-on experience in data quality assurance and testing. Additional Information:- The candidate should have a minimum of 3 years of experience in Data Warehouse ETL Testing.- This position is based at our Gurugram office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Talend ETL Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to your team members. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the highest standards of quality and functionality. Your role will require you to balance technical expertise with effective communication, fostering a collaborative environment that encourages innovation and problem-solving. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Strong understanding of data integration processes and methodologies.- Experience with data warehousing concepts and practices.- Familiarity with SQL and database management systems.- Ability to troubleshoot and optimize ETL processes for performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Talend ETL.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP Native HANA SQL Modeling & Development Good to have skills : Talend ETL, SAP BusinessObjects Data ServicesMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct regular team meetings to discuss progress and challenges- Stay updated on industry trends and technologies to enhance team performance Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Native HANA SQL Modeling & Development- Good To Have Skills: Experience with Talend ETL, SAP BusinessObjects Data Services- Strong understanding of database management and optimization- Expertise in data modeling and schema design- Hands-on experience with SAP HANA Studio and SAP HANA Cloud Platform- Knowledge of data integration and data warehousing concepts Additional Information:- The candidate should have a minimum of 5 years of experience in SAP Native HANA SQL Modeling & Development- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Analysis & Interpretation Good to have skills : Data EngineeringMinimum 5 year(s) of experience is required Educational Qualification : 1 Minimum 15 years of Full-time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing insights and recommendations to enhance application functionality and user experience. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Advanced Proficiency in Snowflake Data Cloud Technology, DBT and Cloud Data warehousing- Good To Have Skills: Experience with Talend- Strong analytical skills to interpret complex data sets. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Cloud Technology- A Minimum 15 Years of Full time Education is required. Qualification 1 Minimum 15 years of Full-time education
Posted 1 month ago
3.0 - 5.0 years
30 - 35 Lacs
Kolkata
Work from Office
Job Title - Enterprise Performance Management(Consolidation)-Consultant - S&C GN-CFO&EV Management Level:09 Consultant Location:Gurgaon, Mumbai, Bangalore, Pune, Hyderabad Must have skills:Anaplan, Oracle EPM, SAP GR, SAC, OneStream, Tagetik, Workiva Good to have skills:FP&A, Data visualization tools Experience: 3-5 years Educational Qualification:MBA(Finance) or CA or CMA Job Summary : Prepare and facilitate sessions on application design and process design Apply financial concepts to translate functional requirements into function/technical solution design Design and develop application components/objects in one of the EPM technologies (oracle FCCS/HFM, OneStream, Tagetik etc.) Based on the application design Independently troubleshoot and resolve application/functional process challenges in a timely manner; map complex processes into logical design components for future-state processes Led individual work streams associated with a consolidation implementation. Examples include consolidations process lead, application and unit testing lead, training lead, and UAT lead. Assist with conversion and reconciliation of financial data for consolidations Preparation of key deliverables such as design documents, test documentation, training materials and administration/procedural guides. Roles & Responsibilities: Strong understanding of accounting/financial and close and consolidations concepts Proven ability to work creatively and analytically in a problem-solving environment Strong hands-on experience in any one of the consolidation tools (Oracle FCCS/HFM, OneStream, Tagetik etc) Strong communication (written and verbal), analytical and organizational skills Proven success in contributing to a team-oriented environment, client experience preferred Professional & Technical Skills: 2-3 full implementation of Consolidation solutions Candidate should have 3 - 5 years of relevant experience in implementing Financial Consolidation Solutions in at least any one of the EPM tools (oracle FCCS/HFM, OneStream, Tagetik, SAP Group reporting etc) and financial consolidation processes. Strong Hands-on experience on data conversion and reconciliation Experience with HFM, HFR, FDMEE is a plus Additional Information: An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Our Company | AccentureQualification Experience: 3-5 years Educational Qualification:MBA(Finance) or CA or CMA
Posted 1 month ago
7.0 - 12.0 years
5 - 9 Lacs
Coimbatore
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : PySpark, Python (Programming Language), Talend ETLMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for utilizing your expertise in Databricks Unified Data Analytics Platform to develop efficient and effective solutions. Your typical day will involve collaborating with the team, analyzing business requirements, designing and implementing applications, and ensuring the applications meet the desired functionality and performance standards. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Design, build, and configure applications based on business process and application requirements.- Analyze business requirements and translate them into technical specifications.- Collaborate with cross-functional teams to ensure the successful implementation of applications.- Perform code reviews and provide guidance to junior developers.- Stay updated with the latest industry trends and technologies to continuously improve application development processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, Python (Programming Language), Talend ETL, PySpark.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France