Jobs
Interviews

1052 Etl Processes Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Data Integration Specialist (Talend Developer), you will be responsible for designing, developing, and implementing data integration solutions using Talend and other ETL tools. With a minimum of 5 years of experience in data integration, you will play a key role in ensuring seamless integration of data from various sources, including databases, APIs, and flat files. Your expertise in ETL processes and proficiency in Java will be essential in creating and maintaining ETL workflows and processes to extract, transform, and load data into target systems. In this role, you will be involved in designing and managing data pipelines to support data ingestion, processing, and storage. You will collaborate with cross-functional teams to deliver high-quality solutions on time, working closely with business analysts, data architects, and stakeholders to understand data integration requirements. Your strong background in Java development will enable you to develop and maintain custom Java components and scripts to support data integration and transformation needs, ensuring code quality, reusability, and adherence to best practices. Furthermore, you will be responsible for monitoring and troubleshooting pipeline performance, resolving issues promptly, and optimizing ETL processes for performance, scalability, and reliability. Your solid knowledge of relational databases, SQL, and data modeling will be crucial in ensuring the efficiency and effectiveness of data integration solutions. Additionally, you will create and maintain comprehensive documentation for data integration processes and workflows, providing regular updates and reports on project progress and data integration activities. To excel in this role, you must have a minimum of 5 years of experience in data integration and ETL processes, hands-on experience with Talend data integration tools, and proficiency in Java programming for data-related tasks. A strong understanding of ETL concepts and best practices, experience in designing and managing data pipelines, familiarity with data governance and data quality principles, excellent problem-solving skills, attention to detail, and strong communication and collaboration abilities are also required.,

Posted 2 weeks ago

Apply

0.0 - 5.0 years

0 Lacs

chennai, tamil nadu

On-site

The Applications Development Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. In this role, you will utilize your knowledge of applications development procedures, concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements. You will be responsible for identifying and analyzing issues, making recommendations, and implementing solutions. Additionally, you will utilize your knowledge of business processes, system processes, and industry standards to solve complex issues. As an Applications Development Programmer Analyst, you will conduct testing and debugging, utilize script tools, and write basic code for design specifications. It is important to assess the applicability of similar experiences and evaluate options under circumstances not covered by procedures. Developing a working knowledge of Citis information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications will be essential in this role. When making business decisions, you must appropriately assess risk, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients, and assets. This includes driving compliance with applicable laws, rules, and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct, and business practices, and escalating, managing, and reporting control issues with transparency. To be successful in this role, you should have 0-2 years of relevant experience and experience in programming/debugging used in business applications. A working knowledge of industry practice and standards, comprehensive knowledge of specific business areas for application development, and working knowledge of program languages are required. You should consistently demonstrate clear and concise written and verbal communication. Education requirements include a Bachelors degree/University degree or equivalent experience. This job description provides a high-level review of the types of work performed, and other job-related duties may be assigned as required. For individuals with disabilities needing accommodations to use search tools or apply for career opportunities, please review Accessibility at Citi. View Citis EEO Policy Statement and the Know Your Rights poster for further information.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a Lead Data Analyst in Finance, you will be an integral part of our team, utilizing your expertise in analyzing large datasets and developing data-driven solutions to provide valuable financial insights. With a minimum of 6 years of experience, you will play a crucial role in supporting business decisions through your proficiency in SQL, Power BI, Azure, AWS, OneStream, and SAP-related financial analytics. Your primary responsibilities will include analyzing extensive financial and transactional datasets to identify trends, developing and maintaining SQL queries and data models, and designing Power BI dashboards to visualize key financial KPIs and operational performance indicators. Collaboration with IT, finance, and operations teams will be essential as you work towards building scalable analytical solutions. Additionally, you will be involved in ETL processes for structured and unstructured financial data, ensuring data accuracy and integrity, and enhancing financial planning, consolidation, and reporting functionalities using OneStream and SAP. Your role will also involve researching, evaluating, and implementing new data analytics and business intelligence tools to enhance financial reporting capabilities. Your qualifications should include a Bachelor's degree in accounting, finance, computer science, mathematics, or engineering, along with at least 6 years of experience in financial data analysis and business intelligence. Proficiency in SQL, Power BI, Azure, AWS, and SAP-related financial analytics is required, as well as expertise in ETL processes, data modeling, and financial data validation. Strong communication skills are crucial in this role, as you will be responsible for effectively conveying financial insights to finance leaders, stakeholders, and executive teams. Preferred qualifications include a Master's degree in IT, Business Administration (MBA), or Data Science, experience in designing interactive dashboards in Power BI, familiarity with ETL tools such as Alteryx, and practical experience with OneStream and SAP for financial modeling. A Data Analytics Certification would be a plus.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

navi mumbai, maharashtra

On-site

About Alkem: Alkem Laboratories Limited is an Indian multinational pharmaceutical company headquartered in Mumbai, that manufactures and sells pharmaceutical generics, formulations and nutraceuticals in India and globally over 50 countries. The company has consistently been ranked amongst the top five pharmaceutical companies in India. Alkem's portfolio includes renowned brands like Clavam, Pan, Pan-D, and Taxim-O, which are featured among the top 50 pharmaceutical brands in India. Purpose of the Role: As a System Analyst at Alkem, your role is crucial in supporting the analysis, development, and continuous improvement of business processes and IT systems. Working closely with senior IT team members, you will be responsible for implementing and maintaining effective system solutions that align with organizational needs. A primary focus of your role will be on Power BI development, creating interactive reports, dashboards, and data visualizations to facilitate data-driven decision-making. Additionally, you will collaborate with senior developers and business stakeholders to gather requirements and transform data into meaningful insights. Key Responsibilities: - Assist in designing, modeling, and optimizing system architectures. - Collaborate with senior management to devise technical solutions that meet business requirements. - Analyze business needs and translate them into scalable and efficient system designs. - Participate in creating and updating system architecture documentation. - Develop Power BI reports, dashboards, and visualizations based on business requirements. - Engage with business users to understand data needs and reporting requirements. - Create and refine data models to ensure efficient reporting and analysis. - Develop DAX (Data Analysis Expressions) queries for creating complex calculations and measures. - Support data extraction, transformation, and loading (ETL) processes from various data sources. - Assist in connecting Power BI to different data sources (SQL Server, Excel, APIs, etc.). - Maintain Power BI reports and dashboards to ensure they reflect the latest data. - Ensure data integrity, accuracy, and consistency in reports and dashboards. - Collaborate with the development team to address any performance issues with Power BI reports. - Contribute to documenting BI solutions, processes, and reports. - Participate in testing and troubleshooting reports and dashboards for optimal functionality. - Stay abreast of new features and best practices in line with industry standards. Skills and Qualifications: - Minimum 2-3 years of experience as a system analyst. - Bachelor's degree or Master's degree in Computer Science, Information Technology, or a related field. - Basic knowledge of system analysis, design, and development processes. - Familiarity with programming languages (e.g., SQL, Java, Python) is advantageous. - Strong analytical and problem-solving skills. - Effective communication skills and ability to collaborate within a team. - Eagerness to learn and thrive in a technical environment. - Detail-oriented with the capacity to manage multiple tasks efficiently. - Knowledge of database management systems (DBMS) and fundamental IT concepts is a plus. - Familiarity with software development methodologies (e.g., Agile, Scrum). - Understanding of cloud technologies, virtualization, and distributed systems is advantageous. - Knowledge of databases, networking, and security concepts is beneficial. - Proficiency in S4 HANA SAP is beneficial. - Experience with Excel or other data analysis tools is advantageous. - Proficiency in Power BI is essential for this role.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chandigarh

On-site

As a Data Engineer, you will provide support to the Global BI team for Isolation Valves in their migration to Microsoft Fabric. Your primary focus will be on data gathering, modeling, integration, and database design to facilitate efficient data management. Your responsibilities will include developing and optimizing scalable data models to meet analytics and reporting requirements and utilizing Microsoft Fabric and Azure technologies for high-performance data processing. In this role, you will collaborate with cross-functional teams, including data analysts, data scientists, and business collaborators, to understand their data needs and deliver effective solutions. You will leverage Fabric Lakehouse for data storage, governance, and processing to support Power BI and automation initiatives. Expertise in data modeling, with a specific emphasis on data warehouse and lakehouse design, will be essential. You will be responsible for designing and implementing data models, warehouses, and databases using MS Fabric, Azure Synapse Analytics, Azure Data Lake Storage, and other Azure services. Additionally, you will develop ETL processes using tools such as SQL Server Integration Services (SSIS) and Azure Synapse Pipelines to prepare data for analysis and reporting. Implementing data quality checks and governance practices to ensure data accuracy, consistency, and security will also be part of your role. Your tasks will involve supervising and optimizing data pipelines and workflows for performance, scalability, and cost efficiency, utilizing Microsoft Fabric for real-time analytics and AI-powered workloads. Proficiency in Business Intelligence (BI) tools like Power BI and Tableau, along with experience in data integration and ETL tools such as Azure Data Factory, will be beneficial. You are expected to have expertise in Microsoft Fabric or similar data platforms and a deep understanding of the Azure Cloud Platform, particularly in data warehousing and storage solutions. Strong communication skills are essential, as you will need to convey technical concepts to both technical and non-technical stakeholders. The ability to work independently as well as within a team environment is crucial. Preferred qualifications for this role include 3-5 years of experience in Data Warehousing with on-premises or cloud technologies, strong analytical abilities, and proficiency in database management, SQL query optimization, and data mapping. A willingness to work flexible hours based on project requirements, strong documentation skills, and advanced SQL skills are also required. Hands-on experience with Medallion Architecture for data processing, prior experience in a manufacturing environment, and the ability to quickly learn new technologies are advantageous. Travel up to 20% may be required. A Bachelor's degree or equivalent experience in Science, with a focus on MIS, Computer Science, Engineering, or a related field, is preferred. Good interpersonal skills in English for efficient collaboration with overseas teams and Agile certification are also desirable. At Emerson, we value an inclusive workplace where every employee is empowered to grow and contribute. Our commitment to ongoing career development and fostering an innovative and collaborative environment ensures that you have the support to succeed. We provide competitive benefits plans, medical insurance options, employee assistance programs, recognition, and flexible time off plans to prioritize employee wellbeing. Emerson is a global leader in automation technology and software, serving industries such as life sciences, energy, power, renewables, and advanced factory automation. We are committed to diversity, equity, and inclusion, and offer opportunities for career growth and development. Join our team at Emerson and be part of a community dedicated to making a positive impact through innovation and collaboration.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

telangana

On-site

As a skilled and detail-oriented Business Analyst with proven experience in the Healthcare domain, you will be playing a critical role in bridging the gap between business requirements and technical solutions. Your primary focus will be on data validation and analysis to ensure data integrity and support healthcare data initiatives. You will collaborate with business stakeholders, product owners, and development teams to gather and analyze business and data requirements in healthcare-related projects. Your responsibilities will include translating business needs into detailed functional and non-functional specifications, user stories, and data mapping documents. Additionally, you will be performing end-to-end data validation to ensure accuracy, consistency, and completeness of healthcare data across systems. Understanding and documenting data flows across multiple healthcare platforms such as EHRs, claims systems, and data warehouses will be a key aspect of your role. You will create and execute test cases for data validation, ensuring data quality before, during, and after implementation. Supporting User Acceptance Testing (UAT) by validating data outputs and resolving data-related issues will also be part of your responsibilities. Identifying data discrepancies and anomalies, investigating root causes, and recommending solutions will be crucial. You will work closely with data engineers and QA teams to develop validation rules, test plans, and automated checks. Your role will also involve preparing comprehensive documentation including business requirement documents (BRDs), data dictionaries, traceability matrices, and validation reports while maintaining compliance with HIPAA and other healthcare regulations when handling sensitive data. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Systems, Healthcare Administration, or a related field, along with at least 3 years of experience as a Business Analyst, preferably in the Healthcare industry. Strong experience in data validation, data profiling, and data quality assessment is required. Familiarity with healthcare data standards such as HL7, FHIR, ICD, CPT, LOINC, EDI 837/835, etc., and proficiency in writing SQL queries for data extraction, analysis, and validation are essential. Experience with tools like Excel, JIRA, Confluence, and data visualization/reporting tools (e.g., Tableau, Power BI) is preferred. Excellent communication, analytical thinking, and documentation skills are necessary for this role. Preferred qualifications include experience with cloud data platforms (e.g., AWS, Azure) or healthcare data lakes, knowledge of ETL processes, familiarity with Agile/Scrum methodologies, and certification in healthcare analytics or business analysis (e.g., CBAP, PMI-PBA, AHIMA, or similar). Join us to be part of a mission-driven team transforming healthcare through data. You will have the opportunity to work on impactful projects with top-tier clients and healthcare providers, while also enjoying continuous learning and opportunities for career advancement.,

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

chennai, tamil nadu, india

On-site

At Capgemini Engineering, the world leader in engineering services, we bring together a global team of engineers, scientists, and architects to help the world's most innovative companies unleash their potential. From autonomous cars to life-saving robots, our digital and software technology experts think outside the box as they provide unique R&D and engineering services across all industries. Join us for a career full of opportunities. Where you can make a difference. Where no two days are the same. Job Description Job Description: Design, develop, and maintain ETL processes using Pentaho Data Integration (Kettle) . Extract data from various sources including databases, flat files, APIs, and cloud platforms. Transform and cleanse data to meet business and technical requirements. Load data into data warehouses, data lakes, or other target systems. Monitor and optimize ETL performance and troubleshoot issues. Collaborate with data architects, analysts, and business stakeholders to understand data requirements. Ensure data quality, integrity, and security throughout the ETL lifecycle.Document ETL processes, data flows, and technical specifications. Job Description - Grade Specific Focus on Industrial Operations Engineering. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Greetings from Teknikoz! You should have at least 5 years of experience in the following areas: - Data Bricks skillset with Pyspark and SQL - Strong proficiency in Pyspark and SQL - Understanding of data warehousing concepts - ETL processes and Data pipeline building with ADB/ADF - Experience with Azure cloud platform and knowledge of data manipulation techniques In this role, you will be responsible for: - Working with business teams to convert requirements into technical stories for migration - Leading technical discussions and implementing solutions - Experience with multi-tenant architecture and successful project delivery in Databricks + Azure combination - Experience with Unity catalogue is considered beneficial. If you have the required experience and skills, we would like to hear from you!,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

kerala

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. We are seeking a highly skilled and analytical Reporting & Analytics senior to join our Reporting & Analytics Center of Excellence (COE) based in Bangalore. The ideal candidate will possess a strong background in financial reporting and have expert-level experience in developing Power BI dashboards. This role requires a strategic thinker with a knack for problem-solving and the ability to build SQL procedures, analyzing, and interpreting data to create reports that help in business decisions, use logic and creative thinking, and have advanced data analysis skills. Key Responsibilities: - Have a strong understanding of financial concepts and conduct analysis to identify trends, variances, and potential business risks, and present findings to stakeholders. - Handle complex ad hoc reports from Stakeholders on a timely manner. - Create and maintain reports and dashboards to track key performance indicators (KPIs) and identify trends. - Experienced in financial reporting with a strong grasp of report creation, data analysis, and the processes involved in delivering accurate and timely financial insights. - Build and maintain SQL queries data extraction, transformation, loading (ETL) processes and read/decode/interpret SQL queries for reporting and analysis. - Create financial dashboards using Power BI by transforming raw data into cohesive, valuable reports capturing meaningful business insights. - Design, develop, deploy, and maintain interactive interfaces using Power BI including data visualizations, dashboards, and reports. - Creating DAX calculations and measures to support data analysis. - Work closely with Global stakeholders to understand their needs and deliver high-quality reports and visualizations that meet their expectations. - Ensure accuracy, integrity, and security of financial data by conducting regular reviews, data validation and troubleshooting, and maintaining detailed documentation. - Stay up to date with industry trends and advancements in reporting and analytics tools and techniques. Requirements: - Proficiency in MS Excel (Advance functions, formulas, Pivot, Charts, etc.). - Strong experience with ETL processes through Power Query. - Experience in developing and optimizing Power BI solutions. - Proficiency in building/reading/editing SQL query. - Ability to design and implement scalable data models. - Good knowledge of DAX calculations and multidimensional data modeling. - Understanding of Microsoft Power Platform tools such as Power Apps & Power Automate would be highly desirable. Qualifications: - Bachelor's or Master's degree in Finance, Data analytics, computer science, information science, or related fields. - 3 to 8 years of experience in financial reporting and hands-on experience as a Power BI developer. Power BI certifications are a plus. - Proven analytical, critical thinking, and problem-solving abilities. - Detail-oriented with an unwavering commitment to accuracy and quality. - Strong communication and interpersonal skills, with the ability to effectively communicate technical concepts to non-technical stakeholders. Preferred Characteristics: - A proactive, self-starter attitude with the initiative to seek out opportunities for improvement. - A collaborative mindset that thrives in a team-oriented environment. - A continuous learner who is passionate about staying current with industry best practices and emerging technologies. - A poised and thoughtful demeanor in high-pressure situations. - Continuous learning mindset and the ability to adapt to evolving reporting and analytics technologies. What We Offer: - A dynamic and supportive work environment that fosters development and growth. - Opportunities to work on challenging projects and make a significant impact on the business. - Competitive compensation package and benefits. - A culture that values diversity, inclusion, and teamwork. EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Description Seeking a skilled and detail-oriented OAS/OBIEE Consultant to join our data and analytics team. The ideal candidate will be responsible for designing, developing, and maintaining business intelligence (BI) and dashboarding solutions to support smelter operations and decision-making processes. You will work closely with cross-functional teams to transform raw data into actionable insights using modern BI tools and ETL processes. Key Responsibilities: Develop and maintain interactive dashboards and reports using Microsoft Power BI and Oracle Analytics . Design and implement ETL processes using Oracle Data Integrator and other tools to ensure efficient data integration and transformation. Collaborate with stakeholders to gather business requirements and translate them into technical specifications. Perform data analysis and validation to ensure data accuracy and consistency across systems. Optimize queries and data models for performance and scalability. Maintain and support Oracle Database and other RDBMS platforms used in analytics workflows. Ensure data governance, quality, and security standards are met. Provide technical documentation and user training as needed. Required Skills and Qualifications: Proven experience in BI solutions , data analysis , and dashboard development . Strong hands-on experience with Microsoft Power BI , Oracle Analytics , and Oracle Data Integrator . Proficiency in Oracle Database , SQL , and relational database concepts. Solid understanding of ETL processes , data management , and data processing . Familiarity with business intelligence and business analytics best practices. Strong problem-solving skills and attention to detail. Excellent communication and collaboration abilities. Preferred Qualifications: Experience in the smelting or manufacturing industry is a plus. Knowledge of scripting languages (e.g., Python, Shell) for automation. Certification in Power BI, Oracle Analytics, or related technologies.

Posted 2 weeks ago

Apply

12.0 - 14.0 years

0 Lacs

india

On-site

About the Role: 11 The Team: Are you ready to dive into the world of data and uncover insights that shape global commodity markets We're looking for a passionate BI Developer to join our Business Intelligence team within the Commodity Insights division at S&P Global. At S&P Global, we are on a mission to harness the power of data to unlock insights that propel our business forward. We believe in innovation, collaboration, and the relentless pursuit of excellence. Join our dynamic team and be a part of a culture that celebrates creativity and encourages you to push the boundaries of what's possible. Key Responsibilities Unlocking the Power of Data Collaborate on the end-to-end data journey, helping collect, cleanse, and transform diverse data sources into actionable insights that shape business strategies for functional leaders. Work alongside senior BI professionals to build powerful ETL processes, ensuring data quality, consistency, and accessibility. Crafting Visual Storytelling Develop eye-catching, impactful dashboards and reports that tell the story of commodity trends, prices, and global market dynamics. Bring data to life for stakeholders across the company, including executive teams, analysts, and developers, by helping to create visually compelling and interactive reporting tools. Mentor and train users on dashboard usage for efficient utilization of insights. Becoming a Data Detective Dive deep into commodities data to uncover trends, patterns, and hidden insights that influence critical decisions in real-time. Demonstrate strong analytical skills to swiftly grasp business needs and translate them into actionable insights. Collaborate with stakeholders to define key metrics and KPIs and contribute to data-driven decisions that impact the organization's direction. Engaging with Strategic Minds Work together with cross-functional teams within business operations to turn complex business challenges into innovative data solutions. Gather, refine, and translate business requirements into insightful reports and dashboards that push our BI team to new heights. Provide ongoing support to cross-functional teams, addressing issues and adapting to changing business processes. Basic Qualifications : 12+ years of professional experience in BI projects, focusing on dashboard development using Power BI or similar tools and deploying them on their respective online platforms for easy access. Proficiency in working with various databases such as Redshift, Oracle, and Databricks , using SQL for data manipulation, and implementing ETL processes for BI dashboards . Ability to identify meaningful patterns and trends in data to provide valuable insights for business decision-making. Knowledge of Generative AI, Microsoft Copilot, and Microsoft Fabric a plus. Skilled in requirement gathering and developing BI solutions. Candidates with a strong background/proficiency in Power BI and Power Platforms tools such as Power Automate/Apps , and intermediate to advanced proficiency in Python are preferred. Essential understanding of data modeling techniques tailored to problem statements. Familiarity with cloud platforms (e.g., Azure, AWS) and data warehousing. Exposure to GenAI concepts and tools such as ChatGPT. Experience with to Agile Project Implementation methods. Excellent written and verbal communication skills. Must be able to self-start and succeed in a fast-paced environment. Ability to write complex SQL queries or enhance the performance of existing ETL pipelines is a must. Familiarity with Azure Devops will be an added advantage. Candidates with a strong background/proficiency in Power BI and Power Platforms tools such as Power Automate/Apps, and intermediate to advanced proficiency in Python are preferred. We're a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSE: SPGI). S&P Global is the world's foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world's leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit . What's In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology-the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide-so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We're committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We're constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That's why we provide everything you-and your career-need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It's not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards-small perks can make a big difference. For more information on benefits by country visit: Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - ----------------------------------------------------------- IFTECH202.2 - Middle Professional Tier II (EEO Job Group)

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Autodesk is a global leader in design and make technology, spanning various industries such as architecture, engineering, construction, media, entertainment, and manufacturing. Our mission is to empower innovators worldwide to tackle significant challenges by providing solutions that enable them to envision, design, and create a better world. The Content Experience Platform team plays a crucial role in delivering seamless and insightful content experiences to our diverse user base. As a Knowledge Engineer at Autodesk, you will be an integral part of our Content Experience Platform team, responsible for building and maintaining the foundational knowledge platform that drives help documentation, learning resources, and community forums. You will collaborate with a team of passionate individuals who are dedicated to leveraging technology to ensure users have access to the information they need precisely when they need it. Located in Bengaluru, IN (Hybrid), you will contribute to enhancing the core infrastructure and systems that support Autodesk's extensive content ecosystem. Your key responsibilities will include maintaining the knowledge platform architecture and infrastructure to ensure scalability, reliability, and performance. You will manage knowledge graph technologies, semantic models, and ontologies to effectively structure and connect our content. Additionally, you will work closely with content strategists, information architects, and other teams to translate content models into robust technical solutions. Monitoring system performance, troubleshooting issues, and contributing to technical documentation and best practices for content management are also vital aspects of this role. To be successful in this position, you should hold a Bachelor's degree in Computer Science, Information Science, or a related field, along with 5+ years of experience in software engineering, data engineering, or a similar role focusing on knowledge management or content platforms. Strong knowledge of knowledge graph concepts, semantic web technologies, data modeling, and problem-solving skills are essential. Excellent communication, collaboration, and familiarity with emerging technologies in knowledge engineering and content management systems are highly valued. Preferred qualifications include experience with content management systems (CMS), content delivery networks (CDNs), search technologies like Elasticsearch and Solr, as well as machine learning and natural language processing (NLP) techniques related to knowledge management. An understanding of Agile development practices, information architecture principles, and content strategy would be advantageous. Join Autodesk in shaping a better future where you can be your authentic self, contribute meaningfully, and have a positive impact. Discover endless opportunities with us and become a part of our innovative and inclusive culture, where your potential can thrive. If you are ready to make a difference and shape the world, come and join us at Autodesk! Autodesk offers a competitive compensation package that includes base salaries, annual cash bonuses, stock grants, commissions, and comprehensive benefits. We are committed to fostering a diverse and inclusive workplace where everyone can flourish. To learn more about our commitment to diversity and belonging, please visit: https://www.autodesk.com/company/diversity-and-belonging.,

Posted 2 weeks ago

Apply

15.0 - 17.0 years

0 Lacs

pune, maharashtra, india

On-site

Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Associate Director - Software Engineering In this role, you will be involved in Software Development, Kubernetes / Google Cloud Platform (GCP) Data Pipeline, MLOps, DevOps and Team Leadership. The ideal candidate will play a pivotal role in designing, implementing, and optimizing scalable solutions while mentoring and grooming the team to achieve technical excellence. To be successful in this role, you should meet the following requirements: Over 15+ years in software development, preferably with Trade data. Hands-on expertise with Java 1.8+, Apache Spark (2.3/3.x), Hadoop (Spark/HDFS/Yarn), GCP, Elastic Search, RDBMS, SQL, Unix scripting, and ETL processes. Do code regularly, leads technical discussions, aligns with business objectives, and takes ownership of all technical aspects of the platform AND stay updated with relevant technologies, patterns, and tools. Skilled in designing data frame objects, optimizing memory usage, and understanding database/file system write operations. Strong background in system and solution architecture, including cluster management for Spark workloads. Familiarity with microservices architecture, API-centric systems, and Spring Boot (4+), including reactive programming. Practical knowledge of cloud deployments, especially GCP or similar providers, and cloud infrastructure optimization. Knowledgeable in big data concepts, DevOps methodologies, and containerization (Docker, Kubernetes). Skilled in using Bitbucket/GitHub, Jenkins, and similar CI/CD tools designs and maintenance of CI/CD pipelines. Provides mentorship, technical guidance, and code reviews for team members establishes frameworks for junior developers. Build relationship with other technical leads and principal engineers, promotes a collaborative, innovative, and growth-oriented team culture conducts performance evaluations, delivers feedback Prepares detailed technical designs based on functional requirements and manages technical tasks/tickets. Engages with business analysts, product owners, and other technical teams for requirement clarification and integration. Principal responsibilities Leading Architecture design for items in alignment to Future State Architecture Establish, document, and implement best practices for end-to-end application initiation and deployment processes. Drive continuous improvement initiatives to enhance customer satisfaction. Demonstrate flexibility and adaptability according to project requirements. Attend and actively participate in relevant project meetings. System Performance to ensure deliverables satisfy Non-Functional requirements Industrialisation to ensure robust solutions are being developed and tech debt reduced Innovation to ensure that we are continually improving and benefitting from industry advancements Ensuring that assigned work packages (EPIC, Story, Sub-Tasks) aligns with definition of ready and definition of done Ensuring high quality Testing Automation (e.g. Unit, Functional) in place at meets agreed level for delivered outputs Technical excellence influence the pod to deliver technically excellent solutions The technical backlog is also in areas of interest and responsibilities for Tech Lead position. Tech Lead sets standards. Ensures principles like DRY, SOLID, and Clean Code. Ensures code quality, security, and scalability. Requirements Requirements Must have: Degree in Computer Science, Engineering, or a closely related discipline (Bachelor's or Master's). Over 15+ years of expertise in software engineering, and cloud platforms, particularly Google Cloud Platform (GCP). Deep knowledge of DevOps technologies such as Jenkins, GitLab CI/CD, Terraform, Kubernetes, and Docker. Practical experience with version control, automation, and orchestration tools like GIT, Jenkins, Ansible/Puppet, and Kubernetes. Advanced coding abilities in languages like Python, Java. Strong grasp of data engineering, pipeline architecture, and ETL methodologies. Excellent verbal and written communication, with strong interpersonal skills. Well-versed in DevOps strategies, containerization. Experienced with continuous integration and deployment tools (e.g., Jenkins, GitLab CI). Knowledgeable about cloud infrastructure and infrastructure-as-code concepts. Adopt at handling multiple tasks, prioritizing, and collaborating across teams to achieve results. Collaborative team member, able to work across functions and engage with domain experts. Comfortable working with international teams and diverse cultures, with strong communication skills. Good to have: Surveillance in General or Trade Surveillance Domain knowledge Experience with other cloud platforms (AWS, Azure) is a plus. Familiarity with monitoring tools like Prometheus, Grafana, or Stackdriver. Knowledge of data governance and compliance frameworks. Certifications in GCP (e.g., Professional Data Engineer, Professional Cloud Architect). Experienced in working with resources in geographically dispersed teams, appreciating and respecting local cultures You'll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by - HSBC Software Development India

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

At EY, you will have the opportunity to develop a career that is as unique as you are, supported by a global scale, inclusive culture, and cutting-edge technology to help you reach your full potential. Your distinctive voice and perspective are essential in contributing to EY's continuous improvement. By joining us, you will not only create an exceptional experience for yourself but also contribute to building a better working world for all. As an RCE - Risk Data Engineer/Lead, you will be part of our dynamic Technology team that is dedicated to developing innovative digital solutions on a global scale in the Financial and Non-Financial services sector. In this senior technical role, you will utilize your expertise in data engineering, cloud infrastructure, platform operations, and production support using advanced cloud and big data technologies. The ideal candidate, with 3-6 years of experience, will have strong technical skills, a passion for learning, and a focus on supporting Financial Crime, Financial Risk, and Compliance technology transformation. Collaboration, adaptability, and a willingness to acquire new skills are key attributes needed for success in this fast-paced environment. As a Senior Data & BI Analyst, you will play a vital role in the Enterprise Analytics Center of Excellence (COE), driving self-service analytics adoption within the organization using tools like ThoughtSpot and Tableau. Your responsibilities will include contributing to the COE's growth, serving as a subject matter expert in enterprise data, collaborating with business users to understand their analytics needs, and providing training sessions to enhance user proficiency with BI tools. Your technical expertise, strategic consulting skills, and passion for data-driven decision-making will be crucial in supporting the organization's analytics initiatives. Key Responsibilities: - Contribute to the development of the Enterprise Analytics COE and establish governance standards. - Guide end users in identifying and utilizing key data sources within the enterprise lakehouse. - Collaborate with business users to address their analytics requirements and offer advice on BI tool selection and data strategies. - Conduct training sessions for ThoughtSpot and Tableau users to enhance self-service analytics capabilities. - Perform hands-on data analysis to uncover insights and showcase tool functionalities. - Create reusable analytics templates, dashboards, and frameworks to drive consistency and efficiency. - Support teams in dashboard development and implementation of best practices. - Stay informed about the latest BI trends, ThoughtSpot/Tableau advancements, and AI-driven analytics to drive innovation. Required Qualifications: - Experience as a Technical Data Analyst, BI Analyst, or Business Analyst in an enterprise setting. - Proficiency in SQL, large dataset handling, data warehousing, ETL processes, and BI reporting tools like Tableau, Power BI, Qlik, and ThoughtSpot. - Strong analytical and problem-solving skills with the ability to collaborate effectively across teams. - Excellent stakeholder management skills and the capacity to communicate technical insights to non-technical audiences. Preferred Qualifications: - Hands-on experience with the ThoughtSpot BI Platform. - Knowledge of financial services and financial data structures. - Familiarity with cloud-based BI platforms such as AWS, Azure, etc. - Understanding of Natural Language Processing (NLP), AI-driven data aggregation, and automated reporting technologies. At EY, we are dedicated to building a better working world by creating long-term value for clients, people, and society while fostering trust in the capital markets. Our diverse teams across 150 countries leverage data and technology to provide assurance, drive growth, transformation, and operational excellence for our clients. By asking better questions and finding innovative solutions, EY teams address the complex challenges of today's world.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

Every day, tens of millions of people come to Roblox to explore, create, play, learn, and connect with friends in 3D immersive digital experiences all created by our global community of developers and creators. At Roblox, we are building the tools and platform that empower our community to bring any experience that they can imagine to life. Our vision is to reimagine the way people come together, from anywhere in the world, and on any device. We are on a mission to connect a billion people with optimism and civility, and we are looking for amazing talent to help us get there. A career at Roblox means you will be working to shape the future of human interaction, solving unique technical challenges at scale, and helping to create safer, more civil shared experiences for everyone. The Roblox Operating System (ROS) team is responsible for the foundational technology and services that power all experiences on Roblox. This critical team ensures a seamless, performant, and reliable platform for our global community of users and developers. As the first Product Manager hire for our India office, you will report to Theresa Johnson, the Head of Product for ROS. You will play a pivotal role in building and enhancing our data analytics capabilities within the Roblox operating system, collaborating closely with the India-based Data Engineering team, which includes an Engineering Manager, three engineers, and multiple data scientists. This is a full-time onsite role based out of our Gurugram office with a shift time of 2:00PM - 10:30PM IST (Cabs will be provided). **Key Responsibilities:** - Collaborate with data engineering and product engineering teams in India to build integrated analytics tooling. - Develop cross-functional data visualization and reporting capabilities. - Implement advanced insights extraction methodologies. - Develop self-service data exploration tools. - Integrate data analytics capabilities into Roblox operating system. - Ensure seamless data flow across organizational platforms. - Implement cutting-edge data infrastructure solutions. - Build a scalable data registry that will allow us to understand, register, classify and govern data across all of ROS. - Partner with Data Scientists to process and transform data into actionable insights. - Contribute to achieving key outcomes such as reducing data access request resolution time, increasing self-service data exploration adoption, and achieving data pipeline reliability. **Requirements:** - A Bachelor's degree or equivalent experience in Computer Science, Computer Engineering, or a similar technical field. - 8+ years of product management experience, with a focus on data platforms, analytics, or developer tools. - Strong understanding of data infrastructure, data warehousing, and ETL processes. - Proven ability to work autonomously in ambiguous environments. - Experience collaborating with engineering and data science teams. - Excellent communication and interpersonal skills. **Desired Skills:** - Strong product intuition. - Highly organized. - Collaborative team player. - Adaptable and comfortable working in a fast-paced environment. - Strategic thinker with a focus on delivering measurable results. Please note that roles based in our San Mateo, CA Headquarters have specific in-office days.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

noida, uttar pradesh

On-site

You are a Collapse Details Lead / Senior Tableau Admin with AWS having 8 to 12 years of experience. Your role involves managing and maintaining Tableau Server, ensuring its reliability, performance, and security. You will handle user management, security measures, data source connections, license management, backup and recovery, performance optimization, scaling, troubleshooting, version upgrades, monitoring and logging, training and support, collaboration, documentation, governance, integration, usage analytics, and staying current with Tableau updates and best practices. Your responsibilities include installing, configuring, and maintaining Tableau Server, managing user accounts and permissions, implementing security measures for data protection, setting up data source connections, monitoring server performance and optimizing configurations, scaling resources as needed, diagnosing and resolving issues, planning and executing server upgrades, providing training and support to users, collaborating with teams for integration, generating reports on Tableau usage, and staying updated with Tableau features. To excel in this role, you must have proven experience as a Tableau Administrator with strong skills in Tableau Server and Tableau Desktop. You should be familiar with AWS services relevant to hosting Tableau Server, have knowledge of SQL and data integration principles, possess problem-solving skills, and excel in communication and collaboration. Relevant certifications in Tableau and AWS are advantageous. As a Collapse Details Lead / Senior Tableau Admin with AWS, you will contribute to the effective utilization of Tableau within the organization, enabling users to leverage data visualization and analytics for informed decision-making. Your qualifications should include a Bachelor's degree in Computer Science or a related field.,

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

noida, uttar pradesh, india

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Lead Consultant, Senio r Data Scientist ! In this role, you will have a strong background in Gen AI implementations, data engineering, developing ETL processes, and utilizing machine learning tools to extract insights and drive business decisions. The Data Scientist will be responsible for analysing large datasets, developing predictive models, and communicating findings to various stakeholders Responsibilities Develop and maintain machine learning models to identify patterns and trends in large datasets. Utilize Gen AI and various LLMs to design & develop production ready use cases. Collaborate with cross-functional teams to identify business problems and develop data-driven solutions. Communicate complex data findings and insights to non-technical stakeholders in a clear and concise manner. Continuously monitor and improve the performance of existing models and processes. Stay up to date with industry trends and advancements in data science and machine learning. Design and implement data models and ETL processes to extract, transform, and load data from various sources. Good hands own experience in AWS bedrock models, Sage maker, Lamda etc Data Exploration & Preparation - Conduct exploratory data analysis and clean large datasets for modeling. Business Strategy & Decision Making - Translate data insights into actionable business strategies. Mentor Junior Data Scientists - Provide guidance and expertise to junior team members. Collaborate with Cross-Functional Teams - Work with engineers, product managers, and stakeholders to align data solutions with business goals. Qualifications we seek in you! Minimum Qualifications Bachelor%27s / Master%27s degree in computer science , Statistics, Mathematics, or a related field. Relevant years of experience in a data science or analytics role. Strong proficiency in SQL and experience with data warehousing and ETL processes. Experience with programming languages such as Python & R is a must . (either one ) Familiarity with machine learning tools and libraries such as Pandas, scikit-learn and AI libraries. Having excellent knowledge in Gen AI, RAG, LLM Models & strong understanding of prompt engineering. Proficiency in Az Open AI & AWS Sagemaker implementation. Good understanding statistical techniques such and advanced machine learning Experience with data warehousing and ETL processes. Proficiency in SQL and database management. Familiarity with cloud-based data platforms such as AWS, Azure, or Google Cloud. Experience with Azure ML Studio is desirable. Knowledge of different machine learning algorithms and their applications. Familiarity with data preprocessing and feature engineering techniques. Preferred Qualifications/ Skills Experience with model evaluation and performance metrics. Understanding of deep learning and neural networks is a plus. Certified in AWS Machine learning , AWS Infra engineer is a plus Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Principal Consultant-Data Engineer, Azure+Python ! Responsibilities Hands on experience with Azure, pyspark , and Python with Kafka Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness . Implement and maintain security measures to protect data and systems within the Azure environment, including IAM policies, security groups, and encryption mechanisms . Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data . Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on Azure Experience of Databricks will be added advantage Strong experience in Python and SQL Strong understanding of security principles and best practices for cloud-based environments . Experience with monitoring tools and implementing proactive measures to ensure system availability and performance . Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment . Strong communication and collaboration skills to work effectively with cross-functional teams . Preferred Qualifications/ Skills Master&rsquos Degree-Computer Science, Electronics, Electrical. Azure Data Engineering & Cloud certifications, Databricks certifications Experience of working with Oracle ERP Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

hyderabad, telangana, india

On-site

Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Principal Consultant - ETL Manual Tester! Looking for an experienced Test Manager to oversee and manage testing activities for our Data/ETL (Extract, Transform, Load) program. The ideal candidate will be responsible for ensuring the quality and reliability of our data processing systems. This role involves developing testing strategies, managing a team of test engineers, and collaborating with other departments to ensure the successful delivery of the program. Responsibilities . Test Strategy & Planning o Develop and implement a comprehensive testing strategy for the Data transformation/ETL program, ensuring alignment with the program%27s objectives and timelines. o Plan, design, and manage the execution of test cases, scripts, and procedures for data validation and ETL processes. o Oversee the preparation of test data and environments, ensuring they meet the requirements of complex data workflows. . Team Management & Leadership o Lead and mentor a team of test engineers, setting clear goals and expectations, and providing regular feedback on performance. o Foster a culture of quality and continuous improvement within the team. o Coordinate with project managers, data engineers, and business analysts to ensure effective communication and resolution of issues. . Testing & Quality Assurance o Ensure that testing activities effectively identify defects and issues in data processing and ETL workflows. o Implement and maintain quality assurance policies and procedures to ensure data integrity and reliability. o Monitor and report on testing activities, including test results, defect tracking, and quality metrics. . Stakeholder Engagement o Act as the primary point of contact for all testing related activities within the Data/ETL program. o Communicate testing progress, risks, and outcomes to program stakeholders, including senior management and project teams. o Collaborate with business users to understand requirements and ensure that testing strategies align with business objectives. . Technology & Tools o Stay abreast of the latest testing methodologies, tools, and technologies relevant to data and ETL processes. o Recommend and implement tools and technologies to improve testing efficiency and effectiveness. o Ensure the testing team is trained and proficient in the use of testing tools and technologies. Qualifications we seek in you! Minimum Qualifications: o Bachelor&rsquos degree in computer science, Information Technology, or related field. o experience in a testing role with a focus on data and ETL processes. o Proven experience managing a testing team and leading testing activities for largescale data projects. o Strong understanding of data modelling, ETL processes, and data warehousing principles. o Proficiency in SQL and experience with database technologies. o Experience with test automation tools and frameworks. o Excellent analytical, problem solving, and communication skills. o Ability to work collaboratively in a team environment and manage multiple priorities. . Preferred Skills: o Experience with cloud-based data warehousing solutions, such as AWS Redshift, Google BigQuery, or Azure Synapse Analytics. o Knowledge of Agile methodologies and experience working in an Agile environment. Why join Genpact . Lead AI-first transformation - Build and scale AI solutions that redefine industries . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career&mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills . Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace . Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Principal Consultant, Snowflake + Matillion ! We are looking for skilled and experienced professionals to join us team. In this role, you will intensively working on Snowflake and Matillion to fulfil project deliverables. Responsibilities Proficient in creating and managing Snowflake objects such as Tables, Stored Procedures, Views, and Materialized Views. Skilled in query optimization for enhanced performance( PL/SQL) Strong understanding of fact and dimension tables within a data warehouse schema. Solid grasp of data warehousing concepts and architecture. Well-versed in Snowflake-specific features and functionalities. Experience in ETL processes for loading data across various layers in Snowflake. Capable of performing root cause analysis for critical issues within tight timeframes . Experienced in data reconciliation and validation. Proficient in creating and managing Snowflake objects such as Tables, Stored Procedures, Views, and Materialized Views. Skilled in query optimization for enhanced performance( PL/SQL) Strong understanding of fact and dimension tables within a data warehouse schema. Solid grasp of data warehousing concepts and architecture. Well-versed in Snowflake-specific features and functionalities. Experience in ETL processes for loading data across various layers in Snowflake. Capable of performing root cause analysis for critical issues within tight timeframes . Experienced in data reconciliation and validation. Qualifications we seek in you! Minimum Qualifications B.Tech /Relevant Educational Experience Industry and hands on experience in Snowflake and Matillion Preferred Qualifications/ Skills Familiar with AWS services and infrastructure. Basic understanding of Power BI for data visualization and reporting. Why join Genpact Lead AI-first transformation - Build and scale AI solutions that redefine industries Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career &mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

pune, maharashtra, india

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant, Senio r Data Scientist ! In this role, you will have a strong background in Gen AI implementations, data engineering, developing ETL processes, and utilizing machine learning tools to extract insights and drive business decisions. The Data Scientist will be responsible for analysing large datasets, developing predictive models, and communicating findings to various stakeholders Responsibilities Develop and maintain machine learning models to identify patterns and trends in large datasets. Utilize Gen AI and various LLMs to design & develop production ready use cases. Collaborate with cross-functional teams to identify business problems and develop data-driven solutions. Communicate complex data findings and insights to non-technical stakeholders in a clear and concise manner. Continuously monitor and improve the performance of existing models and processes. Stay up to date with industry trends and advancements in data science and machine learning. Design and implement data models and ETL processes to extract, transform, and load data from various sources. Good hands own experience in AWS bedrock models, Sage maker, Lamda etc Data Exploration & Preparation - Conduct exploratory data analysis and clean large datasets for modeling. Business Strategy & Decision Making - Translate data insights into actionable business strategies. Mentor Junior Data Scientists - Provide guidance and expertise to junior team members. Collaborate with Cross-Functional Teams - Work with engineers, product managers, and stakeholders to align data solutions with business goals. Qualifications we seek in you! Minimum Qualifications Bachelor%27s / Master%27s degree in computer science , Statistics, Mathematics, or a related field. Relevant years of experience in a data science or analytics role. Strong proficiency in SQL and experience with data warehousing and ETL processes. Experience with programming languages such as Python & R is a must . (either one ) Familiarity with machine learning tools and libraries such as Pandas, scikit-learn and AI libraries. Having excellent knowledge in Gen AI, RAG, LLM Models & strong understanding of prompt engineering. Proficiency in Az Open AI & AWS Sagemaker implementation. Good understanding statistical techniques such and advanced machine learning Experience with data warehousing and ETL processes. Proficiency in SQL and database management. Familiarity with cloud-based data platforms such as AWS, Azure, or Google Cloud. Experience with Azure ML Studio is desirable. Knowledge of different machine learning algorithms and their applications. Familiarity with data preprocessing and feature engineering techniques. Preferred Qualifications/ Skills Experience with model evaluation and performance metrics. Understanding of deep learning and neural networks is a plus. Certified in AWS Machine learning , AWS Infra engineer is a plus Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

vijayawada, andhra pradesh

On-site

Job Description: Navasal Inc. is a Digital Technology agency based in Texas and a Bronze Solution Partner of Adobe. We collaborate with over 50 clients to deliver advanced digital technology services, specializing in Enterprise Content Management Systems such as Adobe Experience Manager (AEM) and Digital Commerce Platforms including Oracle ATG Commerce and Commerce Tools. Additionally, we offer services on Cloud Platforms like AWS and Azure and are experts in AEM Solutioning, AEM Web Design, and Oracle ATG Commerce Web Design Implementations. This is a full-time/w2 on-site role for a Senior Data Engineer, based in Vijayawada. As a Senior Data Engineer at Navasal Inc., you will be responsible for designing, developing, and maintaining data infrastructure and processes. Your main tasks will include data modeling, ETL processes, data warehousing, and data analytics. Furthermore, you will collaborate with cross-functional teams to ensure data quality and efficiency in our projects. To excel in this role, you should possess strong skills in Data Engineering and Data Modeling, along with hands-on experience in Extract Transform Load (ETL) processes and Data Warehousing. Proficiency in Data Analytics, excellent problem-solving abilities, and analytical skills are also essential. You should be able to work collaboratively in an on-site environment and hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Experience with enterprise-level data systems would be a plus. At Navasal Inc., we support H-1B visa transfer and prefer immediate joining, although we are flexible for the right candidate. Join us in driving digital innovation and delivering cutting-edge solutions to our clients. Note: The above description is for informational purposes only and is not exhaustive. Other duties, responsibilities, and activities may be assigned as required.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Power BI Developer, you will be responsible for designing, developing, and maintaining interactive dashboards and reports using Power BI. You will collaborate with stakeholders to understand their reporting needs and deliver data-driven insights that drive business performance. Key Responsibilities - Develop and design interactive reports and dashboards using Power BI to visualize complex data sets. - Collaborate with business users to gather reporting requirements and translate them into technical specifications. - Ensure data accuracy and integrity by performing data validation and troubleshooting issues. - Create and maintain data models, including data transformation and aggregation processes using Power Query and DAX. - Optimize Power BI reports for performance and user experience. - Implement best practices for Power BI development, including version control and documentation. - Stay updated on the latest Power BI features and industry trends to enhance reporting capabilities. - Provide training and support to end-users on Power BI tools and functionalities. - Collaborate with cross-functional teams to integrate Power BI solutions with existing systems and processes. - Participate in data governance and security initiatives to ensure compliance with company policies. Qualifications - Bachelor's / master's degree in Computer Science, Information Technology, Data Analytics, or a related field. - Proven experience as a Power BI Developer or similar role. - Strong proficiency in Power BI, including DAX, Power Query, and data visualization techniques. - Experience with SQL for data extraction and manipulation. - Knowledge of data warehousing concepts and ETL processes is a plus. - Excellent analytical and problem-solving skills. - Strong communication and interpersonal skills, with the ability to collaborate effectively with technical and non-technical stakeholders.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Data Engineer (Snowflake) at our new-age, AI-first Digital & Cloud Engineering Services company, you will have the opportunity to play a crucial role in building and scaling our data infrastructure. Your primary responsibility will be to design, develop, and maintain efficient and reliable data pipelines for both ELT (Extract, Load, Transform) and ETL (Extract, Transform, Load) processes. By collaborating with stakeholders, you will translate data requirements into efficient data models and pipelines to facilitate data-driven decision-making across the organization. Your key responsibilities will include designing, developing, and maintaining robust and scalable data pipelines for ELT and ETL processes, ensuring data accuracy, completeness, and timeliness. Additionally, you will work closely with stakeholders to understand data requirements and build optimized data pipelines using technologies such as Elastic Search, AWS S3, Snowflake, and NFS. You will also be responsible for implementing data warehouse schemas and ETL/ELT processes to support business intelligence and analytics needs, ensuring data integrity through data quality checks and monitoring. To be successful in this role, you must possess a Bachelor's degree in Computer Science, Engineering, or a related field, along with 5+ years of experience as a Data Engineer focusing on ELT/ETL processes. You should have at least 3+ years of experience with Snowflake data warehousing technologies and creating/maintaining Airflow ETL pipelines. Proficiency in Python for data manipulation and automation, as well as experience with Elastic Search, SQL, and cloud-based data storage solutions like AWS S3, are essential requirements. Moreover, staying updated with industry best practices, CI/CD/DevSecFinOps, Scrum, and emerging technologies in data engineering will be crucial. Your strong problem-solving and analytical skills, coupled with excellent communication and collaboration abilities, will enable you to work effectively with data scientists, analysts, and other team members to ensure data accessibility and usability for various analytical purposes. If you are a passionate and talented Data Engineer with a keen interest in Snowflake, data pipelines, and data-driven decision-making, we invite you to join our growing data team and contribute to the development and enhancement of our data warehouse architecture.,

Posted 2 weeks ago

Apply

12.0 - 16.0 years

0 Lacs

haryana

On-site

Join us as a Principal Data Engineer! You will drive the development of software and tools to achieve project and departmental objectives by translating functional and non-functional requirements into a suitable design. In addition to managing the technical delivery of one or more software engineering teams, you will lead broader participation in internal and industry-wide events, conferences, and other activities. Your role will also involve leading the planning, specification, development, and deployment of high-performance, robust, and resilient systems. It will be crucial to ensure that these systems adhere to excellent architectural and engineering principles and are well-suited for their intended purposes. This position is offered at the vice president level. As a Principal Engineer, you will oversee the productivity of software engineering teams and ensure the consistent use of shared platform components and technologies. Engaging with senior stakeholders, you will explore and recommend appropriate technical solutions to meet the required product features. You will also be responsible for monitoring technical progress against plans, ensuring functionality, scalability, and performance, and providing progress updates to stakeholders. Additionally, you will deliver software components to support the delivery of platforms, applications, and services for the organization. Designing and developing high-volume, high-performance, high-availability applications using established frameworks and technologies will be part of your responsibilities. You will also design reusable libraries and APIs for organization-wide use and write unit and integration tests within automated test environments to maintain code quality. To excel in this role, we are seeking an individual with a background in data engineering, software design, or database design and architecture. Significant experience in developing software in an SOA or micro-services paradigm is essential. The ideal candidate will also have a history of leading software development teams, introducing and implementing technical strategies, and hands-on development experience in one or more programming languages. In addition, the following skills and qualifications are highly desirable: - At least 12 years of experience using industry-recognized frameworks and development tools - Excellent understanding of data pipeline development, data integration, database management, and ETL processes - Experience with test-driven development, automated test frameworks, mocking, stubbing, and unit testing tools - Knowledge of working with code repositories, bug tracking tools, and wikis - Background in designing or implementing APIs and in-depth knowledge of large-scale database and NoSQL design and optimization If you are a seasoned professional with a passion for data engineering and software development, and possess the skills and experience outlined above, we invite you to consider this exciting opportunity as a Principal Data Engineer.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies