Jobs
Interviews

1052 Etl Processes Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

karnataka

On-site

66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. At 66degrees, we believe in embracing the challenge and winning together. These values guide us in achieving our goals as a company and for our people. We are dedicated to creating a significant impact for our employees by fostering a culture that sparks innovation and supports professional and personal growth. We're on the hunt for a seasoned Data Analytics Architect who's not afraid to roll up their sleeves and dive into the data trenches with us. As a Data Analytics Architect, you'll be the mastermind behind designing and implementing cutting-edge data solutions that solve real-world business problems for our clients across various industries. You'll be the bridge between the technical and the business worlds, translating complex data insights into actionable strategies that drive tangible results. Responsibilities: - Collaborate with stakeholders to understand business requirements and translate them into technical solutions. - Work with Clients to enable them on the Looker Platform, teaching them how to construct an analytics ecosystem in Looker from the ground up. - Advise clients on how to develop their analytics centers of excellence, defining and designing processes to promote a scalable, governed analytics ecosystem. - Utilize Looker to design and develop interactive and visually appealing dashboards and reports for end-users. - Write clean, efficient, and scalable code (LookML, Python as applicable). - Conduct performance tuning and optimization of data analytics solutions to ensure efficient processing and query performance. - Stay up to date with the latest trends and best practices in cloud data analytics, big data technologies, and data visualization tools. - Collaborate with other teams to ensure seamless integration of data analytics solutions with existing systems and processes. - Provide technical guidance and mentorship to junior team members, sharing knowledge and promoting best practices. Qualifications: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - Proven track record as a Data Analytics Architect or a similar role, with a minimum of 5 years experience in data analytics and visualization. - Excellent comprehension of Looker and its implementation, with additional analytics platform experience a major plus - especially MicroStrategy. - Experience with Google Cloud and its data services is preferred, however experience with other major cloud platforms (AWS, Azure) and their data services will be considered. - Strong proficiency with SQL required. - Demonstrated experience working with clients in various industry verticals, understanding their unique data challenges and opportunities. - Excellent programming skills in Python and experience working with python-enabled capabilities in analytics platforms. - Solid understanding of data modeling, ETL processes, and data integration techniques. Experience working with dbt or dataform is a plus. - Strong problem-solving skills and the ability to translate business requirements into technical solutions. - Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team environment. - Google Cloud certifications and any analytics platform certifications are a plus. - A desire to stay ahead of the curve and continuously learn new technologies and techniques.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

At PwC, the focus in data and analytics engineering is on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. As a data engineer at PwC, your role will involve transforming raw data into actionable insights to enable informed decision-making and drive business growth. You will primarily concentrate on designing and building data infrastructure and systems to facilitate efficient data processing and analysis. Your responsibilities will include developing and implementing data pipelines, data integration, and data transformation solutions. The ideal candidate for this position should possess a strong background in Data warehouse and Data migration projects. They should demonstrate expertise in ETL processes, data comparison/profiling, and source to target validation across various platforms, including relational and non-relational databases, APIs, BI tools, flat files, Excel, CSV, XML, Jsons, and others. The ETL Test Automation Architect will be instrumental in assessing project feasibility, recommending solutions, collaborating with vendors, and leading automation strategies. Proficiency in Python and experience with Datagaps tool are highly desirable. With 10 to 13 years of experience, your key responsibilities will include assessing ETL project feasibility based on project requirements and business goals, designing and implementing ETL automation frameworks/solutions, and evaluating and recommending tools to streamline ETL validation and testing processes. You will conduct thorough validation of all data sources, provide guidance and troubleshooting support, collaborate with vendors, develop and implement automation strategies, and stay updated with market trends and emerging ETL technologies. Required skills for this role include extensive experience in data warehouse and data migration projects, strong knowledge of ETL processes, database validation, and diverse data source validation, expertise in tools like Datagaps for ETL validation and automation, proficiency in Python for scripting and plugin development, hands-on experience with various data sources, familiarity with APIs, BI tools, flat files, Excel, and CSV data integration, and experience with CI/CD pipeline configuration for ETL processes. Preferred skills include Datagaps experience, knowledge of GenAI and cloud platforms, strong analytical and troubleshooting skills, and excellent collaboration and communication skills. Additional requirements for this role include the ability to manage multiple projects, deliver results in a fast-paced environment, willingness to take ownership and lead initiatives, and strong organizational and problem-solving skills.,

Posted 1 week ago

Apply

3.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

As a professional in this role, you will have the primary responsibility of supporting the development and enhancement of SAP BW data models, including InfoObjects, InfoProviders, and Composite Providers. Your expertise will be crucial in assisting in the construction and upkeep of SAC dashboards and reports that align with business requirements. You will play a key role in the data extraction, transformation, and loading (ETL) process from SAP source systems to SAP BW, ensuring efficiency and accuracy. In this position, you will be expected to develop and optimize SAP BW data models, queries, and reports to guarantee high performance and usability. Your skills will be utilized to design and create SAC dashboards and stories that provide actionable insights supporting strategic decision-making within the organization. Collaboration with senior analysts and business users will be essential to comprehend reporting needs and translate them into technical solutions effectively. Monitoring and troubleshooting BW processes to maintain system performance, data accuracy, and reliability will be part of your routine tasks. Additionally, you will be responsible for creating and updating documentation for SAP BW processes, encompassing data flows, reports, and system configurations. Your support will be crucial during SAP system upgrades, migrations, and performance tuning activities. To excel in this role, you must stay abreast of advancements in SAP BW/BI tools and recommend enhancements to existing solutions. The ideal candidate should hold a Bachelor's degree in Computer Science, Information Technology, or a related field, along with approximately 3-5 years of experience with SAP BW and at least 12 years with SAP Analytics Cloud (SAC). Hands-on experience in designing and building SAC dashboards with live and imported models is a critical requirement. Your skill set should include a strong understanding of SAP ERP processes and integration points between modules, expertise in SAP BW data modeling, ETL processes, and BEx Query Designer. Experience with SAP HANA modeling and reporting is preferred, along with proficiency in SAP BW performance tuning and optimization. Strong analytical, problem-solving, and communication skills are essential, along with the ability to work independently and collaborate effectively within a team environment. In terms of technical skills, you should be proficient in SAP BW/BI tools such as data modeling, BEx Query Designer, ETL processes, Info Objects, InfoCubes, DSO, and MultiProviders. Familiarity with SAP Modules like Sales (SD), Materials Management (MM), Finance (FI/CO), database technologies including SAP HANA and SQL, reporting tools such as SAP Analytics Cloud, Analysis for Office, and Power BI, as well as integration tools like SAP Data Services, is required. Experience with performance optimization tools like SAP BW Accelerator and HANA-specific optimizations will be advantageous. Having experience with SAP BW on HANA or SAP BW/4HANA, familiarity with ABAP for SAP BW enhancements, and holding SAP BW/BI certification will be considered as additional assets in this role.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for developing, maintaining, and optimizing reports and dashboards using SAP Business Objects, SAP BW, and Power BI. Collaborating with business users, you will understand their data needs and translate them into functional BI solutions. Extracting, transforming, and loading data from SAP and other sources will be essential to build efficient and scalable data models. Your role will ensure data accuracy, consistency, and integrity across BI platforms. Analyzing business processes and providing insights using SAP functional knowledge will be a key aspect of your responsibilities. Additionally, you will develop and maintain SQL queries, stored procedures, and data validation processes. Supporting ad-hoc reporting and business analytics needs and collaborating with IT and business teams for seamless integration of BI solutions are also part of this role. Monitoring BI system performance and troubleshooting issues as needed and training end-users on BI tools and best practices will be critical to your success. To excel in this role, you must possess a Bachelor's degree in computer science, Information Systems, Business Analytics, or a related field. You should have 5-7 years of experience as a BI Analyst or in a similar role. Hands-on experience with SAP Business Objects (BOBJ), SAP BW (Business Warehouse), and Power BI is mandatory. A strong understanding of SAP functional modules (e.g., Finance, Supply Chain, Sales & Distribution, HR, etc.) is required. Proficiency in SQL and data modeling, as well as experience with ETL processes and data warehousing concepts, are essential qualifications. Desirable qualifications include a Power BI certification. Your technical, business, and leadership skills will play a crucial role in this position. You should have the ability to analyze complex datasets and provide actionable insights. Being highly skilled in communicating business data with a focus on UI/UX and intelligent dashboard design is important. A solid understanding of leading and contemporary practices and capabilities in information management, data governance, reporting, and analytics is necessary. Experience in production support/BAU, working knowledge of change control processes and impacts, a high degree of problem-solving and technical skills, and the ability to evaluate and prioritize tasks are also required. Involvement in a mixture of new application development, maintenance, and technical support, as well as effectively liaising with internal customers at all levels within the organization and external parties when necessary, are key aspects of this role.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You will have the opportunity to build a unique career at EY, with the global scale, support, inclusive culture, and technology to help you become the best version of yourself. Your unique voice and perspective are crucial in helping EY improve as well. Join us to create an exceptional experience for yourself and contribute to building a better working world for all. As the Managed Services Lead, you will oversee all aspects of our AWS/ETL project, including the development, deployment, and maintenance of ETL processes and AWS cloud services. The ideal candidate should possess a strong background in AWS cloud services, ETL processes, and team leadership. Responsibilities: - Lead a team of cloud and ETL engineers to provide high-quality managed services to our clients. - Design and implement robust ETL pipelines to integrate multiple data sources into AWS cloud environments. - Ensure the reliability, scalability, and security of AWS services and ETL processes. - Monitor system performance, troubleshoot issues, and drive continuous improvements. - Collaborate with cross-functional teams to align managed services with business objectives. - Manage client relationships, including regular updates and issue resolution. - Develop and uphold service level agreements (SLAs) and ensure service delivery meets or exceeds standards. - Provide technical leadership and mentorship to team members, promoting a culture of excellence and growth. - Stay informed about the latest AWS features and ETL best practices to recommend and implement cutting-edge solutions. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or related field. - Minimum of 5 years of experience in managing cloud services, with a focus on AWS and ETL processes. - Proven track record of leading technical teams in a managed services environment. - Strong understanding of AWS services such as EC2, S3, RDS, Lambda, and AWS Glue. - Experience with ETL tools and processes, and knowledge of data warehousing concepts. - Excellent problem-solving skills and ability to work under pressure. - Strong communication and interpersonal skills, with a focus on customer service. - Relevant AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified DevOps Engineer) are highly desirable. At EY, we strive to build a better working world by creating long-term value for clients, people, and society while building trust in the capital markets. Our diverse teams in over 150 countries leverage data and technology to provide assurance and help clients grow, transform, and operate across various sectors.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

The Vice President position in our Enterprise Technology division at Macquarie is a pivotal leadership role where you will drive strategic direction and operational excellence. You will be responsible for leading a talented team of engineers, fostering a culture of innovation and collaboration. With our advantage of bringing together diverse individuals, you will have the opportunity to shape various possibilities within our global financial services group. Your main responsibilities will include mentoring the engineering team, developing and executing strategic roadmaps for enterprise data platforms, ensuring alignment with business objectives, and collaborating with cross-functional teams to deliver effective data solutions. We are looking for someone with extensive experience in data engineering, strong leadership skills, proficiency in data architecture, data warehousing, ETL processes, big data technologies, AWS services, Kubernetes, and Docker. Additionally, you should possess strong analytical and problem-solving skills for data-driven decision-making, along with excellent communication and interpersonal skills to engage and influence stakeholders. At Macquarie, we value hearing from individuals who are inspired to contribute to building a better future with us. If you are excited about this role or working at Macquarie, we encourage you to apply and become part of our friendly and supportive team where everyone's ideas contribute to driving outcomes. Technology plays a crucial role in every aspect of Macquarie, empowering our people, customers, and communities. Our global team is passionate about accelerating the digital enterprise, connecting people and data, building platforms and applications, and designing tomorrow's technology solutions. As a Macquarie employee, you will have access to a wide range of benefits including hybrid and flexible working arrangements, wellbeing leave days, paid parental leave, paid volunteer leave, donation matching, and other benefits to support your physical, mental, and financial wellbeing. You will also have access to various learning and development opportunities. Macquarie is committed to diversity, equity, and inclusion. We aim to provide reasonable adjustments to individuals who may need support during the recruitment process and through working arrangements. If you require additional assistance, please let us know in the application process.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You are a talented Lead Data Product Architect and Engineer responsible for designing, developing, and maintaining data solutions across APAC Advisory products. Your role involves ensuring data quality, security, scalability, and performance of the data products while collaborating with various stakeholders to understand business requirements and translate them into data models and architectures. Your responsibilities include defining a robust and scalable data architecture, facilitating the design and development of high-quality data resources, collaborating with other teams for implementation of data models and flows, communicating data methodology and results to stakeholders, and integrating various data sources with internal and external platforms. You will also be responsible for ensuring data quality, accuracy, and consistency across APAC Advisory products, monitoring and troubleshooting data issues, supporting data governance initiatives, providing data expertise to business stakeholders, and staying updated with industry trends related to data architecture, big data, and cloud solutions. As a Lead Data Product Architect and Engineer, you will provide support for applications and products on digital platforms, develop comprehensive data architecture and design elements, evaluate and recommend technologies for Data Lake development, and create a data marketplace and data dictionary. You will work with business stakeholders to gather requirements, design efficient data models, support analytics and reporting needs, and ensure data models support operational requirements. In addition, you will be responsible for data migration, integration, testing, and validation planning, producing responsive design solutions, partnering in delivering data governance best practices, and ensuring data quality, security, scalability, and performance of data systems and products. Collaboration with product managers, developers, analysts, and stakeholders to translate business requirements into data models and architectures is also a key aspect of your role. To be successful in this role, you should have a Bachelor's degree in Computer Science or related field, 5+ years of experience in managing data platforms and architecture, proficiency in data modeling, ETL processes, SQL, and big data technologies, and knowledge of data integration techniques and governance frameworks. Experience with cloud platforms and application development frameworks is highly desirable, along with strong communication, collaboration, and problem-solving skills. Joining Cushman & Wakefield will offer you the opportunity to be part of a global real estate services firm committed to career development, diversity, and inclusion. You will benefit from a growing company, career progression opportunities, and a flexible work environment focused on technology and autonomy. Continuous learning and development opportunities, as well as a comprehensive employee benefits program, are also part of the work culture at Cushman & Wakefield.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

We are looking for a highly skilled Qlik Sense Developer to join our growing team. In this role, you will leverage your expertise in Qlik Sense and data analytics to design, develop, and deploy powerful data visualization solutions. As a Qlik Sense Developer, you will be responsible for end-to-end engineering design, development, testing, deployment, and operations of Qlik Sense dashboards. You will design and develop ETL processes to validate and transform data in SQL & Qlik Sense, and write SQL queries for RDBMS such as MS SQL Server, Oracle, AWS Databricks, etc. Additionally, you will define, implement, and standardize metrics, reports, and dashboards leveraging state-of-the-art business intelligence tools and enforce data quality standards. You should be able to comprehend and translate complex functional, technical, and business requirements into executable architectural designs, create and maintain technical documentation, implement complex logics in SQL & Qlik Sense, and work in an Agile environment using advanced technologies like Extensions. The ideal candidate will bring strong proficiency in Qlik Sense development, including scripting, data modeling, and dashboard/report creation, as well as a solid understanding of data warehousing concepts, ETL processes, and SQL. Experience in designing and developing interactive visualizations and reports, working with large datasets, creating efficient data models in Qlik Sense, and knowledge of Qlik Sense security features is required. You should also have experience with performance tuning and optimization of Qlik Sense applications, strong problem-solving skills, attention to detail, excellent communication skills, and the ability to collaborate with cross-functional teams. Troubleshooting data issues and finding solutions to challenges faced during dashboard development will be part of your responsibilities as an individual contributor. In return, we offer a competitive salary and benefits package, a culture focused on talent development, opportunities to work with cutting-edge technologies, employee engagement initiatives, annual health check-ups, and insurance coverage for self, spouse, children, and parents. We are committed to fostering diversity and inclusion in the workplace, offering hybrid work options, flexible working hours, and accessible facilities for employees with disabilities. At our company, you can accelerate growth professionally and personally, impact the world positively using the latest technologies, enjoy collaborative innovation, and unlock global opportunities to work and learn with the industry's best. Join us at Persistent and unleash your full potential. We are an Equal Opportunity Employer and prohibit discrimination and harassment of any kind.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

About us: Vertiv brings together hardware, software, analytics, and ongoing services to ensure its customers" vital applications run continuously, perform optimally, and scale with business needs. Vertiv solves the most important challenges facing today's data centers, communication networks, and commercial and industrial facilities with a portfolio of power, cooling, and IT infrastructure solutions and services that extends from the cloud to the edge of the network. Job Summary: As an AI Developer at Vertiv, you will be responsible for designing, developing, and implementing AI models and algorithms to address complex business challenges. You will work with large datasets, collaborate with cross-functional teams, and integrate AI solutions into existing systems. Your role will involve optimizing AI models for performance and scalability, developing ETL processes, and deploying models on cloud platforms. You will also stay updated with the latest advancements in AI and machine learning technologies. Key Duties and Responsibilities: - Design, develop, and implement AI models and algorithms to solve complex business problems. - Work with large datasets to extract meaningful insights and build predictive models. - Collaborate with cross-functional teams to integrate AI solutions into existing systems. - Optimize and maintain AI models for performance and scalability. - Develop and maintain Data Pipeline. - Utilize SQL and Big Data platforms (preferably Snowflake) for data manipulation and analysis. - Deploy AI models on cloud platforms such as AWS or similar. - Create AI web apps (preferably Streamlit). - Implement version control and CI/CD pipelines (preferably using GitLab). - Stay updated with the latest advancements in AI and machine learning technologies. - Document and present findings and solutions to stakeholders. Requirements: - Bachelor's degree in Computer Science, Engineering, or a related field. - Strong experience with SQL language and Big Data platforms (Snowflake preferred). - Knowledge of cloud platforms. - Proficiency in Python programming AWS or similar. - Knowledge of version control and CI/CD pipelines (GitLab preferred). - Experience with machine learning projects and large language models (e.g., LLaMA, Mistral, Claude). - Fluent in English with excellent communication skills. - Strong problem-solving skills and attention to detail. - Ability to work independently and as part of a team. Preferred Qualifications: - Master's degree in a related field. - Experience with Snowflake platforms and AWS environment. - Proficiency in Python programming for Data Transformation and ML processes. - Previous experience in a similar role. - Experience with ETL tools (Matillion preferred).,

Posted 1 week ago

Apply

7.0 - 9.0 years

0 Lacs

noida, uttar pradesh, india

On-site

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role As a System Administrator at Kyndryl, you'll solve complex problems and identify potential future issues across the spectrum of platforms and services. You'll be at the forefront of new technology and modernization, working with some of our biggest clients - which means some of the biggest in the world. There's never a typical day as a System Administrator at Kyndryl, because no two projects are alike. You'll be managing systems data for clients and providing day-to-day solutions and security compliance. You'll oversee a queue of assignments and work directly with technicians, prioritizing tickets to deliver the best solutions to our clients. One of the benefits of Kyndryl is that we work with clients in a variety of industries, from banking to retail. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. You'll also get the chance to share your expertise by recommending modernization options, identifying new business opportunities, and cultivating relationships with other teams and stakeholders. Does the work get challenging at times Yes! But you'll collaborate with a diverse group of talented people and gain invaluable management and organizational skills, which will come in handy as you move forward in your career. Your future at Kyndryl Every position at Kyndryl offers a way forward to grow your career, from Junior System Administrator to Architect. We have opportunities for Cloud Hyperscalers that you won't find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. One of the benefits of Kyndryl is that we work with clients in a variety of industries, from banking to retail. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You're good at what you do and possess the required experience to prove it. However, equally as important - you have a growth mindset keen to drive your own personal and professional development. You are customer-focused - someone who prioritizes customer success in their work. And finally, you're open and borderless - naturally inclusive in how you work with others. Required Technical and Professional Expertise Having 7+ Years of experience in Managing Systrack administration end-to-end, monitoring systrack black box/sensor data and validate dashboard data for any discrepancies. Conducting monthly checks on the latest version of DEX Pack through Systrack kits and managing the Tracker. Analyzing data through Visualizer, Prevent, and AppVision. Managing SysTrack settings, system configuration, sensor configuration, and related tools. Keeping the master tracker data up-to-date with any updates or deletions. Cleaning stale records and updating the record of SysTrack licenses. Providing guidance and monitoring to L2 in case they encounter difficulties. Foster collaboration between clients, partners, and stakeholders for a healthy environment. Managing the DEX Pack end-to-end, including role assignment, configuration setup, and testing. In Lakeside ticket, will help to understand the data discrepancies of the dashboard. On a weekly and monthly basis, present the Dex SysTrack Status Report during the EXL Governance call, where he showcases all KPIs and parameters to assess endpoint performance. If performance is poor, identifies the root cause and works on resolving it. Design, develop, and optimize complex SQL queries, stored procedures, views, and triggers to support data extraction, transformation, and loading (ETL) processes. Preferred Technical and Professional Experience Generate reports daily, weekly, monthly, and on an ad-hoc basis as well as agreed business endpoint scopes. Develop automation solutions using PowerShell scripts and batch files. Deploy automation solutions end-to-end in various environments, including User Acceptance Testing (UAT) and Production, and conduct testing. Monitor and manage the collection extension, including enabling and deploying it as needed. Foster collaboration between clients, partners, and stakeholders for a healthy environment. Generate reports for antivirus, BitLocker, and compliance. Validate stale records and Dex Score. Maintaining up-to-date master tracker data for support tickets, including raising and following up on them. In Lakeside ticket, will help to understand the data discrepancies of the automation. Check the data and create a weekly utilization report for the ServiceNow Assist module to support the tech lead Ensure the performance, reliability, and scalability of SQL databases. Conduct regular database performance tuning and optimization. Develop, implement, and maintain BI solutions using Power BI. Create interactive and visually appealing dashboards and reports to meet business requirements. Perform data analysis to identify trends, patterns, and insights to support business decisions. Collaborate with stakeholders to gather and understand business requirements for BI projects. Integrate data from various sources and ensure data consistency and integrity. Implement row level security and role-based access in Power BI. Develop custom visuals and leverage Power BI APIs for enhanced functionality. Ensure data quality and accuracy by implementing data governance policies and procedures.Automation experience, especially IaaS (infrastructure as a code) Being You Diversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learningprograms give you access to the best learning in the industry to receive certifications, includingMicrosoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked How Did You Hear About Us during the application process, select Employee Referral and enter your contact's Kyndryl email address.

Posted 1 week ago

Apply

6.0 - 8.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. As a Data Engineer at Kyndryl, you'll be at the forefront of the data revolution, crafting and shaping data platforms that power our organization's success. This role is not just about code and databases it's about transforming raw data into actionable insights that drive strategic decisions and innovation. In this role, you'll be engineering the backbone of our data infrastructure, ensuring the availability of pristine, refined data sets. With a well-defined methodology, critical thinking, and a rich blend of domain expertise, consulting finesse, and software engineering prowess, you'll be the mastermind of data transformation. Your journey begins by understanding project objectives and requirements from a business perspective, converting this knowledge into a data puzzle. You'll be delving into the depths of information to uncover quality issues and initial insights, setting the stage for data excellence. But it doesn't stop there. You'll be the architect of data pipelines, using your expertise to cleanse, normalize, and transform raw data into the final dataset-a true data alchemist. Armed with a keen eye for detail, you'll scrutinize data solutions, ensuring they align with business and technical requirements. Your work isn't just a means to an end it's the foundation upon which data-driven decisions are made - and your lifecycle management expertise will ensure our data remains fresh and impactful. So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won't find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You're good at what you do and possess the required experience to prove it. However, equally as important - you have a growth mindset keen to drive your own personal and professional development. You are customer-focused - someone who prioritizes customer success in their work. And finally, you're open and borderless - naturally inclusive in how you work with others. R equired Technical and Professional Expertise 6-8 years of experience working as a Data Engineer or in Azure cloud modernization Good experience in Power BI for data visualization and dashboard development Strong experience in developing data pipelines and using tools such as AWS Glue, Azure Databricks, Synapse, or Google Dataproc Proficient in working with both relational and NoSQL databases, including PostgreSQL, DB2, and MongoDB Excellent problem-solving, analytical, and critical thinking skills Ability to manage multiple projects simultaneously while maintaining a high level of attention to detail Expertise in data mining, data storage, and Extract-Transform-Load (ETL) processes Preferred Technical and Professional Experience Experience in Data Modelling, to create conceptual model of how data is connected and how it will be used in business processes Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization Cloud platform certification, e.g., AWS Certified Data Analytics - Specialty, Elastic Certified Engineer, Google Cloud Professional Data Engineer, or Microsoft Certified: Azure Data Engineer Associate Understanding of social coding and Integrated Development Environments, e.g., GitHub and Visual Studio Degree in a scientific discipline, such as Computer Science, Software Engineering, or Information Technology Being You Diversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learningprograms give you access to the best learning in the industry to receive certifications, includingMicrosoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked How Did You Hear About Us during the application process, select Employee Referral and enter your contact's Kyndryl email address.

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

gurgaon, haryana, india

On-site

About Our Team Our global team supports products education electronic health records that introduce students to digital charting and prepare them to document care in today's modern clinical environment. We have a very stable product that we've worked to get to and strive to maintain . Our team values trust, respect, collaboration, agility, and quality. T he Consumption Domain is a newly established domain, offering an exciting opportunity to play a crucial role in structuring and shaping its foundation . Our team is responsible for ensuring seamless data processing, validation, and operational efficiency , while continuously improving workflow optimization and incident management . We work closely with various stakeholders to drive accuracy, speed, and reliability in delivering high-quality data. With a problem-solving mindset and a data-driven approach , we aim to build scalable solutions that enhance business processes and improve overall user experience. About the Role Elsevier is looking for a Senior Analyst to join the Consumption Domain team, where you will play a crucial role in analyzing and interpreting user engagement and content consumption trends. The ideal candidate will possess strong technical expertise in data analytics, Databricks, ETL processes, and cloud storage, coupled with a passion for using data to drive meaningful business decisions. Responsibilities Analyze and interpret large datasets to provide actionable business insights. Leverage Databricks, working with RDDs, Data Frames , and Datasets to optimize data workflows. Design and implement ETL processes, job automation, and data optimization strategies. Work with structured, unstructured, and semi-structured data types, including JSON, XML, and RDF. Manage various file formats (Parquet, Delta files, ZIP files) and handle data storage within DBFS, FileStore , and cloud storage solutions (AWS S3, Google Cloud Storage, etc.). Write efficient SQL, Python, or Scala scripts to extract and manipulate data. Develop insightful dashboards using Tableau or Power BI to visualize key trends and performance metrics. Collaborate with cross-functional teams to drive data-backed decision-making. Maintain best practices in data governance, utilizing platforms such as Snowflake and Collibra. Participate in Agile development methodologies, using tools like Jira, Confluence. Ensure proper version control using GitHub. Requirements Bachelor's degree in Computer Science , Data Science, or Statistics. Minimum of 4-5 years of experience in data analytics, preferably in a publishing environment. Proven expertise in Databricks, including knowledge of RDDs, DataFrames , and Datasets. Strong understanding of ETL processes, data optimization, job automation, and Delta Lake. Proficiency in handling structured, unstructured, and semi-structured data. Experience with various file types, including Parquet, Delta, and ZIP files. Familiarity with DBFS, FileStore , and cloud storage solutions such as AWS S3, Google Cloud Storage. Strong programming skills in SQL, Python, or Scala. Experience creating dashboards using Tableau or Power BI is a plus. Knowledge of Snowflake, Collibra, JSON-LD, SHACL, SPARQL is an advantage. Familiarity with Agile development methodologies, including Jira and Confluence. Experience with GitLab for version control is beneficial. Skills and Competencies Ability to handle large datasets efficiently. Strong analytical and problem-solving skills. Passion for data-driven decision-making and solving business challenges. Eagerness to learn new technologies and continuously improve processes. Effective communication and data storytelling abilities. Experience collaborating with cross-functional teams. Project management experience in software systems is a plus. Work in a way that works for you We promote a healthy work/life balance across the organization. We offer an appealing working prospect for our people. With numerous wellbeing initiatives, shared parental leave, study assistance , and sabbaticals, we will help you meet your immediate responsibilities and your long-term goals . Working for You We understand that your well-being and happiness are essential to a successful career. Here are some benefits we offer: Comprehensive Health Insurance: Covers you, your immediate family, and parents. Enhanced Health Insurance Options: Competitive rates negotiated by the company. Group Life Insurance: Ensuring financial security for your loved ones. Group Accident Insurance: Extra protection for accidental death and permanent disablement. Flexible Working Arrangement: Achieve a harmonious work-life balance. Employee Assistance Program: Access support for personal and work-related challenges. Medical Screening: Your well-being is a top priority. Modern Family Benefits: Maternity, paternity, and adoption support. Long-Service Awards: Recognizing dedication and commitment. New Baby Gift: Celebrating the joy of parenthood. Subsidized Meals in Chennai: Enjoy delicious meals at discounted rates. Various Paid Time Off: Take time off with Casual Leave, Sick Leave, Privilege Leave, Compassionate Leave, Special Sick Leave, and Gazetted Public Holidays. Free Transport pick up and drop from the home -office - home (applies in Chennai) About the Business A global leader in information and analytics, we help researchers and healthcare professionals advance science and improve health outcomes for the benefit of society. Building on our publishing heritage, we combine quality information and vast data sets with analytics to support visionary science and research, health education and interactive learning, as well as exceptional healthcare and clinical practice. What you do every day will help advance science and healthcare to advance human progress. ----------------------------------------------------------------------- We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our or please contact 1-855-833-5120. Criminals may pose as recruiters asking for money or personal information. We never request money or banking details from job applicants. Learn more about spotting and avoiding scams . Please read our . We are an equal opportunity employer: qualified applicants are considered for and treated during employment without regard to race, color, creed, religion, sex, national origin, citizenship status, disability status, protected veteran status, age, marital status, sexual orientation, gender identity, genetic information, or any other characteristic protected by law. .

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 7 Lacs

delhi, india

On-site

Response Informatics is looking for Data Engineer TPC to join our dynamic team and embark on a rewarding career journey Responsibilities : Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 7 Lacs

kolkata, west bengal, india

On-site

Response Informatics is looking for Data Engineer TPC to join our dynamic team and embark on a rewarding career journey Responsibilities : Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 1 week ago

Apply

0.0 - 4.0 years

0 Lacs

karnataka

On-site

The AI Intern role based in Bangalore at Manyata Tech Park is a 6-month paid internship opportunity with a focus on supporting the integration, deployment, and operationalization of AI models, particularly Generative AI, Large Language Models (LLMs), Small Language Models (SLMs), and prompt engineering for conversational AI and workflow automation. As an AI Engineer Intern, you will collaborate with experienced team members to ensure smooth implementation of AI solutions and contribute to extending AI capabilities across various business applications within an enterprise environment. Your responsibilities will include developing end-to-end AI solutions, such as independently creating data pipelines, integrating foundational models like GPT and BERT, implementing backend services, and developing frontend interfaces for user interaction with AI models. You will also be involved in documentation for handover, knowledge sharing sessions, quality assurance and testing, deployment to production environments, monitoring performance, and lifecycle management of AI solutions. To excel in this role, proficiency in Python, experience with Azure AI services, familiarity with data engineering and backend development, knowledge of testing frameworks, and strong communication skills are essential. A self-starting attitude, adaptability, problem-solving mindset, and ability to work collaboratively are key soft skills required for this position. Preferred skills include experience with RAG techniques, prompt engineering, predictive modeling, frontend development, cloud services, advanced AI concepts, and documentation & compliance practices. If you are interested in this opportunity, please share your profile to vijitha.k@blackbox.com.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Database Engineer at our company, you will play a crucial role in managing and enhancing our database systems to ensure smooth integration of data from various sources into our central database. Your primary responsibilities will include administering and maintaining databases to uphold data integrity and security, as well as re-architecting and optimizing database designs to align with evolving business requirements. You will be expected to merge and reconcile data from multiple sources into a centralized master database, alongside developing strategies to handle large datasets efficiently. Additionally, you will be responsible for writing, reviewing, and optimizing complex SQL queries to enhance platform performance, identifying and resolving database performance issues, and leveraging Elastic Search for improved search functionality and data retrieval. To qualify for this role, you should hold a Bachelor's or Master's degree in Computer Science, Information Technology, or related fields, and possess at least 5 years of experience in database design, development, and optimization. An advanced understanding of SQL and query optimization techniques, hands-on experience with database platforms like MySQL, PostgreSQL, or SQL Server, familiarity with Elastic Search and Business Intelligence tools, and proficiency in handling large datasets are essential for this position. Strong problem-solving skills, analytical abilities, and excellent communication skills to collaborate with cross-functional teams are also required. Preferred skills for this role include familiarity with cloud-based database solutions such as AWS RDS or Azure SQL, experience with ETL processes and tools, and knowledge of programming languages like PHP or Python for database-related tasks. Joining our team at eMedEvents will offer you the opportunity to work with cutting-edge technologies that directly impact the healthcare industry. We provide a dynamic work environment, opportunities for personal and professional growth, competitive compensation, and a chance to be part of a team dedicated to making a difference in the healthcare sector.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

As an AI Architect at EY in Dublin, you will have the opportunity to shape and implement transformative AI and data-driven solutions for clients in the Technology Consulting field. You will play a crucial role in designing and delivering AI architectures that address complex business challenges, ensuring scalability, security, and compliance. Your technical expertise in AI architecture, data engineering, and Generative AI technologies will be essential in leading cross-functional teams to deliver innovative AI and data solutions. Collaborating with clients, you will develop comprehensive AI architectures and data strategies tailored to meet their specific business needs. Your responsibilities will include leading the design and delivery of AI-powered solutions such as Generative AI models, predictive analytics, and machine learning systems. You will oversee the end-to-end lifecycle of AI and data projects, from discovery and design to deployment and ongoing optimization. Additionally, you will design and implement data pipelines and data engineering frameworks to seamlessly integrate AI models with enterprise systems. To excel in this role, you must possess a bachelor's degree in computer science, data science, engineering, or a related field, along with at least 8 years of experience in AI architecture, data engineering, and solution delivery roles. Your expertise should encompass data modeling, data integration, cloud-native architectures (e.g., Azure, AWS, GCP), Generative AI technologies, machine learning frameworks, and data-driven decision-making approaches. Strong project management skills and the ability to communicate technical concepts to non-technical stakeholders are also key requirements. Ideal candidates will hold a master's degree in artificial intelligence, data science, or a related discipline, and have consulting experience in delivering scalable AI and data architectures for enterprise clients. Deep understanding of data governance, privacy regulations, and the ability to bridge the gap between IT and business needs are highly desirable qualities. At EY, we offer a competitive remuneration package with comprehensive benefits that support flexible working, career development, and personal well-being. You will have the opportunity to work with engaging colleagues, develop new skills, and progress your career in a supportive and inclusive environment. Our commitment to diversity and inclusion ensures that all differences are valued and respected, fostering a culture where everyone can thrive and contribute their unique perspectives. Join us at EY to build a better working world through innovation, inclusivity, and transformative leadership. If you meet the criteria outlined above and are eager to make a meaningful impact in the field of AI architecture, we encourage you to apply now and be a part of our dynamic team.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

ahmedabad, gujarat

On-site

Join us as a Senior Database Developer and drive high-performance data systems for financial services! MSBC is seeking a Senior Database Developer with expertise in Oracle and PL/SQL to design, develop, and optimize complex database solutions. This role offers an exciting opportunity to enhance data integrity, scalability, and performance while working on mission-critical applications. Collaborate with industry experts to deliver efficient and secure database solutions supporting financial services and enterprise applications. If you are passionate about database development and thrive in a fast-paced, technology-driven environment, join us in driving innovation and efficiency through data management. Key Tools and Technologies: - Database Management: Oracle Database, MySQL, PostgreSQL, NoSQL - Development & Optimization: PL/SQL, SQL, Query Optimization, Index Tuning, Execution Plans - Architecture & Data Modeling: Logical & Physical Data Modeling, Normalization, Data Governance - Security & Performance: Data Security, Performance Tuning, Backup & Recovery, Disaster Recovery - Version Control & Deployment: Git, Database Deployment Strategies - Cloud & Automation: Oracle Cloud, AWS RDS, ETL Processes, BI Tools, DevOps Practices Key Responsibilities: - Develop and optimize database solutions ensuring integrity, security, and performance. - Design and maintain database schema, tables, indexes, views, and stored procedures. - Implement data models and governance standards aligned with business requirements. - Conduct performance tuning and troubleshooting to enhance efficiency. - Manage backups, recovery, and disaster recovery strategies. - Collaborate with architects, analysts, and development teams for seamless integration. - Provide technical support and mentorship to junior developers. Skills & Qualifications: - 5+ years of experience in database development with expertise in PL/SQL and SQL. - Strong grasp of database architecture, normalization, and design patterns. - Hands-on experience with database security, performance tuning, and version control. - Familiarity with cloud-based solutions, automation, and DevOps practices. - Additional experience with MySQL, PostgreSQL, or NoSQL databases is a plus. - Oracle Certified Professional (OCP) certification preferred. - Strong problem-solving, attention to detail, and communication skills. Note: Shift timings align with UK working hours. This role is based in Ahmedabad, but candidates from other cities or states are encouraged to apply, as remote or hybrid working options are available. MSBC Group has been a trusted technology partner for over 20 years, delivering the latest systems and software solutions for financial services, manufacturing, logistics, construction, and startup ecosystems. Our expertise includes Accessible AI, Custom Software Solutions, Staff Augmentation, Managed Services, and Business Process Outsourcing. We are at the forefront of developing advanced AI-enabled services and supporting transformative projects, such as state-of-the-art trading platforms, seamless application migrations, and integrating real-time data analytics. With offices in London, California, and Ahmedabad, and operating in every time-zone, MSBC Group is your AI and automation partner.,

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

chennai, tamil nadu, india

On-site

Job Description: Senior ODI Developer (OCI PaaS/IaaS Expertise) Role Overview: We are seeking a highly skilled Senior ODI Developer with strong hands-on experience in SQL , PL/SQL , and Oracle Data Integrator (ODI) projects, particularly on OCI (Oracle Cloud Infrastructure) PaaS or IaaS platforms. The ideal candidate will design, implement, and optimize ETL processes, leveraging cloud-based solutions to meet evolving business needs. Prior experience in banking projects is a significant advantage. Skills and Qualifications: Mandatory Skills: Strong hands-on experience with Oracle Data Integrator (ODI) development and administration. Proficiency in SQL and PL/SQL for complex data manipulation and query optimization. Experience deploying and managing ODI solutions on OCI PaaS/IaaS environments. Deep understanding of ETL processes, data warehousing concepts, and cloud data integration. Preferred Experience: Hands-on experience in banking or insurance domain projects, with knowledge of domain-specific data structures. Familiarity with OCI services like Autonomous Database, Object Storage, Compute, and Networking. Experience in integrating on-premise and cloud-based data sources. Other Skills: Strong problem-solving and debugging skills. Excellent communication and teamwork abilities. Knowledge of Agile methodologies and cloud-based DevOps practices. Education and Experience: Bachelor's degree in computer science, Information Technology, or a related field. 4 to 6 years of experience in ODI development, with at least 2 years of experience in OCI-based projects. Domain experience in flexcube is an added advantage. Key Responsibilities: Design, develop, and deploy ETL processes using Oracle Data Integrator (ODI) on OCI PaaS/IaaS. Configure and manage ODI instances on OCI, ensuring optimal performance and scalability. Develop and optimize complex SQL and PL/SQL scripts for data extraction, transformation, and loading. Implement data integration solutions, connecting diverse data sources like cloud databases, on-premise systems, APIs, and flat files. Monitor and troubleshoot ODI jobs running on OCI to ensure seamless data flow and resolve any issues promptly. Collaborate with data architects and business analysts to understand integration requirements and deliver robust solutions. Conduct performance tuning of ETL processes, SQL queries, and PL/SQL procedures. Prepare and maintain detailed technical documentation for developed solutions. Adhere to data security and compliance standards, particularly in cloud-based environments. Provide guidance and best practices for ODI and OCI-based data integration projects. Career Level - IC1

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

You are a highly skilled and motivated Talend Developer with 4-5 years of hands-on experience in designing, developing, and implementing data integration solutions using the Talend platform. Your strong understanding of data pipelines, ETL processes, and ability to work with large datasets will be crucial in your role. As a Talend Developer, you will be part of a dynamic team responsible for managing data workflows, data quality, and ensuring seamless integration of data across multiple systems. Your responsibilities include designing, developing, and implementing data integration processes using Talend Studio (both Talend Open Studio and Talend Data Integration). You will create and maintain ETL processes to extract, transform, and load data from various sources (e.g., databases, flat files, APIs) into target systems. Developing and managing complex data pipelines to support real-time and batch data integration will be part of your daily tasks. Collaboration with business and technical teams is essential as you translate data requirements into efficient, scalable solutions. Your work will involve data cleansing, transformation, and data quality management. Additionally, you will integrate Talend with other tools and technologies such as databases, cloud platforms, and message queues. In your role, you will troubleshoot and resolve issues related to ETL jobs, data synchronization, and performance to ensure high performance, scalability, and reliability of data integration solutions. Monitoring and optimizing ETL jobs and data workflows to meet performance and operational standards will be part of your routine. Participating in code reviews, testing, and debugging activities is crucial to ensure code quality. To document technical specifications, processes, and procedures accurately is also part of your responsibilities. Keeping yourself updated with the latest developments in Talend and data integration best practices is essential to excel in this role.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

Your opportunity to make a real impact and shape the future of financial services is waiting for you. Let's push the boundaries of what's possible together. As a Senior Director of Software Engineering at JPMorgan Chase within the Consumer and Community Banking division, you will be responsible for leading various technical domains, overseeing the activities of multiple departments, and fostering cross-functional collaboration. Your technical expertise will be utilized across different teams to promote the adoption and implementation of advanced technical methods, helping the firm stay ahead of industry trends, best practices, and technological advancements. Leads multiple technology and process implementations across departments to achieve firmwide technology objectives. Directly manages multiple areas with strategic transactional focus. Provides leadership and high-level direction to teams while frequently overseeing employee populations across multiple platforms, divisions, and lines of business. Acts as the primary interface with senior leaders, stakeholders, and executives, driving consensus across competing objectives. Manages multiple stakeholders, complex projects, and large cross-product collaborations. Influences peer leaders and senior stakeholders across the business, product, and technology teams. Champions the firm's culture of diversity, equity, inclusion, and respect. Required qualifications, capabilities, and skills include formal training or certification on data management concepts and 10+ years applied experience. In addition, 5+ years of experience leading technologists to manage, anticipate, and solve complex technical items within your domain of expertise. Proven experience in designing and developing large-scale data pipelines for batch & stream processing. Strong understanding of Data Warehousing, Data Lake, ETL processes, and Big Data technologies (e.g., Hadoop, Snowflake, Databricks, Apache Spark, PySpark, Airflow, Apache Kafka, Java, Open File & Table Formats, GIT, CI/CD pipelines etc.). Expertise with public cloud platforms (e.g., AWS, Azure, GCP) and modern data processing & engineering tools. Excellent communication, presentation, and interpersonal skills. Experience developing or leading large or cross-functional teams of technologists. Demonstrated prior experience influencing across highly matrixed, complex organizations and delivering value at scale. Experience leading complex projects supporting system design, testing, and operational stability. Experience with hiring, developing, and recognizing talent. Extensive practical cloud native experience. Expertise in Computer Science, Computer Engineering, Mathematics, or a related technical field. Preferred qualifications, capabilities, and skills include experience working at the code level and ability to be hands-on performing PoCs, code reviews. Experience in Data Modeling (ability to design Conceptual, Logical, and Physical Models, ERDs, and proficiency in data modeling software like ERwin). Experience with Data Governance, Data Privacy & Subject Rights, Data Quality & Data Security practices. Strong understanding of Data Validation / Data Quality. Experience with supporting large-scale AI/ML Data requirements. Experience in Data visualization & BI tools is a huge plus.,

Posted 1 week ago

Apply

5.0 - 8.0 years

0 Lacs

pune, maharashtra, india

On-site

Job Description Job Description: Oracle PL/SQL Developer (Capital Markets/Finance/Regulatory Reporting)Role Overview We are looking for a talented and experienced Oracle PL/SQL Developer with 5 to 8 years of experience in the Capital Markets, Finance, or Regulatory Reporting domains. The ideal candidate will possess strong expertise in Oracle PL/SQL, with hands-on experience in both on-premise and cloud environments. This role involves designing, developing, and maintaining robust database solutions to support critical financial applications and regulatory reporting initiatives. Performance Tuning: Identify and resolve database performance bottlenecks, including query optimization, indexing strategies, and PL/SQL code tuning. Data Modeling & Design: Collaborate with data architects and business analysts to understand data requirements and contribute to logical and physical data model designs. On-Premise & Cloud Operations: Work with Oracle databases deployed in both traditional on-premise data centers and cloud platforms (e.g., Oracle Cloud Infrastructure (OCI), AWS RDS for Oracle, Azure SQL for Oracle) Data Integration: Develop and maintain ETL processes using PL/SQL to facilitate data movement and integration between various financial systems. Regulatory & Financial Data Handling: Ensure data integrity, accuracy, and security for financial data, particularly in the context of regulatory reporting or core finance operations. Documentation: Create and maintain technical documentation, including design specifications, data flow diagrams, and support procedures. Collaboration: Work closely with cross-functional teams, including application developers, business analysts, QA engineers, and project managers. Oracle PL/SQL Expertise: Strong proficiency in writing, optimizing, and debugging complex Oracle PL/SQL code, including advanced SQL, stored procedures, functions, packages, and triggers. Database Performance: Demonstrated experience in Oracle database performance tuning, including AWR/ADDM analysis, SQL tuning, and index optimization. On-Premise & Cloud Experience: Practical experience working with Oracle databases in both on-premise environments and cloud platforms (e.g., OCI, AWS, Azure). Familiarity with cloud-specific Oracle services is highly desirable. Skills (competencies) Verbal Communication Database Platform - Oracle Oracle Exadata Shell Script

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

bengaluru, karnataka, india

Remote

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer (HRIS) to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Duties: Key Responsibilities Data Management . Develop data mapping specifications and transformation rules . Perform data cleansing, validation, and reconciliation activities . Create and execute data conversion scripts and processes . Document data models, mappings, and migration procedures System Integration . Configure integrations between Workday and third-party applications . Develop integration specifications and technical designs . Troubleshoot integration issues and implement solutions . Monitor integration performance and implement optimizations Technical Configuration & Development . Design and implement Snowflake data models optimized for HR analytics . Create and maintain Snowflake objects (tables, views, stored procedures) . Develop SQL transformations within Snowflake for reporting requirements . Configure Workday modules according to design specifications . Build and maintain business processes, security configurations, and reports . Develop custom calculations, rules, and business logic . Create and maintain configuration workbooks and technical documentation . Participate in system testing, including unit, integration, and regression testing Reporting & Analytics . Develop reporting solutions using Snowflake as the data source . Create data visualization dashboards connected to Snowflake . Build self-service reporting capabilities for HR and business users . Design and implement metrics and KPIs for HR analytics . Support data analysis needs throughout the implementation lifecycle Minimum Skills Required: Required Experience & Skills . 3+ years of experience in HRIS implementation, data management, or system integration . 2+ years of hands-on experience with Workday/Greenhouse data conversion, integration, or reporting . Experience with Snowflake or similar cloud data warehouse platforms . Experience with architecting or working with ELT technologies (such as DBT) and data architectures . Strong SQL skills and experience with data transformation tools . Experience with ETL processes and data validation techniques . Understanding of HR data structures and relationships . Excellent analytical and problem-solving abilities . Knowledge of programming languages (Python etc.) . Experience with data visualization tools . Understanding of API integration patterns and technologies . Experience with agile development methodologies About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at Whenever possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client's needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, . NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

bengaluru, karnataka, india

Remote

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer (HRIS) to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Duties: Key Responsibilities Data Management . Develop data mapping specifications and transformation rules . Perform data cleansing, validation, and reconciliation activities . Create and execute data conversion scripts and processes . Document data models, mappings, and migration procedures System Integration . Configure integrations between Workday and third-party applications . Develop integration specifications and technical designs . Troubleshoot integration issues and implement solutions . Monitor integration performance and implement optimizations Technical Configuration & Development . Design and implement Snowflake data models optimized for HR analytics . Create and maintain Snowflake objects (tables, views, stored procedures) . Develop SQL transformations within Snowflake for reporting requirements . Configure Workday modules according to design specifications . Build and maintain business processes, security configurations, and reports . Develop custom calculations, rules, and business logic . Create and maintain configuration workbooks and technical documentation . Participate in system testing, including unit, integration, and regression testing Reporting & Analytics . Develop reporting solutions using Snowflake as the data source . Create data visualization dashboards connected to Snowflake . Build self-service reporting capabilities for HR and business users . Design and implement metrics and KPIs for HR analytics . Support data analysis needs throughout the implementation lifecycle Minimum Skills Required: Required Experience & Skills . 3+ years of experience in HRIS implementation, data management, or system integration . 2+ years of hands-on experience with Workday/Greenhouse data conversion, integration, or reporting . Experience with Snowflake or similar cloud data warehouse platforms . Experience with architecting or working with ELT technologies (such as DBT) and data architectures . Strong SQL skills and experience with data transformation tools . Experience with ETL processes and data validation techniques . Understanding of HR data structures and relationships . Excellent analytical and problem-solving abilities . Knowledge of programming languages (Python etc.) . Experience with data visualization tools . Understanding of API integration patterns and technologies . Experience with agile development methodologies About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at Whenever possible, we hire locally to NTT DATA offices or client sites. This ensures we can provide timely and effective support tailored to each client's needs. While many positions offer remote or hybrid work options, these arrangements are subject to change based on client requirements. For employees near an NTT DATA office or client site, in-office attendance may be required for meetings or events, depending on business needs. At NTT DATA, we are committed to staying flexible and meeting the evolving needs of both our clients and employees. NTT DATA recruiters will never ask for payment or banking information and will only use @nttdata.com and @talent.nttdataservices.com email addresses. If you are requested to provide payment or disclose banking information, please submit a contact us form, . NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 1 week ago

Apply

0.0 - 3.0 years

1 - 20 Lacs

hyderabad, telangana, india

On-site

Description We are seeking a motivated MDM Associate Data Steward to join our team in India. The ideal candidate will support the management of master data, ensuring its accuracy and consistency across various systems. This is an entry-level position suitable for candidates with 0-3 years of experience who are looking to grow their careers in data management. Responsibilities Assist in the management and maintenance of Master Data Management (MDM) systems. Support data quality initiatives and ensure data accuracy and consistency across systems. Collaborate with cross-functional teams to gather and document data requirements. Participate in data governance processes and help enforce data standards. Conduct data analysis to identify discrepancies and recommend corrective actions. Create and maintain documentation related to data processes and workflows. Skills and Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Proficiency in data management tools and technologies. Basic understanding of data governance principles and practices. Familiarity with SQL and database management systems. Strong analytical skills with attention to detail. Excellent communication and collaboration skills. Ability to work in a fast-paced environment and manage multiple tasks.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies