Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As an Oracle Apps Technical Lead, you will be responsible for leading the Oracle Apps Consulting Team, designing, developing, and maintaining Magnitude Products based on Oracle E-Business Suite. Your role will involve providing technical leadership to the team during feature design and review processes. Additionally, you will actively participate in the testing process by conducting test reviews, analysis, test witnessing, and software certification. With 8 to 10 years of experience, you must have worked in a Techno-Functional role with a strong background in Oracle EBS Applications. Proficiency in SQL and PLSQL is a must-have skill for this position. Your technical skills should include a strong understanding of Oracle EBS Applications, particularly in O2C and P2P cycle configuration and customization. You should have a good grasp of underlying financials, SCM module tables, columns, and relationships. Your ability to translate business requirements into technical solutions and solve complex problems through active team participation is crucial. Knowledge of BI tools and Data warehousing concepts would be an added advantage. Familiarity with Services and other modules in Oracle EBS is desirable. In addition to your technical expertise, you should possess strong analytical and issue resolution skills. Experience in collaborating with project teams, customers, and working across onshore/offshore teams is essential. Excellent written and verbal communication skills are also necessary for effective interaction within the team and with stakeholders.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You will be joining Birlasoft as a Genio OpenText ETL Developer. Birlasoft is a global leader in Cloud, AI, and Digital technologies, known for its domain expertise and enterprise solutions. As part of the CKA Birla Group, with over 12,000 professionals, Birlasoft is committed to building sustainable communities and empowering societies worldwide. As a Genio OpenText ETL Developer, you will play a crucial role in designing, developing, and maintaining ETL workflows to support data integration and migration projects. Your responsibilities will include designing, developing, and maintaining ETL processes using OpenText Genio. You will collaborate with business analysts and data architects to understand data requirements and translate them into technical specifications. Your role will involve implementing data extraction, transformation, and loading processes to integrate data from various sources. You will optimize ETL workflows for performance and scalability, perform data quality checks, ensure data integrity throughout the ETL process, and troubleshoot and resolve ETL-related issues. Additionally, you will document ETL processes, maintain technical documentation, and provide support and guidance to junior ETL developers. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field. You must have proven experience as an ETL Developer, focusing on OpenText Genio, and possess strong knowledge of ETL concepts, data integration, and data warehousing. Proficiency in SQL, experience with database management systems, familiarity with data modeling and data mapping techniques, excellent problem-solving skills, attention to detail, and strong communication and teamwork abilities are essential for this position. Preferred qualifications include experience with other ETL tools and technologies and knowledge of Agile development methodologies.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Lead Data Engineer specializing in Data Governance with expertise in Azure Purview, your key responsibilities will include designing and implementing data governance policies, standards, and procedures to guarantee data quality, consistency, and security. You will be tasked with identifying, analyzing, and resolving data quality issues across various data sources and platforms. Collaboration with cross-functional teams to enforce data governance structures and ensure adherence to policies and standards will be a crucial aspect of your role. Your role will also involve implementing and maintaining monitoring systems to track data quality, compliance, and security. Proficiency in data modelling, data warehousing, ETL processes, and data quality tools is essential. Familiarity with data governance tools like Azure Purview will be beneficial in executing your duties effectively. Ensuring that data is safeguarded and complies with privacy regulations through the implementation of appropriate access controls and security measures will be a top priority. You will also be responsible for facilitating data stewardship activities and providing guidance to data stewards on best practices in data governance. Leveraging Azure OneLake and Azure Synapse Analytics, you will design and implement scalable data storage and analytics solutions that support big data processing and analysis. Your expertise in these areas will be instrumental in meeting the data processing and analysis requirements of the organization.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a data professional at DataZymes Analytics, you will be involved in setting up data pipelines in Azure, utilizing Azure Data Factory, and configuring Azure Data Lake with both structured and unstructured data. Your role will require you to be a self-starter who can independently design, develop, and deliver solutions efficiently. Your responsibilities will also include demonstrating a strong understanding of Azure services such as Azure Storage (AFS, Blob, Data Lake) and Azure Functions. Additionally, you should possess a solid knowledge of databases, enabling you to write complex queries and design schemas effectively. Your expertise in setting up CI/CD pipelines in Azure through GitHub will be crucial for ensuring seamless development processes. To excel in this role, you should hold a Bachelor's or Master's degree in Computer Science or Information Technology. Experience with batch job scheduling, identifying data or job dependencies, and proficiency in data warehousing, ETL, and big data processing will be advantageous. While familiarity with the Pharma domain is preferred, it is not mandatory. At DataZymes Analytics, we value accuracy, efficiency, and innovation. By leveraging the latest technologies and best practices, we aim to deliver high-quality solutions with fast turnaround times. If you are passionate about data analytics and are eager to contribute to a dynamic team that thrives on excellence, we welcome you to apply and join us on our journey of transforming data into actionable insights for our Life Sciences clients.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are an experienced Oracle Data Integrator (ODI) Developer with extensive knowledge of Oracle GoldenGate, and you are joining our team. With over 5 years of hands-on experience, you excel in constructing, overseeing, and enhancing data integration processes. Your responsibilities include collaborating with large-scale data environments, ensuring seamless data migrations, and supporting real-time integration strategies. Your key responsibilities will involve designing, developing, and executing data integration solutions utilizing ODI and Oracle GoldenGate. You will be responsible for creating and managing ETL processes for data migration, transformation, and integration. Additionally, you will design real-time data replication setups using GoldenGate across various systems while optimizing and troubleshooting data pipelines for optimal performance and reliability. Collaborating with cross-functional teams to translate business requirements into technical solutions will be crucial. You will also coordinate with DBAs and infrastructure teams to ensure smooth integration and system performance, manage data warehousing, synchronization, and migration tasks while supporting GoldenGate replication for high availability, disaster recovery, and data synchronization. Ensuring scalability, performance, and security of integration configurations will be part of your role. You will develop technical documentation and provide training on ODI and GoldenGate processes, as well as support production environments and troubleshoot data integration issues as required. Your required skills and qualifications include having 5+ years of hands-on experience in ODI development and implementation, proven expertise in Oracle GoldenGate, strong command of SQL, PL/SQL, and scripting for data manipulation, solid understanding of data modeling, ETL architecture, and multi-system integration, familiarity with Oracle databases and data warehousing concepts, experience with various ODI components, proficiency in configuring GoldenGate for high availability and disaster recovery, excellent troubleshooting and optimization skills for data pipelines, experience handling complex data migration and synchronization tasks, and the ability to excel in a fast-paced, client-facing environment. Preferred skills include familiarity with other ETL tools such as Informatica and Talend, knowledge of Oracle Cloud Infrastructure (OCI) or other cloud platforms, certifications in ODI, GoldenGate, or other Oracle technologies, and experience with performance tuning in large-scale data integration projects. Educationally, you hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Possessing relevant Oracle certifications (ODI, GoldenGate) is considered a plus.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a seasoned Data Analyst with over 10 years of experience, you will play a crucial role in the development and implementation of data analytics strategies, frameworks, and methodologies that are in line with the goals and objectives of the organization. Your primary responsibility will be to drive the identification and exploration of data-driven opportunities to optimize business performance, enhance operational efficiency, and achieve strategic objectives. You will collaborate closely with senior leaders and stakeholders to define analytical requirements, establish key performance indicators (KPIs), and develop metrics to measure business performance and progress. Additionally, you will design and implement advanced analytical models and algorithms to extract insights, uncover patterns, and make predictions using large and complex data sets. Your expertise in data analysis, statistical modeling, and machine learning techniques will be instrumental in addressing complex business problems and supporting strategic decision-making. Furthermore, you will be tasked with developing and maintaining data governance frameworks, data quality standards, and data management practices to ensure the accuracy, integrity, and consistency of data assets. In this role, you will lead cross-functional teams in the design and delivery of data visualizations, dashboards, and reports that effectively communicate insights and drive action. It will be essential for you to stay updated with emerging trends, technologies, and best practices in data analytics and provide recommendations for their adoption to enhance analytical capabilities. Mentoring and coaching junior analysts to foster their professional growth and supporting their development of advanced analytical skills will also be part of your responsibilities. You will collaborate closely with data engineering teams to ensure efficient data collection, integration, and preparation for analysis purposes. Presenting complex findings, insights, and recommendations to senior leaders and stakeholders in a clear, concise, and compelling manner will be crucial. Moreover, you will play a key role in fostering a data-driven culture within the organization by promoting the value of data, advocating for data-driven decision-making, and driving data literacy initiatives. Requirements: - Bachelor's degree in a quantitative field such as Statistics, Mathematics, Economics, or Computer Science. A Master's or Ph.D. degree is strongly preferred. - 10+ years of extensive experience as a Data Analyst, with a significant portion of experience in a senior or lead role. - Proven track record of designing and implementing data analytics strategies and driving successful data-driven initiatives. - Expert proficiency in SQL for data extraction, manipulation, and analysis. - Advanced programming skills in Python/R for statistical analysis, predictive modeling, and machine learning. - In-depth knowledge of statistical analysis techniques, predictive modeling, and advanced machine learning algorithms. - Strong experience with data visualization tools such as Tableau, Power BI, or similar. - Extensive experience with data blending, preprocessing, and automation tools within PowerBi or similar. - Solid understanding of database structures, data warehousing concepts, and data governance principles. - Exceptional analytical and problem-solving skills, with the ability to tackle complex business challenges and provide innovative solutions. - Excellent leadership, strategic thinking, and stakeholder management abilities. - Exceptional communication and presentation skills, with the ability to influence and inspire stakeholders at all levels of the organization. - Proven ability to work independently, manage multiple projects, and prioritize effectively. Preferred Qualifications: - Experience in implementing data analytics solutions in cloud platforms such as AWS, Azure, or Google Cloud. - Knowledge of big data technologies such as Hadoop, Spark, or similar. - Familiarity with data science platforms and libraries (e.g., TensorFlow, PyTorch, scikit-learn). - Strong business acumen and the ability to align data analysis efforts with organizational goals and strategies. - Experience in leading and managing cross-functional teams.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As an ideal candidate for this role, you will be responsible for designing and architecting scalable Big Data solutions within the Hadoop ecosystem. Your key duties will include leading architecture-level discussions for data platforms and analytics systems, constructing and optimizing data pipelines utilizing PySpark and other distributed computing tools, translating business requirements into scalable data models and integration workflows, as well as ensuring the high performance and availability of enterprise-grade data processing systems. In addition, you will play a crucial role in mentoring development teams and offering guidance on best practices and performance tuning. To excel in this position, you must possess architect-level experience with Big Data ecosystem and enterprise data solutions. Proficiency in Hadoop, PySpark, and distributed data processing frameworks is essential, along with strong hands-on experience in SQL and data warehousing concepts. A deep understanding of data lake architecture, data ingestion, ETL, and orchestration tools is also required. Your experience in performance optimization and handling large-scale data sets, coupled with excellent problem-solving, design, and analytical skills, will be highly valued. While not mandatory, exposure to cloud platforms like AWS, Azure, or GCP for data solutions would be a beneficial asset. Additionally, familiarity with data governance, data security, and metadata management is considered a good-to-have skill set for this role. Joining our team offers you the opportunity to work with cutting-edge Big Data technologies, gain leadership exposure, and directly participate in architectural decisions. This is a stable, full-time position within a top-tier tech team, offering a conducive work-life balance with a standard 5-day working schedule. If you are passionate about Big Data technologies and eager to contribute to innovative solutions, we welcome your application for this exciting opportunity.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You should be a certified Oracle Database PLSQL Developer Professional for the role of Mantas Scenario Developer. Your responsibilities will include utilizing Mantas OFSAA FCCM and scenario manager to customize scenarios. It is imperative that you possess a strong understanding of Oracle DB and PLSQL, along with the ability to design and develop load processes and queries for large databases with high transaction volumes. Your problem-solving and debugging skills should be top-notch, and you should be able to communicate effectively and collaborate well within a team. Your role will entail designing high-quality deliverables in alignment with business requirements, adhering to defined standards and design principles. You will also be responsible for developing and maintaining highly scalable ETL applications, integrating code following CI/CD practices using Maven and Udeploy, and reviewing code modules developed by other team members. Experience in Agile Methodology, analyzing existing codes to resolve production issues, and working with ETL development tools and data warehousing is essential. Additionally, having knowledge of SQL, query tuning, performance tuning, Agile development models, and scrum teams is crucial. Previous experience in the banking domain, design experience in ETL technology, and proficiency in PL SQL and data warehousing fundamentals are considered advantageous for this role.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a GCP Senior Data Engineer/Architect, you will play a crucial role in our team by designing, developing, and implementing robust and scalable data solutions on the Google Cloud Platform (GCP). Collaborating closely with Architects and Business Analysts, especially for our US clients, you will translate data requirements into effective technical solutions. Your responsibilities will include designing and implementing scalable data warehouse and data lake solutions, orchestrating complex data pipelines, leading cloud data lake implementation projects, participating in cloud migration projects, developing containerized applications, optimizing SQL queries, writing automation scripts in Python, and utilizing various GCP data services such as BigQuery, Bigtable, and Cloud SQL. Your expertise in data warehouse and data lake design and implementation, experience in data pipeline development and tuning, hands-on involvement in cloud migration and data lake projects, proficiency in Docker and GKE, strong SQL and Python scripting skills, and familiarity with GCP services like BigQuery, Cloud SQL, Dataflow, and Composer will be essential for this role. Additionally, knowledge of data governance principles, experience with dbt, and the ability to work effectively within a team and adapt to project needs are highly valued. Strong communication skills, the willingness to work in UK shift timings, and the openness to giving and receiving feedback are important traits that will contribute to your success in this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Genpact is a global professional services and solutions firm committed to shaping the future. With a workforce of over 125,000 employees across 30+ countries, we are fueled by curiosity, agility, and the drive to deliver lasting value to our clients. Our purpose is to relentlessly pursue a world that works better for people, serving and transforming leading enterprises, including the Fortune Global 500, through our deep business and industry expertise, digital operations services, and proficiency in data, technology, and AI. We are looking for candidates for the position of Senior Principal Consultant-Power BI Developer. The responsibilities include: - Collaborating within a team to identify, design, and implement a consistent and intuitive reporting/dashboarding user experience across environments, report methods, security definitions, usability, and scalability best practices. - Extracting query data from industry cognitive model/data lake tables and constructing data models using BI tools. - Applying necessary business logic through data transformation and DAX. - Proficiency in Power BI Data Modelling and various in-built functions. - Familiarity with reporting sharing through Workspace/APP, Access management, Dataset Scheduling, and Enterprise Gateway. - Understanding of static and dynamic row level security. - Capability to generate wireframes based on user stories and business requirements. - Basic comprehension of ETL and Data Warehousing concepts. - Conceptualizing and developing industry-specific insights in the form of dashboards/reports/analytical web applications to deliver Pilots/Solutions following best practices. Qualifications we seek: Minimum Qualifications: - Graduate If you are interested in this role, the primary location is in India-Bangalore, with a full-time schedule. The education level required is Bachelor's / Graduation / Equivalent. The job posting date is Jun 22, 2025, 11:45:10 PM, and the unposting date is Jul 23, 2025, 1:29:00 PM. This is a Full-Time job falling under the Master Skills List - Digital category.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Salesforce Datacloud & Agentforce Solution Architect at NTT DATA in Hyderabad, Telangana (IN-TG), India, you will be responsible for designing, developing, and implementing AI-powered conversational experiences within the Salesforce platform. Your role will involve utilizing Agentforce capabilities to create automated customer interactions across various channels, requiring strong technical skills in Salesforce development and natural language processing (NLP) to build effective virtual agents. Your core responsibilities will include architecting and building data integration solutions using Salesforce Data Cloud to unify customer data from diverse sources, implementing data cleansing, matching, and enrichment processes to improve data quality, designing and managing data pipelines for efficient data ingestion, transformation, and loading, collaborating with cross-functional teams to understand business requirements and translate them into data solutions, monitoring data quality, identifying discrepancies, and taking corrective actions, as well as establishing and enforcing data governance policies to maintain data consistency and compliance. To be successful in this role, you should have expertise in Salesforce Data Cloud features like data matching, cleansing, enrichment, and data quality rules. You should also possess an understanding of data modeling concepts, proficiency in using Salesforce Data Cloud APIs and tools for data integration, knowledge of data warehousing concepts, and experience in implementing Salesforce Data Cloud for customer 360 initiatives. Additionally, you will be involved in designing and developing data integration solutions, managing data quality issues, collaborating with business stakeholders to define data requirements and KPIs, building and customizing Agentforce conversational flows, refining NLP models for accurate understanding of customer queries, monitoring Agentforce performance, integrating Agentforce with other Salesforce components, testing and deploying Agentforce interactions across various channels, and more. Key skills to highlight on your resume include expertise in Salesforce administration, development, understanding of Salesforce architecture, deep knowledge of Agentforce features, familiarity with NLP techniques, proven ability in conversational design, skills in data analysis, and experience in designing, developing, and deploying solutions on the Salesforce Data Cloud platform. At NTT DATA, a trusted global innovator of business and technology services, we are committed to helping clients innovate, optimize, and transform for long-term success. With a diverse team of experts, we provide services in business and technology consulting, data and artificial intelligence, industry solutions, application development, infrastructure management, and connectivity. Join us to be part of a global network dedicated to moving confidently and sustainably into the digital future.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
As a Business Analyst with Data warehousing experience, you will play a key role in analyzing and interpreting data to provide valuable insights for the organization. Your responsibilities will include gathering and documenting business requirements, working closely with stakeholders to understand their needs, and translating these requirements into functional specifications for data warehouse solutions. Having experience in the Pharma industry will be advantageous as you will have the opportunity to leverage your domain knowledge to enhance data analysis and reporting processes. Your ability to effectively communicate with both technical and non-technical stakeholders will be crucial in ensuring successful project outcomes. This is a full-time position based in Pune or Nagpur, with office hours aligned with US EST. If you are a detail-oriented individual with a passion for data analysis and a strong business acumen, we encourage you to apply for this exciting opportunity to make a meaningful impact through data-driven insights.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
KLA is a global leader in diversified electronics for the semiconductor manufacturing ecosystem. Virtually every electronic device in the world is produced using our technologies. No laptop, smartphone, wearable device, voice-controlled gadget, flexible screen, VR device or smart car would have made it into your hands without us. KLA invents systems and solutions for the manufacturing of wafers and reticles, integrated circuits, packaging, printed circuit boards and flat panel displays. The innovative ideas and devices that are advancing humanity all begin with inspiration, research and development. KLA focuses more than average on innovation and we invest 15% of sales back into R&D. Our expert teams of physicists, engineers, data scientists and problem-solvers work together with the worlds leading technology providers to accelerate the delivery of tomorrows electronic devices. Life here is exciting and our teams thrive on tackling really hard problems. There is never a dull moment with us. The Information Technology (IT) group at KLA is involved in every aspect of the global business. ITs mission is to enable business growth and productivity by connecting people, process, and technology. It focuses not only on enhancing the technology that enables our business to thrive but also on how employees use and are empowered by technology. This integrated approach to customer service, creativity and technological excellence enables employee productivity, business analytics, and process excellence. As a Sr. data engineer, part of Data Sciences and Analytics team, you will play a key role for KLAs Data strategy principles and technique. As part of the centralized analytics team we will help analyze and find key data insights into various business unit processes across the company. You will be providing key performance indicators, dashboards to help various business users/partners to make business critical decisions. You will craft and develop analytical solutions by capturing business requirements and translating them into technical specifications, building data models and data visualizations. Responsibilities: - Design, develop and deploy Microsoft Fabric solutions, Power BI reports and dashboards. - Collaborate with business stakeholders to gather requirements and translate them into technical specifications. - Develop data models and establish data connections to various data sources. Use expert knowledge on Microsoft fabric architecture, deployment, and management. - Optimize Power BI solutions for performance and scalability. - Implement best practices for data visualization and user experience. - Conduct code reviews and provide mentorship to junior developers. - Manage permissions and workspaces in Power BI ensuring secure and efficient analytics platform. - Conduct assessments and audits of existing Microsoft Fabric environments to identify areas for improvement. - Stay current with the latest Fabric and Power BI features and updates. - Troubleshoot and resolve issues related to Fabric objects, Power BI reports and data sources. - Create detailed documentation, including design specifications, implementation plans, and user guides. Minimum Qualifications: - Doctorate (Academic) Degree and 0 years related work experience; Master's Level Degree and related work experience of 3 years; Bachelor's Level Degree and related work experience of 5 years - Proven experience as a Power BI Developer, with a strong portfolio of Power BI projects. - In-depth knowledge of Power BI, including DAX, Power Query, and data modeling. - Experience with SQL and other data manipulation languages. - In-depth knowledge of Microsoft Fabric and Power BI, including its components and capabilities. - Strong understanding of Azure cloud computing, data integration, and data management. - Excellent problem-solving skills and the ability to work independently and as part of a team. - Excellent Technical Problem Solving skill, performance optimization skills - Specialist in SQL and Stored procedures with Data warehouse concepts - Performed ETL Processes (Extract, Load, Transform). - Exceptional communication and interpersonal skills. - Expert knowledge of cloud and big data concepts and tools. Azure, AWS, Data Lake, Snowflake, etc. Nice to have: - Extremely strong SQL skills - Foundational knowledge of Metadata management, Master Data Management, Data Governance, Data Analytics - Good to have technical knowledge on Data Bricks/Data Lake/Spark/SQL. - Experience in configuring SSO(Single Single-On), RBAC, security roles on an analytics platform. - SAP functional knowledge is a plus - Microsoft certifications related to Microsoft Fabric/Power BI or Azure/analytics are a plus. - Good understanding of requirements and converting them into data warehouse solutions We offer a competitive, family friendly total rewards package. We design our programs to reflect our commitment to an inclusive environment, while ensuring we provide benefits that meet the diverse needs of our employees. KLA is proud to be an equal opportunity employer Be aware of potentially fraudulent job postings or suspicious recruiting activity by persons that are currently posing as KLA employees. KLA never asks for any financial compensation to be considered for an interview, to become an employee, or for equipment. Further, KLA does not work with any recruiters or third parties who charge such fees either directly or on behalf of KLA. Please ensure that you have searched KLAs Careers website for legitimate job postings. KLA follows a recruiting process that involves multiple interviews in person or on video conferencing with our hiring managers. If you are concerned that a communication, an interview, an offer of employment, or that an employee is not legitimate, please send an email to talent.acquisition@kla.com to confirm the person you are communicating with is an employee. We take your privacy very seriously and confidentially handle your information.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
You deserve to do what you love, and love what you do - a career that works as hard for you as you do. At Fiserv, we are more than 40,000 FiservProud innovators delivering superior value for our clients through leading technology, targeted innovation and excellence in everything we do. You have choices - if you strive to be a part of a team driven to create with purpose, now is your chance to Find your Forward with Fiserv. Responsibilities Requisition ID R-10358162 Date posted 07/23/2025 End Date 07/31/2025 City Noida State/Region Uttar Pradesh Country India Location Type Onsite Calling all innovators - find your future at Fiserv. Professional, Data Architecture - Good Hand on experience on ETL and BI tools like SSIS, SSRS, Power BI etc. - Readiness to play an individual contributor role on the technical front - Excellent communication skills - Readiness to travel onsite for short term, as required - A good experience in ETL development for 3-5 years and with hands-on experience in a migration or data warehousing project - Should have strong database fundamentals and experience in writing Unit test cases and test scenarios - Expert knowledge in writing SQL commands, queries and stored procedures - Good knowledge of ETL tools like SSIS, Informatica, etc. and data warehousing concepts - Should have good knowledge in writing macros - Good client handling skills with preferred onsite experience Thank you for considering employment with Fiserv. Please: - Apply using your legal name - Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You are an experienced Databricks on AWS and PySpark Engineer looking to join our team. Your role will involve designing, building, and maintaining large-scale data pipelines and architectures using Databricks on AWS and PySpark. You will also be responsible for developing and optimizing data processing workflows, collaborating with data scientists and analysts, ensuring data quality, security, and compliance, troubleshooting data pipeline issues, and staying updated with industry trends in data engineering and big data. Your responsibilities will include: - Designing, building, and maintaining large-scale data pipelines and architectures using Databricks on AWS and PySpark - Developing and optimizing data processing workflows using PySpark and Databricks - Collaborating with data scientists and analysts to design and implement data models and architectures - Ensuring data quality, security, and compliance with industry standards and regulations - Troubleshooting and resolving data pipeline issues and optimizing performance - Staying up-to-date with industry trends and emerging technologies in data engineering and big data Requirements: - 3+ years of experience in data engineering, with a focus on Databricks on AWS and PySpark - Strong expertise in PySpark and Databricks, including data processing, data modeling, and data warehousing - Experience with AWS services such as S3, Glue, and IAM - Strong understanding of data engineering principles, including data pipelines, data governance, and data security - Experience with data processing workflows and data pipeline management Soft Skills: - Excellent problem-solving skills and attention to detail - Strong communication and collaboration skills - Ability to work in a fast-paced, dynamic environment - Ability to adapt to changing requirements and priorities If you are a proactive and skilled professional with a passion for data engineering and a strong background in Databricks on AWS and PySpark, we encourage you to apply for this opportunity.,
Posted 1 week ago
9.0 - 12.0 years
25 - 35 Lacs
Pune
Hybrid
We have opening for " Princpal IT Engineer Applications - Data Engineer " role, with one of the top US Product based MNC, Pune. Total exp.-9-12 years, NP-upto 60 days Shift timing- 2 PM to 11 PM Location- Pune, Hinjewadi (Phase 2) PFB Must have skills : Minimum of 8 years of hands-on experience in software or data engineering roles Deep understanding of data modeling and data architecture design; must be a well-rounded technical expert Strong expertise in SQL scripting and performance tuning within large-scale data environments Proven experience working with high-volume data systems Demonstrated ability to solve complex problems using Informatica Excellent communication skills with the ability to clearly articulate and explain technical scenarios Experience in building data products or solutions from the ground up Proficient in Python, Shell scripting, and PL/SQL, with a strong focus on automation Must have strong hands-on experience; not seeking candidates focused on team management or leadership roles Good to have skills: Experience supporting Medicare, Medicaid, or other regulatory healthcare data platforms Nice to have certifications in Informatica Cloud, Oracle, Databricks, and cloud platforms (e.g., Azure, AWS)
Posted 1 week ago
5.0 - 10.0 years
15 - 25 Lacs
Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR
Hybrid
Ready to build the future with AI? At Genpact, we don't just keep up with technology we set the pace. AI and digital innovation are redefining industries, and were leading the charge. Genpacts AI Gigafactory , our industry-first accelerator, is an example of how were scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of whats possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Lead Consultant-Power BI Developer! Responsibilities: Working within a team to identify, design and implement a reporting/dashboarding user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices Gathering query data from tables of industry cognitive model/data lake and build data models with BI tools ¢ Apply requisite business logic using data transformation and DAX ¢ Understanding on Power BI Data Modelling and various in-built functions ¢ Knowledge on reporting sharing through Workspace/APP, Access management, Dataset Scheduling and Enterprise Gateway ¢ Understanding of static and dynamic row level security ¢ Ability to create wireframes based on user stories and Business requirement ¢ Basic Understanding on ETL and Data Warehousing concepts ¢ Conceptualizing and developing industry specific insights in forms dashboards/reports/analytical web application to deliver of Pilots/Solutions following best practices Qualifications we seek in you! Minimum Qualifications Graduate Why join Genpact? Lead AI-first transformation – Build and scale AI solutions that redefine industries. Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best – Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI – Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 1 week ago
7.0 - 12.0 years
11 - 21 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Must have experience in ETL/ELT tools and pipeline Working experience with Python libraries Pandas, NumPy, and SQL Alchemy for ETL Strong understanding of Data Warehouse and Development Experience with relational SQL and NoSQL databases.
Posted 1 week ago
10.0 - 16.0 years
25 - 40 Lacs
Gurugram, Bengaluru, Delhi / NCR
Work from Office
Acuity Knowledge Partners (Acuity) is a leading provider of bespoke research, analytics and technology solutions to the financial services sector, including asset managers, corporate and investment banks, private equity and venture capital firms, hedge funds and consulting firms. Its global network of over 6,000 analysts and industry experts, combined with proprietary technology, supports more than 600 financial institutions and consulting companies to operate more efficiently and unlock their human capital, driving revenue higher and transforming operations. Acuity is headquartered in London and operates from 10 locations worldwide. The company fosters a diverse, equitable and inclusive work environment, nurturing talent, regardless of race, gender, ethnicity or sexual orientation. Acuity was established as a separate business from Moodys Corporation in 2019, following its acquisition by Equistone Partners Europe (Equistone). In January 2023, funds advised by global private equity firm Permira acquired a majority stake in the business from Equistone, which remains invested as a minority shareholder. For more information, visit www.acuitykp.com Position Title- Associate Director (Senior Architect – Data) Department-IT Location- Gurgaon/ Bangalore Job Summary The Enterprise Data Architect will enhance the company's strategic use of data by designing, developing, and implementing data models for enterprise applications and systems at conceptual, logical, business area, and application layers. This role advocates data modeling methodologies and best practices. We seek a skilled Data Architect with deep knowledge of data architecture principles, extensive data modeling experience, and the ability to create scalable data solutions. Responsibilities include developing and maintaining enterprise data architecture, ensuring data integrity, interoperability, security, and availability, with a focus on ongoing digital transformation projects. Key Responsibilities 1. Strategy & Planning o Develop and deliver long-term strategic goals for data architecture vision and standards in conjunction with data users, department managers, clients, and other key stakeholders. o Create short-term tactical solutions to achieve long-term objectives and an overall data management roadmap. o Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity. o Establish methods and procedures for tracking data quality, completeness, redundancy, and improvement. o Conduct data capacity planning, life cycle, duration, usage requirements, feasibility studies, and other tasks. o Create strategies and plans for data security, backup, disaster recovery, business continuity, and archiving. o Ensure that data strategies and architectures are aligned with regulatory compliance. o Develop a comprehensive data strategy in collaboration with different stakeholders that aligns with the transformational projects’ goals. o Ensure effective data management throughout the project lifecycle. 2. Acquisition & Deployment o Ensure the success of enterprise-level application rollouts (e.g. ERP, CRM, HCM, FP&A, etc.) Liaise with vendors and service providers to select the products or services that best meet company goals 3. Operational Management o Assess and determine governance, stewardship, and frameworks for managing data across the organization. o Develop and promote data management methodologies and standards. o Document information products from business processes and create data entities o Create entity relationship diagrams to show the digital thread across the value streams and enterprise o Create data normalization across all systems and data base to ensure there is common definition of data entities across the enterprise o Document enterprise reporting needs develop the data strategy to enable single source of truth for all reporting data o Address the regulatory compliance requirements of each country and ensure our data is secure and compliant o Select and implement the appropriate tools, software, applications, and systems to support data technology goals. o Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. o Collaborate with project managers and business unit leaders for all projects involving enterprise data. o Address data-related problems regarding systems integration, compatibility, and multiple-platform integration. o Act as a leader and advocate of data management, including coaching, training, and career development to staff. o Develop and implement key components as needed to create testing criteria to guarantee the fidelity and performance of data architecture. o Document the data architecture and environment to maintain a current and accurate view of the larger data picture. o Identify and develop opportunities for data reuse, migration, or retirement. 4. Data Architecture Design: o Develop and maintain the enterprise data architecture, including data models, databases, data warehouses, and data lakes. o Design and implement scalable, high-performance data solutions that meet business requirements. 5. Data Governance: o Establish and enforce data governance policies and procedures as agreed with stakeholders. o Maintain data integrity, quality, and security within Finance, HR and other such enterprise systems. 6. Data Migration: o Oversee the data migration process from legacy systems to the new systems being put in place. o Define & Manage data mappings, cleansing, transformation, and validation to ensure accuracy and completeness. 7. Master Data Management: o Devise processes to manage master data (e.g., customer, vendor, product information) to ensure consistency and accuracy across enterprise systems and business processes. o Provide data management (create, update and delimit) methods to ensure master data is governed 8. Stakeholder Collaboration: o Collaborate with various stakeholders, including business users, other system vendors, and stakeholders to understand data requirements. o Ensure the enterprise system meets the organization's data needs. 9. Training and Support: o Provide training and support to end-users on data entry, retrieval, and reporting within the candidate enterprise systems. o Promote user adoption and proper use of data. 10 Data Quality Assurance: o Implement data quality assurance measures to identify and correct data issues. o Ensure the Oracle Fusion and other enterprise systems contain reliable and up-to-date information. 11. Reporting and Analytics: o Facilitate the development of reporting and analytics capabilities within the Oracle Fusion and other systems o Enable data-driven decision-making through robust data analysis. 1. Continuous Improvement: o Continuously monitor and improve data processes and the Oracle Fusion and other system's data capabilities. o Leverage new technologies for enhanced data management to support evolving business needs. Technology and Tools: Oracle Fusion Cloud Data modeling tools (e.g., ER/Studio, ERwin) ETL tools (e.g., Informatica, Talend, Azure Data Factory) Data Pipelines: Understanding of data pipeline tools like Apache Airflow and AWS Glue. Database management systems: Oracle Database, MySQL, SQL Server, PostgreSQL, MongoDB, Cassandra, Couchbase, Redis, Hadoop, Apache Spark, Amazon RDS, Google BigQuery, Microsoft Azure SQL Database, Neo4j, OrientDB, Memcached) Data governance tools (e.g., Collibra, Informatica Axon, Oracle EDM, Oracle MDM) Reporting and analytics tools (e.g., Oracle Analytics Cloud, Power BI, Tableau, Oracle BIP) Hyperscalers / Cloud platforms (e.g., AWS, Azure) Big Data Technologies such as Hadoop, HDFS, MapReduce, and Spark Cloud Platforms such as Amazon Web Services, including RDS, Redshift, and S3, Microsoft Azure services like Azure SQL Database and Cosmos DB and experience in Google Cloud Platform services such as BigQuery and Cloud Storage. • Programming Languages: (e.g. using Java, J2EE, EJB, .NET, WebSphere, etc.) o SQL: Strong SQL skills for querying and managing databases. Python: Proficiency in Python for data manipulation and analysis. Java: Knowledge of Java for building data-driven applications. Data Security and Protocols: Understanding of data security protocols and compliance standards. Key Competencies Qualifications: • Education: o Bachelor’s degree in computer science, Information Technology, or a related field. Master’s degree preferred. Experience: 10+ years overall and at least 7 years of experience in data architecture, data modeling, and database design. Proven experience with data warehousing, data lakes, and big data technologies. Expertise in SQL and experience with NoSQL databases. Experience with cloud platforms (e.g., AWS, Azure) and related data services. Experience with Oracle Fusion or similar ERP systems is highly desirable. Skills: Strong understanding of data governance and data security best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work effectively in a collaborative team environment. Leadership experience with a track record of mentoring and developing team members. Excellent in documentation and presentations. Good knowledge of applicable data privacy practices and laws. Certifications: Relevant certifications (e.g., Certified Data Management Professional, AWS Certified Big Data – Specialty) are a plus. Behavioral • A self-starter, an excellent planner and executor and above all, a good team player • Excellent communication skills and inter-personal skills are a must • Must possess organizational skills, including multi-task capability, priority setting and meeting deadlines • Ability to build collaborative relationships and effectively leverage networks to mobilize resources • Initiative to learn business domain is highly desirable Likes dynamic and constantly evolving environment and requirements
Posted 1 week ago
5.0 - 10.0 years
8 - 18 Lacs
Pune, Delhi / NCR
Hybrid
Skills Required: Automation Testing, ETL, SQL Queries, Data Warehousing, Data Modeling 5 to 10 years experience as a test engineer specially in ETL Testing. Corporate Banking knowledge/Experience would be added advantage. Proven experience in test plan design Understanding of the software development lifecycle and the deliverable created during the development lifecycle Strong analytical skills, creative and critical thinking ability and problem solving skills Familiarity with relevant quality assurance industry standard best practices and methodologies Dedication to customer satisfaction Excellent communication skills Problem solving skills Excellent time management skills Experience in creating test solutions. Use of test management tools similar to JIRA. Understanding of executing test automation and interpreting results. Proficient in Window technologies along with Office (Excel, PowerPoint)and Adobe. Proficient in all forms of functional testing across all browsers and devices. Knowledge of ETL Tools.
Posted 1 week ago
10.0 - 12.0 years
14 - 20 Lacs
Noida
Work from Office
Role & responsibilities Candidate with Database management system knowledge, data modelling in ETL/Snowflakes, solid SQL scripting experience, ADF, data fudging Experience : Total 10 + Yrs Relevant : 5+ Yrs Job Location : Noida Mode : Work From Office Technical Skills: 10+ years of experience on Database Management Systems, configures database parameters and prototype designs against logical data models. Defines data repository requirements, data dictionaries and warehousing requirements. To optimizes database access and allocates/re-allocates database resources for optimum configuration, database performance and cost. ADF Building knowledge and Data Masking Experience/knowledge with Microsoft SQL Replication and CDC technology is must. Experience setting up and configuring HA (High Availability) / Replication/AlwaysOn. Preferably having researched and fine-tuned a setup so that the person understands what/why and can understand our needs and design/implement/configure a setup that meets client needs. Solid SQL scripting experience, in general. *The typical DBA skills / experience including DB maintenance, table/index/etc. maintenance, backups, monitoring, security, data dictionary, integrity checks, configuration, patching, and statistics, etc. Experience with SQL Server 2008/2012/2014/2016, preferably with multiple editions (Standard, Enterprise, etc.). Experience having installed and configured SQL Server instances in a similar capacity Thorough understanding of Performance Tuning both as a System DBA and Application DBA (difference being Application DBA is more about query/application performance tuning and System DBA is about tuning the database itself, how it is configured, etc.). Strong SQL skills Strong Data Modeling skills both Logical/Physical Database Modeling knowledge Strong analytical and problem solving skills Excellent verbal and written communication skills Strong experienced with Microsoft tools and software (Visual Studio 2012/2015, SQL Mgmt. Studio, Microsoft Office, etc.) Experience with Data Warehousing and OLTP database management. Measure, track and meet SLA metrics (analytics cycle time, schedule, accuracy, rework, etc.). Assist in database performance tuning and troubleshooting database issues in both OLTP and EDW environments. Install, configure, and maintain critical SQL Server databases. Dimensional and relational based databases. Supporting internal and customer facing applications Assist in the full range of SQL Server maintenance activities (for example): Backups, restores, recovery models, database shrink operations, DBCC commands, and Replication. Table, Index, etc. design, creation, and maintenance. On-going maintenance activities - working to plan and automate as much as possible. Increase availability and performance while at the same time reducing manual support time. Assist as part of the team designing, building, and maintaining the future state of Merchants database platforms: SQL Server versions, editions, and components. Database server configuration. Database and Data Model standards Business Continuity and High Availability strategy. Overall Data Architecture. Upgrade and Patching strategy. Working on Automation using powershell and T-sql scripts. Knowledge on Python, ARM, Bicep is preferred. Knowledge in snowflakes is preferred. Knowledge in ETL Integration layer ,SSIS and SSRS. To troubleshoot and Fix issues on JAMS JOBS which are running on SSIS, Powershell, T-SQL and Batch files. Knowledge on DevOps and Azure environment Migration of SQL Server and work with application support to support all CDC and ETL Integration Layer. Installation and configuration of SSRS, SSIS and SSAS. Good to have Deep Azure experience Preferred candidate profile Process Skills: General SDLC processes Understanding of utilizing Agile and Scrum software development methodologies Skill in gathering and documenting user requirements and writing technical specifications Behavioral Skills : Good Attitude and Quick learner Well-developed analytical & problem-solving skills Strong oral and written communication skills Excellent team player, able to work with virtual teams Excellent leadership skills with ability lead and guide and groom the team Self-motivated and capable of working independently with minimal management supervision Able to talk to client directly and report
Posted 1 week ago
3.0 - 6.0 years
5 - 8 Lacs
Hyderabad
Work from Office
Detailed JD *(Roles and Responsibilities) 5+ yrs of experience in PowerBi Develop PowerBI Dashboard based on client requirement Performance tuning of the queries. Good in Datawarehouse concepts. Collaborate with application development teams to optimize database designs. Additional Skills Strong analytical and problem-solving skills. Knowledge of database performance tuning and optimization techniques. Experience with different database systems (Snowflake). Ability to work independently and as part of a team. Mandatory skills* Power BI Desired skills* Snowflake, Datawarehousing concepts, Domain* Financial Services
Posted 1 week ago
2.0 - 7.0 years
2 - 7 Lacs
Chennai
Work from Office
• Service and continuous improvement of the product configuration Solution(CONFIGIT ACE), Responsible for digital cost calculation •Implementation and maintenance of our product structures within the product configurator, User support for CRM and CPQ
Posted 1 week ago
5.0 - 10.0 years
5 - 9 Lacs
Pune
Work from Office
Job Description: KPI Partners is seeking an experienced Senior Snowflake Administrator to join our dynamic team. In this role, you will be responsible for managing and optimizing our Snowflake environment to ensure performance, reliability, and scalability. Your expertise will contribute to designing and implementing best practices to facilitate efficient data warehousing solutions. Key Responsibilities: - Administer and manage the Snowflake platform, ensuring optimal performance and security. - Monitor system performance, troubleshoot issues, and implement necessary solutions. - Collaborate with data architects and engineers to design data models and optimal ETL processes. - Conduct regular backups and recovery procedures to protect data integrity. - Implement user access controls and security measures to safeguard data. - Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. - Participate in the planning and execution of data migration to Snowflake. - Provide support for data governance and compliance initiatives. - Stay updated with Snowflake features and best practices, and provide recommendations for continuous improvement. Qualifications: - Bachelors degree in Computer Science, Information Technology, or a related field. - 5+ years of experience in database administration, with a strong focus on Snowflake. - Hands-on experience with SnowSQL, SQL, and data modeling. - Familiarity with data ingestion tools and ETL processes. - Strong problem-solving skills and the ability to work independently. - Excellent communication skills and the ability to collaborate with technical and non-technical stakeholders. - Relevant certifications in Snowflake or cloud data warehousing are a plus. If you are a proactive, detail-oriented professional with a passion for data and experience in Snowflake administration, we would love to hear from you. Join KPI Partners and be part of a team that is dedicated to delivering exceptional data solutions for our clients.
Posted 1 week ago
3.0 - 8.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Educational Requirements Bachelor of Engineering Service Line Infosys Quality Engineering Responsibilities As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems . If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Job Opening is for multiple locationsBANGALORE, BHUBANESWAR, MYSORE, HYD, CHENNAI, PUNE, COIMBATORE, THIRUVANANTHAPURAMPlease apply only if you have skills mentioned under technical requirement Technical and Professional Requirements: Skills Required:SQL, ETL/Datawarehouse testing:1. Strong SQL skills for data querying and validation.2. Experience with ETL and data warehousing testing3.BI Report Testing Preferred Skills: Technology-Data Services Testing-Data Migration Testing Technology-Data Services Testing-Data Warehouse Testing-ETL tool
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough