Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Software Engineer at Capgemini, you will be part of a collaborative community where you can shape your career, reimagine possibilities, and contribute to unlocking the value of technology for leading organizations. Your role will focus on software engineering, involving the development, maintenance, and optimization of software solutions/applications. You will apply scientific methods to analyze and solve software engineering problems, lead cross-functional teams, and collaborate with stakeholders. Key Responsibilities: - Analyze and solve software engineering problems using scientific methods. - Develop and apply software engineering practice and knowledge in research, design, development, and maintenance. - Exercise original thought and judgement while supervising the technical and administrative work of other software engineers. - Build skills and expertise in software engineering discipline to meet standard software engineer skills expectations. - Collaborate and act as a team player with other software engineers and stakeholders. Qualifications Required: - 5+ years of experience in BY Planning or related SCM platforms. - Strong understanding of supply chain and inventory optimization. - Hands-on experience with SQL, APIs, and scripting languages. - Familiarity with ETL tools and cloud data platforms. - Excellent analytical, problem-solving, and communication skills. - Ability to work independently and lead cross-functional teams. Preferred Qualifications: - Experience with Blue Yonder Cloud implementations. - Certifications in BY Planning or related SCM tools. - Exposure to Agile or hybrid delivery models. Capgemini offers a diverse and inclusive workplace that fosters innovation and collaboration. You will have the opportunity to work remotely or in a hybrid model, supporting a healthy work-life balance. Competitive compensation and benefits, career development programs, and certifications in cloud technologies are also provided. Capgemini is a global business and technology transformation partner trusted by clients for over 55 years, delivering end-to-end services and solutions with expertise in AI, cloud, data, and more.,
Posted 2 days ago
5.0 - 10.0 years
0 Lacs
haryana
On-site
Role Overview: As an Oracle Data Integrator, your role is crucial in bringing the team's vision to life by designing, developing, and maintaining application software that aligns with the business objectives. Your responsibilities also include providing support for application failures and fixes. Key Responsibilities: - Utilize your 8-10 years of technology experience in relevant domains, with expertise in ETL tools, especially Oracle Data Integrator (ODI). - Demonstrate proficiency in scripting with at least 5 years of experience, preferably in PERL and Python, along with 8+ years of Oracle experience, including SQL and stored procedures. - Apply your strong experience in ETL development and data analysis, working collaboratively with Agile teams. - Employ your problem analysis and investigation skills, and work with Control-M scheduling and job configuration changes. - Utilize your knowledge of Oracle PL/SQL, Oracle Database 19c, Sybase IQ, Control-M, JIRA/RALLY, GitHub, ODI/ETL tools, and a platform mindset. - Leverage your advanced Linux/Unix skills and data tracing capabilities for efficient data analysis. - Utilize your background knowledge in the financial services sector. Qualification Required: - Hold a tertiary qualification in a technology-related degree. Company Additional Details (if present): The company values individuals with proven abilities in using relevant data analytics approaches and tools for problem-solving and troubleshooting. You should have experience with build languages, a focus on quality and detail, excellent documentation and communication skills, and a strong commitment to quality and auditability. Your ability to engage and influence stakeholders, understand their needs, and ensure quality and governance standards are met is essential. Additionally, you should enjoy working in a large, complex environment, take on challenges effectively, have experience in providing technology client service, be customer-focused, move with speed, and be committed to delivering quality integration solutions.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Success Guide at Salesforce, you will have the opportunity to play a critical role in accelerating customer adoption and outcomes by blending deep product knowledge, hands-on solutioning, and consultative engagement. Your key responsibilities will include: - Engaging with customers to gather, analyze, and translate business requirements into actionable Salesforce solutions. - Delivering org health assessments and advanced security assessments for customers, providing best practice recommendations and actionable insights. - Collaborating with architects, developers, and stakeholders to design scalable, secure, and high-performing solutions. - Leading interactive demos, proof-of-concepts, and collaborative solution-building sessions. - Building and co-developing solution components with customers to ensure faster return on value. - Providing expert coaching sessions, product education, and technical advice to accelerate adoption and drive success. - Acting as a Subject Matter Expert (SME) for Service Cloud and supporting internal enablement. - Partnering with Guide Leadership to design and deliver training programs that build team skills and maturity. - Generating positive feedback from customers, internal teams, and leadership by driving measurable outcomes and customer satisfaction. - Showing continued professional growth through certifications, Trailhead learning, and staying current with Salesforce innovations. You must meet the following qualifications: - Minimum 5 years of development experience in the Salesforce ecosystem and relevant Salesforce certifications including Salesforce Certified Platform Developer2. - Hands-on expertise in Service Cloud capabilities such as Service Cloud Digital Engagement, Service Cloud Voice, Knowledge, Agentforce for Service, and Field Service. - Experience in developing custom solutions in Salesforce Lightning using LWC and Apex. - Proficiency in front-end technologies like HTML, JavaScript, and CSS. - Familiarity with data integration tools and experience in integrating Salesforce with various business systems. - Strong knowledge of SQL, SOQL, Java, JavaScript, SLDS, and custom CSS. - Familiarity with platform authentication patterns and security capabilities. - Experience in building scalable solutions using Visualforce, Apex, Web Services, and APIs. - Expertise in Salesforce Flows, Process Builder, and advanced declarative automation tools. - Ability to collaborate with architects, developers, admins, and business stakeholders. - Strong communication skills with an analytical mindset to design effective solutions that meet business needs. Join Salesforce to unleash your potential and be limitless in all areas of your life. Shape the future and redefine what's possible for yourself, for AI, and the world. Apply today to make a difference and deliver amazing experiences that customers love.,
Posted 2 days ago
15.0 - 20.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Vice President of Software Engineering for Spectral Consultants, one of the leading US Product based organizations, located in Noida with a Hybrid work mode, you will have the following responsibilities: - Define and execute platform engineering strategy aligned with product and business goals. - Lead global R&D roadmap and product transformation initiatives. - Build, mentor, and retain a high-performing engineering team. - Oversee development and operations of a scalable, reliable SaaS platform. - Collaborate cross-functionally with Product, Design, Sales, and Customer Success. - Make technical decisions balancing engineering, business, and customer needs. - Drive continuous improvement in performance, scalability, stability, and security. - Champion Agile methodologies and foster a data-driven engineering culture. - Represent engineering to senior stakeholders and report on progress and challenges. Candidate Requirements: - 20+ years in software engineering, with 15+ years in SaaS platform/product development. - Proven leadership managing teams of 100+ engineers across diverse skill sets. - Expertise in cloud-native architectures, containers, APIs, Infrastructure-as-Code, and serverless computing. - Strong understanding of software and infrastructure cost management. - Technical skills: Java, .NET, AWS, RedHat, ETL tools, front-end technologies; familiarity with Github Copilot and Jira. - Experience coaching, mentoring, and building engineering talent pipelines.,
Posted 2 days ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
As an experienced Data Engineer with 6 to 8 years of relevant experience, you have a strong background in data transformation & ETL on large datasets. Additionally, you possess 5+ years of Data Modeling experience in Relational, Dimensional, Columnar, and Big Data environments. Your expertise also includes 5+ years of complex SQL or NoSQL experience and hands-on experience with industry ETL tools such as Informatica and Unifi. In the Data and Technology domain, you excel in designing customer-centric datasets encompassing CRM, Call Center, Marketing, Offline, Point of Sale, etc. It is mandatory for you to have knowledge of Adobe Experience Platform (AEP) and be well-versed in advanced Data Warehouse concepts. Your proficiency extends to Big Data technologies like Hadoop, Spark, Redshift, Snowflake, Hive, Pig, as well as Reporting Technologies such as Tableau and PowerBI. Moreover, you are familiar with Adobe Experience Cloud solutions and Digital Analytics or Digital Marketing. Your Software Development and Scripting skills are top-notch, with experience in professional software development and programming languages like Python, Java, or Bash scripting. You are adept at handling Business Requirements definition and management, structured analysis, process design, and use case documentation. Your strong verbal & written communication skills enable you to effectively interface with Sales teams and lead customers to successful outcomes. Moreover, your exceptional organizational skills and ability to multitask across different customer projects set you apart. As a self-managed, proactive, and customer-focused individual, you consistently deliver high-quality results.,
Posted 3 days ago
8.0 - 12.0 years
0 Lacs
panchkula, haryana
On-site
You are an experienced and results-driven ETL & DWH Engineer/Data Analyst with over 8 years of experience in data integration, warehousing, and analytics. You possess deep technical expertise in ETL tools, strong data modeling knowledge, and the ability to lead complex data engineering projects from design to deployment. - 4+ years of hands-on experience with ETL tools like SSIS, Informatica, DataStage, or Talend. - Proficient in relational databases such as SQL Server and MySQL. - Strong understanding of Data Mart/EDW methodologies. - Experience in designing star schemas, snowflake schemas, fact and dimension tables. - Experience with Snowflake or BigQuery. - Knowledge of reporting and analytics tools like Tableau and Power BI. - Scripting and programming proficiency using Python. - Familiarity with cloud platforms such as AWS or Azure. - Ability to lead recruitment, estimation, and project execution. - Exposure to Sales and Marketing data domains. - Experience with cross-functional and geographically distributed teams. - Ability to translate complex data problems into actionable insights. - Strong communication and client management skills. - Self-starter with a collaborative attitude and problem-solving mindset. You will be responsible for: - Delivering high-level and low-level design documents for middleware and ETL architecture. - Designing and reviewing data integration components, ensuring adherence to standards and best practices. - Owning delivery quality and timeliness across one or more complex projects. - Providing functional and non-functional assessments for global data implementations. - Offering technical problem-solving guidance and support to junior team members. - Driving QA for deliverables and validating progress against project timelines. - Leading issue escalation, status tracking, and continuous improvement initiatives. - Supporting planning, estimation, and resourcing across data engineering efforts. Please note the contact details: Email: careers@grazitti.com Address: Grazitti Interactive LLP (SEZ Unit), 2nd Floor, Quark City SEZ, A-40A, Phase VIII Extn., Mohali, SAS Nagar, Punjab, 160059, India,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Snowflake Lead with experience and additional expertise in ETL tools and DBT, you will play a crucial role in designing, implementing, and managing Snowflake solutions to facilitate efficient data management and analytics. Your leadership will be pivotal in guiding the team to deliver high-quality, scalable, and optimized data solutions. Your proficiency in ETL and DBT will further enhance the organization's data processing capabilities. **Roles & Responsibilities:** - Perform analysis on the existing Data Storage system and develop Data Solutions in Snowflake. - Design the data warehouse and provide guidance to the team for implementation using Snowflake SnowSQL. - Define and communicate best practices, coding standards, and architectural guidelines for Snowflake development. - Collaborate with stakeholders to understand business requirements and translate them into effective Snowflake data solutions. - Architect Snowflake data models, schemas, and pipelines aligned with the organization's data strategy and industry best practices. - Lead the design and development of complex ETL processes and data transformations using Snowflake, as well as additional ETL tools. - Ensure seamless data integration from various sources into Snowflake while maintaining data quality and integrity. - Utilize experience with DBT (Data Build Tool) to transform raw data into well-structured, business-ready data models. - Design and manage DBT workflows, ensuring consistent and reliable data transformations. - Identify and address performance bottlenecks in Snowflake queries, data loading, and transformations. - Implement optimization techniques such as caching, indexing, and partitioning to enhance query performance. - Streamline data movement, transformation, and loading processes beyond Snowflake using expertise in ETL tools. - Evaluate, select, and integrate ETL tools that complement the Snowflake ecosystem and enhance overall data processing efficiency. - Work closely with cross-functional teams to understand their data requirements and deliver optimal solutions. - Facilitate communication between technical and non-technical teams to ensure successful project outcomes. - Hands-on experience in Python. - Good hands-on experience in converting Source Independent Load, Post Load Process, StoredProcedure, SQL to Snowflake. - Experience in migrating data from on-premises databases and files to Snowflake. - Strong understanding of ELT/ETL and integration concepts and design best practices. - Experience in performance tuning of the Snowflake pipelines and ability to troubleshoot issues quickly. - Experience in Snowsql, Snowpipe, and Snowpark. **Other Qualifications:** - Experience in data engineering, data warehousing, and analytics. - Strong hands-on experience with Snowflake, including architecture design, ETL development, and performance optimization. - Proficiency in ETL tools and experience with DBT. - Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services. - Proven leadership experience, including managing technical teams and projects. - Excellent problem-solving skills and ability to analyze complex technical challenges. - Effective communication skills and a collaborative approach to working with diverse teams. Please share your CV at parul@mounttalent.com. Location: Bangalore, Chennai, Noida, Pune, Mumbai, and Hyderabad.,
Posted 3 days ago
2.0 - 6.0 years
0 Lacs
jaipur, rajasthan
On-site
As a Data Engineer with Fabric, you will be responsible for designing, developing, and maintaining data pipelines and infrastructure to ensure accurate, timely, and accessible data for driving data-driven decision-making and supporting company growth. Key Responsibilities: - Design, develop, and implement data pipelines using Azure Data Factory and Databricks for ingestion, transformation, and movement of data. - Develop and optimize ETL processes to ensure efficient data flow and transformation. - Maintain Azure Data Lake solutions for efficient storage and retrieval of large datasets. - Build and manage scalable data warehousing solutions using Azure Synapse Analytics for advanced analytics and reporting. - Integrate various data sources into MS-Fabric to ensure data consistency, quality, and accessibility. - Optimize data processing workflows and storage solutions to improve performance and reduce costs. - Manage and optimize SQL and NoSQL databases to support high-performance queries and data storage requirements. - Implement data quality checks and monitoring to ensure accuracy and consistency of data. - Collaborate with data scientists, analysts, and stakeholders to understand data requirements and deliver actionable insights. - Create and maintain comprehensive documentation for data processes, pipelines, infrastructure, architecture, and best practices. - Identify and resolve issues in data pipelines, data lakes, and warehousing solutions, providing timely support and maintenance. Qualifications: - Experience: 2-4 years of experience in data engineering or a related field. - Technical Skills: - Proficiency with Azure Data Factory, Azure Synapse Analytics, Databricks, and Azure Data Lake. - Experience with Microsoft Fabric is a plus. - Strong SQL skills and experience with data warehousing concepts (DWH). - Knowledge of data modeling, ETL processes, and data integration. - Hands-on experience with ETL tools and frameworks (e.g., Apache Airflow, Talend). - Knowledge of big data technologies (e.g., Hadoop, Spark) is a plus. - Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and associated data services (e.g., S3, Redshift, BigQuery). - Familiarity with data visualization tools (e.g., Power BI) and experience with programming languages such as Python, Java, or Scala. - Experience with schema design and dimensional data modeling. - Analytical Skills: Strong problem-solving abilities and attention to detail. - Communication: Excellent verbal and written communication skills, with the ability to explain technical concepts to non-technical stakeholders. - Education: Bachelor's degree in computer science, Engineering, Mathematics, or a related field. Advanced degrees or certifications are a plus. Interested candidates can share their CV at sulabh.tailang@celebaltech.com.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Role Overview: As a Software Engineer at Capgemini, you will work in the area of Software Engineering, involving the development, maintenance, and optimization of software solutions/applications. You will apply scientific methods to analyze and solve software engineering problems, while also being responsible for the development and application of software engineering practice and knowledge. Your role will require excellent analytical, problem-solving, and communication skills, as well as the ability to work independently and lead cross-functional teams. You will collaborate with other software engineers and stakeholders, contributing towards building a more sustainable and inclusive world through technology. Key Responsibilities: - Analyze and solve software engineering problems using scientific methods - Develop and apply software engineering practice and knowledge in research, design, development, and maintenance - Exercise original thought and judgment while supervising the technical and administrative work of other software engineers - Build skills and expertise in the software engineering discipline to meet standard expectations for the applicable role - Collaborate and act as a team player with other software engineers and stakeholders Qualifications Required: - 5+ years of experience in BY Planning or related SCM platforms - Strong understanding of supply chain and inventory optimization - Hands-on experience with SQL, APIs, and scripting languages - Familiarity with ETL tools and cloud data platforms - Experience with Blue Yonder Cloud implementations (Preferred) - Certifications in BY Planning or related SCM tools (Preferred) - Exposure to Agile or hybrid delivery models (Preferred) *Note: The additional details of the company have been omitted from the Job Description.,
Posted 4 days ago
4.0 - 8.0 years
0 - 1 Lacs
hyderabad, pune, bengaluru
Hybrid
Hiring a Python Developer with 4+ years of experience and proven expertise in the BFSI sector. Must have strong skills in Python, Django/Flask, SQL, and APIs. Experience with data pipelines, ETL tools, and cloud is a plus. BFSI background is a must.
Posted 4 days ago
2.0 - 6.0 years
0 Lacs
jaipur, rajasthan
On-site
As a Data Engineer at our company, you will be responsible for designing, developing, and maintaining data pipelines and infrastructure to ensure accurate, timely, and accessible data for our organization's growth and data-driven decision-making. Key Responsibilities: - Design, develop, and implement data pipelines using Azure Data Factory and Databricks for data ingestion, transformation, and movement. - Develop and optimize ETL processes to facilitate efficient data flow and transformation. - Maintain Azure Data Lake solutions for efficient storage and retrieval of large datasets. - Collaborate with Azure Synapse Analytics to build scalable data warehousing solutions for advanced analytics and reporting. - Integrate various data sources into MS-Fabric, ensuring data consistency, quality, and accessibility. - Optimize data processing workflows and storage solutions to enhance performance and reduce costs. - Manage and optimize SQL and NoSQL databases for high-performance queries and data storage. - Implement data quality checks and monitoring processes to ensure data accuracy and consistency. - Work closely with data scientists, analysts, and stakeholders to understand data requirements and deliver actionable insights. - Create and maintain comprehensive documentation for data processes, pipelines, infrastructure, architecture, and best practices. - Identify and resolve issues in data pipelines, data lakes, and warehousing solutions, providing timely support and maintenance. Qualifications: - 2-4 years of experience in data engineering or a related field. Technical Skills: - Proficiency in Azure Data Factory, Azure Synapse Analytics, Databricks, and Azure Data Lake. - Experience with Microsoft Fabric is a plus. - Strong SQL skills and familiarity with data warehousing concepts (DWH). - Knowledge of data modeling, ETL processes, and data integration. - Hands-on experience with ETL tools and frameworks like Apache Airflow and Talend. - Familiarity with big data technologies such as Hadoop and Spark. - Experience with cloud platforms like AWS, Azure, Google Cloud, and associated data services. - Familiarity with data visualization tools like Power BI and programming languages such as Python, Java, or Scala. - Experience with schema design and dimensional data modeling. Analytical Skills: - Strong problem-solving abilities and attention to detail. Communication: - Excellent verbal and written communication skills to explain technical concepts to non-technical stakeholders. Education: - Bachelor's degree in computer science, engineering, mathematics, or a related field. Advanced degrees or certifications are a plus. Interested candidates can share their CV at sulabh.tailang@celebaltech.com.,
Posted 4 days ago
4.0 - 8.0 years
0 Lacs
bangalore, karnataka
On-site
In this role as an Axiom Developer at EY, you will work as an individual contributor on multiple clients as a part of the regulatory reporting implementation team. Your responsibilities will include facilitating and encouraging necessary conversations between stakeholders to determine requirements, working independently with minimum supervision, providing technical guidance to the team and client, developing and imparting training on new initiatives, identifying process improvement areas, and actively participating in the selection of new regulatory tools and methodologies. Additionally, you will recommend and assist in their implementation. Qualifications required for this position include: - 4-6 years of experience in AxiomSL Regulatory Reporting Implementation. - Expertise in various Axiom components such as Data Source, Data Model, Portfolio, Aggregation, Shorthand, Freeform/Tabular/Taxonomy Reporting. - Experience in CV 10 would be an added advantage. - Experience in CV 9 to CV 10 Migration would be an added advantage. - Good exposure to analytics and reporting tools like Axiom, Vermeg, OneSumX, different finance systems, and databases. - Strong SQL, Advanced Excel, and analytical skills. - Proactive approach to anticipating client needs, issues, and challenges with strong communication skills. - Experience with ETL tools, BI tools, JIRA, Confluence is a plus. - AxiomSL Controller View Certified is a plus. - Excellent communication, interpersonal, leadership, coaching, and conflict resolution skills. - Time and project management skills. - Ability to analyze processes and information, identify problems and trends, and develop effective solutions and strategies. - Self-starter with the ability to solve problems creatively and deliver results while working in a dynamic, collaborative, and challenging environment. At EY, you'll have the opportunity to build a career in a global organization that values inclusivity, technology, and personal growth. Join EY to contribute your unique voice and perspective towards building a better working world for all.,
Posted 4 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Job Description: You will be supporting complex enterprise-grade data integration platforms with a major expertise in KONG. Your responsibilities will include: - Data Integration Platform Management & Strategy - Platform Operations & Optimization - Architecture - Documentation & Standards Additionally, hands-on experience in ETL tools is required for this role.,
Posted 4 days ago
13.0 - 17.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Senior Business Analyst at Bank of America, you will be responsible for delivering key strategic projects in the credit risk and regulatory risk space. Your role will involve engaging with the trading desk, business sponsors, and quants to ensure successful project delivery. Your contributions will be crucial in building, maintaining, and supporting sophisticated risk and valuation systems for production rollout. **Responsibilities:** - Perform complex analytics support for valuations, FO risk, and market risk - Enhance quantitative models in regulatory spaces such as FRTB, credit flow, and structured products - Collaborate with stakeholders to ensure collective success - Refine and translate requirements into initiatives with Global Credit Technology and BA teams - Gather, document, and translate requirements for Business COO and Market Risk COOs - Translate business requirements into deliverables for PI planning with Technology teams - Validate and implement pricing, analytics, and market making models with Quants and research teams - Conduct back-testing and validate regulatory distributions - Assist Data Engineering Team in data connectivity and ingestion from trading venues - Collaborate with the enterprise team on sensitivities modeling for FRTB and other Regulatory priorities **Requirements:** - Education: BE/BTECH - Certifications: CFA/FRM - Experience Range: 13-16 years **Foundational Skills:** - Strong domain knowledge of fixed income products, including Credit and rates products - Develop relationships with Quant and Financial Engineering teams - Troubleshoot and resolve complex issues to improve production stability - Strong problem-solving ability and excellent communication skills - Ability to collaborate with stakeholders and prioritize effectively **Desired Skills:** - Certifications like FRM, CFA, CQF with knowledge of quantitative models preferred - Knowledge of SQL and ETL tools like Power BI/Power Query or BI tools like 3Forge preferable The position is based in Chennai, Mumbai, Hyderabad, or GIFT. The working hours are from 11:00 AM to 8:00 PM. Join Bank of America to have a great career with opportunities for growth, learning, and making a significant impact.,
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As an ETL Tester, your role will involve validating data movement, transformations, and data quality throughout the ETL process. It is essential to have strong SQL skills, analytical thinking, and practical experience with ETL tools such as Informatica, Talend, DataStage, SSIS, etc. Additionally, a good understanding of data warehouse concepts like star schema, snowflake schema, and data modeling is required. Key Responsibilities: - Understand business requirements, source systems, and target data models. - Develop, design, and execute test plans, test cases, and test scripts for ETL processes. - Perform data validation, integrity checks, and reconciliation between source and target systems. - Validate ETL mappings, stored procedures, workflows, and business rules. - Conduct functional, integration, regression, and performance testing for ETL jobs and data warehouses. - Write and execute complex SQL queries for data verification. - Automate ETL test processes where applicable. - Identify, log, and track defects/issues, and collaborate with developers and analysts for resolution. - Provide detailed test results, defect reports, and sign-offs for releases. - Collaborate with data engineers, business analysts, and QA teams to ensure high data quality and reliability. Qualifications Required: - 6-8 years of relevant experience in ETL testing. - Strong proficiency in SQL. - Hands-on experience with ETL tools like Informatica, Talend, DataStage, SSIS, etc. - Good knowledge of data warehouse concepts such as star schema, snowflake schema, and data modeling. Please note that the job is full-time and requires in-person work at the specified location.,
Posted 5 days ago
5.0 - 10.0 years
20 - 35 Lacs
pune, bengaluru, delhi / ncr
Work from Office
Develop software applications using .NET, C#, .NET Core,SQL,Angular 9+/React Expertise in writing SQL queries,stored procedure & performance tuning Unit testing framework like Xunit,Nunit etc Exposure CI/CD tools such as Azure DevOps/GitHub Actions Required Candidate profile Familiarity with cloud concepts & services like Azure Functions, Azure SQL, or App Services Exp with Azure Data Factory (ADF) or other ETL tools Understanding of REST APIs & data formats like JSON/XML
Posted 5 days ago
4.0 - 8.0 years
0 Lacs
maharashtra
On-site
You are seeking a Senior Analyst - Marketing Data & Campaigns with a solid background in SQL and ETL tools to oversee end-to-end campaign execution. Your primary responsibilities will include constructing audience segments, managing campaign data workflows, and collaborating closely with marketing and analytics teams. Proficiency in Adobe Campaign (Classic or Standard) is essential to facilitate campaign execution tasks and integrations. Your duties will involve: - Managing campaign operations by handling audience extraction, suppression logic, and file delivery through SQL queries. - Utilizing ETL tools (such as SSIS, Talend, Informatica) to support campaign data flows and automate processes. - Generating and scheduling data jobs to prepare campaign-ready lists from various raw datasets (CRM, web, app, product systems). - Conducting campaign-related data quality assurance, deduplication, and validation tasks. - Coordinating with campaign managers and business teams to collect campaign briefs and convert them into data execution plans. In terms of campaign support, you will: - Aid in campaign execution using Adobe Campaign Classic or Standard. - Assist in delivery configuration, dynamic content setup, and campaign tracking within Adobe Campaign. - Facilitate the synchronization of campaign data between internal systems and Adobe Campaign through ETL pipelines or data transfer processes. To be successful in this role, you should possess: - Over 4 years of practical experience in crafting complex SQL queries for data extraction, transformation, and validation. - Hands-on experience with ETL tools like SSIS, Talend, or Informatica. - Understanding of marketing campaign data structures, including audience lists, offers, and response tracking. - Proficiency in Adobe Campaign (Classic or Standard) for support or integration purposes. This is a full-time position that requires in-person work.,
Posted 5 days ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You will be joining Capgemini Engineering, the world leader in engineering services, where a global team of engineers, scientists, and architects collaborate to support innovative companies in unleashing their potential. From autonomous cars to life-saving robots, our digital and software technology experts are known for their out-of-the-box thinking and provide unique R&D and engineering services across all industries. A career with us offers endless opportunities where you can truly make a difference, and each day brings new challenges and excitement. We are currently seeking a Trainer with 6 to 10 years of experience who will be responsible for providing training to both experienced employees and freshers. The ideal candidate should have hands-on experience in Database and Data warehouse along with knowledge of ETL tools like Informatica, Talend, or SSIS. Proficiency in any programming language such as Python or Java is required, as well as familiarity with cloud technologies like Azure, AWS, or GCP. Strong communication and presentation skills are essential for this role. The selected candidate should also have expertise in Oracle or SQL Server, hands-on experience with ETL tools like Informatica, Tableau, or SSIS, and experience working with Hadoop or Big Data. Additionally, proficiency in Data Visualization tools like Power BI and programming languages such as Java or Python is highly desired. This role will involve training individuals on data-related topics and requires a minimum of 6 to 10 years of experience as a Trainer or Corporate Trainer. The position is based in Pune, Mumbai, or Bangalore, and the successful candidate must possess excellent communication skills. Capgemini is a global business and technology transformation partner, committed to helping organizations accelerate their digital and sustainable transformation while making a positive impact on enterprises and society. With a diverse team of over 340,000 members in more than 50 countries, Capgemini leverages its 55-year heritage to deliver end-to-end services and solutions that address a wide range of business needs. Our expertise spans strategy, design, engineering, AI, generative AI, cloud, and data, supported by deep industry knowledge and a strong partner ecosystem.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Technical Product Manager Integrations at Anchanto, you will be responsible for driving the strategy, execution, and optimization of third-party integrations. Your role will involve a deep understanding of functional and technical aspects related to integrations such as Marketplaces, Carriers, Accounting, Robotics, and Automation. You will collaborate with various teams including engineering, business, and external partners to define the product roadmap, ensure smooth execution, and maintain high-quality documentation and training resources. Your key responsibilities will include leading the functional and technical analysis of third-party integrations, defining and executing the integration roadmap, collaborating with engineering teams to design robust APIs and middleware solutions, working closely with partners and service providers, and evaluating and optimizing integration performance based on key metrics. You will also be responsible for establishing success metrics for each integration, implementing a continuous measurement framework, defining requirements for integrations, collaborating with stakeholders, owning and maintaining integration documentation, conducting internal training sessions, managing integration projects using tools like JIRA and Confluence, supporting customer success teams, and resolving integration issues. The required skills and experience for this role include 5+ years of experience in Technical Product Management, API Integrations, or SaaS-based Platforms, a strong technical background with expertise in APIs, Webhooks, Data Mapping, and Middleware Solutions, experience working with eCommerce marketplaces, logistics carriers, accounting systems, or automation tools, proficiency in defining and executing integration roadmaps, setting up success metrics, and managing requirements. Additionally, proficiency in tools like JIRA, Aha!, Confluence, excellent communication and stakeholder management skills, hands-on experience with API testing tools, integration platforms, and data transformation processes are preferred. Preferred qualifications include experience in eCommerce, logistics, supply chain, or B2B SaaS platforms, knowledge of OAuth, REST, GraphQL, SAML, and messaging protocols, familiarity with cloud platforms and microservices architecture, certifications in Scrum/Agile methodologies, business process analysis, expertise with APIs, XMLs, JSONs, and hands-on experience with the latest technologies and approaches. Strong analytical skills, communication, collaboration, and detail-oriented mindset are essential for this role. Application development experience with various technologies is a plus.,
Posted 5 days ago
4.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
At Solidatus, we are revolutionizing the way organizations comprehend their data. We are an award-winning, venture-backed software company often referred to as the Git for Metadata. Our platform enables businesses to extract, model, and visualize intricate data lineage flows. Through our unique lineage-first approach and active AI development, we offer organizations unparalleled clarity and robust control over their data's journey and significance. As a rapidly growing B2B SaaS business with fewer than 100 employees, your contributions play a pivotal role in shaping our product. Renowned for our innovation and collaborative culture, we invite you to join us as we expand globally and redefine the future of data understanding. We are currently looking for an experienced Data Pipeline Engineer/Data Lineage Engineer to support the development of data lineage solutions for our clients" existing data pipelines. In this role, you will collaborate with cross-functional teams to ensure the integrity, accuracy, and timeliness of the data lineage solution. Your responsibilities will involve working directly with clients to maximize the value derived from our product and assist them in achieving their contractual objectives. **Experience:** - 4-10 years of relevant experience **Qualifications:** - Proven track record as a Data Engineer or in a similar capacity, with hands-on experience in constructing and optimizing data pipelines and infrastructure. - Demonstrated experience working with Big Data and related tools. - Strong problem-solving and analytical skills to diagnose and resolve complex data-related issues. - Profound understanding of data engineering principles and practices. - Exceptional communication and collaboration abilities to work effectively in cross-functional teams and convey technical concepts to non-technical stakeholders. - Adaptability to new technologies, tools, and methodologies within a dynamic environment. - Proficiency in writing clean, scalable, and robust code using Python or similar programming languages. Background in software engineering is advantageous. **Desirable Languages/Tools:** - Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. - Experience with XML in transformation pipelines. - Familiarity with major Database technologies like Oracle, Snowflake, and MS SQL Server. - Strong grasp of data modeling concepts including relational and dimensional modeling. - Exposure to big data technologies and frameworks such as Databricks, Spark, Kafka, and MS Notebooks. - Knowledge of modern data architectures like lakehouse. - Experience with CI/CD pipelines and version control systems such as Git. - Understanding of ETL tools like Apache Airflow, Informatica, or SSIS. - Familiarity with data governance and best practices in data management. - Proficiency in cloud platforms and services like AWS, Azure, or GCP for deploying and managing data solutions. - Strong problem-solving and analytical skills for resolving complex data-related issues. - Proficiency in SQL for database management and querying. - Exposure to tools like Open Lineage, Apache Spark Streaming, Kafka, or similar for real-time data streaming. - Experience utilizing data tools in at least one cloud service - AWS, Azure, or GCP. **Key Responsibilities:** - Implement robust data lineage solutions utilizing Solidatus products to support business intelligence, analytics, and data governance initiatives. - Collaborate with stakeholders to comprehend data lineage requirements and translate them into technical and business solutions. - Develop and maintain lineage data models, semantic metadata systems, and data dictionaries. - Ensure data quality, security, and compliance with relevant regulations. - Uphold Solidatus implementation and data lineage modeling best practices at client sites. - Stay updated on emerging technologies and industry trends to enhance data lineage architecture practices continually. **Qualifications:** - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - Proven experience in data architecture, focusing on large-scale data systems across multiple companies. - Proficiency in data modeling, database design, and data warehousing concepts. - Experience with cloud platforms (e.g., AWS, Azure, GCP) and big data technologies (e.g., Hadoop, Spark). - Strong understanding of data governance, data quality, and data security principles. - Excellent communication and interpersonal skills to thrive in a collaborative environment. **Why Join Solidatus ** - Participate in an innovative company that is shaping the future of data management. - Collaborate with a dynamic and talented team in a supportive work environment. - Opportunities for professional growth and career advancement. - Flexible working arrangements, including hybrid work options. - Competitive compensation and benefits package. If you are passionate about data architecture and eager to make a significant impact, we invite you to apply now and become a part of our team at Solidatus.,
Posted 6 days ago
2.0 - 10.0 years
0 Lacs
karnataka
On-site
As an experienced Reltio MDM professional, you will be responsible for designing, developing, and implementing Master Data Management solutions using the Reltio platform. This role requires a minimum of 2+ years of experience specifically in Reltio MDM, and a total of 7-10 years of relevant experience. Your key responsibilities will include gathering and analyzing business requirements for Master Data Management, designing and configuring Reltio MDM Cloud solutions, building data models, match & merge rules, survivorship rules, hierarchies, and workflows in Reltio, integrating Reltio with enterprise applications, implementing data quality, governance, and stewardship processes, managing user roles, security, and workflows, collaborating with stakeholders and data stewards, troubleshooting, providing production support, and preparing documentation and training materials. The required skills for this role include strong hands-on experience with Reltio MDM Cloud, expertise in data modeling, governance, quality, and metadata management, proficiency in REST APIs, JSON, XML, knowledge of ETL tools and integration frameworks, strong SQL skills, familiarity with cloud platforms, understanding of Agile methodology and DevOps processes, and excellent communication and stakeholder management skills. Preferred skills for the role include Reltio Certification, experience with other MDM tools, knowledge of Graph databases, Big Data platforms, or Data Lakes. If you are looking to further your career as a Reltio MDM Developer / Lead and possess the required skills and experience, we encourage you to apply for this exciting opportunity in our organization.,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You will be joining our team as a highly skilled Salesforce Technical professional, responsible for managing and customizing our Salesforce systems. Your role will involve creating timelines, defining project scopes, and coordinating with team members to deliver solutions that align with our company's needs. Additionally, as a Salesforce Technical professional, you will troubleshoot and resolve any technical issues that arise. The ideal candidate is a problem solver with a profound understanding of the Salesforce platform and possesses the ability to effectively communicate technical concepts to non-technical team members. Your responsibilities will include designing, developing, testing, documenting, and deploying high-quality business solutions on the Salesforce platform based on industry best practices and business requirements. You will also manage the process of implementing improvements and new functionality in the Salesforce application, as well as provide support and solutions for any break/fix issues that may occur. Effective communication and collaboration with technical resources and stakeholders will be essential to keep all parties updated on the status, technical issues, and creative solutions. Required Skills: - Proficiency in Salesforce.com development, including APEX Classes, Controllers and Triggers, Visualforce, Force.com IDE, Migration Tool, and Web Services. - Strong understanding of Salesforce.com best practices and functionality. - Experience with object-oriented programming (OOP) concepts and patterns. - A Bachelor's degree in Computer Science, Information Technology, or a related field is required. Salesforce certification is highly preferred. Preferred Skills: - Familiarity with Salesforce Lightning Design System. - Experience with Salesforce.com AppExchange and third-party integrations. - Knowledge of Agile Scrum methodologies. - Experience with Salesforce.com's Sales Cloud and Service Cloud modules. - Understanding of database design and data management. - Knowledge of JavaScript, HTML5, CSS3. - Experience with Salesforce.com Communities. - Salesforce Certified Developer or Advanced Developer status. - Familiarity with Salesforce DX. - Experience with data migration and ETL tools.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You will be part of a dynamic team at SpectraMedix, committed to driving innovation and reshaping the healthcare industry. Your role as a Senior Java Full-stack developer will involve collaborating closely with lead developers, architects, and data/business analysts to implement, test, and document as required. You will also have the opportunity to engage directly with clients, presenting and explaining technical configurations to ensure the SpectraMedix product exceeds customer expectations. As a key member of the team, you will develop code based on the Sprint scope or Project Plan, following the High-Level Design document. Adhering to security and coding standards, you will create unit and integration test scripts using the Test-Driven Development (TDD) approach. Your expertise in Core Java, OOPS, Spring MVC, Spring Security, Javascript, Angular, Reactjs, and Tomcat will be essential in delivering high-quality outputs in design, code, and unit testing. To excel in this role, you should have 5-7 years of experience in a relevant field and hold a Bachelors/Masters degree in engineering or a technical field. Proficiency in Core Java, back-end frameworks like Spring and Hibernate, front-end frameworks such as Javascript, Angular, Reactjs, and experience with Restful Web Services using JAX-RS and JAX-WS are required. Knowledge of ETL Tools, Bigdata technologies, Elasticsearch, SQL Server, and deployment on Tomcat Server will be advantageous. Apart from technical skills, non-technical competencies are essential for success in this role. You must have experience working with US-based clients in an onsite/offshore delivery model. Strong verbal and written communication, technical articulation, listening, and presentation skills are necessary. Demonstrated expertise in problem-solving, time management, stakeholder management, and the ability to work under tight deadlines within a matrix organizational structure are vital. If you are a quick learner, self-starter, proactive, and an effective team player with the required technical and non-technical competencies, we invite you to apply by emailing your resume to shailesh.pathak@spectramedix.com. Join us in creating a culture of belonging and mutual respect while contributing to meaningful change in the healthcare industry.,
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
At Aristocrat, we aim to bring happiness through play. We are currently seeking a dedicated Technical Lead - Database QA to join our Global Data team. As a Technical Lead - Database QA, you will play a critical role in maintaining data system performance. Your main responsibility will be spearheading the quality assurance efforts for our data management processes. You will collaborate closely with a team of data stewards, architects, and analysts to ensure the integrity and accuracy of our data. This role presents an exciting opportunity to be at the forefront of data quality assurance, utilizing brand new technologies to deliver flawless solutions. In this role, you will conduct comprehensive analysis and profiling of master data sets to identify inconsistencies, inaccuracies, and anomalies. You will work with data stewards, data architects, and business analysts to define and understand Master Data Management (MDM) requirements. Additionally, you will lead the design and execution of test strategies for Extract Transform Load (ETL) pipelines, data transformations, and BI dashboards. Some of your key responsibilities will include analyzing large data sets, conducting both manual and automated testing for data ingestion pipelines, data transformations, and business reports, and validating Power BI dashboards and reports for data correctness, layout integrity, and performance. You will also automate data validations using SQL and Python, employing tools such as PyTest, dbt tests, or custom scripts. We are looking for someone with at least 6 years of experience in QA roles with a focus on Master Data Management. The ideal candidate will have proven experience in crafting detailed test plans and test cases, conducting functional, non-functional, and regression testing to ensure data quality. Proficiency in SQL for querying and validating large datasets, experience testing BI reports and dashboards, and familiarity with test automation frameworks and ETL tools are also required. Join our team at Aristocrat and help us deliver outstanding performance and happiness to millions of users worldwide. We offer a robust benefits package, global career opportunities, and a work environment that values individual differences and encourages the realization of potential. Aristocrat is committed to creating a diverse and inclusive workplace where all employees have the opportunity to thrive. Please note that depending on the nature of your role, you may be required to register with the Nevada Gaming Control Board (NGCB) and/or other gaming jurisdictions. Additionally, we are unable to sponsor work visas for this position at this time. Candidates must be authorized to work in the job posting location on a full-time basis without the need for current or future visa sponsorship.,
Posted 6 days ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
As a Snowflake ETL Expert at Fiserv, you will play a crucial role in designing, developing, and maintaining ETL processes to facilitate data integration and analytics using the Snowflake platform. Your responsibilities will involve close collaboration with data architects, analysts, and various stakeholders to ensure the accurate and efficient processing and storage of data. You will be tasked with designing and developing ETL processes, encompassing the extraction, transformation, and loading of data from diverse sources into Snowflake. Data integration will be a key aspect of your role, ensuring the quality and consistency of integrated data from multiple sources. Performance optimization of ETL processes to enhance efficiency and scalability will also be within your purview. In addition to ETL processes, you will be involved in data modeling activities to create and maintain data models that support analytical insights and reporting functionalities. Collaboration with data architects, analysts, and stakeholders will be essential to comprehend data requirements and deliver effective solutions. Documentation of ETL processes, data models, and other pertinent information will also be part of your responsibilities. To excel in this role, you should possess at least 7 years of overall IT experience, including a minimum of 3 years of experience in Snowflake platform Data Engineering. Proficiency in writing complex SQL queries, stored procedures, and analytical functions, as well as experience with Python scripting, is required. A strong understanding of data warehousing, ETL concepts, and best practices is essential. Familiarity with ETL tools like Informatica, Talend, or similar platforms is advantageous. Strong database concepts such as Entity-relationship, Data modeling, DDL, and DML statements are expected. While a Bachelor's degree is a prerequisite, relevant work experience can be considered as a substitute. Excellent analytical and problem-solving skills are crucial, along with effective communication abilities to collaborate with stakeholders and document processes. You should demonstrate the capacity to work both independently and as part of a team. Having additional qualifications such as Snowflake SnowPro certification, experience with code versioning tools like Github, proficiency in AWS/Azure Cloud services, exposure to an Agile development environment, and domain knowledge in Cards, banking, or financial services industry would be highly beneficial.,
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |