Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 6.0 years
4 - 8 Lacs
bengaluru
Work from Office
Key Responsibilities : - Exp in Snowflake with min 3 yrs. - Design, develop, and maintain integration workflows using Informatica CAI/CDI within IDMC. - Configure and manage Informatica MDM for data consolidation, cleansing, and governance. - Translate business and technical requirements into robust, scalable integration solutions. - Collaborate with analysts, architects, and business stakeholders to ensure delivery meets expectations. - Monitor and troubleshoot integration processes and performance. - Support deployment activities and CI/CD processes. - Maintain documentation of integration designs and data flows. Essential Skills & Experience : - 4+ years experience in Informatica development (Cloud and/or On-Prem). - Strong experience with CAI (Cloud Application Integration) and CDI (Cloud Data Integration). - Experience configuring and supporting Informatica MDM including data models, match/merge rules, and hierarchies. - Solid understanding of REST/SOAP APIs, event-driven architecture, and message queues. - Hands-on experience with IDMC platform and cloud-native integration patterns. - Proficient in SQL and data manipulation techniques. - Experience in data governance, data quality, and master data best practice. - Experience with CI/CD pipelines for Informatica using Git, Jenkins, or similar tools. - Knowledge of cloud platforms (Azure, AWS, or GCP). - Exposure to data warehousing, data lake, or real-time integration. - Familiarity with Agile/Scrum delivery methods.
Posted 4 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
Role Overview: At Infoblox, you have the opportunity to join as a BI Engineer in the IT Data & Analytics team based in Bengaluru, India. As a BI Engineer, you will play a crucial role in creating insightful dashboards to support various business functions in their day-to-day operations and long-term strategies. You will work in a dynamic team that has implemented an in-house Datalake system and will be involved in the transformative journey of upgrading systems to be AI-ready. Key Responsibilities: - Architect complex and scalable Tableau dashboards and visualizations aligned with business goals and KPIs - Design and implement robust data models, including complex joins, LOD expressions, and calculated fields for deep analytics - Mentor junior developers, conduct code reviews, and establish best practices for Tableau development - Act as a liaison between business units and IT, translating business requirements into technical solutions - Optimize dashboard performance through query tuning, extract strategies, and efficient data source design - Ensure data accuracy, consistency, and compliance with organizational data governance policies - Stay updated with Tableau's latest features and BI trends, proactively suggesting improvements to existing solutions Qualifications Required: - 2-5 years of experience in business intelligence - Proficiency in Tableau or similar BI reporting tools for creating insightful dashboards - Strong SQL writing skills and knowledge of cloud-hosted databases - Hands-on experience with Datalake and data warehouse concepts - Experience with cloud infrastructure like AWS, Azure, etc. - Technical knowledge of ETL flows and functional flows such as marketing, sales, lead to order, or order to cash (bonus) - Bachelor's degree in engineering preferred Additional Details: Infoblox is recognized as a great place to work, with awards such as Glassdoor Best Places to Work, Great Place to Work-Certified in five countries, and Cigna Healthy Workforce honors. The company focuses on empowering its employees and building world-class technology, making remarkable careers possible when talented individuals meet first-class technology. In a community that values inclusion and rewards bold ideas, curiosity, and creativity, continuous learning is encouraged. The company offers comprehensive health coverage, generous PTO, flexible work options, learning opportunities, career-mobility programs, leadership workshops, volunteer hours, global employee resource groups, modern offices, culture celebrations, and more. If you are ready to make a difference and thrive in a collaborative and innovative environment, Infoblox invites you to join as a BI Engineer and be part of a team that is shaping the future of networking and security solutions.,
Posted 4 days ago
4.0 - 6.0 years
4 - 8 Lacs
mumbai
Work from Office
Key Responsibilities : - Exp in Snowflake with min 3 yrs. - Design, develop, and maintain integration workflows using Informatica CAI/CDI within IDMC. - Configure and manage Informatica MDM for data consolidation, cleansing, and governance. - Translate business and technical requirements into robust, scalable integration solutions. - Collaborate with analysts, architects, and business stakeholders to ensure delivery meets expectations. - Monitor and troubleshoot integration processes and performance. - Support deployment activities and CI/CD processes. - Maintain documentation of integration designs and data flows. Essential Skills & Experience : - 4+ years experience in Informatica development (Cloud and/or On-Prem). - Strong experience with CAI (Cloud Application Integration) and CDI (Cloud Data Integration). - Experience configuring and supporting Informatica MDM including data models, match/merge rules, and hierarchies. - Solid understanding of REST/SOAP APIs, event-driven architecture, and message queues. - Hands-on experience with IDMC platform and cloud-native integration patterns. - Proficient in SQL and data manipulation techniques. - Experience in data governance, data quality, and master data best practice. - Experience with CI/CD pipelines for Informatica using Git, Jenkins, or similar tools. - Knowledge of cloud platforms (Azure, AWS, or GCP). - Exposure to data warehousing, data lake, or real-time integration. - Familiarity with Agile/Scrum delivery methods.
Posted 5 days ago
6.0 - 10.0 years
5 - 9 Lacs
bengaluru
Work from Office
The Azure Data Bricks Engineer plays a critical role in establishing and maintaining an efficient data ecosystem within an organization. This position is integral to the development of data solutions leveraging the capabilities of Microsoft Azure Data Bricks. The engineer will work closely with data scientists and analytics teams to facilitate the transformation of raw data into actionable insights. With increasing reliance on big data technologies and cloud-based solutions, having an expert on board is vital for driving data-driven decision-making processes. The Azure Data Bricks Engineer will also be responsible for optimizing data workflows, ensuring data quality, and deploying scalable data solutions that align with organizational goals. This role requires not only technical expertise in handling large volumes of data but also the ability to collaborate across various functional teams to enhance operational efficiency. - Design and implement scalable data pipelines using Azure Data Bricks. - Develop ETL processes to efficiently extract, transform, and load data. - Collaborate with data scientists and analysts to define and refine data requirements. - Optimize Spark jobs for performance and efficiency. - Monitor and troubleshoot production workflows and jobs. - Implement data quality checks and validation processes. - Create and maintain technical documentation related to data architecture. - Conduct code reviews to ensure best practices are followed. - Work on integrating data from various sources including databases, APIs, and third-party services. - Utilize SQL and Python for data manipulation and analysis. - Collaborate with DevOps teams to deploy and maintain data solutions. - Stay updated with the latest trends and updates in Azure Data Bricks and related technologies. - Facilitate data visualization initiatives for better data-driven insights. - Provide training and support to team members on data tools and practices. - Participate in cross-functional projects to enhance data sharing and access. - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 6 years of experience in data engineering or a related domain. - Strong expertise in Azure Data Bricks and data lake concepts. - Proficiency with SQL, Python, and Spark. - Solid understanding of data warehousing concepts. - Experience with ETL tools and frameworks. - Familiarity with cloud platforms such as Azure, AWS, or Google Cloud. - Excellent problem-solving and analytical skills. - Ability to work collaboratively in a diverse team environment. - Experience with data visualization tools such as Power BI or Tableau. - Strong communication skills with the ability to convey technical concepts to non-technical stakeholders. - Knowledge of data governance and data quality best practices. - Hands-on experience with big data technologies and frameworks. - A relevant certification in Azure is a plus. - Ability to adapt to changing technologies and evolving business requirements.
Posted 5 days ago
6.0 - 10.0 years
9 - 13 Lacs
mumbai
Work from Office
The Azure Data Bricks Engineer plays a critical role in establishing and maintaining an efficient data ecosystem within an organization. This position is integral to the development of data solutions leveraging the capabilities of Microsoft Azure Data Bricks. The engineer will work closely with data scientists and analytics teams to facilitate the transformation of raw data into actionable insights. With increasing reliance on big data technologies and cloud-based solutions, having an expert on board is vital for driving data-driven decision-making processes. The Azure Data Bricks Engineer will also be responsible for optimizing data workflows, ensuring data quality, and deploying scalable data solutions that align with organizational goals. This role requires not only technical expertise in handling large volumes of data but also the ability to collaborate across various functional teams to enhance operational efficiency. - Design and implement scalable data pipelines using Azure Data Bricks. - Develop ETL processes to efficiently extract, transform, and load data. - Collaborate with data scientists and analysts to define and refine data requirements. - Optimize Spark jobs for performance and efficiency. - Monitor and troubleshoot production workflows and jobs. - Implement data quality checks and validation processes. - Create and maintain technical documentation related to data architecture. - Conduct code reviews to ensure best practices are followed. - Work on integrating data from various sources including databases, APIs, and third-party services. - Utilize SQL and Python for data manipulation and analysis. - Collaborate with DevOps teams to deploy and maintain data solutions. - Stay updated with the latest trends and updates in Azure Data Bricks and related technologies. - Facilitate data visualization initiatives for better data-driven insights. - Provide training and support to team members on data tools and practices. - Participate in cross-functional projects to enhance data sharing and access. - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 6 years of experience in data engineering or a related domain. - Strong expertise in Azure Data Bricks and data lake concepts. - Proficiency with SQL, Python, and Spark. - Solid understanding of data warehousing concepts. - Experience with ETL tools and frameworks. - Familiarity with cloud platforms such as Azure, AWS, or Google Cloud. - Excellent problem-solving and analytical skills. - Ability to work collaboratively in a diverse team environment. - Experience with data visualization tools such as Power BI or Tableau. - Strong communication skills with the ability to convey technical concepts to non-technical stakeholders. - Knowledge of data governance and data quality best practices. - Hands-on experience with big data technologies and frameworks. - A relevant certification in Azure is a plus. - Ability to adapt to changing technologies and evolving business requirements.
Posted 5 days ago
4.0 - 6.0 years
4 - 8 Lacs
hyderabad
Work from Office
Key Responsibilities : - Exp in Snowflake with min 3 yrs. - Design, develop, and maintain integration workflows using Informatica CAI/CDI within IDMC. - Configure and manage Informatica MDM for data consolidation, cleansing, and governance. - Translate business and technical requirements into robust, scalable integration solutions. - Collaborate with analysts, architects, and business stakeholders to ensure delivery meets expectations. - Monitor and troubleshoot integration processes and performance. - Support deployment activities and CI/CD processes. - Maintain documentation of integration designs and data flows. Essential Skills & Experience : - 4+ years experience in Informatica development (Cloud and/or On-Prem). - Strong experience with CAI (Cloud Application Integration) and CDI (Cloud Data Integration). - Experience configuring and supporting Informatica MDM including data models, match/merge rules, and hierarchies. - Solid understanding of REST/SOAP APIs, event-driven architecture, and message queues. - Hands-on experience with IDMC platform and cloud-native integration patterns. - Proficient in SQL and data manipulation techniques. - Experience in data governance, data quality, and master data best practice. - Experience with CI/CD pipelines for Informatica using Git, Jenkins, or similar tools. - Knowledge of cloud platforms (Azure, AWS, or GCP). - Exposure to data warehousing, data lake, or real-time integration. - Familiarity with Agile/Scrum delivery methods.
Posted 5 days ago
6.0 - 10.0 years
5 - 9 Lacs
hyderabad
Work from Office
The Azure Data Bricks Engineer plays a critical role in establishing and maintaining an efficient data ecosystem within an organization. This position is integral to the development of data solutions leveraging the capabilities of Microsoft Azure Data Bricks. The engineer will work closely with data scientists and analytics teams to facilitate the transformation of raw data into actionable insights. With increasing reliance on big data technologies and cloud-based solutions, having an expert on board is vital for driving data-driven decision-making processes. The Azure Data Bricks Engineer will also be responsible for optimizing data workflows, ensuring data quality, and deploying scalable data solutions that align with organizational goals. This role requires not only technical expertise in handling large volumes of data but also the ability to collaborate across various functional teams to enhance operational efficiency. - Design and implement scalable data pipelines using Azure Data Bricks. - Develop ETL processes to efficiently extract, transform, and load data. - Collaborate with data scientists and analysts to define and refine data requirements. - Optimize Spark jobs for performance and efficiency. - Monitor and troubleshoot production workflows and jobs. - Implement data quality checks and validation processes. - Create and maintain technical documentation related to data architecture. - Conduct code reviews to ensure best practices are followed. - Work on integrating data from various sources including databases, APIs, and third-party services. - Utilize SQL and Python for data manipulation and analysis. - Collaborate with DevOps teams to deploy and maintain data solutions. - Stay updated with the latest trends and updates in Azure Data Bricks and related technologies. - Facilitate data visualization initiatives for better data-driven insights. - Provide training and support to team members on data tools and practices. - Participate in cross-functional projects to enhance data sharing and access. - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 6 years of experience in data engineering or a related domain. - Strong expertise in Azure Data Bricks and data lake concepts. - Proficiency with SQL, Python, and Spark. - Solid understanding of data warehousing concepts. - Experience with ETL tools and frameworks. - Familiarity with cloud platforms such as Azure, AWS, or Google Cloud. - Excellent problem-solving and analytical skills. - Ability to work collaboratively in a diverse team environment. - Experience with data visualization tools such as Power BI or Tableau. - Strong communication skills with the ability to convey technical concepts to non-technical stakeholders. - Knowledge of data governance and data quality best practices. - Hands-on experience with big data technologies and frameworks. - A relevant certification in Azure is a plus. - Ability to adapt to changing technologies and evolving business requirements.
Posted 5 days ago
4.0 - 6.0 years
4 - 8 Lacs
noida
Work from Office
Key Responsibilities : - Exp in Snowflake with min 3 yrs. - Design, develop, and maintain integration workflows using Informatica CAI/CDI within IDMC. - Configure and manage Informatica MDM for data consolidation, cleansing, and governance. - Translate business and technical requirements into robust, scalable integration solutions. - Collaborate with analysts, architects, and business stakeholders to ensure delivery meets expectations. - Monitor and troubleshoot integration processes and performance. - Support deployment activities and CI/CD processes. - Maintain documentation of integration designs and data flows. Essential Skills & Experience : - 4+ years experience in Informatica development (Cloud and/or On-Prem). - Strong experience with CAI (Cloud Application Integration) and CDI (Cloud Data Integration). - Experience configuring and supporting Informatica MDM including data models, match/merge rules, and hierarchies. - Solid understanding of REST/SOAP APIs, event-driven architecture, and message queues. - Hands-on experience with IDMC platform and cloud-native integration patterns. - Proficient in SQL and data manipulation techniques. - Experience in data governance, data quality, and master data best practice. - Experience with CI/CD pipelines for Informatica using Git, Jenkins, or similar tools. - Knowledge of cloud platforms (Azure, AWS, or GCP). - Exposure to data warehousing, data lake, or real-time integration. - Familiarity with Agile/Scrum delivery methods.
Posted 6 days ago
6.0 - 10.0 years
5 - 9 Lacs
noida
Work from Office
The Azure Data Bricks Engineer plays a critical role in establishing and maintaining an efficient data ecosystem within an organization. This position is integral to the development of data solutions leveraging the capabilities of Microsoft Azure Data Bricks. The engineer will work closely with data scientists and analytics teams to facilitate the transformation of raw data into actionable insights. With increasing reliance on big data technologies and cloud-based solutions, having an expert on board is vital for driving data-driven decision-making processes. The Azure Data Bricks Engineer will also be responsible for optimizing data workflows, ensuring data quality, and deploying scalable data solutions that align with organizational goals. This role requires not only technical expertise in handling large volumes of data but also the ability to collaborate across various functional teams to enhance operational efficiency. - Design and implement scalable data pipelines using Azure Data Bricks. - Develop ETL processes to efficiently extract, transform, and load data. - Collaborate with data scientists and analysts to define and refine data requirements. - Optimize Spark jobs for performance and efficiency. - Monitor and troubleshoot production workflows and jobs. - Implement data quality checks and validation processes. - Create and maintain technical documentation related to data architecture. - Conduct code reviews to ensure best practices are followed. - Work on integrating data from various sources including databases, APIs, and third-party services. - Utilize SQL and Python for data manipulation and analysis. - Collaborate with DevOps teams to deploy and maintain data solutions. - Stay updated with the latest trends and updates in Azure Data Bricks and related technologies. - Facilitate data visualization initiatives for better data-driven insights. - Provide training and support to team members on data tools and practices. - Participate in cross-functional projects to enhance data sharing and access. - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 6 years of experience in data engineering or a related domain. - Strong expertise in Azure Data Bricks and data lake concepts. - Proficiency with SQL, Python, and Spark. - Solid understanding of data warehousing concepts. - Experience with ETL tools and frameworks. - Familiarity with cloud platforms such as Azure, AWS, or Google Cloud. - Excellent problem-solving and analytical skills. - Ability to work collaboratively in a diverse team environment. - Experience with data visualization tools such as Power BI or Tableau. - Strong communication skills with the ability to convey technical concepts to non-technical stakeholders. - Knowledge of data governance and data quality best practices. - Hands-on experience with big data technologies and frameworks. - A relevant certification in Azure is a plus. - Ability to adapt to changing technologies and evolving business requirements.
Posted 6 days ago
6.0 - 10.0 years
5 - 9 Lacs
gurugram
Work from Office
The Azure Data Bricks Engineer plays a critical role in establishing and maintaining an efficient data ecosystem within an organization. This position is integral to the development of data solutions leveraging the capabilities of Microsoft Azure Data Bricks. The engineer will work closely with data scientists and analytics teams to facilitate the transformation of raw data into actionable insights. With increasing reliance on big data technologies and cloud-based solutions, having an expert on board is vital for driving data-driven decision-making processes. The Azure Data Bricks Engineer will also be responsible for optimizing data workflows, ensuring data quality, and deploying scalable data solutions that align with organizational goals. This role requires not only technical expertise in handling large volumes of data but also the ability to collaborate across various functional teams to enhance operational efficiency. - Design and implement scalable data pipelines using Azure Data Bricks. - Develop ETL processes to efficiently extract, transform, and load data. - Collaborate with data scientists and analysts to define and refine data requirements. - Optimize Spark jobs for performance and efficiency. - Monitor and troubleshoot production workflows and jobs. - Implement data quality checks and validation processes. - Create and maintain technical documentation related to data architecture. - Conduct code reviews to ensure best practices are followed. - Work on integrating data from various sources including databases, APIs, and third-party services. - Utilize SQL and Python for data manipulation and analysis. - Collaborate with DevOps teams to deploy and maintain data solutions. - Stay updated with the latest trends and updates in Azure Data Bricks and related technologies. - Facilitate data visualization initiatives for better data-driven insights. - Provide training and support to team members on data tools and practices. - Participate in cross-functional projects to enhance data sharing and access. - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 6 years of experience in data engineering or a related domain. - Strong expertise in Azure Data Bricks and data lake concepts. - Proficiency with SQL, Python, and Spark. - Solid understanding of data warehousing concepts. - Experience with ETL tools and frameworks. - Familiarity with cloud platforms such as Azure, AWS, or Google Cloud. - Excellent problem-solving and analytical skills. - Ability to work collaboratively in a diverse team environment. - Experience with data visualization tools such as Power BI or Tableau. - Strong communication skills with the ability to convey technical concepts to non-technical stakeholders. - Knowledge of data governance and data quality best practices. - Hands-on experience with big data technologies and frameworks. - A relevant certification in Azure is a plus. - Ability to adapt to changing technologies and evolving business requirements.
Posted 6 days ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
As a Data Ops Capability Deployment Analyst at Citi, you will be a seasoned professional contributing to the development of new solutions and techniques for the Enterprise Data function. Your role involves performing data analytics and analysis across various asset classes, as well as building data science capabilities within the team. You will collaborate closely with the wider Enterprise Data team to deliver on business priorities. Working within the B & I Data Capabilities team, you will be responsible for managing the Data quality/Metrics/Controls program and implementing improved data governance and management practices. This program focuses on enhancing Citis approach to data risk and meeting regulatory commitments in this area. Key Responsibilities: - Hands-on experience with data engineering and a strong understanding of Distributed Data platforms and Cloud services. - Knowledge of data architecture and integration with enterprise applications. - Research and assess new data technologies and self-service data platforms. - Collaboration with Enterprise Architecture Team on refining overall data strategy. - Addressing performance bottlenecks, designing batch orchestrations, and delivering Reporting capabilities. - Performing complex data analytics on large datasets including data cleansing, transformation, joins, and aggregation. - Building analytics dashboards and data science capabilities for Enterprise Data platforms. - Communicating findings and proposing solutions to stakeholders. - Translating business requirements into technical design documents. - Collaboration with cross-functional teams for testing and implementation. - Understanding of banking industry requirements. - Other duties and functions as assigned. Skills & Qualifications: - 10+ years of development experience in Financial Services or Finance IT. - Experience with Data Quality/Data Tracing/Data Lineage/Metadata Management Tools. - Hands-on experience with ETL using PySpark, data ingestion, Spark optimization, and batch orchestration. - Proficiency in Hive, HDFS, Airflow, and job scheduling. - Strong programming skills in Python with data manipulation and analysis libraries. - Proficient in writing complex SQL/Stored Procedures. - Experience with DevOps tools, Jenkins/Lightspeed, Git, CoPilot. - Knowledge of BI visualization tools such as Tableau, PowerBI. - Implementation experience with Datalake/Datawarehouse for enterprise use cases. - Exposure to analytical tools and AI/ML is desired. Education: - Bachelor's/University degree, master's degree in information systems, Business Analysis, or Computer Science. In this role, you will be part of the Data Governance job family focusing on Data Governance Foundation. This is a full-time position at Citi, where you will utilize skills like Data Management, Internal Controls, Risk Management, and more to drive compliance and achieve business objectives. If you require a reasonable accommodation due to a disability to utilize search tools or apply for a career opportunity at Citi, please review the Accessibility at Citi guidelines. Additionally, you can refer to Citi's EEO Policy Statement and the Know Your Rights poster for more information.,
Posted 6 days ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Ops Capability Deployment Analyst at Citigroup, you will be a seasoned professional contributing to the development of new solutions, frameworks, and techniques for the Enterprise Data function. Your role will involve performing data analytics and analysis across different asset classes, as well as building data science and tooling capabilities within the team. You will work closely with the Enterprise Data team to deliver business priorities. The B & I Data Capabilities team manages the Data quality/Metrics/Controls program and implements improved data governance and data management practices. The Data quality program focuses on enhancing Citigroup's approach to data risk and meeting regulatory commitments. Key Responsibilities: - Hands-on experience with data engineering and distributed data platforms - Understanding of data architecture and integration with enterprise applications - Research and evaluate new data technologies and self-service data platforms - Collaborate with the Enterprise Architecture Team on defining data strategy - Perform complex data analytics on large datasets - Build analytics dashboards and data science capabilities - Communicate findings and propose solutions to stakeholders - Convert business requirements into technical design documents - Work with cross-functional teams for implementation and support - Demonstrate a good understanding of the banking industry - Perform other assigned duties Skills & Qualifications: - 10+ years of development experience in Financial Services or Finance IT - Experience with Data Quality/Data Tracing/Metadata Management Tools - ETL experience using PySpark on distributed platforms - Proficiency in Python, SQL, and BI visualization tools - Strong knowledge of Hive, HDFS, Airflow, and job scheduling - Experience in Data Lake/Data Warehouse implementation - Exposure to analytical tools and AI/ML is desired Education: - Bachelor's/University degree, master's degree in information systems, Business Analysis, or Computer Science If you are a person with a disability and require accommodation to use search tools or apply for a career opportunity, review Accessibility at Citi.,
Posted 6 days ago
4.0 - 6.0 years
4 - 8 Lacs
gurugram
Work from Office
Key Responsibilities : - Exp in Snowflake with min 3 yrs. - Design, develop, and maintain integration workflows using Informatica CAI/CDI within IDMC. - Configure and manage Informatica MDM for data consolidation, cleansing, and governance. - Translate business and technical requirements into robust, scalable integration solutions. - Collaborate with analysts, architects, and business stakeholders to ensure delivery meets expectations. - Monitor and troubleshoot integration processes and performance. - Support deployment activities and CI/CD processes. - Maintain documentation of integration designs and data flows. Essential Skills & Experience : - 4+ years experience in Informatica development (Cloud and/or On-Prem). - Strong experience with CAI (Cloud Application Integration) and CDI (Cloud Data Integration). - Experience configuring and supporting Informatica MDM including data models, match/merge rules, and hierarchies. - Solid understanding of REST/SOAP APIs, event-driven architecture, and message queues. - Hands-on experience with IDMC platform and cloud-native integration patterns. - Proficient in SQL and data manipulation techniques. - Experience in data governance, data quality, and master data best practice. - Experience with CI/CD pipelines for Informatica using Git, Jenkins, or similar tools. - Knowledge of cloud platforms (Azure, AWS, or GCP). - Exposure to data warehousing, data lake, or real-time integration. - Familiarity with Agile/Scrum delivery methods.
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
As a Data Ops Capability Deployment Analyst at Citigroup, you will be a seasoned professional applying your in-depth disciplinary knowledge to contribute to the development of new solutions, frameworks, and techniques for the Enterprise Data function. Your role will involve integrating subject matter expertise and industry knowledge within a defined area, requiring a thorough understanding of how different areas collectively integrate within the sub-function to contribute to the overall business objectives. Your primary responsibilities will include performing data analytics and data analysis across various asset classes, as well as building data science and tooling capabilities within the team. You will collaborate closely with the wider Enterprise Data team, particularly the front to back leads, to deliver on business priorities. Working within the B & I Data Capabilities team in the Enterprise Data function, you will manage the Data quality/Metrics/Controls program and implement improved data governance and data management practices across the region. The Data quality program focuses on enhancing Citigroup's approach to data risk and meeting regulatory commitments in this area. Key Responsibilities: - Utilize a data engineering background to work hands-on with Distributed Data platforms and Cloud services. - Demonstrate a sound understanding of data architecture and data integration with enterprise applications. - Research and evaluate new data technologies, data mesh architecture, and self-service data platforms. - Collaborate with the Enterprise Architecture Team to define and refine the overall data strategy. - Address performance bottlenecks, design batch orchestrations, and deliver Reporting capabilities. - Perform complex data analytics on large datasets, including data cleansing, transformation, joins, and aggregation. - Build analytics dashboards and data science capabilities for Enterprise Data platforms. - Communicate findings and propose solutions to various stakeholders. - Translate business and functional requirements into technical design documents. - Work closely with cross-functional teams to prepare handover documents and manage testing and implementation processes. - Demonstrate an understanding of how the development function integrates within the overall business/technology landscape. Skills & Qualifications: - 10+ years of active development background in Financial Services or Finance IT. - Experience with Data Quality, Data Tracing, Data Lineage, and Metadata Management Tools. - Hands-on experience with ETL using PySpark on distributed platforms, data ingestion, Spark optimization, resource utilization, and batch orchestration. - Proficiency in programming languages such as Python, with experience in data manipulation and analysis libraries. - Strong SQL skills and experience with DevOps tools like Jenkins/Lightspeed, Git, CoPilot. - Knowledge of BI visualization tools like Tableau, PowerBI. - Experience in implementing Datalake/Datawarehouse for enterprise use cases. - Exposure to analytical tools and AI/ML is desired. Education: - Bachelor's/University degree, master's degree in information systems, Business Analysis, or Computer Science. In this role, you will play a crucial part in driving compliance with applicable laws, rules, and regulations while safeguarding Citigroup, its clients, and assets. Your ability to assess risks and make informed business decisions will be essential in maintaining the firm's reputation. Please refer to the full Job Description for more details on the skills, qualifications, and responsibilities associated with this position.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
lululemon is an innovative performance apparel company dedicated to supporting individuals in their yoga, running, training, and other athletic pursuits. With a focus on technical fabrics and functional design, we aim to create transformational products and experiences that encourage movement, growth, connection, and overall well-being. Our success is attributed to our groundbreaking products, our commitment to our team members, and the strong connections we establish within every community we engage with. As a company, we are dedicated to fostering positive change and building a healthier, thriving future. Central to this mission is the establishment of an equitable, inclusive, and growth-oriented environment for all our team members. Our India Tech Hub plays a pivotal role in enhancing our technological capabilities across various domains including Product Systems, Merchandising and Planning, Digital Presence, distribution and logistics, and corporate systems. The team in India collaborates closely with the global team on projects of strategic importance. The Global Digital Technology organization is currently seeking a dynamic team leader to oversee development teams responsible for delivering lululemon's web and mobile experiences. In this role, you will play a crucial part in ensuring the success of your team by identifying high-value deliverables, maintaining a focus on delivering quality code, and collaborating closely with product and project leads to manage project scope and meet stakeholder expectations. Your expertise in building production systems in cloud-native and hybrid environments, emphasizing automation and Infrastructure as Code, equips you to guide teams and partners towards best practices that facilitate quick and sustainable progress. As an independent manager, you excel in leading team members with diverse skill sets, effectively communicating expectations, and taking charge of the technical roadmap for your team. You work closely with partners and stakeholders to communicate timelines, scope, and risks, leading the delivery of major initiatives within defined timelines. Additionally, you play a key role in identifying strategic technical debt, conducting cost/benefit analysis for debt resolution, and proposing prioritized timelines to the management team. Your leadership extends to supporting the growth and development of your team as a collective and on an individual level. At lululemon, goal-setting is paramount, and you facilitate your team's participation in our Vision & Goals program, aligning with what is authentically important for you and your team and working towards realizing those objectives. **Core Accountabilities:** - Lead a team of developers across multiple work streams simultaneously - Adapt processes and timelines to ensure high-quality deliverables - Ensure the team produces software that is highly available, monitorable, and maintainable - Ensure appropriate skills are utilized to address specific challenges within the team - Guide teams towards best practices and participate in code reviews as necessary - Identify and eliminate obstacles to success **Qualifications:** - 2+ years of experience in senior engineering and 4+ years of experience as a people manager - Hands-on experience in Guest Data Management and Data Authentication - Proficiency in Snowflake, MongoDB, Datalake, Databricks, ETL pipeline - 3+ years of experience in Java, Jsecure, Spring, Microservices - Thorough understanding of the web platform and its technical foundations - Ability to work effectively across various browser and server environments - Comfortable operating in complex and large-scale production settings - Previous retail experience in the retail industry is desirable **Must-haves:** - Demonstrates personal responsibility and acknowledges the presence of choice in every moment - Possesses an entrepreneurial spirit and continuously seeks innovation to achieve exceptional results - Communicates with honesty and kindness, fostering an environment of open communication - Leads with courage, prioritizing the pursuit of greatness over the fear of failure - Emphasizes building trusting relationships and putting people first - Integrates fun and joy into their work, maintaining a light-hearted approach to challenges,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a Data Integrator at our company, you will be expected to have 3 to 5 years of experience and hold a B.tech/BE degree. Your role will require excellent communication skills and the ability to thrive in a complex matrix environment. We value self-motivation and a strong team player mentality in our candidates. Your responsibilities will include demonstrating excellent development skills in PySpark, Python 2x or 3x, as well as in SQL/PL SQL (Oracle, Teradata/Sybase). Experience working in a Unix/Linux environment is essential, along with exposure to ElasticSearch, GPDB, and ORACLE. Knowledge of Datalake and Datawarehouse concepts is necessary, along with an understanding of Physical DataModel, Logical DataModel, and Conceptual DataModel. You will be tasked with creating Source to target Mapping, System Test Cases and Plans, as well as handling Code Versioning, Change Management, and Production Release Support. As the Single Point of Contact (SPOC) for Human Resources, you will play a crucial role in the company's operations. If you meet the requirements and are excited about this opportunity, please reach out to our Human Resources department at careers@tcgdigital.com.,
Posted 2 weeks ago
10.0 - 15.0 years
50 - 60 Lacs
hyderabad
Work from Office
Job Title: Enterprise Solution Architect Utilities Domain Location: Hyderabad, India Work Mode: Work from Office (4 days/week) Experience: 1015 Years Industry: Utilities (Electricity, Water, Gas, Renewable Energy) Employment Type: Full-time Role Summary We are looking for a technology leader with a strong background in enterprise solution design and solid exposure to the Utilities domain. The role focuses on driving large-scale digital transformation programs using Microsoft technologies and modern cloud platforms. Key Responsibilities Lead solution design and architecture for digital transformation programs in the utilities sector Define enterprise architecture roadmaps aligning business and IT goals Collaborate with business and technical stakeholders across regions Recommend modern cloud-native platforms and integration strategies Document solution designs, patterns, and best practices Stay updated with utilities industry trends and emerging technologies Required Skills & Experience 10 to 15 years of IT experience with at least 5+ years in solution/enterprise architecture Strong understanding of utility operations (metering, billing, outage management, customer service) Hands-on expertise in Microsoft technology stack: Cloud: Azure (mandatory) ERP/CRM Platforms: Microsoft Dynamics 365 (Customer Service, Field Service, Billing modules) Backend: .NET Core / C# Frontend: Angular / React (architectural level, not coding) Integration: REST APIs, Event-driven architecture (Kafka/Azure Service Bus) Data & Analytics: Power BI, Azure Synapse / Data Lake Familiarity with SAP IS-U or Oracle Utilities (good to have, not mandatory) Strong communication and stakeholder management skills Preferred Qualifications Microsoft Certified Azure Solutions Architect, or Dynamics certifications Experience in global utilities transformation projects (Europe/US/UK preferred, not mandatory)
Posted 2 weeks ago
5.0 - 7.0 years
0 Lacs
pune, maharashtra, india
On-site
Druva is the leading provider of data security solutions, empowering customers to secure and recover their data from all threats. The Druva Data Security Cloud is a fully managed SaaS solution offering air-gapped and immutable data protection across cloud, on-premises, and edge environments. By centralizing data protection, Druva enhances traditional security measures and enables faster incident response, effective cyber remediation, and robust data governance. Trusted by nearly 7,500 customers, including 75 of the Fortune 500, Druva safeguards business data in an increasingly interconnected world. Visit and follow us on , and . As a Senior Staff Software Engineer, you will be providing technical leadership to create high-quality software by owning low level , design and implementation of services within a product. This role will require excellent communication skills as you will collaborate with Product Management to refine requirements, product architects to propose design changes, and other product owners to drive features to completion with good quality. Key Skill AI first mindset to software development, having experience using genAI during various phases of software development lifecycle from design to code to test using tools like cursor 5-7 years of experience, preferably in a product company, building global scale distributed SaaS applications that handle petabytes of data. Hands-on experience in the design and development of complex products Extensive hands-on experience in Go/Python/C/C++/Java on Unix/Linux platforms. A strong understanding of complex concepts related to computer architecture, data structures, algorithms, design concepts, and programming practices. Data modelling for OLAP workloads, Scalability design and query optimisations Understanding of data consistency at cloud scale, eventual consistency models Hands on experience with Big data tools and frameworks (Datalake / Lakehouse, ETL) preferably in public cloud ecosystems like AWS and modules like Apache Spark, AWS Glue, Iceberg. Desirable Skills: Excellent written and verbal communication skills Working knowledge of Dockers and Kubernetes will be an advantage Role and Responsibilities: The Senior Software Engineer's role is to be the technical leader in building enterprise-grade scalable, performant systems which deliver the required functionality to the customers and delight them Should be able to design and implement sufficiently large and complex features and/or architectural improvements to the product. Suggest and propose solutions to complex design problems. Identify areas of engineering improvements to the product and work with product architects and the team to address them. Should be able to technically guide junior engineers with feature design and implementation. Review design and implementation done by junior engineers. Should be able to independently handle complex escalations and guide others as required. Be able to write technical blogs and make technical presentations in internal and external forums Qualification B.E / B Tech M.E./ M.Tech (Computer Science) or equivalent
Posted 2 weeks ago
3.0 - 5.0 years
5 - 8 Lacs
bengaluru
Remote
As a Senior Azure Data Engineer, your responsibilities will include: Building scalable data pipelines using Databricks and PySpark Transforming raw data into usable business insights Integrating Azure services like Blob Storage, Data Lake, and Synapse Analytics Deploying and maintaining machine learning models using MLlib or TensorFlow Executing large-scale Spark jobs with performance tuning on Spark Pools Leveraging Databricks Notebooks and managing workflows with MLflow Qualifications: Bachelors/Masters in Computer Science, Data Science, or equivalent 7+ years in Data Engineering, with 3+ years in Azure Databricks Strong hands-on in: PySpark, Spark SQL, RDDs, Pandas, NumPy, Delta Lake Azure ecosystem: Data Lake, Blob Storage, Synapse Analytics
Posted 2 weeks ago
3.0 - 5.0 years
40 - 45 Lacs
kochi, kolkata, bhubaneswar
Work from Office
We are seeking experienced Data Engineers with over 3 years of experience to join our team at Intuit, through Cognizant. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 3 years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.
Posted 2 weeks ago
4.0 - 9.0 years
6 - 10 Lacs
kolkata
Work from Office
- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill
Posted 3 weeks ago
4.0 - 9.0 years
6 - 10 Lacs
surat
Work from Office
- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill
Posted 3 weeks ago
4.0 - 9.0 years
6 - 10 Lacs
vadodara
Work from Office
- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill
Posted 3 weeks ago
4.0 - 9.0 years
6 - 10 Lacs
hyderabad
Work from Office
- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill
Posted 3 weeks ago
4.0 - 9.0 years
6 - 10 Lacs
kanpur
Work from Office
- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. - Designing and implementing data engineering, ingestion, and transformation functions - Azure Synapse or Azure SQL data warehouse - Spark on Azure is available in HD insights and data bricks - Good customer communication. - Good Analytical skill
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |