Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
25 - 40 Lacs
Mumbai, Hyderabad
Work from Office
Essential Services: Role & Location fungibility At ICICI Bank, we believe in serving our customers beyond our role definition, product boundaries, and domain limitations through our philosophy of customer 360-degree. In essence, this captures our belief in serving the entire banking needs of our customers as One Bank, One Team . To achieve this, employees at ICICI Bank are expected to be role and location-fungible with the understanding that Banking is an essential service . The role descriptions give you an overview of the responsibilities, it is only directional and guiding in nature. About the Role: As a Data Warehouse Architect, you will be responsible for managing and enhancing data warehouse that manages large volume of customer-life cycle data flowing in from various applications within guardrails of risk and compliance. You will be managing the day-to-day operations of data warehouse i.e. Vertica. In this role responsibility, you will manage a team of data warehouse engineers to develop data modelling, designing ETL data pipeline, issue management, upgrades, performance fine-tuning, migration, governance and security framework of the data warehouse. This role enables the Bank to maintain huge data sets in a structured manner that is amenable for data intelligence. The data warehouse supports numerous information systems used by various business groups to derive insights. As a natural progression, the data warehouses will be gradually migrated to Data Lake enabling better analytical advantage. The role holder will also be responsible for guiding the team towards this migration. Key Responsibilities: Data Pipeline Design: Responsible for designing and developing ETL data pipelines that can help in organising large volumes of data. Use of data warehousing technologies to ensure that the data warehouse is efficient, scalable, and secure. Issue Management: Responsible for ensuring that the data warehouse is running smoothly. Monitor system performance, diagnose and troubleshoot issues, and make necessary changes to optimize system performance. Collaboration: Collaborate with cross-functional teams to implement upgrades, migrations and continuous improvements. Data Integration and Processing: Responsible for processing, cleaning, and integrating large data sets from various sources to ensure that the data is accurate, complete, and consistent. Data Modelling: Responsible for designing and implementing data modelling solutions to ensure that the organizations data is properly structured and organized for analysis. Key Qualifications & Skills: Education Qualification: B.E./B. Tech. in Computer Science, Information Technology or equivalent domain with 10 to 12 years of experience and at least 5 years or relevant work experience in Datawarehouse/ mining/BI/MIS. Experience in Data Warehousing: Knowledge on ETL and data technologies and outline future vision in OLTP, OLAP (Oracle / MS SQL). Data Modelling, Data Analysis and Visualization experience (Analytical tools experience like Power BI / SAS / ClickView / Tableu etc). Good to have exposure to Azure Cloud Data platform services like COSMOS, Azure Data Lake, Azure Synapse, and Azure Data factory. Synergize with the Team: Regular interaction with business/product/functional teams to create mobility solutions. Certification: Azure certified DP 900, PL 300, DP 203 or any other Data platform/Data Analyst certifications. About the Business Group The Technology Group at ICICI Bank is at the forefront of our operations and offerings, which are focused on leveraging state-of-the-art technology to provide customer-centric solutions. This group plays a pivotal role in our vision of the transition from Bank to Bank Tech. Further, the group offers round-the-clock support to our entire banking ecosystem. In our persistent efforts to provide products and solutions that genuinely touch customers, unlocking the potential of technology in every single engagement would go a long way in creating customer delight. In this endeavor, we also tirelessly ensure all our processes, systems, and infrastructure are very well within the guardrails of data security, privacy, and relevant regulations.
Posted 1 week ago
9.0 - 13.0 years
9 - 15 Lacs
Chennai
Work from Office
Petrofac is a leading international service provider to the energy industry, with a diverse client portfolio including many of the worlds leading energy companies. We design, build, manage, and maintain infrastructure for our clients. We recruit, reward, and develop our people based on merit, regardless of race, nationality, religion, gender, age, sexual orientation, marital status, or disability. We value our people and treat everyone who works for or with Petrofac fairly and without discrimination. The world is re-thinking its energy supply and energy security needs and planning for a phased transition to alternative energy sources. We are here to help our clients meet these evolving energy needs. This is an exciting time to join us on this journey. Are you ready to bring the right energy to Petrofac and help us deliver a better future for everyone? JOB TITLE: Data Engineer KEY RESPONSIBILITIES: Architecting and defining data flows for big data/data lake use cases. Excellent knowledge on implementing full life cycle of data management principles such as Data Governance, Architecture, Modelling, Storage, Security, Master data, and Quality. Act as a coach and provide consultancy services and advice to data engineers by offering technical guidance, and ensuring architecture principles, design standards and operational requirements are met. Participate in the Technical Design Authority forums. Collaborates with analytics and business stakeholders to improve data models that feed BI tools, increasing data accessibility, and fostering data-driven decision making across the organization. Work with team of data engineers to deliver the tasks and achieving weekly and monthly goals, also to guide the team to follow the best practices and improve the deliverables. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability. Responsible for estimating the cluster size, core size, monitoring, and troubleshooting of the data bricks cluster and analysis server to produce optimal capacity for computing data ingestion. Deliver master data cleansing and improvement efforts; including automated and cost-effective solutions for processing, cleansing, and verifying the integrity of data used for analysis. Expertise in securing the big data environment including encryption, tunnelling, access control, secure isolation. To guide and build highly efficient OLAP cubes using data modelling techniques to cater all the required business cases and mitigate the limitation of Power BI in analysis service. Deploy and maintain highly efficient CI/CD devops pipelines across multiple environments such as dev, stg and production. Strictly follow scrum based agile approach of development to work based on allocated stories. Comprehensive knowledge on data extraction, Transformation and loading data from various sources like Oracle, Hadoop HDFS, Flat files, JSON, Avro, Parquet and ORC. Experience defining, implementing, and maintaining a global data platform Experience building robust and impactful data visualisation solutions and gaining adoption Extensive work experience onboarding various data sources using real-time, batch load or scheduled loads. The sources can be in cloud, on premise, SQL DB, NO SQL DB or API-based. Expertise in extracting the data through JSON, ODATA, REST API, WEBSERVICES, XML. Expertise in data ingestion platforms such as Apache Sqoop, Apache Flume, Amazon kinesis, Fluent, Logstash etc. Hands on experience in using Databricks, Pig, SCALA, HIVE, Azure Data Factory, Python, R Operational experience with Big Data Technologies and Engines including Presto, Spark, Hive and Hadoop Environments Experience in various databases including Azure SQL DB, Oracle, MySQL, Cosmos DB, MongoDB Experience supporting and working with cross-functional teams in a dynamic environment. ESSENTIAL QUALIFICATION & SKILLS: Bachelors degree (masters preferred) in Computer Science, Engineering, or any other technology related field 10+ years of experience in data analytics platform and hands-on experience on ETL and ELT transformations with strong SQL programming knowledge. 5+ years of hands-on experience on big data engineering, distributed storage and processing massive data into data lake using Scala or Python. Proficient knowledge on Hadoop and Spark eco systems like HDFS, Hive, Sqoop, Oozie, Spark core, streaming. Experience with programming languages such as Scala, Java, Python and Shell scripting Proven Experience in pulling data through REST API, ODATA, XML,Web services. Experience with Azure product offerings and data platform. Experience in data modelling (data marts, snowflake/Star, Normalization, SCD2). Architect and defining the data flows and building highly efficient, scalable data pipelines. To work in tandem with the Enterprise and Domain Architects to understand the business goals and vision, and to contribute to the Enterprise Roadmaps. Strong troubleshooting skills, problem solving skills of any issues stopping business progress. Coordinate with multiple business stake holders to understand the requirement and deliver. Conducting a continuous audit of data management system performance, refine whenever required, and report immediately any breach or loopholes to the stakeholders. Allocate task to various team members, track the status and provide the report on activities to management. Understand the physical and logic plan of execution and optimize the performance of data pipelines. Extensive background in data mining and statistical analysis. Able to understand various data structures and common methods in data transformation. Ability to work with ETL tools with strong knowledge on ETL concepts. Strong focus on delivering outcomes. Data management: modelling, normalisation, cleaning, and maintenance Understand Data architectures, Data warehousing principles and be able to participate in the design and development of conventional data warehouse solutions.
Posted 1 week ago
12.0 - 17.0 years
12 - 17 Lacs
Pune
Work from Office
Role Overview: The Technical Architect specializes in Traditional ETL tools such as Informatica Intelligent Cloud Services (IICS), and similar technologies. The jobholder designs, implements, and oversees robust ETL solutions to support our organization's data integration and transformation needs. Responsibilities: Design and develop scalable ETL architectures using tools like IICS, and other traditional ETL platforms. Collaborate with stakeholders to gather requirements and translate them into technical solutions. Ensure data quality, integrity, and security throughout the ETL processes. Optimize ETL workflows for performance and reliability. Provide technical leadership and mentorship to development teams. Troubleshoot and resolve complex technical issues related to ETL processes. Document architectural designs and decisions for future reference. Stay updated with emerging trends and technologies in ETL and data integration. Key Technical Skills & Responsibilities 12+ years of experience in data integration and ETL development, with at least 3 years in an Informatica architecture role. Extensive expertise in Informatica PowerCenter, IICS, and related tools (Data Quality, EDC, MDM). Proven track record of designing ETL solutions for enterprise-scale data environments Advanced proficiency in Informatica PowerCenter and IICS for ETL/ELT design and optimization. Strong knowledge of SQL, Python, or Java for custom transformations and scripting. Experience with data warehousing platforms (Snowflake, Redshift, Azure Synapse) and data lakes. Familiarity with cloud platforms (AWS, Azure, GCP) and their integration services. Expertise in data modeling, schema design, and integration patterns. Knowledge of CI/CD, Git, and infrastructure-as-code (e.g., Terraform) Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Primary Skills: Informatica,IICS Data Lineage and Metadata Management Data Modeling Data Governance Data integration architectures Informatica data quality Eligibility Criteria: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience in ETL architecture and development using tools like IICS, etc. Strong understanding of data integration, transformation, and warehousing concepts. Proficiency in SQL and scripting languages. Experience with cloud-based ETL solutions is a plus. Familiarity with Agile development methodologies. Excellent problem-solving and analytical skills. Strong communication and leadership abilities. Knowledge of data governance and compliance standards. Ability to work in a fast-paced environment and manage multiple priorities.
Posted 1 week ago
10.0 - 15.0 years
35 - 37 Lacs
Hyderabad
Work from Office
About the Job We are changing the way people think about customer service by leveraging advanced AI and automation, and we need your help! The Vice President of Product is a senior leader who will craft, lead, and execute a multi-product, multi-year strategy that simplifies and aligns with the global growth and scale we are currently experiencing. A successful Vice President of Product needs to be experienced in all areas of Product Management Lifecycle, plus also be collaborative and visionary a trusted leader who can also build and mentor a team of Product Managers. As Vice President of Product, You Will Plan and lead the strategy and design of new software products and/or enhancements. Build and manage the ongoing evolution of a multi-product road map that aligns with IntouchCX''s business objectives. Build and mentor a team of top-tier talent, including but not limited to: Product Management Business Analysis UI/UX Oversee the analysis and design of in-house developed tools as well as various integrations with 3rd party products. Partner with various stakeholders to ensure all projects are delivered on time and with a high level of quality. Ensure the appropriate evolution of IntouchCX software tools. Ensure technology solutions align with business requirements and direction. Perform continuous reviews of technologies, industry standards, and industry-related developments, as well as user feedback to make recommendations relative to the need for further research and associated changes. Promote and advance IntouchCX culture. Define and monitor key team and system metrics. Ensure all projects and resources align with the budget. As Vice President of Product, You Need Minimum of 10 years of experience in Product Management and/or Business Analysis. Minimum of 7 years experience in team leadership. Significant experience managing medium to large software projects. Ideally Computer Science, Engineering, or a similar degree. Experience in Product Management Lifecycle tools and methodologies. Experience working with Agile software development methodologies preferred. Experience with data integration and reporting an asset. Must be knowledgeable about industry trends, best practices, and change management. Excellent speaking and writing abilities. Ability to prioritize and plan work activities as needed. Ability to meet aggressive deadlines and handle multiple and complex projects. Ability to coach and mentor employees, providing career and professional guidance. Ability to interface with internal and external stakeholders.
Posted 1 week ago
3.0 - 6.0 years
10 - 20 Lacs
Hyderabad
Work from Office
Power BI/SAC, LLMs (ChatGPT/OpenAI), Python, REST APIs, UiPath, SQL/NoSQL, Data Modeling, API Integration, Power Apps, AppSheet, WordPress, Drupal, DNS, Web Hosting, Google Workspace Admin.
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
Genpact is a global professional services and solutions firm that is committed to delivering outcomes that shape the future. With over 125,000 employees in 30+ countries, we are fueled by our innate curiosity, entrepreneurial agility, and the desire to create lasting value for our clients. Our purpose, the relentless pursuit of a world that works better for people, drives us to serve and transform leading enterprises, including the Fortune Global 500, by leveraging our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Principal Consultant - Snowflake Data Modeller. As a Data Engineer, you will be expected to demonstrate strong expertise in data analysis, data integration, data transformation, and ETL/ELT skills necessary to excel in this role. Additionally, relevant domain experience in Investment Banking and exposure to Cloud, preferably AWS, are desired qualifications. Responsibilities: - Possess hands-on experience in relational, dimensional, and/or analytic work using RDBMS, dimensional data platform technologies, and ETL and data ingestion. - Demonstrate experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts. - Exhibit strong communication and presentation skills. - Assist the team in implementing business and IT data requirements through new data strategies and designs across all data platforms and tools. - Collaborate with business and application/solution teams to implement data strategies and develop conceptual/logical/physical data models. - Define and enforce data modeling and design standards, tools, best practices, and related development for enterprise data models. - Engage in hands-on modeling and mappings between source system data model and Datawarehouse data models. - Proactively address project requirements and articulate issues/challenges to reduce project delivery risks regarding modeling and mappings. - Showcase hands-on experience in writing complex SQL queries. - Good to have experience in data modeling for NOSQL objects. Qualifications we seek in you: Minimum Qualifications: - Bachelor's Degree in Computer Science, Mathematics, or Statistics. - Relevant experience in the field. - 8+ years of experience in metadata management, data modeling, and related tools (Erwin, ER Studio, or others). Overall 10+ years of experience in IT. If you are passionate about leveraging your skills and experience in data modeling and analysis to drive impactful results, we invite you to join us as a Principal Consultant at Genpact. This is a full-time position based in India, Bangalore. Please note that the job posting date is October 7, 2024, and the unposting date is October 12, 2024. We are looking for individuals with a strong digital skill set to contribute to our dynamic team.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
The role of UI Backend and Data Pipeline Engineer is based in Hyderabad/Bangalore (Hybrid) and is a full-time position suitable for immediate joiners. As part of a team focused on developing new products to support customer growth and transformation, your responsibilities will involve contributing to the delivery of automotive forecasting solutions. This role offers an opportunity to work on cutting-edge technology platforms at scale and play a key role in driving business strategies. You will be responsible for designing, developing, and maintaining scalable data pipelines with complex algorithms, as well as building and maintaining UI backend services using Python, C#, or similar technologies to ensure high performance and responsiveness. Your role will also involve ensuring data quality and integrity through robust validation processes, leading data integration projects, and collaborating with cross-functional teams to gather data requirements. To succeed in this role, you should have a Bachelor's degree in computer science or a related field, strong analytical and problem-solving skills, and 7+ years of experience in Data Engineering/Advanced Analytics. Proficiency in Python, experience with Flask for backend development, and a strong understanding of object-oriented programming are essential. Additionally, AWS proficiency, including ECR and Containers, is considered a significant advantage. As a key member of the team, you will have the opportunity to make a significant impact by designing and developing AWS cloud native solutions that enable analysts to forecast long and short-term trends in the automotive industry. Your work will contribute to the growth and success of the organization while providing valuable insights to clients. If you are passionate about delivering innovative solutions to complex problems and thrive in a fast-paced, collaborative environment, this role offers an exciting opportunity to showcase your technical expertise and drive business success.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
You will be working independently as an SAP BODS developer, familiar with standard concepts, practices, and procedures. Your responsibilities will include experience in Data Extraction, Data Integration, Data Transformation, Data Quality, and Data Profiling. You should have experience in building integrations in SAP DS with Rest/SOAP APIs and cloud platforms like Google Cloud Platform. It is essential to possess a deep knowledge and understanding of Datawarehouse concepts and have strong PL-SQL skills for writing complex queries. You will be responsible for data extraction and integration using different data sources like SAP HANA and various warehouse applications. Experience in utilizing all kinds of SAP BODS transformations such as Data Integrator, Data Quality, and Basic Transforms is required. Additionally, tuning jobs for high volume and complex transformation scenarios and experience in DS components including job server, repositories, and service designers will be advantageous. In addition to the core responsibilities, you should be able to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data. It is crucial to stay aware of the latest technologies and trends in the industry. Logical thinking, problem-solving skills, and the ability to collaborate effectively are essential traits for this role. You must also be capable of assessing current processes, identifying improvement areas, and suggesting appropriate technology solutions. The mandatory skills for this position include expertise in SAP BODS.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
The Lead Data Management position entails the responsibility of overseeing and preserving the accuracy, consistency, and integrity of the master data within all business units of the organization. This role requires the development of data management strategies, implementation of data governance practices, and leading the implementation of master Data Management tools on an SAP S4 HANA platform. Additionally, the incumbent will be tasked with ensuring the seamless integration of data management systems. The ideal candidate should possess a solid background in SAP S4 HANA, data management, exceptional analytical skills, and the ability to spearhead data projects within a dynamic work environment. The Lead Data Management will report to the Sr Director, Business Transformation and Technology and will not have any direct reports. Qualifications for this role include a Bachelor's degree in Business Administration, Information Technology, Data Management, or a related field; however, a Master's degree is highly preferred. The successful candidate should have a minimum of 10 years of experience in data management, with a specific focus on master data management. Proficiency in data governance, data quality, and data integration best practices is essential. The ability to draft policies and procedures to establish data standards and operational models is crucial. Strong project management skills are required, and experience in the pharmaceutical or medical device industry is advantageous, along with an understanding of relevant industry and regional data regulations and standards. Moreover, experience in handling global cross-functional projects across diverse geographical locations, along with the flexibility to operate in different time zones, is beneficial. Proficiency in data management tools like SAP MDM, Informatics, or similar software is necessary. Knowledge of SAP data elements is mandatory, while familiarity with data platforms such as Azure and AWS is desirable.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
TresVista is a global enterprise offering a diversified portfolio of services that enables clients to achieve resource optimization through leveraging an offshore capacity model. Our services include investment diligence, industry research, valuation, fund administration, accounting, and data analytics. With over 1,800 employees worldwide, we provide high-caliber support to over 1,000 clients across various geographies and asset classes. Our Human Resources business unit plays a critical role in empowering TresVista's workforce to drive client impact. Responsibilities of the HR department include recruitment, managing compensation and benefits, enhancing employee productivity and wellbeing, performance reviews, and overall employee lifecycle management. The HR Center of Excellence (CoE) at TresVista is a specialized department focused on driving innovation and best practices in core HR and talent management. As part of the Transformation team, the Senior Associate role involves collaborating with various departments to develop streamlined processes, innovative strategies, and impactful initiatives that align with organizational objectives and enhance HR effectiveness. Key responsibilities of the Senior Associate include evaluating and implementing HR technologies, driving process improvements, optimizing workflows, managing change initiatives, preparing business requirement documents, integrating data for decision-making, and collaborating with stakeholders to execute transformation projects. To be successful in this role, candidates should have led at least one HR transformation project, possess certification in SAP Success Factors/Workday or equivalent, demonstrate strong analytical skills, experience in project management and change methodologies, familiarity with data visualization tools such as Power BI, and knowledge of AI/ML applications in HR. The ideal candidate will have at least 5 years of experience in HR transformation, HRIS, and HR analytics, along with a PGDM or MBA equivalent education. The compensation structure will be in line with industry standards. Join us at TresVista and be part of a dynamic team driving HR innovation and transformation to support our global client base effectively.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
Genpact is a global professional services and solutions firm dedicated to delivering outcomes that shape the future. With a workforce of over 125,000 individuals spread across more than 30 countries, we are motivated by our inherent curiosity, entrepreneurial agility, and commitment to creating lasting value for our clients. Our driving purpose is the relentless pursuit of a world that works better for people. We cater to and transform leading enterprises, including the Fortune Global 500, leveraging our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently looking for candidates for the position of Business Analyst - Qlik Sense Developer. We are in search of a highly experienced and skilled professional with a background in Qlik Sense development, particularly in the Insurance domain. **Responsibilities:** - Hands-on experience in Qlik Sense development, dashboarding, data modeling, and reporting, including ad hoc report generation techniques. - Proficiency in building Mashups, Nprinting, Geo Analytics & Insight Advisor. - Strong capabilities in data transformation, creating QVD files, and set analysis. - Expertise in designing, architecting, developing, and deploying applications using Qlik Sense, with a focus on front-end development and visualization best practices. - Solid database design and SQL skills, with experience in RDMS such as MS SQL Server, Oracle, etc. - Effective communication skills (verbal/written) to convey technical insights and interpret data reports for clients, as well as understanding and meeting client requirements. - Leadership qualities to implement Qlik Sense best practices thoughtfully and deliver effective solutions. - Ability to translate complex functional, technical, and business requirements into architectural designs. - Creation and maintenance of technical documentation. - Experience in data integration through ETL processes from various sources. **Qualifications:** *Minimum Qualifications:* - Bachelors or Masters degree in a relevant field is preferred. - Certification in BI & DW domain would be advantageous. *Preferred Qualifications/Skills:* - Knowledge and experience in prototyping, designing, requirement analysis, and data integration through ETL processes. - Strong analytical and logical mindset, attention to detail, and ability to work in a team. - Demonstrated ability to efficiently execute projects, meeting deliverables on time and within scope. - Effective communication skills to convey project goals, expectations, and updates to team members and stakeholders. *Job Details:* - **Job Title:** Business Analyst - **Primary Location:** India-Hyderabad - **Schedule:** Full-time - **Education Level:** Master's / Equivalent - **Job Posting:** Apr 17, 2025, 8:26:44 AM - **Unposting Date:** Ongoing - **Master Skills List:** Operations - **Job Category:** Full Time Join us in our pursuit of delivering innovative solutions and shaping a better future through your expertise in Qlik Sense development and business analysis.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You should have 3-6 years of experience as a developer on the Palantir Foundry platform. Along with this, a strong understanding of data integration, data modeling, and software development principles is required. Proficiency in languages like Python, PySpark, and Scala Spark is essential. Experience with SQL and relational databases is also a must. Your responsibilities will include designing, developing, and deploying models and applications within the Palantir Foundry platform. You will be integrating data from various sources to ensure the robustness and reliability of data pipelines. Customizing and configuring the platform to meet business requirements will also be part of your role. The position is at the Consultant level and is based in Hyderabad, Bangalore, Mumbai, Pune, Chennai, Kolkata, or Gurgaon. The notice period for this role is between 0-90 days.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As a Data Engineering Lead/Architect with over 10 years of experience, you will play a crucial role in architecting and designing data solutions that meet business requirements efficiently. Collaborating with cross-functional teams, you will define data architectures, models, and integration strategies to ensure the successful implementation of data pipelines, ETL processes, and data warehousing solutions. Your expertise in Snowflake technologies will be essential in building and optimizing data warehouses. You will develop and maintain Snowflake data models and schemas, following best practices such as cost analysis, resource allocation, and security configurations to support reporting and analytics needs effectively. Utilizing Azure cloud services and Databricks platforms, you will manage and process large datasets efficiently. Your responsibilities will include building, deploying, and maintaining data pipelines on Azure Data Factory, Azure Databricks, and other Azure services. Implementing best practices for data warehousing, ensuring data quality, consistency, and reliability will be a key focus area. You will also create and manage data integration processes, including real-time and batch data movement between systems. Your mastery in SQL and PL/SQL will be vital in writing complex queries to extract, transform, and load data effectively. You will optimize SQL queries and database performance for high-volume data processing to ensure seamless operations. Continuously monitoring and enhancing the performance of data pipelines and storage systems will be part of your responsibilities. You will troubleshoot and resolve data-related issues promptly to minimize downtime and maintain data availability. Documenting data engineering processes, data flows, and architectural decisions will be crucial for effective collaboration with data scientists, analysts, and stakeholders. Additionally, implementing data security measures and adhering to compliance standards like GDPR and HIPAA will be essential to protect sensitive data. In addition to your technical skills, you are expected to showcase leadership abilities by driving data engineering strategies, engaging in sales and proposal activities, developing strong customer relationships, and mentoring other team members. Your experience with cloud-based data solution architectures, client engagement, and leading technical teams will be valuable assets in this role. To qualify for this position, you should hold a bachelor's or master's degree in computer science or a related field. You must have over 10 years of experience in Data Engineering, with a strong focus on architecture. Proven expertise in Snowflake, Azure, and Databricks technologies, along with comprehensive knowledge of data warehousing concepts, ETL processes, and data integration techniques, is required. Exceptional SQL and PL/SQL skills, experience with performance tuning, and strong problem-solving abilities are essential. Excellent communication skills and relevant certifications in technologies like Snowflake and Azure will be advantageous.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a HubSpot Marketing Hub Specialist within the IMS nHance division, you will be responsible for managing, optimizing, and troubleshooting HubSpot-related tasks to support our marketing stakeholders effectively. Your role is critical in ensuring smooth operations, resolving technical challenges, and enhancing marketing processes within the HubSpot ecosystem. Your key responsibilities will include handling data imports, exports, and cleansing to maintain high-quality CRM data. You will also play a vital role in ensuring GDPR compliance through effective management of opt-in processes and permissions. As the primary point of contact for troubleshooting HubSpot issues, you will minimize disruptions to marketing activities and assist in form creation and property management for seamless data capture. Furthermore, you will support marketing users with lead management processes, integrate HubSpot with other platforms to ensure correct data flow, and configure/manage HubSpot properties, sequences, and analytics to align with marketing goals. Implementing best practices for RevOps within the marketing function will be essential, along with collaborating closely with marketing teams to optimize workflows and enhance MarTech adoption. Your proficiency in HubSpot, ability to quickly grasp new technology, data literacy with a commercial mindset, and familiarity with marketing campaign measurements and KPIs are key attributes for success in this role. Effective communication, collaboration skills, and a proactive approach to seeking innovative solutions to benefit our businesses and customers are highly valued. Moreover, your knowledge of governance, expertise in simplifying complex systems or processes, understanding of GDPR compliance, and testing importance will be instrumental. Your courage to contribute opinions, strong data management understanding, problem-solving skills, and ability to train marketing teams on HubSpot-related queries are essential requirements. In summary, you should possess a strong understanding of data management, integrations, and marketing automation, along with knowledge of GDPR compliance, best practices for data handling, and proficiency in troubleshooting HubSpot issues. Your analytical skills, familiarity with HubSpot reports, and the ability to support marketing teams effectively will be crucial for driving success in this role.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As a Senior Platform Engineer at Kenvue Data Platforms, you will have an exciting opportunity to be part of our growing Data & Analytics product line team. Your role involves collaborating closely with various teams such as Business partners, Product Owners, Data Strategy, Data Platform, Data Science, and Machine Learning (MLOps) to drive innovative data products for end users. You will play a key role in shaping the overall solution and data platforms, ensuring their stability, responsiveness, and alignment with business and cloud computing needs. Your expertise will be crucial in optimizing business outcomes and contributing to the growth and success of the organization. Your responsibilities will include providing leadership for data platforms in partnership with architecture teams, conducting proof of concepts to deliver secure and scalable platforms, staying updated on emerging technologies, mentoring other platform engineers, and focusing on the execution and delivery of reliable data platforms. You will work closely with Business Analytics leaders to understand business needs and create value through technology. Additionally, you will lead data platforms operations, build next-generation data and analytics capabilities, and drive the adoption and scaling of data products within the organization. To be successful in this role, you should have an undergraduate degree in Technology, Computer Science, applied data sciences, or related fields, with an advanced degree being preferred. You should possess strong analytical skills, effective communication abilities, and a proven track record in developing and maintaining data platforms. Experience with cloud platforms such as Azure, GCP, AWS, cloud-based databases, data streaming platforms, and Agile methodology will be essential. Your ability to define platforms tech stack, prioritize work items, and work effectively in a diverse and inclusive company culture will be critical to your success in this role. If you are passionate about leveraging data and technology to drive business growth, make a positive impact on personal health, and shape the future of data platforms, then this role at Kenvue Data Platforms is the perfect opportunity for you. Join us in our mission to empower millions of people every day through insights, innovation, and care. We look forward to welcoming you to our team! Location: Asia Pacific-India-Karnataka-Bangalore Function: Digital Product Development Qualifications: - Undergraduate degree in Technology, Computer Science, applied data sciences or related fields; advanced degree preferred - Strong interpersonal and communication skills, ability to explain digital concepts to business leaders and vice versa - 4 years of data platforms experience in Consumer/Healthcare Goods companies - 6 years of progressive experience in developing and maintaining data platforms - Minimum 5 years hands-on experience with Cloud Platforms and cloud-based databases - Experience with data streaming platforms, microservices, and data integration - Proficiency in Agile methodology within DevSecOps model - Ability to define platforms tech stack to address data challenges - Proven track record of delivering high-profile projects within defined resources - Commitment to diversity, inclusion, and equal opportunity employment,
Posted 1 week ago
0.0 - 4.0 years
0 Lacs
pune, maharashtra
On-site
As a member of the team, you will be responsible for supporting Data Cleansing & Integration Project Delivery. This involves working closely with the Data Cleansing team to collect missing information from designated sources within specified timelines. Additionally, you will play a key role in Data Sourcing by assisting in gathering accurate data attributes from various sources. We are seeking fresh graduate candidates who possess strong communication skills, a willingness to learn, and proficiency in navigating MS Office and other applications. In addition to technical skills, soft skills such as a readiness to take on new challenges, active listening, and collaboration are highly valued in this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Data Quality Analyst at TriNet, you will play a crucial role in ensuring the accuracy, consistency, and integrity of data within the organization. Your responsibilities will include analyzing and validating data to maintain high quality standards, conducting thorough assessments of data quality across various sources, and collaborating with data governance teams to establish and enforce data quality standards and policies. You will be expected to develop and implement data quality standards and processes, continuously monitor data quality metrics, and work closely with IT, data management, and business teams to design and implement data strategies and models. Root cause analysis of data quality issues, recommending corrective actions, and providing training and support to data stewards and team members on data quality best practices will also be part of your role. A Bachelor's Degree in Computer Science, Information Technology, Statistics, or a related field is required for this position, along with a minimum of 5 years of experience in data quality analysis or a related field. Proficiency in SQL and data analysis tools, strong analytical and problem-solving skills, excellent attention to detail, and the ability to work collaboratively with cross-functional teams are essential for success in this role. Additionally, strong communication and presentation skills, proficiency in Microsoft Office Suite, and experience working with domain structures are desired qualifications. This is an on-site position based in India with minimal travel requirements. The work environment is clean, pleasant, and comfortable, enabling you to focus on performing the essential functions of the job effectively. TriNet is committed to building a diverse, inclusive, and authentic workplace, and encourages candidates who are excited about the role to apply even if their past experience does not align perfectly with every single qualification in the job description. In summary, as a Senior Data Quality Analyst at TriNet, you will have the opportunity to contribute to the success of the organization by ensuring high data quality standards and processes are maintained, collaborating with various teams to address data-related issues, and continuously improving data quality processes and tools. If you are passionate about data quality and eager to make an impact in a dynamic and innovative environment, we invite you to apply for this exciting opportunity.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. We are looking for Senior Level Consultants with expertise in Data Modelling, Data Integration, Data Manipulation, and analysis to join the SCT group of our GDS consulting Team. This is a fantastic opportunity to be part of a leading firm while being instrumental in the growth of a new service offering. This role demands a highly technical, extremely hands-on Data Warehouse Modelling consultant who will work closely with our EY Partners and external clients to develop new business as well as drive other initiatives on different business needs. The ideal candidate must have a good understanding of the value of data warehouse and ETL and proven experience in delivering solutions to different lines of business and technical leadership. **Your Key Responsibilities** - A minimum of 5+ years of experience into BI/Data integration/ETL/DWH solutions in cloud and on-premises platforms such as Informatica/PC/IICS/Alteryx/Talend/Azure Data Factory (ADF)/SSIS/SSAS/SSRS and experience on any reporting tool like Power BI, Tableau, OBIEE, etc. - Performing Data Analysis and Data Manipulation as per client requirements. - Expert in Data Modelling to simplify business concepts. - Create extensive ER Diagrams to help business in decision making. - Working experience with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures, and integrated datasets using data integration technologies. - Should be able to develop sophisticated workflows & macros (Batch, Iterative, etc.) in Alteryx with enterprise data. - Design and develop ETL workflows and datasets in Alteryx to be used by the BI Reporting tool. - Perform end-to-end Data validation to maintain accuracy of data sets. - Support client needs by developing SSIS Packages in Visual Studio (version 2012 or higher) or Azure Data Factory (Extensive hands-on experience implementing data migration and data processing using Azure Data Factory). - Support client needs by delivering Various Integrations with third-party applications. - Experience in pulling data from a variety of data source types using appropriate connection managers as per Client needs. - Develop, Customize, Deploy, maintain SSIS packages as per client business requirements. - Should have thorough knowledge in creating dynamic packages in Visual Studio with multiple concepts such as - reading multiple files, Error handling, Archiving, Configuration creation, Package Deployment, etc. - Experience working with clients throughout various parts of implementation lifecycle. - Proactive with Solution-oriented mindset, ready to learn new technologies for Client requirements. - Analysing and translating business needs into long-term solution data models. - Evaluating existing Data Warehouses or Systems. - Strong knowledge of database structure systems and data mining. **Skills And Attributes For Success** - Deliver large/medium DWH programmes, demonstrate expert core consulting skills and advanced level of ODI, Informatica, SQL, PL/SQL, Alteryx, ADF, SSIS, SSAS knowledge, and industry expertise to support delivery to clients. - Demonstrate management and an ability to lead projects or teams individually. - Experience in team management, communication, and presentation. **To qualify for the role, you must have** - 5+ years ETL experience as Lead/Architect. - Expertise in the ETL Mappings, Data Warehouse concepts. - Should be able to design a Data Warehouse and present solutions as per client needs. - Thorough knowledge in Structured Query Language (SQL) and experience working on SQL Server. Experience in SQL tuning and optimization using explain plan and SQL trace files. - Should have experience in developing SSIS Batch Jobs Deployment, Scheduling Jobs, etc. - Building Alteryx workflows for data integration, modelling, optimization, and data quality. - Knowledge of Azure components like ADF, Azure Data Lake, and Azure SQL DB. - Knowledge of data modelling and ETL design. - Design and develop complex mappings, Process Flows and ETL scripts. - In-depth experience in designing the database and data modelling. **Ideally, you'll also have** - Strong knowledge of ELT/ETL concepts, design, and coding. - Expertise in data handling to resolve any data issues as per client needs. - Experience in designing and developing DB objects such as Tables, Views, Indexes, Materialized Views and Analytical functions. - Experience of creating complex SQL queries for retrieving, manipulating, checking, and migrating complex datasets in DB. - Experience in SQL tuning and optimization using explain plan and SQL trace files. - Candidate ideally should have ideally good knowledge of ETL technologies/tools such as Alteryx, SSAS, SSRS, Azure Analysis Services, Azure Power Apps. - Good verbal and written communication in English, Strong interpersonal, analytical, and problem-solving abilities. - Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. - Candidates having additional knowledge of BI tools such as PowerBi, Tableau, etc. will be preferred. - Experience with Cloud databases and multiple ETL tools. **What We Look For** The incumbent should be able to drive ETL Infrastructure related developments. Additional knowledge of complex source system data structures preferably in Financial services (preferred) Industry and reporting related developments will be an advantage. An opportunity to be a part of market-leading, multi-disciplinary team of 10000 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY GDS consulting practices globally with leading businesses across a range of industries. **What Working At EY Offers** At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies, and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: - Support, coaching, and feedback from some of the most engaging colleagues around - Opportunities to develop new skills and progress your career - The freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chandigarh
On-site
As a Senior Data Engineer, you will play a crucial role in supporting the Global BI team for Isolation Valves as they transition to Microsoft Fabric. Your primary responsibilities will involve data gathering, modeling, integration, and database design to facilitate efficient data management. You will be tasked with developing and optimizing scalable data models to cater to analytical and reporting needs, utilizing Microsoft Fabric and Azure technologies for high-performance data processing. Your duties will include collaborating with cross-functional teams such as data analysts, data scientists, and business collaborators to comprehend their data requirements and deliver effective solutions. You will leverage Fabric Lakehouse for data storage, governance, and processing to back Power BI and automation initiatives. Additionally, your expertise in data modeling, particularly in data warehouse and lakehouse design, will be essential in designing and implementing data models, warehouses, and databases using MS Fabric, Azure Synapse Analytics, Azure Data Lake Storage, and other Azure services. Furthermore, you will be responsible for developing ETL processes using tools like SQL Server Integration Services (SSIS), Azure Synapse Pipelines, or similar platforms to prepare data for analysis and reporting. Implementing data quality checks and governance practices to ensure data accuracy, consistency, and security will also fall under your purview. You will supervise and optimize data pipelines and workflows for performance, scalability, and cost efficiency, utilizing Microsoft Fabric for real-time analytics and AI-powered workloads. Your role will require a strong proficiency in Business Intelligence (BI) tools such as Power BI, Tableau, and other analytics platforms, along with experience in data integration and ETL tools like Azure Data Factory. A deep understanding of Microsoft Fabric or similar data platforms, as well as comprehensive knowledge of the Azure Cloud Platform, particularly in data warehousing and storage solutions, will be necessary. Effective communication skills to convey technical concepts to both technical and non-technical stakeholders, the ability to work both independently and within a team environment, and the willingness to stay abreast of new technologies and business areas are also vital for success in this role. To excel in this position, you should possess 5-7 years of experience in Data Warehousing with on-premises or cloud technologies, strong analytical abilities to tackle complex data challenges, and proficiency in database management, SQL query optimization, and data mapping. A solid grasp of Excel, including formulas, filters, macros, pivots, and related operations, is essential. Proficiency in Python and SQL/Advanced SQL for data transformations/Debugging, along with a willingness to work flexible hours based on project requirements, is also required. Furthermore, hands-on experience with Fabric components such as Lakehouse, OneLake, Data Pipelines, Real-Time Analytics, Power BI Integration, and Semantic Models, as well as advanced SQL skills and experience with complex queries, data modeling, and performance tuning, are highly desired. Prior exposure to implementing Medallion Architecture for data processing, experience in a manufacturing environment, and familiarity with Oracle, SAP, or other ERP systems will be advantageous. A Bachelor's degree or equivalent experience in a Science-related field, with good interpersonal skills in English (spoken and written) and Agile certification, will set you apart as a strong candidate for this role. At Emerson, we are committed to fostering a workplace where every employee is valued, respected, and empowered to grow. Our culture encourages innovation, collaboration, and diverse perspectives, recognizing that great ideas stem from great teams. We invest in your ongoing career development, offering mentorship, training, and leadership opportunities to ensure your success and make a lasting impact. Employee wellbeing is a priority for us, and we provide competitive benefits plans, medical insurance options, Employee Assistance Program, flexible time off, and other supportive resources to help you thrive. Emerson is a global leader in automation technology and software, dedicated to helping customers in critical industries operate more sustainably and efficiently. Our commitment to our people, communities, and the planet drives us to create positive impacts through innovation, collaboration, and diversity. If you seek an environment where you can contribute to meaningful work, develop your skills, and make a difference, join us at Emerson. Let's go together towards a brighter future.,
Posted 1 week ago
8.0 - 15.0 years
0 Lacs
karnataka
On-site
The position of Informatica Architect requires 8-15 years of experience in Data Profiling, Data Quality rules (preferably in insurance domain), and strong experience in Informatica MDM. The role involves migration experience from Informatica MDM to Informatica Cloud MDM SaaS, designing, developing, and implementing MDM solutions, configuring and customizing Informatica MDM Hub on Cloud, data mapping, transformation, and cleansing rules management, data integration with enterprise systems and databases, as well as data quality rules and validation processes development. Additional skills include knowledge of MDM concepts, insurance domain experience, and Informatica MDM certification is preferred. Soft skills such as excellent written and verbal communication, experience with cross-functional teams, and strong stakeholder management are required. On the other hand, the MDM Consultant / Data Analyst position requires 8-10 years of experience in Data Profiling, identifying Data Quality rules (preferably in insurance), and proficiency in SQL and data analysis. The role also demands an understanding of data warehouse concepts, strong analytical and problem-solving skills for trend analysis and quality checks. Additional skills include knowledge of MDM concepts and familiarity with the insurance domain. Soft skills such as excellent written and verbal communication, experience working with cross-functional teams, and a strong ability to work with client stakeholders are essential. Both positions offer a collaborative work environment that requires interaction with cross-functional teams and client stakeholders. Key responsibilities include data quality assessment, governance, troubleshooting, and ensuring data integrity across the organization. For any specific changes or additional details, please reach out to sushma@metamorfs.com or contact at +91-8971322318.,
Posted 1 week ago
0.0 - 5.0 years
0 Lacs
haryana
On-site
As a member of the Global Investment Operations team at KKR & Co. Inc., you will play a crucial role in supporting the firm's day-to-day middle office and back office operations to drive business decisions and long-term success. You will collaborate closely with various internal teams, including the finance team, data team members, and external agents to ensure accurate processing and reporting of investments. Your primary responsibility will involve independently developing and executing procedures to track portfolio investments in multibillion-dollar funds. This will include troubleshooting technical and accounting issues, reconciling cash and positions across all funds, and processing upcoming activities and contract modifications as per agent notices. You will work with third-party agent banks and custodians to improve processes, ensure quality deliverables, and resolve queries on a day-to-day basis. In this role, you will interact with global counterparts to understand reporting needs, address bottlenecks, and act as a point of escalation for queries. Additionally, you will manage process metrics, key performance indicators, and other dashboards at regular intervals to ensure operational excellence and efficiency. To qualify for this position, you should have a Bachelor's Degree or equivalent work experience, with 0-5 years of experience in teams like Asset Servicing or Loan Servicing. Knowledge of loans, bonds, loan syndication, and investment strategies such as Direct Lending, Mezzanine, CLOs, and Asset-backed Financing is preferred. Experience in a multinational Financial Services organization and/or Private Equity will be advantageous. Strong communication skills, email writing proficiency, and the ability to manage multiple requests and tasks efficiently are essential for this role. You should also have advanced proficiency in Excel and familiarity with tools like Advent Geneva, ClearPar, Markit, and Allvue (Everest). Accounting experience, intellectual curiosity, integrity, and a collaborative mindset are highly valued qualities for this position. This is a full-time role based in our Gurugram office, with a 4-day in-office and 1-day flexible work arrangement. If you are a proactive individual with a keen interest in investment operations and a drive for excellence, we encourage you to apply for this exciting opportunity at KKR & Co. Inc.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
bhubaneswar
On-site
As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your role will involve collaborating with the team to ensure project progress and providing solutions that align with business needs and application specifications. You are expected to be a subject matter expert (SME) and lead the team in implementing innovative solutions. Key Responsibilities include: - Collaborating and managing the team to perform effectively - Making team decisions and contributing to key decisions across multiple teams - Providing solutions to problems within your team and across various teams - Conducting regular team meetings to ensure project progress - Staying updated on industry trends and technologies Professional & Technical Skills Required: - Proficiency in Stibo Product Master Data Management - Strong understanding of data modeling and data architecture - Experience in data integration and data migration - Hands-on experience in application development and customization - Knowledge of data governance and data quality management Minimum 7.5 years of experience in the field is required, along with a 15 years full-time educational qualification. This position is based in Bhubaneswar.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Senior ETL Developer in the Data Services Team, you will play a lead role in ETL design, data modeling, and ETL development. Your responsibilities will include facilitating best practice guidelines, providing technical leadership, working with stakeholders to translate requirements into solutions, gaining approval for designs and effort estimates, and documenting work via Functional and Tech Specs. You will also be involved in analyzing processes for gaps and weaknesses, preparing roadmaps and migration plans, and communicating progress using the Agile Methodology. To excel in this role, you should have at least 5 years of experience with Oracle, Data Warehousing, and Data Modeling. Additionally, you should have 4 years of experience with ODI or Informatica IDMC, 3 years of experience with Databricks Lakehouse and/or Delta tables, and 2 years of experience in designing, implementing, and supporting a Kimball method data warehouse on SQL Server or Oracle. Strong SQL skills, a background in Data Integration, Data Security, and Enterprise Data Warehouse development, as well as experience in Change Management, Release Management, and Source Code control practices are also required. The ideal candidate will have a high school diploma or equivalent, with a preference for a Bachelor of Arts or a Bachelor of Science degree in computer science, systems analysis, or a related area. If you are enthusiastic about leveraging your ETL expertise to drive digital modernization and enhance data services, we encourage you to apply for this role and be part of our dynamic team.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
We are looking for an experienced and skilled Azure Data Engineer to join our team at Creant for a contract-based position in Pune. As an Azure Data Engineer, you will be responsible for designing, developing, and implementing data analytics and data warehouse solutions using Azure Data Platform. You will collaborate closely with business stakeholders, data architects, and technical teams to ensure efficient data integration, transformation, and availability. Your key responsibilities will include designing, developing, and implementing data warehouse and data analytics solutions leveraging Azure Data Platform. You will create and manage data pipelines using Azure Data Factory (ADF) and Azure Data Bricks, and work extensively with Azure AppInsights, Dataverse, and PowerCAT Tools to ensure efficient data processing and integration. Additionally, you will implement and manage data storage solutions using Azure SQL Database and other Azure data services. Designing and developing Logic Apps, Azure Function Apps for data processing, orchestration, and automation will also be part of your role. You will collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions. Performing data validation, quality checks, and ensuring data consistency across systems will be essential. You will also be responsible for monitoring, troubleshooting, and optimizing data solutions for performance, scalability, and security, as well as preparing technical documentation and supporting project handover to operations teams. The primary skills required for this role include: - Strong experience as a Data Engineer with 6 to 10 years of relevant experience. - Expertise in Azure Data Engineering services such as Azure AppInsights, Dataverse, PowerCAT Tools, Azure Data Factory (ADF), Azure Data Bricks, Azure SQL Database, Azure Function Apps, and Azure Logic Apps. - Proficiency in ETL/ELT processes, data integration, and data migration. - Solid understanding of Data Warehouse Architecture and data modeling principles. - Experience in working on large-scale data platforms and handling complex data workflows. - Familiarity with Azure Analytics Services and related data tools. - Strong knowledge of SQL, Python, or Scala for data manipulation and processing. Preferred skills for this role include knowledge of Azure Synapse Analytics, Cosmos DB, and Azure Monitor, a good understanding of data governance, security, and compliance aspects, as well as strong problem-solving, troubleshooting, communication, and stakeholder management skills.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be joining Wells Fargo as a Lead Data Product Management Consultant for the Home Lending (Servicing) team. In this role, you will be responsible for enabling data product design and delivery to drive business initiatives, strategies, and analytics while ensuring governance set by Data Management. Acting as a bridge between data analytics, data management, and technology, your focus will be on data integration, efficiency, and enablement across various data platforms and utilities. Your key responsibilities will include leading complex data product initiatives, participating in large-scale planning to drive data enablement and capabilities, reviewing and analyzing multi-faceted data product initiatives, making decisions in complex situations, and collaborating with peers and senior managers to ensure optimal performance of data product solutions. You will provide strategic input on new use case intake, prioritization, product roadmap definition, and other critical business processes. Additionally, you will manage complex datasets, create and maintain data product roadmaps, design innovative data products, and serve as a liaison between data management, product teams, data engineering, and architecture teams. To be successful in this role, you should have at least 5 years of data product or data management experience. Desired qualifications include experience in the Home Lending domain, strategic planning, effective teamwork, excellent communication skills, and the ability to work in a virtual environment across different time zones. The work timings for this role are from 1:30 PM to 10:30 PM IST, and it involves working from the office as mandated by the business, currently for 3 full days in the office. If you are looking to join a team that values diversity and focuses on building strong customer relationships while maintaining a risk mitigating and compliance-driven culture, then this role at Wells Fargo might be the right fit for you. ,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France