Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
11 - 15 Lacs
gurugram, coimbatore, bengaluru
Work from Office
Data engineers at Thoughtworks are engineers who build, maintain and test the software architecture and infrastructure for managing data applications. They are involved in developing core capabilities which include technical and functional data platforms. They support functional streams of work and are accountable for timely delivery. They work on the latest big data tools, frameworks and offerings (data mesh, etc.), while also being involved in enabling credible and collaborative problem solving to execute on a strategy. Job responsibilities You will collaborate with team members to design intricate data processing pipelines, addressing clients' most challenging problems. You will collaborate with data scientists to design scalable implementations of their models. You will write clean, iterative code using TDD and leverage various continuous delivery practices to deploy, support and operate data pipelines. You will apply different standard models for big data and create data models for at least one type of modeling technique. You will incorporate data quality into your day-to-day work. Job qualifications Technical Skills You have hands-on experience of data modeling and modern data engineering tools and platforms. You have experience in writing clean, high-quality code using the preferred programming language. You have awareness in building data pipelines and data-centric applications using any of the distributed storage platforms and distributed processing platforms in a production setting. You have knowledge of data visualization techniques and can communicate the insights as per the audience. You are aware of data governance, security and privacy strategy to solve business problems. You have experience with different types of databases (i.e.: SQL, NoSQL, etc.). Professional Skills You understand the importance of stakeholder management and can easily liaise between clients and other key stakeholders throughout projects, ensuring buy-in and gaining trust along the way. You are resilient in ambiguous situations and can adapt your role to approach challenges from multiple perspectives. You dont shy away from risks or conflicts, instead you take them on and skillfully manage them. You enjoy influencing others and always advocate for technical excellence while being open to change when needed.
Posted 2 weeks ago
18.0 - 22.0 years
25 - 30 Lacs
chennai
Work from Office
Architect and modernize enterprise data platforms.Lead initiatives on Data Lake, EDW, and Data as a Service.Implement governance, data quality, and security strategies.Guide migration and API integration for scalability. Required Candidate profile Provide leadership and mentoring to data engineering teams. Location - Pan India.
Posted 2 weeks ago
18.0 - 22.0 years
25 - 30 Lacs
thiruvananthapuram
Work from Office
Architect and modernize enterprise data platforms.Lead initiatives on Data Lake, EDW, and Data as a Service.Implement governance, data quality, and security strategies.Guide migration and API integration for scalability. Required Candidate profile Provide leadership and mentoring to data engineering teams. Location - Pan India.
Posted 2 weeks ago
18.0 - 22.0 years
25 - 30 Lacs
hyderabad
Work from Office
Architect and modernize enterprise data platforms.Lead initiatives on Data Lake, EDW, and Data as a Service.Implement governance, data quality, and security strategies.Guide migration and API integration for scalability. Required Candidate profile Provide leadership and mentoring to data engineering teams. Location - Pan India.
Posted 2 weeks ago
18.0 - 22.0 years
25 - 30 Lacs
gurugram
Work from Office
Architect and modernize enterprise data platforms.Lead initiatives on Data Lake, EDW, and Data as a Service.Implement governance, data quality, and security strategies.Guide migration and API integration for scalability. Required Candidate profile Provide leadership and mentoring to data engineering teams. Location - Pan India.
Posted 2 weeks ago
18.0 - 22.0 years
25 - 30 Lacs
pune
Work from Office
Architect and modernize enterprise data platforms.Lead initiatives on Data Lake, EDW, and Data as a Service.Implement governance, data quality, and security strategies.Guide migration and API integration for scalability. Required Candidate profile Provide leadership and mentoring to data engineering teams. Location - Pan India.
Posted 2 weeks ago
18.0 - 22.0 years
25 - 30 Lacs
bengaluru
Work from Office
Architect and modernize enterprise data platforms.Lead initiatives on Data Lake, EDW, and Data as a Service.Implement governance, data quality, and security strategies.Guide migration and API integration for scalability. Required Candidate profile Provide leadership and mentoring to data engineering teams. Location - Pan India.
Posted 2 weeks ago
18.0 - 22.0 years
25 - 30 Lacs
mumbai
Work from Office
Architect and modernize enterprise data platforms.Lead initiatives on Data Lake, EDW, and Data as a Service.Implement governance, data quality, and security strategies.Guide migration and API integration for scalability. Required Candidate profile Provide leadership and mentoring to data engineering teams. Location - Pan India.
Posted 2 weeks ago
5.0 - 10.0 years
13 - 23 Lacs
pune
Hybrid
We are Hiring: Data & Records Governance Roles -Pune UK Captive Bank Roles Available: Assistant Vice President (Data & Records Governance Lead) Pune | Up to 23 LPA | 7+ years experience Data Governance Analyst Pune | Up to 13.50 LPA | 4+ years experience What youll do: Build and maintain Data & Records Governance Frameworks aligned with laws, regulations, and best practices. Drive data quality, compliance, and risk management . Support governance strategy across customer, financial, and internal records. Collaborate across functions to deliver trusted, secure data for the business. What were looking for: Proven expertise in Data Governance, Quality, and Risk Management. Knowledge of data architecture, integration, analytics, AI, or cloud computing. For AVP: hands-on experience with data governance frameworks, Quality and risk management and strong leadership skills to guide teams, influence stakeholders, and drive change. For Analyst: hands-on experience with data governance frameworks, Quality and risk management. For details call at 9368820159 or email resume to sakshi.n@manningconsulting.in
Posted 2 weeks ago
3.0 - 8.0 years
10 - 20 Lacs
hyderabad, bengaluru, mumbai (all areas)
Work from Office
Role: Data Governance - Cloud Data Quality(CDQ) Experience: 3+yrs Mode of work: Remote Notice period: Only 15-30days Package: 10LPA Max Strong hands-on experience with Informatica Cloud Data Quality (CDQ) Data Quality Rule Design and Implementation Proficiency in data profiling, cleansing, standardization, and data verification. Solid understanding of data governance and data management principles.\ Experience with ETL processes and data integration. Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Knowledge of SQL and database management systems. Familiarity with CDGC preferred Azure experience required. Strong communication skills oral and written.
Posted 2 weeks ago
4.0 - 8.0 years
12 - 15 Lacs
hyderabad, chennai, bengaluru
Work from Office
Role & responsibilities We are seeking a hands-on Data Engineer to develop, optimize, and maintain automated data pipelines supporting data governance and analytics initiatives. This role will focus on building production-ready workflows for ingestion, transformation, quality checks, lineage capture, access auditing, cost usage analysis, retention tracking, and metadata integration, primarily using Azure Databricks , Azure Data Lake , and Microsoft Purview . Experience: 4+ years in data engineering, with strong Azure and Databricks experience Key Responsibilities Pipeline Development Design, build, and deploy robust ETL/ELT pipelines in Databricks (PySpark, SQL, Delta Lake) to ingest, transform, and curate governance and operational metadata from multiple sources landed in Databricks. Granular Data Quality Capture Implement profiling logic to capture issue-level metadata (source table, column, timestamp, severity, rule type) to support drill-down from dashboards into specific records and enable targeted remediation. Governance Metrics Automation Develop data pipelines to generate metrics for dashboards covering data quality, lineage, job monitoring, access & permissions, query cost, usage & consumption, retention & lifecycle, policy enforcement, sensitive data mapping, and governance KPIs. Microsoft Purview Integration Automate asset onboarding, metadata enrichment, classification tagging, and lineage extraction for integration into governance reporting. Data Retention & Policy Enforcement Implement logic for retention tracking and policy compliance monitoring (masking, RLS, exceptions). Job & Query Monitoring – Build pipelines to track job performance, SLA adherence, and query costs for cost and performance optimization. Metadata Storage & Optimization – Maintain curated Delta tables for governance metrics, structured for efficient dashboard consumption. Testing & Troubleshooting – Monitor pipeline execution, optimize performance, and resolve issues quickly. Collaboration – Work closely with the lead engineer, QA, and reporting teams to validate metrics and resolve data quality issues. Security & Compliance – Ensure all pipelines meet organizational governance, privacy, and security standards. Required Qualifications Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field 4+ years of hands-on data engineering experience, with Azure Databricks and Azure Data Lake Proficiency in PySpark , SQL , and ETL/ELT pipeline design Demonstrated experience building granular data quality checks and integrating governance logic into pipelines Working knowledge of Microsoft Purview for metadata management, lineage capture, and classification Experience with Azure Data Factory or equivalent orchestration tools Understanding of data modeling, metadata structures, and data cataloging concepts Strong debugging, performance tuning, and problem-solving skills Ability to document pipeline logic and collaborate with cross-functional teams Preferred Qualifications Microsoft certification in Azure Data Engineering Experience in governance-heavy or regulated environments (e.g., finance, healthcare, hospitality) Exposure to Power BI or other BI tools as a data source consumer Familiarity with DevOps/CI-CD for data pipelines in Azure Experience integrating both cloud and on-premises data sources into Azure
Posted 2 weeks ago
3.0 - 8.0 years
10 - 16 Lacs
hyderabad, bengaluru, mumbai (all areas)
Work from Office
Position: Data Governance - Cloud Data Governance and Catalog (CDGC). Work Mode: Remote. Experience: 3+ Years. Minimum 3+ years with hands-on experience on EDC and Axon; Familiarity with CDGC is strongly preferred as the tools are transitioning to cloud end of the year. Strong technical and analytical skills Leverage industry standard practices to develop Glossaries, policies, guidelines, tools, metrics, and standards for managing metadata. Previous implementation exposure of Axon data governance marketplace with clear understanding of data domain, meta model, operating model Clear communication skills oral and written required. Strong understanding of EDC and integration with other systems and tools. Strong Experience with configuring scanners (native and custom) and APIs. Strong client facing skills to work closely with business /SME’s. Experience with Azure cloud platform required. Familiarity with Informatica cloud Data Quality Management.
Posted 2 weeks ago
3.0 - 8.0 years
15 - 30 Lacs
hyderabad
Remote
Role & responsibilities Data Governance and Informatica Cloud data Quality Preferred candidate profile Strong hands-on experience with Informatica Cloud Data Quality (CDQ) Data Quality Rule Design and Implementation Proficiency in data profiling, cleansing, standardization, and data verification. Solid understanding of data governance and data management principles. Experience with ETL processes and data integration. Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Knowledge of SQL and database management systems. Familiarity with CDGC preferred Azure experience required. Strong communication skills oral and written.
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
lucknow, uttar pradesh
On-site
About Agoda: Agoda is an online travel booking platform that offers a wide range of accommodations, flights, and more to travelers worldwide. With a global network of 4.7M hotels and holiday properties, as well as flights and activities, Agoda is committed to providing innovative technology solutions. As part of Booking Holdings and based in Asia, Agoda's team of 7,100+ employees from 95+ nationalities across 27 markets fosters a diverse and collaborative work environment. Embracing a culture of experimentation and ownership, Agoda aims to enhance the travel experience for its customers. Purpose: Bridging the World Through Travel Agoda believes that travel enriches people's lives by allowing them to explore, learn, and connect with different cultures. By bringing individuals and communities closer together, travel promotes empathy, understanding, and happiness. The team at Agoda is united by a shared passion to make a positive impact in the world. Leveraging innovative technologies and strong partnerships, Agoda strives to make travel easy and rewarding for everyone. The Opportunity: Agoda is looking for highly skilled engineers with experience in fintech to join their team. Whether you are a seasoned professional or new to the field, Agoda welcomes applications from intelligent and agile engineers who possess a strong attention to detail. The ideal candidates should be capable of working on both back-end and data engineering tasks. If you are passionate about fintech technology and eager to contribute to building and innovating, Agoda would like to hear from you. They value diversity and encourage qualified candidates from various backgrounds to apply. In this Role, you will get to: - Take ownership of the full product life cycle, from business requirements to coding standards, testing, and monitoring - Design, develop, and maintain platforms and data pipelines in the fintech domain - Enhance the scalability, stability, and efficiency of existing systems - Mentor team members and collaborate with other departments - Participate in the recruitment of exceptional talent What You'll Need To Succeed: - Minimum 7 years of experience developing performance-critical applications using Scala, Java, C#, or Kotlin - Proficiency in data tooling such as Spark, Kafka, and Workflow Orchestration Tools - Strong SQL knowledge and experience with relational databases - Expertise in building and optimizing big data pipelines and data sets - Experience in root cause analysis to improve business processes - Familiarity with Scrum and Agile methodologies - Excellent communication skills in English It's great if you have: - Deep experience with spark-based distributed data pipelines and Spark streaming - Strong background in building finance stack applications - Knowledge of financial data risk management and data governance - Experience in leading projects and teams with full ownership of systems - Ability to build data pipelines for integrating third-party systems Equal Opportunity Employer: Agoda is an equal opportunity employer that values diversity and inclusion. They will keep your application on file for future opportunities and respect your privacy preferences. For more information, please refer to their privacy policy.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
maharashtra
On-site
Flexing It is a freelance consulting marketplace that connects freelancers and independent consultants with organizations seeking independent talent. Our client, a leading global specialist in energy management and automation, is looking to engage with a Consultant Tableau Developer. In this role, you will play an active part in accelerating the company's Big Data and Analytics environment, contributing to Digital initiatives aimed at enhancing, automating, and accelerating the implementation of master data management, adoption of big data platforms, data excellence and data dictionary evolution, data security, and business intelligence and analytics. You will collaborate with different business units of the company and team members distributed between Paris, Grenoble, Bangalore, and Barcelona. Key responsibilities include: - Designing, developing, and delivering Analytics solutions integrated with the corporate Data Platform for self and a team of developers - Conducting data analysis, data modeling, designing Analytics Dashboards architecture, and delivering in alignment with Global Platform standards - Interacting with customers to understand their business problems and provide analytics solutions - Collaborating with Global Data Platform leaders to integrate analytics with corporate platforms - Working with UX/UI global functions to design visualization for customers - Building interactive, rich visualization dashboards showcasing KPIs - Demonstrating strength in data modeling, ETL development, and data warehousing - Utilizing SQL and Query performance tuning skills - Developing solutions using Tableau to meet enterprise level requirements - Operating large-scale data warehousing and analytics projects using AWS technologies Duration: 3 to 4 months Location: On-site, Bagmane, Bangalore (Hybrid work mode) Capacity: Full time Skills Required: - B.E / B TECH/Masters in Computer Science, Electronics, relevant technical certification - 7 years of experience in Analytics Development with data modeling experience on Tableau - Certified on Tableau - Strong presentation, communication, and interpersonal skills - Ability to work effectively with globally dispersed stakeholders - Ability to manage multiple priorities in a fast-paced environment - Data-driven mindset and ability to communicate complex business problems and technical solutions - Strong team player/leader with analytical skills and problem-solving abilities In summary, this role as a Consultant Tableau Developer offers an opportunity to contribute significantly to the enhancement of Big Data and Analytics environment in a global work culture, collaborating with different business units and team members across multiple locations. The position requires a strong technology and solution delivery focus, with responsibilities spanning from data analysis to visualization design and integration with corporate platforms, utilizing Tableau and other analytics tools.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
This position requires someone with good problem-solving skills, a solid understanding of business operations, and excellent client-facing abilities. The ideal candidate should have over 8 years of professional experience, with at least 5 years in a leadership role managing client portfolios in the Data Engineering sector. It is essential to have a deep understanding of business technology, operational challenges, and effective solutions used in various business functions. You should be adept at utilizing both traditional and modern Data Engineering technologies and tools to address business issues and guide clients through their data journey. Knowledge of emerging technologies related to data management such as data governance, data quality, security, integration, processing, and provisioning is a must. Additionally, possessing strong soft skills to collaborate with teams and lead medium to large teams is essential. In this role, you will be expected to take on leadership responsibilities in client projects, presales/consulting activities, solutioning, business development discussions, and overseeing the execution of data engineering projects. Responsibilities include: Client Engagement & Relationship Management: - Being the main point of contact for clients on data engineering projects, understanding their requirements, challenges, and objectives. - Cultivating and nurturing strong client relationships to ensure high satisfaction levels and repeat business. - Converting client needs into actionable technical solutions and project plans. Project Management & Delivery: - Supervising the end-to-end delivery of data engineering projects to ensure timely completion within scope and budget. - Managing project resources, timelines, and risks for seamless project execution. - Collaborating with cross-functional teams to provide comprehensive data solutions. Technical Leadership & Innovation: - Leading the design, development, and deployment of scalable data architectures tailored to client needs. - Staying updated on industry trends and implementing innovative practices in client projects. - Offering technical oversight and guidance to the data engineering team to ensure high-quality output. Team Leadership & Development: - Mentoring and managing a team of data engineers to foster a high-performance culture. - Providing professional growth opportunities and support to team members. - Equipping the team with the necessary skills and tools to deliver top-notch consulting services. Data Governance & Quality Assurance: - Implementing data governance frameworks to ensure data integrity, security, and compliance in client projects. - Establishing and enforcing data quality standards for accurate and reliable data usage in client solutions. Business Development & Consulting: - Supporting business development efforts by contributing to proposals and presenting solutions to prospective clients. - Showcasing thought leadership in data engineering through various channels to enhance the company's industry reputation. Essential Skills: - 8 to 12 years of data engineering experience, with 3 years in a managerial role in consulting or professional services. - Proven track record of managing multiple complex data engineering projects simultaneously. - Experience leading a team of 8 to 12 professionals. - Strong problem-solving abilities and adeptness in handling complex situations. - Proficiency in project management, particularly in Agile methodologies. - Client-focused mindset and willingness to tackle challenging projects. - Excellent written and verbal communication skills. - Ability to collaborate effectively across functions and levels, especially in virtual environments. Background Check Requirement: - Clean criminal record. Other Requirements: - Bachelor's or master's degree in Computer Science, Engineering, or related field, or equivalent practical experience. - Interview process involves 2-3 rounds. - This is a full-time office-based role with no hybrid/remote options available.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Data & Analytics professional in the Logistics & Freight Forwarding industry, your primary responsibility will be to lead the modernization of legacy systems into scalable, cloud-based data frameworks. You will play a crucial role in integrating structured and unstructured data into unified insights platforms. Your expertise will be essential in implementing data governance standards such as GDPR and ISO 27001 to ensure data quality and manage privacy compliance effectively. In this role, you will drive self-service Business Intelligence (BI), develop dashboards, and create predictive models using tools like Power BI, Azure, and Snowflake. Your ability to translate complex data into actionable business insights will be instrumental in shaping data-driven strategies within the organization. Leading the ETL pipeline development, managing SQL/NoSQL data migrations, and optimizing performance will be key aspects of your responsibilities. Collaborating cross-functionally with business, tech, and analytics teams to prioritize data needs and cultivate a data-driven culture will be essential for success in this role. Furthermore, your role will involve building and mentoring a high-performing data team comprising analysts, scientists, and architects. Encouraging experimentation and fostering innovation will be integral to driving continuous improvement and growth within the team. To excel in this position, you should possess 10-12 years of experience in Data & Analytics roles with a track record of successful modernization project delivery. Hands-on experience with Power BI, SQL, Python, data lakes, and cloud warehouses (AWS, Azure, Snowflake) is essential. Strong business acumen, stakeholder engagement skills, and exposure to machine learning and data governance tools are highly desirable qualities for this role. Key Deliverables for this position include establishing a centralized data platform with standardized pipelines and reporting stack, developing KPI-driven dashboards and AI/ML-powered analytics, implementing a robust data governance and compliance framework, fostering a high-performance data team, and cultivating a company-wide data culture aligned with business strategies for growth, efficiency, and insights.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Governance Specialist at Collibra, you will be responsible for managing and maintaining the Collibra Data Catalog, Data Dictionary, and Business Glossaries. Your role will involve implementing and supporting data quality rules, custom lineage stitching, and metadata ingestion processes. Additionally, you will participate in workflow creation and configuration within Collibra, collaborating with business stakeholders to maintain high-quality metadata. It is essential to apply data governance frameworks to ensure regulatory compliance with standards such as GDPR and HIPAA. Supporting data lifecycle initiatives from creation to archival and working cross-functionally in a Scrum/Agile environment are also key responsibilities. To excel in this role, you should have at least 2 years of hands-on experience with Collibra, preferably in a data steward role, and a minimum of 3 years in a data governance team or function. A strong understanding of data management, data lineage, and best practices in data quality is crucial. Your experience should also include metadata management, lineage stitching, and automating data processes. Excellent communication skills, both technical and business-facing, will be essential for effective collaboration. While not mandatory, it would be beneficial to have experience or understanding of Master Data Management (MDM) and Collibra certifications such as Ranger or Expert. Exposure to tools like Informatica, Azure Purview, Alation, or Tableau would also be advantageous in this role. If you are ready to take on this exciting opportunity, connect with us at shalini.v@saranshinc.com.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
maharashtra
On-site
Do you want to make a global impact on patient health Join Pfizer Digital's Artificial Intelligence, Data, and Advanced Analytics organization (AIDA) to leverage cutting-edge technology for critical business decisions and enhance customer experiences for colleagues, patients, and physicians. Our team is at the forefront of Pfizer's transformation into a digitally driven organization, using data science and AI to change patients" lives. The Data Science Industrialization team leads engineering efforts to advance AI and data science applications from POCs and prototypes to full production. As a Senior Manager, AI and Analytics Data Engineer, you will be part of a global team responsible for designing, developing, and implementing robust data layers that support data scientists and key advanced analytics/AI/ML business solutions. You will partner with cross-functional data scientists and Digital leaders to ensure efficient and reliable data flow across the organization. You will lead the development of data solutions to support our data science community and drive data-centric decision-making. Join our diverse team in making an impact on patient health through the application of cutting-edge technology and collaboration. ROLE RESPONSIBILITIES - Lead development of data engineering processes to support data scientists and analytics/AI solutions, ensuring data quality, reliability, and efficiency. - Enforce best practices, standards, and documentation as a data engineering tech lead to ensure consistency and scalability, and facilitate related trainings. - Provide strategic and technical input on the AI ecosystem, including platform evolution, vendor scan, and new capability development. - Act as a subject matter expert for data engineering on cross-functional teams in bespoke organizational initiatives by providing thought leadership and execution support for data engineering needs. - Train and guide junior developers on concepts such as data modeling, database architecture, data pipeline management, data ops and automation, tools, and best practices. - Stay updated with the latest advancements in data engineering technologies and tools and evaluate their applicability for improving our data engineering capabilities. - Direct data engineering research to advance design and development capabilities. - Collaborate with stakeholders to understand data requirements and address them with data solutions. - Partner with the AIDA Data and Platforms teams to enforce best practices for data engineering and data solutions. - Demonstrate a proactive approach to identifying and resolving potential system issues. - Communicate the value of reusable data components to end-user functions (e.g., Commercial, Research and Development, and Global Supply) and promote innovative, scalable data engineering approaches to accelerate data science and AI work. BASIC QUALIFICATIONS - Bachelor's degree in computer science, information technology, software engineering, or a related field (Data Science, Computer Engineering, Computer Science, Information Systems, Engineering, or a related discipline). - 7+ years of hands-on experience in working with SQL, Python, object-oriented scripting languages (e.g., Java, C++, etc.) in building data pipelines and processes. Proficiency in SQL programming, including the ability to create and debug stored procedures, functions, and views. - Recognized by peers as an expert in data engineering with deep expertise in data modeling, data governance, and data pipeline management principles. - In-depth knowledge of modern data engineering frameworks and tools such as Snowflake, Redshift, Spark, Airflow, Hadoop, Kafka, and related technologies. - Experience working in a cloud-based analytics ecosystem (AWS, Snowflake, etc.). - Familiarity with machine learning and AI technologies and their integration with data engineering pipelines. - Demonstrated experience interfacing with internal and external teams to develop innovative data solutions. - Strong understanding of Software Development Life Cycle (SDLC) and data science development lifecycle (CRISP). - Highly self-motivated to deliver both independently and with strong team collaboration. - Ability to creatively take on new challenges and work outside comfort zone. - Strong English communication skills (written & verbal). PREFERRED QUALIFICATIONS - Advanced degree in Data Science, Computer Engineering, Computer Science, Information Systems, or a related discipline (preferred, but not required). - Experience in software/product engineering. - Experience with data science enabling technology, such as Dataiku Data Science Studio, AWS SageMaker or other data science platforms. - Familiarity with containerization technologies like Docker and orchestration platforms like Kubernetes. - Experience working effectively in a distributed remote team environment. - Hands-on experience working in Agile teams, processes, and practices. - Expertise in cloud platforms such as AWS, Azure, or GCP. - Proficiency in using version control systems like Git. - Pharma & Life Science commercial functional knowledge. - Pharma & Life Science commercial data literacy. - Ability to work non-traditional work hours interacting with global teams spanning across different regions (e.g., North America, Europe, Asia). Pfizer is an equal opportunity employer and complies with all applicable equal employment opportunity legislation in each jurisdiction in which it operates.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
Epergne Solutions is seeking a Treasury Data Engineer to join the data squad under the Treasury Data & Analytics team. The primary focus of this role is to contribute to the strategic pillars of the Treasury Data & Analytics team, which include: - Developing a common finance data model for GCFO that serves Treasury purposes with centralized control. - Building an optimized finance data architecture with straight-through processing (STP) to enable future initiatives like a self-service portal for business products and analytics. - Establishing a data governance model that encompasses policies, procedures, and business data ownership. - Monitoring and managing data quality through Data Quality Management System (DQMS) and the Issue Management Resolution process (IMR). - Implementing self-service data modeling capabilities enhanced by AI functionalities. - Providing a standardized and rationalized set of Analytics on the Treasury Landing Page with a user-friendly interface. Qualifications for this role include: - Demonstrated experience in data management and/or executing data operating models in transformation projects. - At least 6 years of overall work experience with a minimum of 2 years in relevant data management. - Proactive and independent work style with strong initiative. - Excellent communication and presentation skills. - Consistent high performance and alignment with the core values of the organization. - High energy levels, driven, and willing to put in the effort. - Detail-oriented team player with a pragmatic approach. - Hands-on experience in data management and operating models within Tier 1 Banks. - Strong stakeholder management and communication skills across different levels. - Proficiency in SAP products for planning and outlook, including SAP HANA Cloud, Datasphere, and SAP Analytics Cloud. - Experience with SQL, Python, or Scala. - Proficiency in ETL tools and data pipeline frameworks. - Hands-on experience with cloud-based data solutions such as AWS, BigQuery, Azure, etc. - Knowledge of API integrations, real-time data processing, and automation. - Understanding of data governance, security, and compliance reporting. - Familiarity with treasury operations, cash management, and investment tracking. - Strong comprehension of financial risk management, including FX, interest rate, and liquidity risk. - Ability to bridge business requirements with technical solutions. - Understanding of financial reporting, regulatory compliance, and audit processes. - Experience with real-time financial data processing and analytics. - Knowledge of machine learning models for treasury forecasting.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. EY is counting on your unique voice and perspective to help the organization become even better. Join us and build an exceptional experience for yourself, and contribute to creating a better working world for all. EY's Financial Services Office (FSO) is an industry-focused business unit that provides integrated services leveraging deep industry experience with strong functional capability and product knowledge. The FSO practice offers advisory services to financial institutions and capital markets participants, including commercial banks, investment banks, broker-dealers, asset managers, and insurance functions of leading Fortune 500 companies. Within EY's FSO Advisory Practice, the Data and Analytics team addresses complex issues and opportunities to deliver better outcomes that help expand and safeguard businesses now and in the future. By embedding the right analytical practices at the core of clients" decision-making, we create a compelling business case. Key Responsibilities: The role requires good data visualization development experience, and the candidate must have a strong ability to: - Work both as a team player and an individual contributor throughout design, development, and delivery phases with a focus on quality deliverables. - Collaborate with clients directly to understand requirements and provide inputs to build optimal solutions. - Develop new capabilities for clients through visualization dashboards in tools like PowerBI, QlikView, QlikSense, Tableau, etc. - Provide support in organization-level initiatives and operational activities. - Ensure continuous knowledge management and participate in all internal training programs. Qualifications: - BE/BTech/MCA/MBA with 3-6 years of industry experience Technical Skills Requirement: Must have: - Excellent visualization design and development experience with tools like Tableau, QlikView, Power BI. - Experience in designing and building dashboard automation processes and organizing analysis findings. - Strong understanding and hands-on experience with SQL; relational database experience with DB2, Oracle, SQL Server, Teradata. - Ability to interpret and present data effectively to communicate findings and insights. Good to have: - Understanding of Data Management concepts and Data Strategy. - Experience with data preparation tools like Alteryx. - Knowledge of data concepts such as Data Warehouses, Data Marts, data extraction, preparation processes, and Data Modeling. - Understanding of Data governance and Data security importance. - Experience in Banking and Capital Markets domains. People Responsibilities: - Willingness to travel to meet client needs. - Excellent communication and interpersonal skills; a team player who maintains good professional relationships with colleagues. - Ability to multitask, be flexible, and change priorities quickly. - Ability to quickly understand and learn new technology/features and inspire the learning process among peers within the team. EY | Building a better working world: EY aims to build a better working world by creating long-term value for clients, people, and society, and by building trust in the capital markets. With diverse teams in over 150 countries enabled by data and technology, EY provides trust through assurance and helps clients grow, transform, and operate across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing the world today.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
The Data Architect role requires a professional with strong organizational and communication skills. You will collaborate with the Client Architect to drive data architecture-related client workshops, internal meetings, and proposals. Your responsibilities will include a deep understanding of NiFi architecture and its components, as well as experience with data formats such as JSON, XML, and Avro. Knowledge of data protocols like HTTP, TCP, and Kafka will be essential for this role. You will be expected to coach and develop a Data strategy and vision for the larger team, providing subject matter training where necessary. Understanding of data governance principles, data quality, database design, data modeling, and Cloud architecture will be crucial. Familiarity with data governance and security best practices is also required. Additionally, knowledge of containerization and orchestration tools like Docker and Kubernetes will be beneficial. In this role, you will be responsible for creating high-level designs, data architecture, and data pipelines for Apache NiFi and the AI-NEXT platform. Ensuring database performance, data quality, integrity, and security will be key aspects of your duties. You will guide the team in solution implementation and collaborate with internal product architect, engineering, and security teams. Supporting the pre-sales team for Data Solution and optimizing NiFi workflows for performance, scalability, and reliability will also be part of your responsibilities. Furthermore, you will collaborate with cross-functional teams to integrate NiFi with other systems, including databases, APIs, cloud services, and other backend applications. Please note that EdgeVerve Systems does not engage with external manpower agencies or charge any fees from candidates for recruitment. If you encounter such scams, please report them immediately.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
We have an exciting opportunity for you to impact your career and embark on an adventure where you can challenge the boundaries of what is achievable. As a Lead Software Engineer at JPMorgan Chase in the Consumer & Community Banking- Data Technology team, you play a crucial role in an agile team dedicated to enhancing, constructing, and delivering trusted, cutting-edge technology products in a secure, stable, and scalable manner. Serving as a key technical contributor, your responsibilities include implementing innovative software solutions, designing, developing, and troubleshooting technical issues with a focus on devising unconventional solutions and dissecting complex technical problems. Your role involves developing secure, high-quality production code, as well as reviewing and debugging code authored by colleagues. You will also be tasked with identifying opportunities to streamline or automate the resolution of recurring issues to enhance the overall operational stability of software applications and systems. Additionally, you will lead evaluation sessions with external vendors, startups, and internal teams to drive outcomes-driven assessments of architectural designs, technical viability, and integration possibilities within existing systems and information architecture. Your responsibilities extend to enabling the Gen AI platform, implementing Gen AI use cases, LLM fine-tuning, and multi-agent orchestration. Effective communication of technical concepts and solutions across all organizational levels is also a key aspect of your role. Furthermore, you will lead the design and development of our AI/ML platform, ensuring its robustness, scalability, and high performance. The ideal candidate should possess formal training or certification in software engineering concepts and have a minimum of 5 years of practical experience. Extensive hands-on experience with Python, AWS cloud services (including EKS, EMR, ECS, and DynamoDB), and DataBricks ML lifecycle development is essential. Advanced knowledge in software engineering, AI/ML, machine learning operations (MLOps), and data governance is required. Demonstrated experience in leading complex projects, encompassing system design, testing, and ensuring operational stability, is a must. Expertise in computer science, computer engineering, mathematics, or a related technical field is preferred. Familiarity with large language model (LLM) approaches like Retrieval-Augmented Generation (RAG) and real-time model serving experience with Seldon and Ray are advantageous. Join us on this journey as we continue to drive innovation and technological advancement within the industry.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As a Master Data Management (MDM) specialist with 3-4 years of experience in SAP Public Cloud, particularly focused on Materials Management (MM) and Sales & Distribution (SD), your primary responsibility will be to ensure data accuracy, consistency, and governance within the SAP system. Your role involves creating and managing master data on a regular basis as part of operational management. This includes developing MDM strategies, defining data standards, and implementing data governance frameworks. You will collaborate with various departments to maintain data quality and provide expertise in SAP MDM, ensuring end-users are properly trained and supported. Your key responsibilities will include: - Master Data Entry and Updation: Creating and managing master data for SAP Public Cloud, including Item Creation, Customer Creation, Vendor Creation, Tax Code allocation, Plant-related parameters, and GL, among others. - Data Management Strategy: Developing and implementing MDM strategies and policies to ensure data integrity and consistency across the SAP Public Cloud environment. - Data Governance: Designing and executing data governance frameworks to maintain high data quality, focusing on accuracy, completeness, and timeliness. - Data Standardization: Collaborating with business units to define master data requirements and standards, ensuring consistency across all master data domains. - Data Quality Monitoring: Monitoring and auditing master data to identify and resolve discrepancies, ensuring the accuracy and reliability of business-critical data. - Training and Support: Training and supporting end-users on MDM processes and best practices to promote a culture of data accuracy and consistency. - Data Consolidation: Ensuring the creation of a single, trusted source of truth for master data by integrating data from various SAP and third-party sources. - MM/SD Expertise: Possessing a deep understanding of Materials Management and Sales & Distribution modules within SAP, including specific master data requirements and best practices. Qualifications: - A graduate from a reputed institute with extensive experience in SAP systems, particularly MM and SD modules, and a strong understanding of SAP Public Cloud. - Data Management Skills: Strong understanding of data management principles, including data governance, data quality, and data integration. - Collaboration Skills: Ability to effectively collaborate with business users and IT teams to define and implement MDM strategies and processes. - Analytical Skills: Strong analytical skills to identify data issues, evaluate data quality, and recommend solutions, including performing reconciliations. - Communication Skills: Excellent communication skills to effectively train end-users and communicate MDM strategies and processes. - Problem-Solving Skills: Strong problem-solving skills to identify and resolve data discrepancies, ensuring data accuracy.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
About Us Founded in 2006, Rite Software is a global IT consulting company headquartered in Houston, Texas. Rite Software delivers strategic IT solutions for clients facing complex challenges involving cloud applications, cloud infrastructure, analytics, and digital transformation. Description: Mastech Digital provides digital and mainstream technology staff as well as Digital Transformation Services for leading American Corporations. We are currently seeking an Informatica MDM Developer for our client in the IT-Services domain. We value our professionals, providing comprehensive benefits, exciting challenges, and the opportunity for growth. This is a contract position, and the client is looking for someone to start immediately. - Performs solution design, implements, and supports robust and complex MDM initiatives. - Provides data architecture solutions. - Interprets business requirements and converts them into technical requirements, defines use cases and test scenarios. - Collaborates with source systems data stewards, system owners, and technical personnel for data governance and resolves any data quality or technical issues related to data ingestion.,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |