Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary Engineering Manager (Data & Analytics) – CL5 Role Overview: As an Engineering Manager , you will actively engage in your data engineering craft, taking a hands-on approach to multiple high-visibility projects. Your expertise will be pivotal in managing data engineering teams to deliver solutions that delight customers and users, while also driving tangible value for Deloitte's business investments. You will leverage your extensive engineering craftsmanship and advanced proficiency across multiple programming languages and modern frameworks, consistently demonstrating your exemplary track record in delivering high-quality, outcome-focused solutions. The ideal candidate will be a role model and engineering mentor, advocating engineering excellence, and leading cross-functional teams to design, develop, test, deploy, and operate advanced software solutions. Key Responsibilities: Embrace and drive a culture of accountability for customer and business outcomes. Lead data engineering teams to deliver solutions that solve complex problems with valuable outcomes, ensuring high-quality, lean designs and implementations. Serve as the technical advocate for data products and data engineering teams, promoting and supporting evolutionary releases (e.g., alpha, beta, MVP).Ensure proper planning, code integrity, quality, alignment with customer goals, architecture designs, and NFRs. Possess passion and experience as an individual contributor, responsible for fostering a culture of engineering excellence within the team, being hands-on with design, configuration and/or code part of the time, contributing to team velocity. Work daily with the data engineering teams to resolve any issues, blockers, or impediments, perform code reviews and optimizations, maintain coding standards compliance, and ensure that technical debt is addressed continuously within sprints to achieve comprehensive quality. Be self-driven to learn new technologies, experiment with engineers, and inspire the teams to learn and drive application of those new technologies. Mentor and coach product engineering team to cultivate and nurture strong masters of crafts with passion towards product outcomes. Lead data engineering teams to develop lean solutions through rapid, inexpensive experimentation to solve customer needs. Engage with customers and product teams to deliver the right solution for the product in the right way at the right time. Exhibit a mindset that favors action and evidence over extensive planning. Utilize a leaning-forward approach to navigate complexity and uncertainty, delivering lean, supportable, and maintainable solutions. Work collaboratively with empowered, cross-functional teams including product management, experience, delivery, infrastructure, and security. Integrate diverse perspectives to make well-informed decisions that balance feasibility, viability, usability, and value. Foster a collaborative environment that enhances team synergy and innovation. Possess deep expertise in modern software engineering practices and principles, Agile methodologies, DevSecOps, Continuous Integration/Continuous Deployment, deployment techniques like Blue-Green, Canary to minimize down-time and enable A/B testing approaches. Act as a Role-Model, leveraging these techniques to optimize solutioning and product delivery, ensuring high-quality outcomes with minimal waste. Demonstrate proficiency in product development, from conceptualization and design to implementation and scaling, with a focus on continuous improvement and learning. Quickly acquire domain-specific knowledge relevant to the business or product. Translate business and user needs into engineering plans (e.g., sprint plans, enables, tasks, priorities). Navigate various enterprise functions such as business and enabling areas as well as product, experience, delivery, infrastructure, and security to drive product value and feasibility as well as alignment with organizational goals. Exhibit exceptional communication skills, capable of articulating complex technical concepts clearly and compellingly. Inspire and influence stakeholders at all levels through well-structured arguments and trade-offs supported by evidence, evaluations, and research. Create coherent narratives that align technical solutions with business objectives. Engage and collaborate with stakeholders at all organizational levels, from team members to senior executives. Build and maintain constructive relationships, fostering a culture of co-creation and shared momentum towards achieving product goals. Align diverse perspectives and drive consensus to create feasible solutions. The Team US Deloitte Technology Product Engineering has modernized software and product delivery, creating a scalable, cost-effective model that focuses on value/outcomes that leverages a progressive and responsive talent structure. As Deloitte’s primary internal development team, Product Engineering delivers innovative digital solutions to businesses, service lines, and internal operations with proven bottom-line results and outcomes. It helps power Deloitte’s success. It is the engine that drives Deloitte, serving many of the world’s largest, most respected companies. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. Qualifications A bachelor’s degree in computer science, software engineering, or a related discipline. An advanced degree (e.g., MS) is preferred but not required. Experience is the most relevant factor. Excellent software engineering and product architecture/design foundation with deep understanding of Business Context Diagrams (BCD), sequence/activity/state/ER/DFD diagrams, OOP/OOD, data-structures, algorithms, code instrumentations, etc. 10+ years proven experience with industry-leading business intelligence tools like Tableau, SAP BOE, Power BI and Qlik (including add-on products for data and server management). 5+ years of experience with cloud-native data and analytics platforms on Azure, AWS or GCP. 3+ years of experience with AI/ML and GenAI is preferred. Deep understanding of methodologies & tools like XP, Lean, SAFe, DevSecOps, SRE, ADO, GitHub, SonarQube, etc. to deliver high quality products rapidly. Excellent interpersonal and organizational skills, with the ability to handle diverse situations, complex projects, and changing priorities, behaving with passion, empathy, and care. Strong SQL skills with experience in stored procedures, triggers, functions, and designing tables/views; proficiency with advanced MS Excel (including PivotTables), Word, and PowerPoint. Experience with ETL tools (e.g., Alteryx, Informatica, SAP SLT, SAP BODS) and working with both structured and unstructured data; familiarity with R/Python for data preparation and integration is a plus. Ability to align technology solutions with business objectives, maintaining a constant awareness of project milestones, client needs, and long-term strategic goals. How You will Grow: At Deloitte, our professional development plans focus on helping people at every level of their career to identify and use their strengths to do their best work every day and excel in everything they do. Location : Bengaluru Shift timing – 11AM to 8PM #CAL-BMT #CA-SK #CAP-PD #CA-SG1 Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 307590
Posted 1 week ago
6.0 - 11.0 years
20 - 35 Lacs
Hyderabad
Work from Office
Role:-Data Analyst Exp:- 6-11 Yrs Location:-Hyderabad Primary Skills:- ETL,Informatica,Python, SQL,BI tools and Investment Domain Please share your resumes to rajamahender.n@technogenindia.com , Job Description:- The Minimum Qualifications Education: Bachelors or Masters degree in Data Science, Statistics, Mathematics, Computer Science, Actuarial Science, or related field. Experience: 7-9 years of experience as a Data Analyst, with at least 5 years supporting Finance within the insurance industry. Hands-on experience with Vertica/Teradata for querying, performance optimization, and large-scale data analysis. Advanced SQL skills: proficiency in Python is a strong plus. Proven ability to write detailed source-to-target mapping documents and collaborate with technical teams on data integration. Experience working in hybrid onshore-offshore team environments. Deep understanding of data modelling concepts and experience working with relational and dimensional models. Strong communication skills with the ability to clearly explain technical concepts to non-technical audiences. A strong understanding of statistical concepts, probability and accounting standards, financial statements (balance sheet, income statement, cash flow statement), and financial ratios. Strong understanding of life insurance products and business processes across the policy lifecycle. Investment Principles: Knowledge of different asset classes, investment strategies, and financial markets. Quantitative Finance: Understanding of financial modelling, risk management, and derivatives. Regulatory Framework: Awareness of relevant financial regulations and compliance requirements.
Posted 1 week ago
12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Description Hiring Locations: Chennai, Trivandrum, Kochi Experience Range: 6–12 years Role Description As a Manager – IT Operations at UST HealthProof , you will lead and manage production support operations, ensuring high service quality and customer satisfaction. You will oversee a geographically distributed support team responsible for health plan technology solutions. This role includes managing SLAs, coordinating change and issue resolution, driving operational efficiency, and delivering continuous improvements aligned with business goals. You will report to the Director of Delivery. Key Responsibilities Ensure operational excellence for customer-facing technology delivery. Manage day-to-day production support including claims, enrollment, adjudication, and payment systems. Drive resolution of production incidents and root cause analysis. Generate SLA/operational reports for both internal stakeholders and customers. Manage incidents using ITSM tools like JIRA or ServiceNow. Coordinate with internal and external teams (network, middleware, OS, DB, vendors) for support and upgrades. Lead customer calls, prioritize daily support issues, and handle escalations. Identify value-added innovations and efficiency opportunities. Mentor and guide the support team; manage team development and performance evaluations. Participate in contract renewals, SOWs, and onboarding activities. Ensure knowledge management and upskilling through platforms like TICL, GAMA, etc. Strategically contribute to account growth via resource planning and new engagements. Mandatory Skills Minimum 6+ years managing production support in a mid to large-scale IT environment. Strong hands-on experience with ServiceNow/JIRA or other ITSM tools. Experience in SLA governance and operational reporting. Proven capability in SQL, Excel, and PowerPoint. Working knowledge of Cloud platforms (AWS/GCP). Excellent understanding of ITIL standards and practices. Experience managing support for enterprise applications or healthcare systems. Good To Have Skills Informatica / Informatica Cloud experience (highly desirable). Knowledge of SOAP, EDI, and ETL processing. Familiarity with SaaS platforms and HealthEdge applications. PMP/Prince2/CSM certification or equivalent. Exposure to working with SOWs, SLAs, contract management, and change requests. Experience in working in an onshore-offshore delivery model. Soft Skills Strong communication and presentation abilities. Customer-focused mindset and ability to foster strong relationships. High ownership, problem-solving attitude, and stakeholder management. Ability to manage critical escalations under pressure. Team mentoring, conflict resolution, and people development. Agility in multitasking across priorities and timelines. Outputs & Success Metrics Timely and quality SLA/Operational reporting. Effective incident reduction and permanent fixes implementation. Improved customer satisfaction (C-SAT/NPS). Seamless knowledge transfers and upskilling initiatives. Measurable team engagement, development, and performance. Achievement of project/account financial targets (EBITDA). Value additions and innovations introduced in the engagement. Certifications (Preferred) PMP / Prince2 / CSM ITIL v3 or v4 Foundation / Intermediate About UST HealthProof UST HealthProof is reshaping the future of health insurance operations by building best-in-class cloud-based administrative ecosystems. Our solutions aim to reduce administrative costs and improve the healthcare experience. With strong leadership and a startup culture, we nurture individual growth while driving meaningful industry transformation. Skills Healthcare,Production Support,Production Management
Posted 1 week ago
2.0 - 8.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Join our dynamic team and contribute to the advancement of our data management capabilities, driving innovation and excellence in the financial industry. As a Lead Software Engineer at JPMorgan Chase within the Commercial and Investment Banks Prime Finance Technology Team, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. Job Responsibilities Demonstrate proficiency in application, data, and infrastructure architecture disciplines, especially in managing high-volume data. Utilize strong Oracle database knowledge, including Oracle, PL/SQL, performance tuning, and ETL-Informatica. Handle complex SQL joins, correlated sub-queries, aggregate functions, analytic functions, materialized views, indexing, partitioning, and performance tuning using explain plans. Lead the analysis, design, development, testing, and implementation of ETL processes, preferably using Informatica. Provide maintenance and support for enterprise-level data integration solutions. Implement solutions using Postgress and cloud-based technologies. Apply knowledge of Data Lake and Data Warehouse concepts and implementation within an agile software delivery lifecycle. Understand and contribute to architecture and design across systems. Work proficiently with developmental toolsets to enhance system capabilities Manage the issue resolution process for application production problems, ensuring minimal disruption. Provide Level 3 support for production and business-as-usual activities, ensuring system reliability and performance Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience. Over 10 years of proven experience in database solution implementation and ETL processes. Strong analytical skills and the ability to work with complex data sets. Excellent problem-solving abilities and attention to detail. Ability to work independently and lead projects with minimal supervision. Strong communication skills to effectively collaborate with cross-functional teams. Collaborate effectively within large teams to achieve organizational goals, demonstrating strong team player qualities and project ownership. Preferred qualifications, capabilities, and skills Knowledge of big data technologies, such as Snowflake, and integration experience is a plus. Experience with Python and Java is highly advantageous. Stay informed about industry-wide technology trends and best practices. Join our dynamic team and contribute to the advancement of our data management capabilities, driving innovation and excellence in the financial industry. As a Lead Software Engineer at JPMorgan Chase within the Commercial and Investment Banks Prime Finance Technology Team, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. Job Responsibilities Demonstrate proficiency in application, data, and infrastructure architecture disciplines, especially in managing high-volume data. Utilize strong Oracle database knowledge, including Oracle, PL/SQL, performance tuning, and ETL-Informatica. Handle complex SQL joins, correlated sub-queries, aggregate functions, analytic functions, materialized views, indexing, partitioning, and performance tuning using explain plans. Lead the analysis, design, development, testing, and implementation of ETL processes, preferably using Informatica. Provide maintenance and support for enterprise-level data integration solutions. Implement solutions using Postgress and cloud-based technologies. Apply knowledge of Data Lake and Data Warehouse concepts and implementation within an agile software delivery lifecycle. Understand and contribute to architecture and design across systems. Work proficiently with developmental toolsets to enhance system capabilities Manage the issue resolution process for application production problems, ensuring minimal disruption. Provide Level 3 support for production and business-as-usual activities, ensuring system reliability and performance Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience. Over 10 years of proven experience in database solution implementation and ETL processes. Strong analytical skills and the ability to work with complex data sets. Excellent problem-solving abilities and attention to detail. Ability to work independently and lead projects with minimal supervision. Strong communication skills to effectively collaborate with cross-functional teams. Collaborate effectively within large teams to achieve organizational goals, demonstrating strong team player qualities and project ownership. Preferred qualifications, capabilities, and skills Knowledge of big data technologies, such as Snowflake, and integration experience is a plus. Experience with Python and Java is highly advantageous. Stay informed about industry-wide technology trends and best practices.
Posted 1 week ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Test Analyst (ETL, Data Testing, and SQL Expertise) Bangalore, Chennai, Gurgaon - 3 Days work from office Job Summary: We are seeking a highly skilled Test Automation Engineer with strong expertise in ETL testing, data validation, and SQL to join our dynamic team. The ideal candidate will be responsible for ensuring the quality and integrity of data through automated and manual testing processes and will work closely with data engineers, developers, and business stakeholders to deliver reliable data solutions. Key Responsibilities: Design, develop, and maintain test automation frameworks for ETL workflows and data pipelines. to validate data extraction, transformation, and loading processes across different systems. to ensure data quality, accuracy, and integrity across source and target systems. Write and optimize to validate large datasets and perform backend testing. Collaborate with data engineers and developers to resolve defects and improve data processes. Create and execute test plans, test cases, and test scripts for data validation and automation. Work with large volumes of structured and unstructured data to perform analysis and ensure compliance with business requirements. Perform root cause analysis on issues found in production and recommend solutions. Document test results, defect tracking, and provide regular reports to stakeholders. Required Skills & Qualifications: Bachelor s degree in Computer Science, Information Technology, or a related field. Proven experience in test automation for data workflows and ETL pipelines. Strong ETL testing skills with hands-on experience in tools like Informatica, Talend, or similar. Proficiency in SQL for complex data validation and backend testing. Solid understanding of data warehousing concepts, data integration, and BI processes. Knowledge of test management tools (e.g., JIRA, TestRail) and automation tools (e.g., Selenium, Python, or equivalent). Experience with large-scale data testing in cloud platforms (AWS, Azure, or GCP) is a plus. Excellent analytical and problem-solving skills. Strong communication skills and ability to work collaboratively in a team environment. Preferred Qualifications: Experience with scripting languages (Python, Shell, etc.) for automation. Knowledge of API testing and integration testing. Exposure to CI/CD pipelines and version control tools (e.g., Git).
Posted 1 week ago
6.0 - 8.0 years
8 - 10 Lacs
Hyderabad
Work from Office
Position: Sr. BI Developer Work Location: Hyderabad Mode of work - Hybrid Experience - 6 to 8 years Summary: We are seeking a skilled Senior Spotfire Developer with 6 - 8 years of experience to join our analytics team. The ideal candidate will bring expertise in TIBCO Spotfire and a strong foundation in business intelligence and data visualization. This role involves developing, optimizing, and supporting interactive dashboards and reports that provide key insights to support data-driven decision-making. You will work closely with cross-functional teams, including data analysts, engineers, and business stakeholders, to deliver impactful solutions that meet business objectives. Key Responsibilities: Spotfire Development and Customization: Design, develop, and deploy Spotfire applications, dashboards, and reports to support various business units in data-driven initiatives. Requirement Analysis: Collaborate with stakeholders to gather and understand requirements, translating them into technical solutions within Spotfire. Data Integration and Transformation: Use data blending and transformation techniques to prepare data for analysis in Spotfire, ensuring quality and integrity. Optimize Performance: Implement best practices for data loading, caching, and optimization to ensure responsive and efficient dashboards. Customization and Scripting: Enhance Spotfire functionality through scripting (IronPython, JavaScript) and integrate R or Python when needed for advanced analytics. Documentation and Support: Maintain documentation for dashboards and reports and provide support to users, addressing any technical or functional issues. Qualifications: Education: Bachelor s degree in Computer Science, Data Analytics, Information Systems, or a related field. Experience: 4 6 years in BI development, with 4+ years specifically in TIBCO Spotfire Technical Skills: Proficiency in TIBCO Spotfire, including visualization techniques and dashboard configuration. Strong SQL skills and experience with data modeling and data blending. Scripting experience in IronPython and/or JavaScript; knowledge of R or Python for advanced Spotfire functionalities. Familiarity with data integration tools such as Informatica, Alteryx, or equivalent. Analytical Skills: Ability to interpret complex data sets and create visually appealing, user-friendly dashboards. Soft Skills: Strong communication and interpersonal skills with the ability to work collaboratively in a team setting. Preferred Skills: Experience with other BI tools (e.g., Power BI, Tableau) is a plus. Understanding of machine learning and predictive analytics in a BI context. Exposure to cloud platforms like AWS, Azure, or Google Cloud.
Posted 1 week ago
9.0 - 15.0 years
6 - 10 Lacs
Hyderabad, Pune
Work from Office
Good understanding of RDBMS, and management of structured data across cloud and on-premise repositories: Oracle, SQL server, Sybase, Mongo DB, PostgreSQL databases Good understanding of managing integration of databases , big data and data warehouse with ETL tools like Informatica Familiarity with Business Intelligence tools like Tableau , Qlik, Power BI, Starburst, Familiarity with BigData, Teradata , Snowflake, Data bricks. Familiarity with tools like SecuPi, Immuta, Okera, Voltage Secure Data, Thales encryption and Fortanix. Familiarity with Java, SpringBoot, Kafka, Python, Proficiency in security technologies data masking, tokenization, or encryption techniques. Strong analytical, problem-solving and customer interaction skills. Excellent communication and teamwork abilities Implemented multiple long-term projects Client interfacing skills, understand and translate requirements back to team Position Responsibilities : Implement and manage data masking, tokenization, and encryption solutions to protect sensitive information. Facilitate integration of applications and client tools like Rapid SQL, iSQL, SQL Server Mgt. Studio with security tools like SecuPi. Creation and management of APIs Evaluate alternate tools such as Immuta, Okera, Voltage Secure Data, and Fortanix when necessary. Collaborate with cross-functional teams to ensure data security measures are aligned with business requirements. Stay updated with the latest trends and best practices in data security. Ability to communicate effectively with other teams. Data Masking And Encryption, Immuta, Rdbms, Technical, Tokenisation
Posted 1 week ago
1.0 - 6.0 years
10 - 11 Lacs
Chennai
Work from Office
As a Data Engineer, you will leverage your technical expertise in data, analytics, cloud technologies, and analytic software tools to identify best designs, improve business processes, and generate measurable business outcomes. You will work with Data Engineering teams from within D&A, across the Pro Tech portfolio and additional Ford organizations such as GDI&A (Global Data Insight & Analytics), Enterprise Connectivity, Ford Customer Service Division, Ford Credit, etc. Develop EL/ELT/ETL pipelines to make data available in BigQuery analytical data store from disparate batch, streaming data sources for the Business Intelligence and Analytics teams. Work with on-prem data sources (Hadoop, SQL Server), understand the data model, business rules behind the data and build data pipelines (with GCP, Informatica) for one or more Ford Pro verticals. This data will be landed in GCP BigQuery. Build cloud-native services and APIs to support and expose data-driven solutions. Partner closely with our data scientists to ensure the right data is made available in a timely manner to deliver compelling and insightful solutions. Design, build and launch shared data services to be leveraged by the internal and external partner developer community. Building out scalable data pipelines and choosing the right tools for the right job. Manage, optimize and Monitor data pipelines. Provide extensive technical, strategic advice and guidance to key stakeholders around data transformation efforts. Understand how data is useful to the enterprise. Experience with GCP cloud services including BigQuery, Cloud Composer, Dataflow, CloudSQL, GCS, Cloud Functions and Pub/Sub. Inquisitive, proactive, and interested in learning new tools and techniques. Familiarity with big data and machine learning tools and platforms. Comfortable with open source technologies including Apache Spark, Hadoop, Kafka. 1+ year experience with Hive, Spark, Scala, JavaScript. Strong oral, written and interpersonal communication skills Comfortable working in a dynamic environment where problems are not always well-defined. M.S. in a science-based program and/or quantitative discipline with a technical emphasis. Bachelors Degree 3+ years of experience with SQL and Python 2+ years of experience with GCP or AWS cloud services; Strong candidates with 5+ years in a traditional data warehouse environment (ETL pipelines with Informatica) will be considered 3+ years of experience building out data pipelines from scratch in a highly distributed and fault-tolerant manner. Comfortable with a broad array of relational and non-relational databases. Proven track record of building applications in a data-focused role (Cloud and Traditional Data Warehouse)
Posted 1 week ago
1.0 - 6.0 years
25 - 30 Lacs
Chennai
Work from Office
As a Data Engineer, you will leverage your technical expertise in data, analytics, cloud technologies, and analytic software tools to identify best designs, improve business processes, and generate measurable business outcomes. You will work with Data Engineering teams from within D&A, across the Pro Tech portfolio and additional Ford organizations such as GDI&A (Global Data Insight & Analytics), Enterprise Connectivity, Ford Customer Service Division, Ford Credit, etc. Develop EL/ELT/ETL pipelines to make data available in BigQuery analytical data store from disparate batch, streaming data sources for the Business Intelligence and Analytics teams. Work with on-prem data sources (Hadoop, SQL Server), understand the data model, business rules behind the data and build data pipelines (with GCP, Informatica) for one or more Ford Pro verticals. This data will be landed in GCP BigQuery. Build cloud-native services and APIs to support and expose data-driven solutions. Partner closely with our data scientists to ensure the right data is made available in a timely manner to deliver compelling and insightful solutions. Design, build and launch shared data services to be leveraged by the internal and external partner developer community. Building out scalable data pipelines and choosing the right tools for the right job. Manage, optimize and Monitor data pipelines. Provide extensive technical, strategic advice and guidance to key stakeholders around data transformation efforts. Understand how data is useful to the enterprise. Experience with GCP cloud services including BigQuery, Cloud Composer, Dataflow, CloudSQL, GCS, Cloud Functions and Pub/Sub. Inquisitive, proactive, and interested in learning new tools and techniques. Familiarity with big data and machine learning tools and platforms. Comfortable with open source technologies including Apache Spark, Hadoop, Kafka. 1+ year experience with Hive, Spark, Scala, JavaScript. Strong oral, written and interpersonal communication skills Comfortable working in a dynamic environment where problems are not always well-defined. Bachelors Degree 3+ years of experience with SQL and Python 2+ years of experience with GCP or AWS cloud services; Strong candidates with 5+ years in a traditional data warehouse environment (ETL pipelines with Informatica) will be considered 3+ years of experience building out data pipelines from scratch in a highly distributed and fault-tolerant manner. Comfortable with a broad array of relational and non-relational databases. Proven track record of building applications in a data-focused role (Cloud and Traditional Data Warehouse)
Posted 1 week ago
5.0 - 8.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Job Title: Collibra Data Governance Specialist Location: Hyderabad Job Type: Full-Time About the Role: We are seeking a highly skilled and experienced Collibra Expert to lead and manage our enterprise-level data governance initiatives using Collibra. This role requires deep expertise in Collibra s platform, including configuration, integration, workflow development, and stakeholder engagement. The ideal candidate will be responsible for implementing and maintaining data governance frameworks, ensuring data quality, and enabling data stewardship across the organization. Key Responsibilities: Lead the end-to-end implementation and administration of the Collibra Data Intelligence Platform. Design and configure Collibra Operating Model , including domains, assets, workflows, and roles. Develop and maintain custom workflows using BPMN and Collibra Workflow Designer. Integrate Collibra with enterprise systems (e.g., Snowflake, Informatica, Tableau, Azure, SAP) using APIs and connectors. Collaborate with data stewards, data owners, and business users to define and enforce data governance policies . Implement and monitor data quality rules , lineage, and metadata management. Provide training and support to business and technical users on Collibra usage and best practices. Act as a Collibra SME and evangelist within the organization, promoting data governance maturity. Maintain documentation and ensure compliance with internal and external data governance standards. Required Skills & Qualifications: 5+ years of experience in data governance , metadata management, or data quality. 3+ years of hands-on experience with Collibra , including configuration, workflow development, and integration. Strong understanding of data governance frameworks , data stewardship, and data lifecycle management. Proficiency in Collibra APIs, BPMN, and scripting languages (e.g., Groovy, JavaScript). Experience with data cataloging, lineage, and business glossary in Collibra. Familiarity with data platforms like Snowflake, Azure, AWS, Informatica, or similar. Excellent communication and stakeholder management skills. Collibra Ranger or Solution Architect certification is a plus . Preferred Qualifications: Experience in enterprise-level deployments of Collibra. Knowledge of regulatory compliance (e.g., GDPR, HIPAA, CCPA). Background in data architecture or data engineering is a plus.
Posted 1 week ago
3.0 - 8.0 years
30 - 45 Lacs
Gurugram
Work from Office
Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills
Posted 1 week ago
3.0 - 8.0 years
30 - 45 Lacs
Noida
Work from Office
Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills
Posted 1 week ago
3.0 - 8.0 years
30 - 45 Lacs
Pune
Work from Office
Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills
Posted 1 week ago
3.0 - 8.0 years
30 - 45 Lacs
Bengaluru
Work from Office
Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills
Posted 1 week ago
3.0 - 7.0 years
42 - 72 Lacs
Dindigul
Work from Office
Responsibilities: * Design, develop, test & maintain Informatica CDGC solutions using ETL processes with SQL queries. * Collaborate with cross-functional teams on project requirements & deliverables. Provident fund Health insurance Office cab/shuttle Annual bonus Food allowance
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title – Master Data Analyst Preferred Location - Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do Job Summary We are seeking a detail-oriented and experienced Master Data Analyst to ensure the accuracy, consistency, and integrity of our critical master data across various enterprise systems. The Master Data Analyst will play a crucial role in data governance, data quality initiatives, and supporting business processes through reliable and well-managed master data. Key Responsibilities Develop, implement, and maintain master data management (MDM) policies, standards, and procedures. Ensure data quality, completeness, and consistency of master data (e.g., customer, product, vendor, material) across all relevant systems. Perform data profiling, cleansing, and validation to identify and resolve data quality issues. Collaborate with business units and IT teams to define data definitions, business rules, and data hierarchies. Act as a data steward, overseeing the creation, modification, and deletion of master data records. Support data integration efforts, ensuring master data is accurately and efficiently synchronized between systems. Document master data processes, data flows, and data lineage. Participate in projects related to data migration, system implementations, and data governance initiatives. Provide training and support to end-users on master data best practices and tools. Required Qualifications Bachelor's degree in Information Systems, Data Science, or a related quantitative field. 3+ years of experience in a Master Data Management (MDM), Data Quality, or Data Analyst role, specifically focused on master data. Strong understanding of master data concepts, data governance principles, and data lifecycle management. Proficiency with data analysis tools and techniques. Experience with enterprise resource planning (ERP) systems (e.g., SAP, Oracle, Microsoft Dynamics) and their master data structures. Experienced in cloud platforms (AWS, Azure) or relevant data technologies. Excellent analytical, problem-solving, and communication skills, with the ability to translate technical concepts to non-technical stakeholders. Proven ability to work independently and collaboratively in a fast-paced environment. Preferred Qualifications Experience with MDM software solutions (e.g., Informatica MDM, SAP MDG). Familiarity with SQL and experience querying relational databases. Knowledge of SAP modules (ECC, CRM, BW) and with data governance, metadata management, and data cataloging tools (e.g., Alation, Collibra). Familiarity handling MDM in SAP ECC and SAP S/4 versions Knowledge of data warehousing concepts and business intelligence tools (e.g., Power BI, Tableau). Experience with data governance frameworks and tools. Certifications in data management or related fields. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice
Posted 1 week ago
1.0 - 4.0 years
10 - 11 Lacs
Chennai
Work from Office
As a Data Engineer, you will leverage your technical expertise in data, analytics, cloud technologies, and analytic software tools to identify best designs, improve business processes, and generate measurable business outcomes. You will work with Data Engineering teams from within D&A, across the Pro Tech portfolio and additional Ford organizations such as GDI&A (Global Data Insight & Analytics), Enterprise Connectivity, Ford Customer Service Division, Ford Credit, etc. Develop EL/ELT/ETL pipelines to make data available in BigQuery analytical data store from disparate batch, streaming data sources for the Business Intelligence and Analytics teams. Work with on-prem data sources (Hadoop, SQL Server), understand the data model, business rules behind the data and build data pipelines (with GCP, Informatica) for one or more Ford Pro verticals. This data will be landed in GCP BigQuery. Build cloud-native services and APIs to support and expose data-driven solutions. Partner closely with our data scientists to ensure the right data is made available in a timely manner to deliver compelling and insightful solutions. Design, build and launch shared data services to be leveraged by the internal and external partner developer community. Building out scalable data pipelines and choosing the right tools for the right job. Manage, optimize and Monitor data pipelines. Provide extensive technical, strategic advice and guidance to key stakeholders around data transformation efforts. Understand how data is useful to the enterprise. Experience with GCP cloud services including BigQuery, Cloud Composer, Dataflow, CloudSQL, GCS, Cloud Functions and Pub/Sub. Inquisitive, proactive, and interested in learning new tools and techniques. Familiarity with big data and machine learning tools and platforms. Comfortable with open source technologies including Apache Spark, Hadoop, Kafka. 1+ year experience with Hive, Spark, Scala, JavaScript. Strong oral, written and interpersonal communication skills Comfortable working in a dynamic environment where problems are not always well-defined. M. S. in a science-based program and/or quantitative discipline with a technical emphasis. Bachelors Degree 3+ years of experience with SQL and Python 2+ years of experience with GCP or AWS cloud services; Strong candidates with 5+ years in a traditional data warehouse environment (ETL pipelines with Informatica) will be considered 3+ years of experience building out data pipelines from scratch in a highly distributed and fault-tolerant manner. Comfortable with a broad array of relational and non-relational databases. Proven track record of building applications in a data-focused role (Cloud and Traditional Data Warehouse)
Posted 1 week ago
7.0 - 9.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role:-Data Analyst Exp:- 6-11 Yrs Location:-Hyderabad Primary Skills:- ETL,Informatica,Python, SQL,BI tools and Investment Domain Please share your resumes to rajamahender.n@technogenindia.com , Job Description:- The Minimum Qualifications Education: Bachelor’s or Master’s degree in Data Science, Statistics, Mathematics, Computer Science, Actuarial Science, or related field. Experience: 7-9 years of experience as a Data Analyst, with at least 5 years supporting Finance within the insurance industry. Hands-on experience with Vertica/Teradata for querying, performance optimization, and large-scale data analysis. Advanced SQL skills: proficiency in Python is a strong plus. Proven ability to write detailed source-to-target mapping documents and collaborate with technical teams on data integration. Experience working in hybrid onshore-offshore team environments. Deep understanding of data modelling concepts and experience working with relational and dimensional models. Strong communication skills with the ability to clearly explain technical concepts to non-technical audiences. A strong understanding of statistical concepts, probability and accounting standards, financial statements (balance sheet, income statement, cash flow statement), and financial ratios. Strong understanding of life insurance products and business processes across the policy lifecycle. Investment Principles: Knowledge of different asset classes, investment strategies, and financial markets. Quantitative Finance: Understanding of financial modelling, risk management, and derivatives. Regulatory Framework: Awareness of relevant financial regulations and compliance requirements. The Ideal Qualifications Technical Skills: Proven track record of Analytical and Problem-Solving skills. A solid understanding of Financial Accounting Systems and knowledge of accounting principles, reporting and budgeting Strong data analysis skills for extracting insights from financial data Proficiency in data visualization tools and reporting software is also important. Experience integrating financial systems with actuarial, policy administration, and claims platforms. Familiarity with actuarial processes, reinsurance, or regulatory reporting requirements. Experience with General Ledger systems such as SAP and forecasting tools like Anaplan. Soft Skills: Exceptional communication and interpersonal skills. Ability to influence and motivate teams without direct authority. Excellent time management and organizational skills, with the ability to prioritize multiple initiatives. What to Expect as Part of MassMutual and the Team Regular meetings with the Corporate Technology leadership team Focused one-on-one meetings with your manager Access to mentorship opportunities Access to learning content on Degreed and other informational platforms Your ethics and integrity will be valued by a company with a strong and stable ethical business with industry leading pay and benefits
Posted 1 week ago
0 years
0 Lacs
New Delhi, Delhi, India
On-site
The purpose of this role is to oversee the development of our database marketing solutions, using database technologies such as Microsoft SQL Server/Azure, Amazon Redshift, Google BigQuery. The role will be involved in design, development, troubleshooting, and issue resolution. The role involves upgrading, enhancing, and optimizing the technical solution. It involves continuous integration and continuous deployment of various requirements changes in the business logic implementation. Interactions with internal stakeholders and/or clients to explain technology solutions and a clear understanding of client’s business requirements through which to guide optimal design/solution to meet their needs. The ability to communicate to both technical and non-technical audiences is key. Job Description: Must Have Skills: Database (SQL server / Snowflake / Teradata / Redshift / Vertica / Oracle / Big query / Azure DW etc. ETL (Extract, Transform, Load) tool (Talend, Informatica, SSIS, DataStage, Matillion) Python, UNIX shell scripting, Project & resource management Workflow Orchestration (Tivoli, Tidal, Stonebranch) Client-facing skills Good to have Skills: Experience in Cloud computing (one or more of AWS, Azure, GCP) . AWS Preferred. Key responsibilities: Understanding and practical knowledge of data warehouse, data mart, data modelling, data structures, databases, and data ingestion and transformation Strong understanding of ETL processes as well as database skills and common IT offerings i.e. storage, backups and operating system. Has a strong understanding of the SQL and data base programming language Has strong knowledge of development methodologies and tools Contribute to design and oversees code reviews for compliance with development standards Designs and implements technical vision for existing clients Able to convert documented requirements into technical solutions and implement the same in given timeline with quality issues. Able to quickly identify solutions for production failures and fix them. Document project architecture, explain detailed design to team and create low level to high level design. Perform mid to complex level tasks independently. Support Client, Data Scientists and Analytical Consultants working on marketing solution. Work with cross functional internal team and external clients . Strong project Management and organization skills . Ability to lead/work 1 – 2 projects of team size 2 – 3 team members. Code management systems which include Code review and deployments Location: DGS India - Pune - Baner M- Agile Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
You desire impactful work. You’re RGA ready RGA is a purpose-driven organization working to solve today’s challenges through innovation and collaboration. A Fortune 500 Company and listed among its World’s Most Admired Companies , we’re the only global reinsurance company to focus primarily on life- and health-related solutions. Join our multinational team of intelligent, motivated, and collaborative people, and help us make financial protection accessible to all. General Summary Under limited supervision, participate in and support various GFS initiatives related to administration and data integration. This includes working with RGA Valuation, Finance, Pricing, and Risk Management to ensure consistent and quality data extracts are produced. Lead and support the development and implementation of processes for analyzing, mapping, and testing client data to be used in various downstream processes. Lead and support the analysis of client reported inventories for new deals and review changes to existing deals. Responsibilities Serve as technical resource for guiding the team in extracting, loading and mapping of client data files. Serve as a subject matter expert when dealing with the most complex issues related to data conversion. Write and execute data queries to get results needed for analysis, validity and accuracy testing. Interpret data, analyze results and provides ongoing reporting and analysis of key metrics Champion the future vision of the department and assist in creation/maintenance of data repository documentation and data standards and guidelines Solve business problems with a moderate level of complexity; analyze possible solutions using technical experience and judgment and precedents Explain data research and findings in a clear and straightforward manner to assist leadership in prioritizing business and information needs Analyze, test, and debug system solutions. Consult with other Operations and Risk Management associates in the development of solutions for specific business needs Perform other duties/projects as assigned. Required Education and Experience Bachelor’s degree in Information Technology, Computer Science, Data Science, Actuarial Science or a related degree, or equivalent experience 3-5 years of experience in a data quality assurance and/or annuity/pension administration system testing role Preferred Progress toward FLMI, ALHC or another relevant professional accreditation Required Skills and Abilities Intermediate Word, Excel, VBA and SQL/Query skills Advanced level of investigative, analytical and problem solving skills Detailed oriented, passionate about completing the task correctly rather than quickly Advanced oral and written communication skills demonstrating ability to share and impart knowledge Advanced ability to liaise with individuals across a wide variety of operational, functional, and technical disciplines Familiarity with insurance administrative systems, ability to calculate benefits under multiple structures, and basic understanding of how the data affects liabilities and financial results Ability to work effectively within a team environment and individually Advanced ability to translate business needs and problems into viable/accepted solutions. Advanced skills in customer relationship management and change management Ability to interpret and understand various client data formats Broad business knowledge, including knowledge of valuation, finance, and/or administrative systems Advanced ability to set goals and handle multiple tasks, clients, and projects simultaneously; Ability to appropriately balance priorities, deadlines, and deliverables Willingness to learn new skills and software applications Ability to customize a process for testing that can be repeated by others if needed Ability to liaise with individuals across a wide variety of operational, functional, and technical disciplines Preferred Advanced-to-expert knowledge of database application such as Access, SQL Server, or Oracle as well as SQL Complex analytical and problem-solving skills Experience with data management and or visualization tools such as (Tableau, Alteryx, Informatica, Python, etc.) Demonstrated management experience Insurance industry knowledge This is the contractual role for 1 year What You Can Expect From RGA Gain valuable knowledge from and experience with diverse, caring colleagues around the world. Enjoy a respectful, welcoming environment that fosters individuality and encourages pioneering thought. Join the bright and creative minds of RGA, and experience vast, endless career potential.
Posted 1 week ago
7.0 years
8 - 9 Lacs
Thiruvananthapuram
On-site
7 - 9 Years 4 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning. Experience in data warehouse design and cost improvements. Apply and optimize data models for efficient storage retrieval and processing of large datasets. Communicate and explain design/development aspects to customers. Estimate time and resource requirements for developing/debugging features/components. Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification. Knowledge Examples: Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF. Proficient in SQL for analytics and windowing functions. Understanding of data schemas and models. Familiarity with domain-related data. Knowledge of data warehouse optimization techniques. Understanding of data security concepts. Awareness of patterns frameworks and automation practices. Additional Comments: We are seeking a highly experienced Senior Data Engineer to design, develop, and optimize scalable data pipelines in a cloud-based environment. The ideal candidate will have deep expertise in PySpark, SQL, Azure Databricks, and experience with either AWS or GCP. A strong foundation in data warehousing, ELT/ETL processes, and dimensional modeling (Kimball/star schema) is essential for this role. Must-Have Skills 8+ years of hands-on experience in data engineering or big data development. Strong proficiency in PySpark and SQL for data transformation and pipeline development. Experience working in Azure Databricks or equivalent Spark-based cloud platforms. Practical knowledge of cloud data environments – Azure, AWS, or GCP. Solid understanding of data warehousing concepts, including Kimball methodology and star/snowflake schema design. Proven experience designing and maintaining ETL/ELT pipelines in production. Familiarity with version control (e.g., Git), CI/CD practices, and data pipeline orchestration tools (e.g., Airflow, Azure Data Factory Skills Azure Data Factory,Azure Databricks,Pyspark,Sql About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 1 week ago
15.0 years
0 Lacs
Hyderābād
On-site
Project Role : Technology Account Lead Project Role Description : Function as primary contact for technology work at each account. Integrate technology contracts and engagements at the client. Leverage all technology offerings to expand the scope of technology work at the account (up-sell/cross-sell). Create the technology account plan and get the right people involved to maximize the opportunity and build the account. Must have skills : SAP PP Production Planning & Control Discrete Industries, Informatica-BI Tools Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Technology Account Lead, you will serve as the primary contact for technology initiatives at each assigned account. Your typical day will involve integrating technology contracts and engagements, collaborating with various stakeholders, and leveraging technology offerings to enhance the scope of work. You will be responsible for creating a comprehensive technology account plan, ensuring that the right individuals are engaged to maximize opportunities and foster account growth. Your role will require strategic thinking and effective communication to align technology solutions with client needs, ultimately driving success for both the client and the organization. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Facilitate regular meetings to ensure alignment and progress across teams. - Mentor junior professionals to enhance their skills and knowledge in technology account management. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP PP Production Planning & Control Process Industries. - Strong understanding of technology integration and account management strategies. - Experience in developing and executing technology account plans. - Ability to analyze client needs and propose tailored technology solutions. - Excellent communication and interpersonal skills to engage with diverse teams and stakeholders. Additional Information: - The candidate should have minimum 12 years of experience in SAP PP Production Planning & Control Process Industries. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education
Posted 1 week ago
130.0 years
4 - 7 Lacs
Hyderābād
On-site
Job Description Manager, Data Visualization The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of the company IT operating model, Tech centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each tech center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview: A unique opportunity to be part of an Insight & Analytics Data hub for a leading biopharmaceutical company and define a culture that creates a compelling customer experience. Bring your entrepreneurial curiosity and learning spirit into a career of purpose, personal growth, and leadership. We are seeking those who have a passion for using data, analytics, and insights to drive decision-making that will allow us to tackle some of the world's greatest health threats As a manager in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Our Quantitative Sciences team use big data to analyze the safety and efficacy claims of our potential medical breakthroughs. We review the quality and reliability of clinical studies using deep scientific knowledge, statistical analysis, and high-quality data to support decision-making in clinical trials. What will you do in this role: Design & develop user-centric data visualization solutions utilizing complex data sources. Identify & define key business metrics and KPIs in partnership with business stakeholders. Define & develop scalable data models in alignment & support from data engineering & IT teams. Lead UI UX workshops to develop user stories, wireframes & develop intuitive visualizations. Collaborate with data engineering, data science & IT teams to deliver business friendly dashboard & reporting solutions. Apply best practices in data visualization design & continuously improve upon intuitive user experience for business stakeholders. Provide thought leadership and data visualization best practices to the broader Data & Analytics organization. Identify opportunities to apply data visualization technologies to streamline & enhance manual / legacy reporting deliveries. Provide training & coaching to internal stakeholders to enable a self-service operating model. Co-create information governance & apply data privacy best practices to solutions. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace. What Should you have: 5 years’ relevant experience in data visualization, infographics, and interactive visual storytelling Working experience and knowledge in Power BI / QLIK / Spotfire / Tableau and other data visualization technologies Working experience and knowledge in ETL process, data modeling techniques & platforms (Alteryx, Informatica, Dataiku, etc.) Experience working with Database technologies (Redshift, Oracle, Snowflake, etc) & data processing languages (SQL, Python, R, etc.) Experience in leveraging and managing third party vendors and contractors. Self-motivation, proactivity, and ability to work independently with minimum direction. Excellent interpersonal and communication skills Excellent organizational skills, with ability to navigate a complex matrix environment and organize/prioritize work efficiently and effectively. Demonstrated ability to collaborate and lead with diverse groups of work colleagues and positively manage ambiguity. Experience in Pharma and or Biotech Industry is a plus. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who we are: We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What we look for: Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Business Intelligence (BI), Clinical Decision Support (CDS), Clinical Testing, Communication, Create User Stories, Data Visualization, Digital Transformation, Healthcare Innovation, Information Technology Operations, IT Operation, Management Process, Marketing, Motivation Management, Requirements Management, Self Motivation, Statistical Analysis, Statistics, Thought Leadership, User Experience (UX) Design Preferred Skills: Job Posting End Date: 07/31/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R359276
Posted 1 week ago
15.0 years
0 Lacs
Hyderābād
On-site
Project Role : Integration Engineer Project Role Description : Provide consultative Business and System Integration services to help clients implement effective solutions. Understand and translate customer needs into business and technology solutions. Drive discussions and consult on transformation, the customer journey, functional/application designs and ensure technology and business solutions represent business requirements. Must have skills : Infrastructure As Code (IaC) Good to have skills : Google Cloud Storage, Microsoft Azure Databricks, Ansible on Microsoft Azure Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Integration Engineer, you will provide consultative Business and System Integration services to assist clients in implementing effective solutions. Your typical day will involve engaging with clients to understand their needs, facilitating discussions on transformation, and ensuring that the technology and business solutions align with their requirements. You will work collaboratively with various teams to translate customer needs into actionable plans, driving the customer journey and application designs to achieve optimal outcomes. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate workshops and meetings to gather requirements and feedback from stakeholders. - Develop and maintain documentation related to integration processes and solutions. - Infrastructure as Code (IaC): Knowledge of tools like Terraform, Terraform linkage, Helm, Ansible, ansible tower dependency and package management - Broad knowledge of operating systems - Network management knowledge and understanding of network protocols, configuration, and troubleshooting. Proficiency in configuring and managing network settings within cloud platforms - Security: Knowledge with cybersecurity principles and practices, implementing security frameworks that ensure secure workloads and data protection - Expert proficiency in Linux CLI - Monitoring of the environment from technical perspective. - Monitoring the costs of the development environment. Professional & Technical Skills: - Must To Have Skills: Proficiency in Infrastructure As Code (IaC). - Good To Have Skills: Experience with Hitachi Data Systems (HDS), Google Cloud Storage, Microsoft Azure Databricks. - Strong understanding of cloud infrastructure and deployment strategies. - Experience with automation tools and frameworks for infrastructure management. - Familiarity with version control systems and CI/CD pipelines. - Solid understanding of Data Modelling, Data warehousing and Data platforms design. - Working knowledge of databases and SQL. - Proficient with version control such as: Git, GitHub or GitLab - Solid understanding of Data warehousing and Data platforms design. - Experience supporting BAT teams and BAT test environments. - Experience with workflow and batch scheduling. Added advantage of Control-M and Informatica experience. - Good know-how on Financial Markets. Know-how on Clearing, Trading and Risk business process will be added advantage - Know-How on Java, Spark & BI reporting will be an added advantage. - Know-how of cloud platform and affinity towards modern technology an added advantage. - Experience in CI/CD pipeline and exposure to DevOps methodologies will be considered as added advantage. Additional Information: - The candidate should have minimum 5 years of experience in Infrastructure As Code (IaC). - This position is based in Hyderabad. - A 15 years full time education is required. 15 years full time education
Posted 1 week ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description And Requirements Position Summary The engineer role is to support external data transmission, operations, scheduling and middleware transmission. Experience in Windows and Linux environments and knowledge of Informatica MFT & Data Exchange tools. Should be able to handle day to day customer transmission and Informatica MFT/DX activities. Job Responsibilities Design and implement complex integration solutions through collaboration with engineers, application teams and operations team across the global enterprise Provide technical support to application developers when required. This includes promoting use of best practices, ensuring standardization across applications and trouble shooting Able to create new setups and support existing transmissions Able to diagnose and troubleshoot transmission and connection issues Experience in Windows administration and good to have expertise in IBM workload scheduler Hands on experience in tools like IIS, Informatica MFT & DX console, Splunk and IBM workload scheduler Responsibilities also include planning, engineering, and implementation of new transmissions as well as migration of setups The role will participate in the evaluation and recommendation of new products and technologies The role will also represent the domain in relevant automation and value innovation efforts Technical leadership, ability to think strategically and effectively communicate solutions to a variety of stake holders Able to debug production issues by analyzing the logs directly and using tools like Splunk. Learn new technologies based on demand and help team members by coaching and assisting Willing to work in rotational shifts Good Communication skill with the ability to communicate clearly and effectively Knowledge, Skills And Abilities Education Bachelor's degree in computer science, Information Systems, or related field Experience 7+ years of total experience and at least 4+ years of experience in designing and implementation of complex integration solutions through collaboration with engineers, application and operations team Create new setups and support existing transmissions Experience in tools like IIS, Informatica MFT & DX console, Splunk and IBM workload scheduler SSH/SSL/Tectia Microsoft IIS IBM Connect:Direct IBM Sterling Informatica MFT Operating System Knowledge (Linux/Windows/AIX) Troubleshooting Azure Dev Ops Pipeline Knowledge Mainframe z/OS Knowledge Open Shift and Kube Enterprise Scheduling Knowledge (Maestro) Good to Have : Python and/or Powershell Agile SAFe for Teams Ansible (Automation) Elastic Other Requirements (licenses, Certifications, Specialized Training – If Required) Working Relationships Internal Contacts (and purpose of relationship): MetLife internal partners External Contacts (and purpose of relationship) – If Applicable MetLife external partners About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us!
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France