Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Hyderābād
On-site
Job Information Date Opened 06/19/2025 Job Type Full time Work Experience 5+ years Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500032 Job Description As a Data Governance Lead at Kanerika, you will be responsible for defining, leading, and operationalizing the data governance framework, ensuring enterprise-wide alignment and regulatory compliance. Key Responsibilities: 1. Governance Strategy & Stakeholder Alignment Develop and maintain enterprise data governance strategies, policies, and standards. Align governance with business goals: compliance, analytics, and decision-making. Collaborate across business, IT, legal, and compliance teams for role alignment. Drive governance training, awareness, and change management programs. 2. Microsoft Purview Administration & Implementation Manage Microsoft Purview accounts, collections, and RBAC aligned to org structure. Optimize Purview setup for large-scale environments (50TB+). Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. Schedule scans, set classification jobs, and maintain collection hierarchies. 3. Metadata & Lineage Management Design metadata repositories and maintain business glossaries and data dictionaries. Implement ingestion workflows via ADF, REST APIs, PowerShell, Azure Functions. Ensure lineage mapping (ADF Synapse Power BI) and impact analysis. 4. Data Classification & Security Governance Define classification rules and sensitivity labels (PII, PCI, PHI). Integrate with MIP, DLP, Insider Risk Management, and Compliance Manager. Enforce records management, lifecycle policies, and information barriers. 5. Data Quality & Policy Management Define KPIs and dashboards to monitor data quality across domains. Collaborate on rule design, remediation workflows, and exception handling. Ensure policy compliance (GDPR, HIPAA, CCPA, etc.) and risk management. 6. Business Glossary & Stewardship Maintain business glossary with domain owners and stewards in Purview. Enforce approval workflows, standard naming, and steward responsibilities. Conduct metadata audits for glossary and asset documentation quality. 7. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps. Create pipelines for ingestion, lineage, glossary updates, tagging. Integrate with Power BI, Azure Monitor, Synapse Link, Collibra, BigID, etc. 8. Monitoring, Auditing & Compliance Set up dashboards for audit logs, compliance reporting, metadata coverage. Oversee data lifecycle management across its phases. Support internal and external audit readiness with proper documentation. Requirements 7+ years of experience in data governance and data management. Proficient in Microsoft Purview and Informatica data governance tools. Strong in metadata management, lineage mapping, classification, and security. Experience with ADF, REST APIs, Talend, dbt, and automation via Azure tools. Knowledge of GDPR, CCPA, HIPAA, SOX and related compliance needs.Skilled in bridging technical governance with business and compliance goals. Benefits Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
Posted 1 month ago
7.0 years
3 - 6 Lacs
Hyderābād
On-site
As a Data Governance Lead at Kanerika, you will be responsible for defining, leading, and operationalizing the data governance framework, ensuring enterprise-wide alignment and regulatory compliance. Key Responsibilities: 1. Governance Strategy & Stakeholder Alignment Develop and maintain enterprise data governance strategies, policies, and standards. Align governance with business goals: compliance, analytics, and decision-making. Collaborate across business, IT, legal, and compliance teams for role alignment. Drive governance training, awareness, and change management programs. 2. Microsoft Purview Administration & Implementation Manage Microsoft Purview accounts, collections, and RBAC aligned to org structure. Optimize Purview setup for large-scale environments (50TB+). Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. Schedule scans, set classification jobs, and maintain collection hierarchies. 3. Metadata & Lineage Management Design metadata repositories and maintain business glossaries and data dictionaries. Implement ingestion workflows via ADF, REST APIs, PowerShell, Azure Functions. Ensure lineage mapping (ADF Synapse Power BI) and impact analysis. 4. Data Classification & Security Governance Define classification rules and sensitivity labels (PII, PCI, PHI). Integrate with MIP, DLP, Insider Risk Management, and Compliance Manager. Enforce records management, lifecycle policies, and information barriers. 5. Data Quality & Policy Management Define KPIs and dashboards to monitor data quality across domains. Collaborate on rule design, remediation workflows, and exception handling. Ensure policy compliance (GDPR, HIPAA, CCPA, etc.) and risk management. 6. Business Glossary & Stewardship Maintain business glossary with domain owners and stewards in Purview. Enforce approval workflows, standard naming, and steward responsibilities. Conduct metadata audits for glossary and asset documentation quality. 7. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps. Create pipelines for ingestion, lineage, glossary updates, tagging. Integrate with Power BI, Azure Monitor, Synapse Link, Collibra, BigID, etc. 8. Monitoring, Auditing & Compliance Set up dashboards for audit logs, compliance reporting, metadata coverage. Oversee data lifecycle management across its phases. Support internal and external audit readiness with proper documentation. Requirements 7+ years of experience in data governance and data management. Proficient in Microsoft Purview and Informatica data governance tools. Strong in metadata management, lineage mapping, classification, and security. Experience with ADF, REST APIs, Talend, dbt, and automation via Azure tools. Knowledge of GDPR, CCPA, HIPAA, SOX and related compliance needs.Skilled in bridging technical governance with business and compliance goals. Benefits Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
Posted 1 month ago
4.0 - 5.0 years
0 Lacs
Bengaluru
On-site
Job Title: ETL Tester Experience: 4-5 years We are looking for a highly skilled and detail-oriented ETL Tester with 4–5 years of hands-on experience in validating data pipelines, ETL processes, and data warehousing systems. The ideal candidate will have a strong understanding of data extraction, transformation, and loading processes, and will be responsible for ensuring data accuracy, completeness, and quality across various system. Key responsibilities: -Review and understand ETL requirements and source-to-target mappings. -Develop and execute comprehensive test cases, test plans, and test scripts for ETL processes. -Validate data accuracy, transformations, and data flow between source and target systems. -Perform data validation, data reconciliation, and back-end/database testing using SQL. -Identify, document, and track defects using tools like JIRA or HP ALM. -Work closely with developers, business analysts, and data engineers to resolve issues. -Automate testing processes where applicable using scripting or ETL testing tools Required Skills: -4–5 years of hands-on experience in ETL testing. Strong SQL skills for writing complex queries and performing data validation. -Experience with ETL tools (e.g., Informatica, SSIS, Talend). -Familiarity with Data Warehousing concepts and Data Migration projects. -Proficiency in defect tracking and test management tools (e.g., JIRA, ALM, TestRail). -Knowledge of automation frameworks or scripting for ETL test automation is a plus. -Good understanding of Agile/Scrum methodology. Preferred Qualifications: -Bachelor's degree in Computer Science, Information Systems, or a related field. -Experience in cloud-based data platforms (AWS, Azure, GCP) is a plus. -Exposure to reporting and BI tools (Tableau, Power BI) is an advantage. Job Type: Full-time Schedule: Day shift Morning shift Work Location: In person
Posted 1 month ago
1.0 - 9.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: As an Analyst/Consultant/Senior Consultant in our T&T Team you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - Develop and execute automated test cases for ETL processes. Validate data transformation, extraction, and loading accuracy. Collaborate with data engineers and QA teams to understand ETL workflows. Identify and document defects and inconsistencies. Maintain test documentation and support manual testing efforts. Design and implement automated ETL test scripts and frameworks. Validate end-to-end data flows and transformation logic. Collaborate with data architects, developers, and QA teams. Integrate ETL testing into CI/CD pipelines where applicable. Analyze test results and troubleshoot data issues. Lead the architecture and development of advanced ETL automation frameworks. Drive best practices in ETL testing and data quality assurance. Mentor and guide junior consultants and analysts. Collaborate with stakeholders to align testing strategies with business goals. Integrate ETL testing within DevOps and CI/CD pipelines. Desired Qualifications 1 to 9 years’ experience in ETL testing and automation. Knowledge of ETL tools such as Informatica, Talend, or DataStage. Experience with SQL and database querying. Basic scripting or programming skills (Python, Shell, etc.). Good analytical and communication skills. Strong SQL skills and experience with ETL tools like Informatica, Talend, or DataStage. Proficiency in scripting languages for automation (Python, Shell, etc.). Knowledge of data warehousing concepts and best practices. Strong problem-solving and communication skills. Expert knowledge of ETL tools and strong SQL proficiency. Experience with automation scripting and data validation techniques. Strong leadership, communication, and stakeholder management skills. Familiarity with big data technologies and cloud platforms is a plus. Location and way of working: Base location: Bangalore This profile involves occasional travelling to client locations. Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterized by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognize there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organization and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.
Posted 1 month ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Responsibilities The Sr. Integration Developer (Senior Software Engineer) will work in the Professional Services Team and play a significant role in designing and implementing complex integration solutions using the Adeptia platform. This role requires hands-on expertise in developing scalable and efficient solutions to meet customer requirements. The engineer will act as a key contributor to the team's deliverables while mentoring junior engineers. They will ensure high-quality deliverables by collaborating with cross-functional teams and adhering to industry standards and best practices. Responsibilities include but not limited to: Collaborate with customers' Business and IT teams to understand integration requirements in the B2B/Cloud/API/Data/ETL/EAI Integration space and implement solutions using the Adeptia platform Design, develop, and configure complex integration solutions, ensuring scalability, performance, and maintainability. Take ownership of assigned modules and lead the implementation lifecycle from requirement gathering to production deployment. Troubleshoot issues during implementation and deployment, ensuring smooth system performance. Guide team members in addressing complex integration challenges and promote best practices and performance practices. Collaborate with offshore/onshore and internal teams to ensure timely execution and coordination of project deliverables. Write efficient, well-documented, and maintainable code, adhering to established coding standards. Review code and designs of team members, providing constructive feedback to improve quality. Participate in Agile processes, including Sprint Planning, Daily Standups, and Retrospectives, ensuring effective task management and delivery. Stay updated with emerging technologies to continuously enhance technical expertise and team skills. Essential Skills: Technical 5-7 years of hands-on experience in designing and implementing integration solutions across B2B, ETL, EAI, Cloud, API & Data Integration environments using leading platforms such as Adeptia, Talend, MuleSoft, or equivalent enterprise-grade tools. Proficiency in designing and implementing integration solutions, including integration processes, data pipelines, and data mappings, to facilitate the movement of data between applications and platforms. Proficiency in applying data transformation and data cleansing as needed to ensure data quality and consistency across different data sources and destinations. Good experience in performing thorough testing and validation of data integration processes to ensure accuracy, reliability, and data integrity. Proficiency in working with SOA, RESTful APIs, and SOAP Web Services with all security policies. Good understanding and implementation experience with various security concepts, best practices,Security standards and protocols such as OAUTH, SSL/TLS, SSO, SAML, IDP (Identity Provider). Strong understanding of XML, XSD, XSLT, and JSON. Good understanding in RDBMS/NoSQL technologies (MSSQL, Oracle, MySQL). Proficiency with transport protocols (HTTPS, SFTP, JDBC) and experiences of messaging systems such as Kafka, ASB(Azure Service Bus) or RabbitMQ. Hands-on experience in Core Java and exposure to commonly used Java frameworks Non-Technical 5-7 years Experience working in a Services Delivery Organization directly reporting to the client Strong communication skills. Develop solutions and POC’s based on customer and project needs Excellent documentation, process diagraming skills is needed Excellent interpersonal skills for building and maintaining positive relationships. Exceptional collaboration skills with the ability to work effectively with customers and internal teams. Experienced in gathering business requirements and translating them into actionable technical plans, and aligning teams for successful execution. Strong analytical, troubleshooting, and problem-solving skills. Proven ability to lead and mentor junior team members. Self-motivated with a strong commitment to delivering high-quality results under tight deadlines. Desired Skills: Technical Familiarity with JavaScript frameworks like ReactJS, AngularJS, or NodeJS. Exposure to integration standards (EDI, EDIFACT, IDOC). Experience with modern web UI tools and frameworks. Exposure to DevOps tools such as Git, Jenkins, and CI/CD pipelines. Non-Technical Onshore Experience working directly with Customers Strong time management skills and the ability to handle multiple priorities. Detail-oriented and enthusiastic about learning new tools and technologies. Committed to delivering high-quality results. Flexible, responsible, and focused on quality work. Ability to prioritize tasks, work under pressure, and collaborate with cross-functional teams. About Adeptia Adeptia believes business users should be able to access information anywhere, anytime by creating data connections themselves, and its mission is to enable that self-service capability. Adeptia is a unique social network for digital business connectivity for “citizen integrators” to respond quickly to business opportunities and get to revenue faster. Adeptia helps Information Technology (IT) staff to manage this capability while retaining control and security. Adeptia’ s unified hybrid offering — with simple data connectivity in the cloud, and optional on-premises enterprise process-based integration — provides a competitive advantage to 450+ customers, ranging from Fortune 500 companies to small businesses. Headquartered in Chicago, Illinois, USA and with an office in Noida, India, Adeptia provides world-class support to its customers around-the-clock. For more, visit www.adeptia.com Our Locations: India R&D Centre : Office 56, Sixth floor, Tower-B, The Corenthum, Sector-62, Noida, U.P. US Headquarters : 332 S Michigan Ave, Unit LL-A105, Chicago, IL 60604, USA
Posted 1 month ago
7.0 years
0 Lacs
Gandhinagar, Gujarat, India
On-site
Key Responsibilities • Lead and mentor a high-performing data pod composed of data engineers, data analysts, and BI developers. • Design, implement, and maintain ETL pipelines and data workflows to support real-time and batch processing. • Architect and optimize data warehouses for scale, performance, and security. • Perform advanced data analysis and modeling to extract insights and support business decisions. • Lead data science initiatives including predictive modeling, NLP, and statistical analysis. • Manage and tune relational and non-relational databases (SQL, NoSQL) for availability and performance. • Develop Power BI dashboards and reports for stakeholders across departments. • Ensure data quality, integrity, and compliance with data governance and security standards. • Work with cross-functional teams (product, marketing, ops) to turn data into strategy. Qualifications Required: • PhD in Data Science, Computer Science, Engineering, Mathematics, or related field. • 7+ years of hands-on experience across data engineering, data science, analysis, and database administration. • Strong experience with ETL tools (e.g., Airflow, Talend, SSIS) and data warehouses (e.g., Snowflake, Redshift, BigQuery). • Proficient in SQL, Python, and Power BI. • Familiarity with modern cloud data platforms (AWS/GCP/Azure). • Strong understanding of data modeling, data governance, and MLOps practices. • Exceptional ability to translate business needs into scalable data solutions.
Posted 1 month ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company Fives India Engineering & Projects Pvt. Ltd. Job Title Data Analyst (BI developer) Job Location Chennai, Tamil Nadu, India Job Department IT Educational Qualification BE/B.Tech/MCA from a reputed Institute in Computer Science or related field. Work Experience 2 – 4 years Job Description Fives is a global industrial engineering group based in Paris, France, that designs and supplies machines, process equipment and production lines for the world’s largest industrial sectors including aerospace, automotive, steel, aluminium, glass, cement, logistics and energy. Headquartered in Paris, Fives is located in about 25 countries with more than 9000 employees. Fives is seeking a Data Analyst for their office located in Chennai, India. The position is an integral part of the Group IT development team working on custom software solutions for the Group IT requirements. We are looking for analyst specialized in BI development. Required Skills Applicant should have skills/experience in the following area: 2 – 4 years’ of experience in Power BI development Good understanding of data visualization concepts. Proficiency in writing DAX expressions and Power Query Knowledge of SQL and database related technologies Source control such as GIT Proficient in building REST APIs to interact with data sources Familiarity with ETL/ELT concepts and tools such as Talend is a plus Good knowledge of programming, algorithms and data structures Ability to use Agile collaboration tools such as Jira Good communication skills both verbal and written Willingness to learn new technologies and tools Position Type Full-Time/Regular
Posted 1 month ago
6.0 - 12.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Requirements Job Requirements Role/ Job Title: Senior Data Analyst Function/ Department: Data & Analytics Job Purpose Senior data Analyst (DG) will work within the Data & Analytics Office to implement data governance framework with a focus on improvement of data quality, standards, metrics, processes. Align data management practices with regulatory requirements. Understanding of lineage – How the data is produced, managed, and consumed within the Banks business process and system. Roles & Responsibilities Demonstrate Strong understanding of data governance, data quality, data lineage and metadata management concepts. Participate in the data quality governance framework design and optimization, including process, standards, rules etc. Design and implement data quality rules and Monitoring mechanism. Analyze data quality issue and collaborate with business stakeholders to address the issue resolution, Build recovery model across Enterprise. knowledge of DG technologies for data quality and metadata management (Oval edge, Talend, Collibra etc.) Support in development of Centralized Metadata repositories (Business glossary, technical metadata etc.), Captures business/Data quality rules and design DQ reports & Dashboards. Improve data literacy among the stakeholders. Minimum 6 to 12 years of experience in Data governance with Banking Domain preferable Education Qualification Graduation: Bachelor of Science (B.Sc) / Bachelor of Technology (B.Tech) / Bachelor of Computer Applications (BCA) Post-graduation: Master of Science (M.Sc) /Master of Technology (M.Tech) / Master of Computer Applications (MCA) Experience: 5 to 10 years of relevant experience.
Posted 1 month ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Position Title: Product Manager – Healthcare Data & Analytics About The Role As Product Manager, you will lead the strategy, execution, and commercialization of innovative data and analytics products for the U.S. healthcare market. This is a highly collaborative role where you'll work cross-functionally with Engineering, Sales, Design, and Delivery teams to build scalable, interoperable solutions that address core challenges across payers and providers. You’ll be responsible for partnering with the solution offering manager to deliver on the product vision and roadmap, conducting customer discovery, tracking success metrics, and ensuring timely delivery of high-impact features. This role carries revenue responsibilities and is key to EXL Health’s broader growth agenda. Core Responsibilities Product Strategy & Leadership Develop and own the quarterly roadmap for healthcare data and analytics solution Manage product backlog and ensure alignment with evolving client needs, compliance mandates (e.g., CMS, FHIR), and company objectives Translate customer pain points and regulatory changes into innovative data-driven products and services. Champion a customer-first approach while ensuring technical feasibility and commercial viability. Stay ahead of technology and market trends—especially in AI, value-based care, and Care Management Collaborate closely with Engineering and Design teams to define and prioritize product requirements. Client Engagement & Sales Support Meet directly with clients to shape strategy, gather feedback, and build trusted relationships. Serve as the bridge between client expectations and solution capabilities, ensuring alignment and delivery excellence. Experience Qualifications Minimum 5–8 years of experience in analytics, data platforms, or product management, preferably within the U.S. healthcare ecosystem. At least 3 years in a leadership or client-facing product role, including experience managing end-to-end product development and revenue accountability. Proven success in bringing data or analytics products to market—from ideation through launch and iteration. Healthcare Domain Expertise Deep familiarity with U.S. payer or provider environments, including claims, payments, risk adjustment, population health, or care management. Working knowledge of regulatory and interoperability standards (e.g., CMS 0057, FHIR, TEFCA). Hands-on understanding of how data management, analytics, and AI/ML drive value in clinical or operational workflows. Technical Skills Practical experience with or exposure to: Cloud Platforms: Snowflake, AWS, Azure, GCP BI & Visualization Tools: Tableau, Power BI, Qlik ETL/Data Integration: Informatica, Talend, SSIS, Erwin Data Science/AI/ML: Experience collaborating data science teams on AI initiatives Agile/Tools: Jira, Confluence, Asana, Agile/Scrum methodologies Personal Attributes Strategic thinker who can dive deep into execution. Exceptional written and verbal communication, with the ability to translate technical concepts for non-technical audiences. Strong organizational, problem-solving, and analytical skills. Passion for innovation, continuous improvement, and cross-functional collaboration. A team-first leader with high emotional intelligence and the ability to mentor others. Education Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, Statistics, Business, or a related field from a top-tier institution.
Posted 1 month ago
13.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Team Lead - BI About Medline: Medline is America's largest privately held national manufacturer and distributor of health care supplies and services. Today, Medline manufactures and distributes more than 550,000 medical products, encompassing medical-surgical items and one of the largest textile lines in the industry. With 17 manufacturing facilities worldwide and over 25 joint venture manufacturing plants worldwide, along with 45 distribution centers in North America and 50 throughout the world, Medline posted $ 17 billion revenue last year. Medline is ranked #27 in Forbes 2019 list of America’s private companies. Medline Industries India Pvt. Ltd. is engaged in providing offshore business support services to Medline Industries Inc. & its global associate companies in the area of Information Services, Finance & Business processes Management. Medline Industries India Private Limited was setup in 2010 in Pune, India and today we are proud team of 700 plus associates supporting Medline’s healthcare vision across USA, Europe & other International regions. Why join Medline: A direct, full-time employment in a large, stable, rapidly growing and yet profitable company. Privately owned company with no public debts. No ill effects of recent downturn/recession. First-rate compensation and benefits package. Genuine individual growth potential in this new establishment. Open door and highly ethical work culture, with due accountability. Location: Pune Required skills: 13+ years of overall BI development experience (Azure/ Fabric/Power BI) 2+ years of lead experience with excellent Team handling skills Extensive hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure Databricks, and Power BI. Strong understanding of SDLC and best development practices Strong in Data analytics tools following the Industry Best Practices for development Good understanding of Data Modeling, performance fine tuning and Data Warehousing concepts Capability to bring in process improvements and performance improvements. Understanding of Jira tool and Agile scrums. Ability to listen to his/her team members and communicate instructions effectively Ability to lead, direct, influence and control team members Ability to handle multiple jobs at the same time Highly result focused Acumen and willingness to learn and support new programming languages or reporting tools Excellent oral and written communication skills with ability to independently engage all stakeholders at onsite and offshore Ability to always pay attention to details and encourage team members to do same Ability to take proactive steps in managing problems Willingness to troubleshoot and support critical issues during non-working hours Innovative in providing the solutions Desired Skills (Good To Have): HANA modeling (Native HANA skills), TDV, SQL Scripting, BO, Tableau Knowledge of ERP SAP Experience of ETL tools, preferably Talend Responsibilities: Provide analytical and technical leadership for BI and reporting environment (Azure/ Fabric/Power BI). Work planning and scrums execution Be the go-to person for all technical design queries, strong leadership qualities with delivery focus, Stakeholder Management for prioritization. Monitor all team members and provide necessary advice and guidance. Monitor team performance, Strive to work on any cutting edge technology Review completed tasks. Get involved in Project Intakes, discussion with different Project Stakeholders, Project Meetings, CAB Reviews to ascertain compliance with standards. Coach all team members and motivate them to produce the desired results. Collaborate with other teams to identify and troubleshoot technical issues such as data synchronization and application issues. Oversee hiring of new talent and team development; provide training on data governance expectations, standards, and processes. Lead team meeting to evaluate progress, identify existing as well as potential issues, and coordinate team’s effort to develop solutions to ad hoc projects. Innovation and quick adaptation to new technologies in the roadmap. Spearhead and drive POCs for roadmap decisions. Performance Tuning and Optimization (PTO) of the queries, using native monitoring and troubleshooting tools. Lead to development and continues improvement of the BI solutions. Lead various BI Team Initiatives for increasing Team effectiveness. Discover the new BI tools in market and lead implementing the defined BI strategies. Explore the opportunities to refine the existing BI processes. Work with extended teams and refine the handshake between the Teams. Education: IT Graduate (BE, BTech, MCA) preferred. Experience of working in Captive is a plus.
Posted 1 month ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Talend ETL Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will be pivotal in driving the success of application projects and fostering a collaborative environment among team members and other departments. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate training and development opportunities for team members to enhance their skills. - Monitor project progress and implement adjustments as necessary to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL. - Good To Have Skills: Experience with data integration tools and methodologies. - Strong understanding of data warehousing concepts and practices. - Experience in performance tuning and optimization of ETL processes. - Familiarity with cloud-based data solutions and architectures. Additional Information: - The candidate should have minimum 5 years of experience in Talend ETL. - This position is based at our Pune office. - A 15 years full time education is required. 15 years full time education
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role Description Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Code Outputs Expected: Code as per design Follow coding standards templates and checklists Review code – for team and peers Documentation Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure Define and govern configuration management plan Ensure compliance from the team Test Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain Relevance Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project Manage delivery of modules and/or manage user stories Manage Defects Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate Create and provide input for effort estimation for projects Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release Execute and monitor release process Design Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface With Customer Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications Take relevant domain/technology certification Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware’s. Strong analytical and problem-solving abilities Knowledge Examples Appropriate software programs / modules Functional and technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments Job Title: SDET – Data Integration & Transformation Location: Pune, India Job Type: Full-time Experience Level: [Mid-Level / Senior] Department: Quality Engineering / Data Engineering Work Time: upto 8 PM IST ________________________________________ Mandatory: Java, Selenium, worked on testing and automating Data Engineering pipeline, Data Pipeline testing, quality and anomaly. Job Summary: We are seeking a highly skilled and detail-oriented SDET (Software Development Engineer in Test) with expertise in test automation for data integration and transformation processes. The ideal candidate will work closely with data engineers and developers to build robust, automated testing frameworks ensuring data quality, consistency, and integrity across complex ETL and integration pipelines. ________________________________________ Key Responsibilities: Design, develop, and maintain automated test frameworks for validating data integration and transformation workflows. Collaborate with data engineers to understand data flow, business rules, and transformation logic. Create and execute test cases for ETL processes, data pipelines, and APIs. Validate data quality, schema, completeness, and correctness across multiple data sources and targets. Automate regression, integration, and end-to-end testing for data-driven applications. Implement tests for data accuracy, consistency, duplication, and loss. Work closely with DevOps teams to integrate test automation into CI/CD pipelines. Participate in requirement analysis, risk assessment, and test planning activities. Document defects clearly and collaborate on root cause analysis and resolutions. ________________________________________ Required Skills & Experience: Strong experience with test automation in data integration and transformation environments. Solid understanding of ETL/ELT pipelines, data validation, and transformation logic. Proficiency in writing SQL queries for test validation and data profiling. Hands-on experience with Python, Java, or similar scripting languages for test automation. Familiarity with data integration tools (e.g., Apache NiFi, Talend, Informatica, etc.) is a plus. Understanding of data formats like JSON, XML, Avro, and Parquet. Experience with test frameworks such as PyTest, JUnit, TestNG, or similar. Knowledge of CI/CD tools like Jenkins, GitLab CI, or CircleCI. Familiarity with big data platforms and distributed systems (e.g., Kafka, Spark, Hadoop) is a plus. ________________________________________ Preferred Qualifications: Exposure to cloud data ecosystems (e.g., AWS Glue, Redshift, S3, EMR, GCP BigQuery). Experience with data cataloging and data lineage tools. Understanding of data governance and security compliance. Strong communication and collaboration skills with both technical and non-technical stakeholders. Mandatory Soft Skills:- Good written and verbal communication Strong sense of ownership and ability to drive tasks independently Proactive about raising blockers and suggesting solutions Able to collaborate effectively across backend, frontend, and DevOps teams Comfortable working in a fast-paced, asynchronous environment Skills Java,Selenium,Rest Api,Etl Testing
Posted 1 month ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The future is our choice At Atos, as the global leader in secure and decarbonized digital, our purpose is to help design the future of the information space. Together we bring the diversity of our people’s skills and backgrounds to make the right choices with our clients, for our company and for our own futures. Roles & Responsibilities Lead the design and implementation of ETL solutions using tools like IICS, Talend, and other traditional ETL platforms. Architect and optimize data pipelines for performance, scalability, and reliability. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Requirements 10+ years of experience in ETL development with Informatica PowerCenter or IICS Hands-on expertise in building and optimizing complex ETL pipelines. Experience integrating with data warehouses, data lakes, or cloud platforms Proficiency in Informatica PowerCenter and IICS for ETL/ELT development. Strong SQL skills for querying and transformation logic. Familiarity with Python or Java for scripting and custom integrations. Must be good with Python and PySpark for data pipeline building. Must have experience of working with streaming data sources and Kafka. Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture. Here at Atos, diversity and inclusion are embedded in our DNA. Read more about our commitment to a fair work environment for all. Atos is a recognized leader in its industry across Environment, Social and Governance (ESG) criteria. Find out more on our CSR commitment. Choose your future. Choose Atos.
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description: Seeking an ETL Test Engineer to design, develop, and execute test cases for data pipelines and ETL processes. Should have strong SQL skills and experience in validating data transformations, data integrity, and ETL workflows. Key Skills: ETL Testing (Informatica/Talend/SSIS) Strong SQL (Joins, Aggregations, Data Validation) Data Warehousing Concepts Defect Management (JIRA/ALM) Test Automation (nice to have: Selenium/Python) Reporting tools knowledge (Power BI/Tableau – optional) Good to Have: Cloud Data Platforms (AWS/GCP/Azure) Big Data Testing (Hive/Spark) Python/Automation frameworks
Posted 1 month ago
7.0 years
10 - 18 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description We are looking for a highly skilled Senior Data Engineer / BI Consultant with 7+ years of experience in implementing commercial projects across Data Engineering, Business Intelligence, ETL, and Data Warehousing. The ideal candidate will have strong hands-on expertise in SQL, ETL tools, BI platforms, and modern database environments. Responsibilities Lead end-to-end implementation of data projects in enterprise environments. Design and develop robust and scalable ETL pipelines using industry-leading ETL tools. Analyze business requirements and translate them into technical solutions using data models and visualizations. Develop and manage reporting dashboards using BI tools like SAP Analytics Cloud, Power BI, Tableau, or Qlik. Work with relational and columnar databases (e.g., Oracle, SAP BW, Teradata, Exasol) for efficient data processing. Write optimized SQL for data extraction, transformation, and reporting purposes. Collaborate with stakeholders to ensure timely delivery and effective business insights. Implement data governance, metadata management, and security protocols in reporting and data warehousing. Perform unit testing, peer reviews, and performance optimization of ETL and BI solutions. Required Experience And Qualifications 7+ years of hands-on experience in data engineering, BI, or data science projects. Proficiency in SQL: including DML, DDL, DCL, and TCL operations. Experience in BI tools: such as SAP Analytics Cloud, Tableau, Power BI, Qlik, or MicroStrategy. ETL/Data Integration tools expertise: Informatica, Dataiku, Oracle Data Integrator, Talend, Pentaho DI, or IBM DataStage. Strong knowledge of data warehousing platforms: Oracle, Teradata, SAP BW, or Exasol. Ability to work independently and manage multiple priorities in a fast-paced environment. Solid understanding of data modeling, performance tuning, and data lifecycle management. Preferred Skills Knowledge of cloud platforms (AWS, Azure, GCP) and integration with data/BI solutions. Familiarity with big data technologies (Spark, Hive, HDFS) is a plus. Understanding of DevOps practices and CI/CD in data project delivery. Experience with version control tools like Git. Skills: etl,qlik,talend,oracle data integrator,bi platforms,exasol,hive,devops practices,pentaho di,ibm datastage,ci/cd,aws,performance tuning,informatica,tableau,power bi,data lifecycle management,git,data modeling,hdfs,teradata,sql,etl tools,dataiku,oracle,gcp,azure,data warehousing,spark,sap analytics cloud,sap bw
Posted 1 month ago
2.0 - 5.0 years
2 - 5 Lacs
Bengaluru
Work from Office
Job Description. Tietoevry Create is seeking a skilled Snowflake Developer to join our team in Bengaluru, India. In this role, you will be responsible for designing, implementing, and maintaining data solutions using Snowflake's cloud data platform. You will work closely with cross-functional teams to deliver high-quality, scalable data solutions that drive business value.. 7+ years of experience in designing, development of Datawarehouse & Data integration projects (SSE / TL level). Experience of working in Azure environment. Developing ETL pipelines in and out of data warehouse using a combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake.. Good understanding of database design concepts Transactional / Datamart / Data warehouse etc.. Expertise in loading from disparate data sets and translating complex functional and technical requirements into detailed design. Will also perform analysis of vast data stores and uncover insights.. Snowflake data engineers will be responsible for architecting and implementing substantial scale data intelligence solutions around Snowflake Data Warehouse.. A solid experience and understanding of architecting, designing, and operationalizing large-scale data & analytics solutions on Snowflake Cloud Data Warehouse is a must.. Very good articulation skill. Flexible and ready to learn new skills.. Additional Information. At Tietoevry, we believe in the power of diversity, equity, and inclusion. We encourage applicants of all backgrounds, genders (m/f/d), and walks of life to join our team, as we believe that this fosters an inspiring workplace and fuels innovation.?Our commitment to openness, trust, and diversity is at the heart of our mission to create digital futures that benefit businesses, societies, and humanity.. Diversity,?equity and?inclusion (tietoevry.com). Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Job Description Are you ready to make an impact at DTCC? Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We are committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. The Information Technology group delivers secure, reliable technology solutions that enable DTCC to be the trusted infrastructure of the global capital markets. The team delivers high-quality information through activities that include development of essential, building infrastructure capabilities to meet client needs and implementing data standards and governance Pay And Benefits Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits, based on location Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact You Will Have In This Role The Enterprise Application Support role specializes in maintaining and providing technical support for all applications that are beyond the development stage and are running in the daily operations of the firm. This role works closely with development teams, infrastructure partners, and internal clients to advance and resolve technical support incidents. 3 days onsite is mandatory with 2 optional days remote work (Onsite Tuesdays, Wednesdays and a third day of your choosing) Maybe required to work Tuesday through Saturday or Sunday through Thursday on rotational or permanent basis. Your Primary Responsibilities Experience with using ITIL Change, Incident and Problem management processes. Assist Major Incident calls and engaging the proper parties needed and helping to determine root cause. Troubleshoot and debug system component(s) to resolve technical issues in complex and highly regulated environments comprised of ground and cloud applications and services. Analyze proposed application design(s) and provide feedback on potential gaps or provide recommendations for optimization. Hands-on experience with Monitoring and Alerting processes in Distributed, Cloud and Mainframe environments. Knowledge and understanding of cyber security best practices and general security concepts like password rotation, access restriction and malware detection. Take part in Monthly Service Reviews (MSR) with Development partners to go over KPI metrics. Participate in Disaster Recovery / Loss of Region events (planned and unplanned) executing tasks and collecting evidence. Collaborate both within the team and across teams to resolve application issues and escalate as needed. Support audit requests in a timely fashion providing needed documentation and evidence. Plan and execute certificate creation/renewals as needed. Monitor Dashboards to better catch potential issues and aide in observability. Help gather and analyze project requirements and translate them into technical specification(s). Basic understanding of all lifecycle components (code, test, deploy). Good verbal and written communication and interpersonal skills, communicating openly with team members and others. Contribute to a culture where honesty and transparency are expected. On-call support with flexible work arrangement. **NOTE: The Primary Responsibilities of this role are not limited to the details above. ** Qualifications Minimum of 3 years of relevant Production support experience. Bachelor's degree preferred or equivalent experience. Talents Needed For Success Technical Qualifications (Distributed/Cloud): Hands on experience in Unix, Linux, Windows, SQL/PLSQL Familiarity working with relational databases (DB2, Oracle, Snowflake) Monitoring and Data Tools experience (Splunk, DynaTrace, Thousand Eyes, Grafana, Selenium, IBM Zolda) Cloud Technologies (AWS services (S3,EC2,Lambda,SQS,IAM roles), Azure, OpenShift, RDS Aurora, Postgress) Scheduling Tool experience (CA AutoSys, Control-M) Middleware experience (Solace, Tomcat, Liberty Server, WebSphere, WebLogic, JBoss) Messaging Queue Systems (IBM MQ, Oracle AQ, ActiveMQ, RabbitMQ, Kafka) Scripting languages (Bash, Python, Ruby, Shell, Perl, JavaScript) Hands on experience with ETL tools (Informatica Datahub/IDQ, Talend ) Technical Qualifications (Mainframe) Mainframe troubleshooting and support skills (COBOL, JCL, DB2, DB2 Stored Procedures, CICS, SPUFI, File aid) Mainframe scheduling (Job abends, Predecessor/Successor) Actual salary is determined based on the role, location, individual experience, skills, and other considerations. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. About Us With over 50 years of experience, DTCC is the premier post-trade market infrastructure for the global financial services industry. From 20 locations around the world, DTCC, through its subsidiaries, automates, centralizes, and standardizes the processing of financial transactions, mitigating risk, increasing transparency, enhancing performance and driving efficiency for thousands of broker/dealers, custodian banks and asset managers. Industry owned and governed, the firm innovates purposefully, simplifying the complexities of clearing, settlement, asset servicing, transaction processing, trade reporting and data services across asset classes, bringing enhanced resilience and soundness to existing financial markets while advancing the digital asset ecosystem. In 2024, DTCC’s subsidiaries processed securities transactions valued at U.S. $3.7 quadrillion and its depository subsidiary provided custody and asset servicing for securities issues from over 150 countries and territories valued at U.S. $99 trillion. DTCC’s Global Trade Repository service, through locally registered, licensed, or approved trade repositories, processes more than 25 billion messages annually. To learn more, please visit us at www.dtcc.com or connect with us on LinkedIn , X , YouTube , Facebook and Instagram . DTCC proudly supports Flexible Work Arrangements favoring openness and gives people freedom to do their jobs well, by encouraging diverse opinions and emphasizing teamwork. When you join our team, you’ll have an opportunity to make meaningful contributions at a company that is recognized as a thought leader in both the financial services and technology industries. A DTCC career is more than a good way to earn a living. It’s the chance to make a difference at a company that’s truly one of a kind. Learn more about Clearance and Settlement by clicking here .
Posted 1 month ago
12.0 years
0 Lacs
India
Remote
Location : Principal Data Architect (12 to 15 Years Exp.) Location : REMOTE (India) Fulltime Job Summary : We are seeking an experienced Principal Data Architect to lead our data architecture strategy with a strong emphasis on ETL development and data modeling. This role will define and implement scalable data frameworks, guide data integration and transformation efforts, and ensure the integrity, efficiency, and availability of data to support enterprise analytics and business intelligence initiatives. Key Responsibilities: • Design and maintain enterprise-level data architecture, ensuring alignment with business goals and IT strategy. • Define standards and best practices for data integration, modeling, and governance. • Provide thought leadership on data platform evolution (e.g., data lakes, warehouses, lakehouses). • Architect robust ETL pipelines to extract data from various sources (structured and unstructured), transform it, and load it into target systems. • Optimize ETL workflows for performance, scalability, and reliability. • Lead logical and physical data modeling efforts across operational and analytical environments (OLTP, OLAP). Required: • 10+ years of experience in data architecture, data engineering, or related roles. • Expert-level experience with ETL tools (e.g., Informatica, Talend, Azure Data Factory, AWS Glue, etc.). • Deep expertise in data modeling (3NF, dimensional, data vault, etc.) and tools like Erwin, ER/Studio, or dbt. • Strong SQL skills and experience with RDBMS and NoSQL technologies. • Cloud data architecture experience with platforms such as AWS, Azure, or GCP. • Excellent communication and stakeholder management skills.
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. T24 BA_Data Migration - Senior Manager Key words: T24 Data Migration, Requirements Gathering, Stakeholder Management, Gap Analysis, As-Is / To-Be Analysis, Data Mapping, ETL Processes, Data Quality & Validation SQL Data Modelling, Data Transformation, Source-to-Target Mapping, Data Reconciliation, Data Cleansing, Migration Tools (e.g., Informatica, Talend, Microsoft SSIS, SAP Data Services) Job Summary : Data Migration Business Analyst will lead the end-to-end analysis and execution of data migration initiatives across complex enterprise systems. This role demands deep expertise in data migration strategies, strong analytical capabilities, and a proven ability to work with cross-functional teams, including IT, business stakeholders, and data architects. Will be responsible for defining migration requirements, leading data mapping and reconciliation efforts, ensuring data integrity, and supporting transformation programs from legacy systems to modern platforms. As a senior leader, you will also play a critical role in stakeholder engagement, risk mitigation, and aligning data migration efforts with broader business objectives. Mandatory requirements: Selected candidates should be willing to work out of client location in Chennai for 5 days a week Roles and Responsibilities: T24 professionals with expertise and prior work experience on one or more of T24 products – Retail, Corporate, Internet Banking, Mobile banking, Wealth management, Payment suites Well versed in Technical aspects of the product and experienced in Data Migration activities Good understanding of the T24 architecture, administration, configuration, data structure. Technical: Design and Development experience in Infobasic, Core Java, EJB and J2EE Enterprise. Working experience and/or Knowledge of INFORMATICA In-depth experience in End-to-End Migration tasks, right from Migration strategy, ETL process, data reconciliation Experience in Relational or hierarchical databases including Oracle, DB2, Postgres, MySQL, and MSSQL. Working knowledge in one more of the functional areas such as Core, retail, Corporate, Securities Lending, Asset management, Compliance, AML - Including product parametrisation and set up. Show in-depth knowledge on best banking practices and T24 modules like Private Banking, Securities, Accounting, combined with good understanding of GL Ability to handle crisis and steer the team in the right direction. Excellent documentation skills in migration stream - Data migration strategy, finalising data mapping, Data profiling/cleansing, ETL process, Data reconciliation Excellent business communication skills Other skills include Effort Estimation, Pre-sales support, engagement Assessments, Project planning, Conduct Training for clients/Internal staff Good Leadership skills Excellent client-facing skills MBA/MCA/ BE / B.Tech/ equivalent with a sound industry experience of 9 to 12 Yrs Your client responsibilities: Need to work as a team lead in one or more T24 projects. Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Regular status reporting to the Manager and onsite coordinators Interface with the customer representatives as and when needed Should be ready to travel to customers locations on need basis Your People Responsibilities Building a quality culture Manage the performance management for the direct reportees, as per the organization policies Foster teamwork and lead by example Training and mentoring of project resources Participating in the organization-wide people initiatives Preferred skills: Database administration Performance tuning Prior Client facing experience EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Data Architect Location: Hyderabad, Chennai & Bangalore Exprience: 10+ Years Job Summary We are seeking a highly experienced and strategic Data Architect to lead the design and implementation of robust data architecture solutions. The ideal candidate will have a deep understanding of data modelling, governance, integration, and analytics platforms. As a Data Architect, you will play a crucial role in shaping the data landscape across the organization, ensuring data availability, consistency, security, and quality. Mandatory Skills Enterprise data architecture and modeling Cloud data platforms (Azure, AWS, GCP) Data warehousing and lakehouse architecture Data governance and compliance frameworks ETL/ELT design and orchestration Master Data Management (MDM) Databricks architecture and implementation Key Responsibilities Lead, define, and implement end-to-end modern data platforms on public cloud using Databricks Design and manage scalable data models and storage solutions Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to define required data structures, formats, pipelines, metadata, and workload orchestration capabilities Establish data standards, governance policies, and best practices Oversee the integration of new data technologies and tools Lead the development of data pipelines, marts, and lakes Ensure data solutions are compliant with security and regulatory standards Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations Mentor data engineers and developers on best practices Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field Relevant certifications in cloud platforms, data architecture, or governance Technical Skills Data Modelling: Conceptual, Logical, Physical modelling (ERwin, Power Designer, etc.) Cloud: Azure (ADF, Synapse, Databricks), AWS (Redshift, Glue), GCP (Big Query) Databases: SQL Server, Oracle, PostgreSQL, NoSQL (MongoDB, Cassandra) Data Integration: Informatica, Talend, Apache NiFi Big Data: Hadoop, Spark, Kafka Governance Tools: Collibra, Alation, Azure Purview Scripting: Python, SQL, Shell DevOps/DataOps practices and CI/CD tools Soft Skills Strong leadership and stakeholder management Excellent communication and documentation skills Strategic thinking with problem-solving ability Collaborative and adaptive in cross-functional teams Good to Have Experience in AI/ML data lifecycle support Exposure to industry data standards and frameworks (TOGAF, DAMA-DMBOK) Experience with real-time analytics and streaming data solutions Work Experience Minimum 10 years in data engineering, architecture, or related roles At least 5 years of hands-on experience in designing data platforms on Azure Demonstrated knowledge of 2 full project cycles using Databricks as an architect Experience supporting and working with cross-functional teams in a dynamic environment Advanced working SQL knowledge and experience working with relational databases and unstructured datasets Experience with stream-processing systems such as Storm, Spark-Streaming Compensation & Benefits Competitive salary and annual performance-based bonuses Comprehensive health and optional Parental insurance. Retirement savings plans and tax savings plans. Work-Life Balance: Flexible work hours Key Result Areas (KRAs) Effective implementation of scalable and secure data architecture Governance and compliance adherence Standardization and optimization of data assets Enablement of self-service analytics and data democratization Key Performance Indicators (KPIs) Architecture scalability and reusability metrics Time-to-delivery for data initiatives Data quality and integrity benchmarks Compliance audit outcomes Satisfaction ratings from business stakeholders Contact: hr@bigtappanalytics.com
Posted 1 month ago
12.0 years
0 Lacs
Chandigarh, India
On-site
Job Summary JOB DESCRIPTION As a key contributor to our ERP Transformation Services team, the Senior ETL Data Migration Analyst is responsible for owning the design, development, and execution of enterprise-wide data migration activities. This role is instrumental in the success of global ERP implementations—primarily Oracle EBS and SAP ECC—by ensuring consistent, auditable, and high-quality data migration processes using industry-standard tools and frameworks. In This Role, Your Responsibilities Will Be: Pre-Go-Live: Planning & Development Design and implement global data migration strategies for Oracle and SAP ERP projects. Develop ETL processes using Syniti DSP / SKP or an equivalent tool to support end-to-end migration. Collaborate with legacy system teams to extract and analyze source data. Build workflows for data profiling, cleansing, enrichment, and transformation. Ensure audit ability and traceability of migrated data, aligned with compliance and governance standards. Go-Live & Cutover Execution Support mock loads, cutover rehearsals, and production data loads. Monitor data load progress and resolve issues related to performance, mapping, or data quality. Maintain a clear log of data migration actions and reconcile with source systems. Post-Go-Live: Support & Stewardship Monitor data creation and updates to ensure business process integrity post go-live. Provide data extract/load services for ongoing master data maintenance. Contribute to legacy data archiving strategies, tools, and execution. Tools, Documentation & Collaboration Maintain documentation of ETL procedures, technical specifications, and data lineage. Partner with implementation teams to translate business requirements into technical solutions. Contribute to the development and refinement of ETL frameworks and reusable components. Travel Requirements Willingness to travel up to 20% for project needs, primarily during key implementation phases Who You Are: You show a tremendous amount of initiative in tough situations; you are someone who has strong analytical and problem-solving skills. You are self-motivated, accountable, and proactive in learning and applying new technologies. You possess superb communication and collaboration across global teams. For This Role, You Will Need: 12+ years of IT experience with a focus on ETL, data management, and ERP data migration. Strong hands-on experience with Oracle EBS or SAP ECC implementations. Proficiency in Syniti DSP, Informatica, Talend, or similar enterprise ETL tools. Proficient SQL skills; ability to write and optimize queries for large datasets. Demonstrable track record in data profiling, cleansing, and audit trail maintenance. Academic background in MCA / BE / BSC - Computer Science, Engineering, Information Systems, or Business Administration Proven Application development experience in .NET, ABAP, or scripting languages Familiarity with Data Migration implementations and data modeling principles. Knowledge of project management methodologies (Agile, PMP, etc.). Performance Indicators Successful execution of data migration cutovers with minimal errors. Complete Data traceability and audit compliance from source to target. Timely delivery of ETL solutions and reports per project phases. Continuous improvement and reuse of ETL frameworks and standard processes. Our Culture & Commitment to You: At Emerson, we prioritize a workplace where every employee is valued, respected, and empowered to grow. We foster an environment that encourages innovation, collaboration, and diverse perspectives—because we know that great ideas come from great teams. Our commitment to ongoing career development and growing an inclusive culture ensures you have the support to thrive. Whether through mentorship, training, or leadership opportunities, we invest in your success so you can make a lasting impact. We believe diverse teams, working together are key to driving growth and delivering business results. We recognize the importance of employee wellbeing. We prioritize providing competitive benefits plans, a variety of medical insurance plans, Employee Assistance Program, employee resource groups, recognition, and much more. Our culture offers flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave About Us WHY EMERSON Our Commitment to Our People At Emerson, we are motivated by a spirit of collaboration that helps our diverse, multicultural teams across the world drive innovation that makes the world healthier, safer, smarter, and more sustainable. And we want you to join us in our bold aspiration. We have built an engaged community of inquisitive, dedicated people who thrive knowing they are welcomed, trusted, celebrated, and empowered to solve the world’s most complex problems — for our customers, our communities, and the planet. You’ll contribute to this vital work while further developing your skills through our award-winning employee development programs. We are a proud corporate citizen in every city where we operate and are committed to our people, our communities, and the world at large. We take this responsibility seriously and strive to make a positive impact through every endeavor. At Emerson, you’ll see firsthand that our people are at the center of everything we do. So, let’s go. Let’s think differently. Learn, collaborate, and grow. Seek opportunity. Push boundaries. Be empowered to make things better. Speed up to break through. Let’s go, together. Accessibility Assistance or Accommodation If you have a disability and are having difficulty accessing or using this website to apply for a position, please contact: idisability.administrator@emerson.com . About Emerson Emerson is a global leader in automation technology and software. Through our deep domain expertise and legacy of flawless execution, Emerson helps customers in critical industries like life sciences, energy, power and renewables, chemical and advanced factory automation operate more sustainably while improving productivity, energy security and reliability. With global operations and a comprehensive portfolio of software and technology, we are helping companies implement digital transformation to measurably improve their operations, conserve valuable resources and enhance their safety. We offer equitable opportunities, celebrate diversity, and embrace challenges with confidence that, together, we can make an impact across a broad spectrum of countries and industries. Whether you’re an established professional looking for a career change, an undergraduate student exploring possibilities, or a recent graduate with an advanced degree, you’ll find your chance to make a difference with Emerson. Join our team – let’s go! No calls or agencies please.
Posted 1 month ago
12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary JOB DESCRIPTION As a key contributor to our ERP Transformation Services team, the Senior ETL Data Migration Analyst is responsible for owning the design, development, and execution of enterprise-wide data migration activities. This role is instrumental in the success of global ERP implementations—primarily Oracle EBS and SAP ECC—by ensuring consistent, auditable, and high-quality data migration processes using industry-standard tools and frameworks. In This Role, Your Responsibilities Will Be: Pre-Go-Live: Planning & Development Design and implement global data migration strategies for Oracle and SAP ERP projects. Develop ETL processes using Syniti DSP / SKP or an equivalent tool to support end-to-end migration. Collaborate with legacy system teams to extract and analyze source data. Build workflows for data profiling, cleansing, enrichment, and transformation. Ensure audit ability and traceability of migrated data, aligned with compliance and governance standards. Go-Live & Cutover Execution Support mock loads, cutover rehearsals, and production data loads. Monitor data load progress and resolve issues related to performance, mapping, or data quality. Maintain a clear log of data migration actions and reconcile with source systems. Post-Go-Live: Support & Stewardship Monitor data creation and updates to ensure business process integrity post go-live. Provide data extract/load services for ongoing master data maintenance. Contribute to legacy data archiving strategies, tools, and execution. Tools, Documentation & Collaboration Maintain documentation of ETL procedures, technical specifications, and data lineage. Partner with implementation teams to translate business requirements into technical solutions. Contribute to the development and refinement of ETL frameworks and reusable components. Travel Requirements Willingness to travel up to 20% for project needs, primarily during key implementation phases Who You Are: You show a tremendous amount of initiative in tough situations; you are someone who has strong analytical and problem-solving skills. You are self-motivated, accountable, and proactive in learning and applying new technologies. You possess superb communication and collaboration across global teams. For This Role, You Will Need: 12+ years of IT experience with a focus on ETL, data management, and ERP data migration. Strong hands-on experience with Oracle EBS or SAP ECC implementations. Proficiency in Syniti DSP, Informatica, Talend, or similar enterprise ETL tools. Proficient SQL skills; ability to write and optimize queries for large datasets. Demonstrable track record in data profiling, cleansing, and audit trail maintenance. Academic background in MCA / BE / BSC - Computer Science, Engineering, Information Systems, or Business Administration Proven Application development experience in .NET, ABAP, or scripting languages Familiarity with Data Migration implementations and data modeling principles. Knowledge of project management methodologies (Agile, PMP, etc.). Performance Indicators Successful execution of data migration cutovers with minimal errors. Complete Data traceability and audit compliance from source to target. Timely delivery of ETL solutions and reports per project phases. Continuous improvement and reuse of ETL frameworks and standard processes. Our Culture & Commitment to You: At Emerson, we prioritize a workplace where every employee is valued, respected, and empowered to grow. We foster an environment that encourages innovation, collaboration, and diverse perspectives—because we know that great ideas come from great teams. Our commitment to ongoing career development and growing an inclusive culture ensures you have the support to thrive. Whether through mentorship, training, or leadership opportunities, we invest in your success so you can make a lasting impact. We believe diverse teams, working together are key to driving growth and delivering business results. We recognize the importance of employee wellbeing. We prioritize providing competitive benefits plans, a variety of medical insurance plans, Employee Assistance Program, employee resource groups, recognition, and much more. Our culture offers flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave About Us WHY EMERSON Our Commitment to Our People At Emerson, we are motivated by a spirit of collaboration that helps our diverse, multicultural teams across the world drive innovation that makes the world healthier, safer, smarter, and more sustainable. And we want you to join us in our bold aspiration. We have built an engaged community of inquisitive, dedicated people who thrive knowing they are welcomed, trusted, celebrated, and empowered to solve the world’s most complex problems — for our customers, our communities, and the planet. You’ll contribute to this vital work while further developing your skills through our award-winning employee development programs. We are a proud corporate citizen in every city where we operate and are committed to our people, our communities, and the world at large. We take this responsibility seriously and strive to make a positive impact through every endeavor. At Emerson, you’ll see firsthand that our people are at the center of everything we do. So, let’s go. Let’s think differently. Learn, collaborate, and grow. Seek opportunity. Push boundaries. Be empowered to make things better. Speed up to break through. Let’s go, together. Accessibility Assistance or Accommodation If you have a disability and are having difficulty accessing or using this website to apply for a position, please contact: idisability.administrator@emerson.com . About Emerson Emerson is a global leader in automation technology and software. Through our deep domain expertise and legacy of flawless execution, Emerson helps customers in critical industries like life sciences, energy, power and renewables, chemical and advanced factory automation operate more sustainably while improving productivity, energy security and reliability. With global operations and a comprehensive portfolio of software and technology, we are helping companies implement digital transformation to measurably improve their operations, conserve valuable resources and enhance their safety. We offer equitable opportunities, celebrate diversity, and embrace challenges with confidence that, together, we can make an impact across a broad spectrum of countries and industries. Whether you’re an established professional looking for a career change, an undergraduate student exploring possibilities, or a recent graduate with an advanced degree, you’ll find your chance to make a difference with Emerson. Join our team – let’s go! No calls or agencies please.
Posted 1 month ago
15.0 years
0 Lacs
Hyderābād
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Talend ETL Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Roles & Responsibilities: - Expected to be a SME with deep knowledge and experience. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Create data pipelines to extract, transform, and load data across systems. - Implement ETL processes to migrate and deploy data across systems. - Ensure data quality and integrity throughout the data lifecycle. Professional & Technical Skills: - Required Skill: Expert proficiency in Talend Big Data. - Strong understanding of data engineering principles and best practices. - Experience with data integration and data warehousing concepts. - Experience with data migration and deployment. - Proficiency in SQL and database management. - Knowledge of data modeling and optimization techniques. Additional Information: - The candidate should have minimum 5 years of experience in Talend Big Data. 15 years full time education
Posted 1 month ago
0 years
0 Lacs
Bhopal
On-site
We are seeking a skilled Data Analyst intern to join our team. The ideal candidate will have experience in data analytics, business analysis, and working with various tools and technologies to extract insights from data. Duties: Utilize tools such as Talend, and SQL to analyze data sets. Collaborate with cross-functional teams to gather requirements for data analysis projects. Develop and implement data collection systems and strategies. Interpret data, analyze results using statistical techniques, and provide ongoing reports. Identify patterns and trends in data sets. Conduct full lifecycle activities to include requirements analysis and design. Experience Proficiency in SQL for querying and analyzing large datasets Experience with data visualization tools like Tableau for creating insightful reports Knowledge of Agile methodologies for project management Familiarity with SDLC processes Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail Nice-to-have skills: familiarity with Linked Data concepts, Business Analysis techniques, and experience using analytics tools like 'smash' Interested candidate can share their resume at ritupokharal@katyayaniorganics.com Job Type: Internship Contract length: 6 months Pay: ₹5,000.00 per year Schedule: Day shift Work Location: In person
Posted 1 month ago
7.0 years
0 Lacs
Gandhinagar, Gujarat, India
On-site
Key Responsibilities • Lead and mentor a high-performing data pod composed of data engineers, data analysts, and BI developers. • Design, implement, and maintain ETL pipelines and data workflows to support real-time and batch processing. • Architect and optimize data warehouses for scale, performance, and security. • Perform advanced data analysis and modeling to extract insights and support business decisions. • Lead data science initiatives including predictive modeling, NLP, and statistical analysis. • Manage and tune relational and non-relational databases (SQL, NoSQL) for availability and performance. • Develop Power BI dashboards and reports for stakeholders across departments. • Ensure data quality, integrity, and compliance with data governance and security standards. • Work with cross-functional teams (product, marketing, ops) to turn data into strategy. Qualifications Required: • PhD in Data Science, Computer Science, Engineering, Mathematics, or related field. • 7+ years of hands-on experience across data engineering, data science, analysis, and database administration. • Strong experience with ETL tools (e.g., Airflow, Talend, SSIS) and data warehouses (e.g., Snowflake, Redshift, BigQuery). • Proficient in SQL, Python, and Power BI. • Familiarity with modern cloud data platforms (AWS/GCP/Azure). • Strong understanding of data modeling, data governance, and MLOps practices. • Exceptional ability to translate business needs into scalable data solutions.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15459 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France