Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
4 - 6 Lacs
Noida
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 2+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
3.0 - 7.0 years
3 - 5 Lacs
Calcutta
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Senior – Collibra As part of our EY-GDS D&A (Data and Analytics) team, we assist our clients in overcoming complex business challenges through the power of data and technology. We delve deep into data to extract maximum value and uncover opportunities across key sectors, including Banking, Insurance, Manufacturing, Healthcare, Retail, Supply Chain, and Finance. The opportunity We are seeking candidates with a robust understanding of technology and data in the Data Governance and Data Cataloguing space, along with a proven track record of successful delivery. This is an excellent opportunity to join a leading firm and be part of a dynamic Data and Analytics team. Your key responsibilities Develop standardized practices for creating and deploying data cataloguing and metadata solutions using Collibra. Collaborate with client technology leaders to understand their business objectives and architect data governance solutions tailored to their needs. Define and implement best practices for metadata management specific to client requirements. Create and maintain data dictionaries and hierarchies within Collibra. Integrate Collibra into the broader Data Analytics ecosystem, ensuring seamless functionality with other tools. Skills and attributes for success 3 - 7 years of total IT experience. Strong experience in designing and building solutions using Collibra (Data Catalog, Metadata ingestion, Data Classification & Tagging, Data Lineage, Workflow creation). Extensive experience in developing and maintaining data cataloguing solutions. Proficient programming skills in Python/ Java/ Groovy. Solid understanding of Data Governance and Metadata management principles. Hands-on experience with API Integration. Familiarity with at least one other data governance tool such as MS Purview/Alation/Informatica. Knowledge of data engineering pipelines in with Informatica PowerCenter, IDMC is preferred. Knowledge of data engineering pipelines in Azure/Databricks is preferred. Experience with BI and data analytics databases is a plus. Ability to translate business challenges into technical solutions, considering security, performance, and scalability. To qualify for the role, you must have. Be a computer science graduate or equivalent with > 3 years of industry experience. Have working experience in an Agile base delivery methodology. Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Ideally, you’ll also have Client management skills Solutioning skills What we look for People with technical experience and enthusiasm to learn new things in this fast-moving environment. What working at EY offers At EY, we’re dedicated to helping our clients, from startups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career. The freedom and flexibility to handle your role in a way that’s right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 3 weeks ago
0 years
0 Lacs
Andhra Pradesh
On-site
Team Lead Ref #S0014 : Data Quality Developer - Analyze Data Quality Requirements and associated Business Rules - Create automated business logic checks/data cleansing routines using Informatica - Perform code review, Unit Testing and Bug fixing - Provide required Support during System and Regression testing - Prior Experience in Data Quality / Data Profiling assignments with good hands on experience in Informatica Data Quality 8.6/ 9.0 is desirable Requirements: Developer - Analyze Data Quality Requirements and associated Business Rules - Create automated business logic checks/data cleansing routines using Informatica - Perform code review, Unit Testing and Bug fixing - Provide required Support during System and Regression testing - Prior Experience in Data Quality / Data Profiling assignments with good hands on experience in Informatica Data Quality 8.6/ 9.0 is desirable Special requirements: Informatica Data Quality 8.6/ 9.0 Any Applicant who is interested in this position may apply by regular mail (include Reference Number S0014) to: Human Resources Solis Technologies, Inc. #B8, Flot No PH1 Madhura Nagar, Hyderabad Andhra Pradesh, India - 500038
Posted 3 weeks ago
6.0 years
2 - 4 Lacs
Durgāpura
On-site
Unlock yourself. Take your career to the next level. At Atrium, we live and deliver at the intersection of industry strategy, intelligent platforms, and data science — empowering our customers to maximize the power of their data to solve their most complex challenges. We have a unique understanding of the role data plays in the world today and serve as market leaders in intelligent solutions. Our data-driven, industry-specific approach to business transformation for our customers places us uniquely in the market. Who are you? You are smart, collaborative, and take ownership to get things done. You love to learn and are intellectually curious in business and technology tools, platforms, and languages. You are energized by solving complex problems and bored when you don’t have something to do. You love working in teams and are passionate about pulling your weight to make sure the team succeeds. What will you be doing at Atrium? In this role, you will join the best and brightest in the industry to skillfully push the boundaries of what’s possible. You will work with customers to make smarter decisions through innovative problem-solving using data engineering, Analytics, and systems of intelligence. You will partner to advise, implement, and optimize solutions through industry expertise, leading cloud platforms, and data engineering. As a Snowflake Data Engineering Lead , you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. In this role, you will: Lead the design and architecture of end-to-end data warehousing and data lake solutions, focusing on the Snowflake platform, incorporating best practices for scalability, performance, security, and cost optimization Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Lead and mentor both onshore and offshore development teams, creating a collaborative environment Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, DBT, Python, AWS, and Big Data tools Development of ELT processes to ensure timely delivery of required data for customers Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data Design, implement, and maintain data models that can support the organization's data storage and analysis needs Deliver technical and functional specifications to support data governance and knowledge sharing In this role, you will have: Bachelor's degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education 6+ years of experience delivering consulting services to medium and large enterprises. Implementations must have included a combination of the following experiences: Data Warehousing or Big Data consulting for mid-to-large-sized organizations 3+ years of experience specifically with Snowflake, demonstrating deep expertise in its core features and advanced capabilities Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture SnowPro Core certification is highly desired Hands-on experience with Python (Pandas, Dataframes, Functions) Strong proficiency in SQL (Stored Procedures, functions), including debugging, performance optimization, and database design Strong Experience with Apache Airflow and API integrations Solid experience in any one of the ETL/ELT tools (DBT, Coalesce, Wherescape, Mulesoft, Matillion, Talend, Informatica, SAP BODS, DataStage, Dell Boomi, etc.) Nice to have: Experience in Docker, DBT, data replication tools (SLT, Fivetran, Airbyte, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions Strong presentation and communication skills Next Steps Our recruitment process is highly personalized. Some candidates complete the hiring process in one week, others may take longer, as it’s important we find the right position for you. It's all about timing and can be a journey as we continue to learn about one another. We want to get to know you and encourage you to be selective - after all, deciding to join a company is a big decision! At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employe,r and all qualified applicants will receive consideration for employment.
Posted 3 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Data Steward has accountability for day-to-day management of data. They are the Subject Matter Experts who understand and communicate the meaning and use of information. They are responsible for working with the Data & Business Owners to implement data quality standards & process. About The Role Key Responsibilities: Execute data stewardship tasks, using common methods & tools. Collaborate with business in defining business rules for the data and documenting metadata for various data elements Collaborate with Data Governance team providing input for Data Standards and Process as per insights gained from the data Develop good understanding of Finance business processes, end-to-end business and data functionality Work closely with the Data Owners, Data Governance and Data Quality, Global Process Owner (GPO) to ensure execution of data stewardship tasks as per aligned stewardship process and standards Liaise with the Functional Data Owners, Business Owner, Data Maintainers, to discuss and resolve Data Quality issues. Continuously monitor the progress of Data Quality KPIs and ensure adherence Ensure continuous and effective communication with relevant team members, stakeholders and colleagues in relation to stewardship activities. Review and approve data exceptions for the data created by Data Owner/Maintenance team Collaborate effectively with data community, to facilitate shared learning between Business Users and Stewards and to promote active Data Quality Governance through the Finance Master Data Team. Adherence to the Novartis Values & Behaviors Ensure exemplary communication with all stakeholders including internal associates through regular updates with focus on accomplishments, KPIs, best practices, change management, key events etc. Implement continuous process improvement projects to improve data quality & productivity. Implementation of Data Quality Strategy & framework .Ensure to maintain the Quality of Master Data throughout the business process Provide guidance and set standards of functional excellence in methodologies, processes and SOPs to enable enhancement of Global & Local data operations Essential Requirements Chartered Accountant /MBA finance or equivalent. CA Preferred. 5+ years of experience working as a data steward for key business functions such as Finance, Pharmaceutical, Healthcare Hands-on experience in working in Data Quality, Data Governance, Master data and data management domain Hands-on experience in Collibra, Informatica Data Quality, Informatica Analyst, Ataccama, Alation or any such tools. Familiar with process set-up, Data Quality KPIs and operational issues / management Exposure to tools like Power BI, Service Now, Jira, Confluence, Excel, PowerPoint, and SharePoint for analysis & documentation. Strong understanding of data models, data lifecycle, and enterprise systems (e.g., SAP ECC/S4 Hana, SAP EDW). Proficiency in Data Stewardship process, Data Quality monitoring and issue remediation Excellent analytical, communication, Presentation, and stakeholder management skills Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards
Posted 3 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
As an employee at Thomson Reuters, you will play a role in shaping and leading the global knowledge economy. Our technology drives global markets and helps professionals around the world make decisions that matter. As the world’s leading provider of intelligent information, we want your unique perspective to create the solutions that advance our business and your career.Our Service Management function is transforming into a truly global, data and standards-driven organization, employing best-in-class tools and practices across all disciplines of Technology Operations. This will drive ever-greater stability and consistency of service across the technology estate as we drive towards optimal Customer and Employee experience. About the role: In this opportunity as Senior Application Support Analyst - Informatica ,you will: Experience on Informatica support. The engineer will be responsible for supporting Informatica Development, Extractions, and loading. Fixing the data discrepancies and take care of performance monitoring. Collaborate with stakeholders such as business teams, product owners, and project management in defining roadmaps for applications and processes. Drive continual service improvement and innovation in productivity, software quality, and reliability, including meeting/exceeding SLAs. Thorough understanding of ITIL processes related to incident management, problem management, application life cycle management, operational health management. Experience in supporting applications built on modern application architecture and cloud infrastructure, Informatica PowerCenter/IDQ, Javascript frameworks and Libraries, HTML/CSS/JS, Node.JS, TypeScript, jQuery, Docker, AWS/Azure. About You: You're a fit for the role of Application Support Analyst - Informatica if your background includes: 3 to 8+ experienced Informatica Developer and Support will be responsible for implementation of ETL methodology in Data Extraction, Transformation and Loading. Have Knowledge in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. Should have experience in creating ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). Should be able to perform source system analysis as required. Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. Implements versioning of the ETL repository and supporting code as necessary. Develops stored procedures, database triggers and SQL queries where needed. Implements best practices and tunes SQL code for optimization. Loads data from SF Power Exchange to Relational database using Informatica. Works with XML's, XML parser, Java and HTTP transformation within Informatica. Experience in Integration of various data sources like Oracle, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV, Salesforce and excel Manage. Have in depth knowledge and experience in implementing the best practices for design and development of data warehouses using Star schema & Snowflake schema design concepts. Experience in Performance Tuning of sources, targets, mappings, transformations, and sessions Carried out support and development activities in a relational database environment, designed tables, procedures/Functions, Packages, Triggers and Views in relational databases and used SQL proficiently in database programming using SNFL Thousand Coffees Thomson Reuters café networking. What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 3 weeks ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Dentsu’s master data management (MDM) team utilizes Semarchy for mastering critical enterprise data domains such as client, customer, etc for driving data governance, efficient operations, and improving the quality and trust of our data and insights. The Semarchy developer will work closely with business and technical teams to design, develop, test, deploy, and maintain data products meeting functional and nonfunctional requirements. Job Description: Core Requirements: Knowledge of Master Data Management (MDM) Experience of working with MDM systems (such as Informatica, IBM Data Stage, Trillium, Semarchy, etc.) Has a Computer Science or numerate degree. Minimum 3 to 5 years of SQL experience working in a data warehouse, analytics or data migration environment. Database design using normalisation techniques. Experienced in designing Entity Relationship Diagrams. Has worked in a technical team to deliver team goals. Has worked in an Agile environment (using Jira or Azure DevOps or other agile technologies). Worked on internal stakeholder or customer projects. Understands the technical development lifecycle. Must understand the difference between good design and bad design. Use of coding standards. Has created test plans/scripts. Must be a team player. Must be a strong problem solver. The following are preferred requirements: Excellent communication skills with ability to document and present design patterns, code reviews, runbook. Knowledge of record matching and/or data quality issues. Experience of working with integration tools (such as Azure Data Factory, SnapLogic, BizTalk, etc.) Experience of programming languages (such as C, C++, C#, Python) Experience of Reporting tools (Tableau or Power BI) Understands project management principals. Has performed demonstrations to stakeholders. Understanding of how to implement algorithms. Technical Leadership People Management Location: DGS India - Mumbai - Thane Ashar IT Park Brand: Dentsu Time Type: Full time Contract Type: Permanent
Posted 3 weeks ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
This job is with Moody's, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. At Moody's, we unite the brightest minds to turn today’s risks into tomorrow’s opportunities. We do this by striving to create an inclusive environment where everyone feels welcome to be who they are—with the freedom to exchange ideas, think innovatively, and listen to each other and customers in meaningful ways. If you are excited about this opportunity but do not meet every single requirement, please apply! You still may be a great fit for this role or other open roles. We are seeking candidates who model our values: invest in every relationship, lead with curiosity, champion diverse perspectives, turn inputs into actions, and uphold trust through integrity. Job Title: Software Engineer – Salesforce Location: Gurgaon, Haryana, India Department: Customer, Operations & Risk (COR), Moody’s Analytics Reporting Manager: Manav Vatsyayana Employment Type: Full-Time About The Role We are looking to bring on board a skilled and motivated Software Engineer to join our Production Support team within Moody’s Analytics. This role is critical to ensuring the stability, performance, and continuous improvement of our Salesforce platform and its integrations with enterprise systems. You will be part of a dynamic team that supports global users and collaborates closely with cross-functional stakeholders, vendors, and agile teams. If you are passionate about Salesforce technologies, thrive in high-availability environments, and enjoy solving complex problems, I’d love to hear from you. Key Responsibilities Provide daily production support for Salesforce applications, ensuring timely resolution of incidents and service requests. Lead and manage ticket inflow, task assignments, and daily reporting. Collaborate with L1 business leads to prioritize tasks and ensure alignment with business needs. Drive root cause analysis and resolution of integrated data issues across platforms. Oversee release management and operational support activities. Design and implement automation for build, release, and deployment processes. Support deployment of new features and configuration changes using DevOps tools. Communicate incident and request statuses to stakeholders, including senior leadership. Participate in project transitions, UAT, and knowledge transfer activities. Act as Duty Manager on a rotational basis, including weekends, for major incident management (if required). Participate in team meetings, document procedures, and ensure service level targets are met. Required Skills And Competencies Salesforce certifications: Administrator, Platform App Builder, and Platform Developer I. Apttus CPQ and FinancialForce certifications are a plus. Strong understanding of ITIL disciplines: Event, Incident, Request, Problem, Release, and Knowledge Management. Experience with data quality tools and techniques (e.g., SQL/SOQL for profiling, validation, cleansing). Proficiency in DevOps tools: GitHub and Jira or other similar tools like Bitbuket, AutoRabit, SVN, Aldon, TFS, Jenkins, Urban Code, Nolio and Puppet. Experience supporting Salesforce applications and ERP/data integration tools (e.g., SAP, MuleSoft, Informatica, IBM SPM). Strong analytical and problem-solving skills with attention to detail. Ability to manage competing priorities in a fast-paced, Agile environment. Excellent communication and interpersonal skills. Proficiency in reporting and analysis tools (e.g., Excel, PowerPoint). Familiarity with workload automation and monitoring tools such as BMC Remedy, Control-M, Tivoli, Nagios, and Splunk is advantageous. Education And Experience Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field. Minimum 4 years of experience in software development, DevOps, and production support, preferably within the financial services sector. About The Team You’ll be joining the Production Support Team under the Business Systems group in the Customer, Operations & Risk business unit. Our team supports Moody’s Analytics employees globally who rely on the Salesforce CRM platform. This is an exciting opportunity to work on cutting-edge Salesforce technologies and contribute to a high-impact support function. Moody’s is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected veteran status, sexual orientation, gender expression, gender identity or any other characteristic protected by law. Candidates for Moody's Corporation may be asked to disclose securities holdings pursuant to Moody’s Policy for Securities Trading and the requirements of the position. Employment is contingent upon compliance with the Policy, including remediation of positions in those holdings as necessary.
Posted 3 weeks ago
9.0 years
20 - 24 Lacs
Pune, Maharashtra, India
On-site
Role: Lead Software Quality Automation Engineer Experience: 9 to 12 Years Minimum 6 years in ETL/data warehouse testing (DBT/Informatica, Snowflake, SQL) Notice period: Candidates with an official notice period of maximum 1 month All key skills must be clearly mentioned in the project details section of the resume. Validate relocation cases thoroughly. Word Mode : Hybrid (2-3 days WHO/ week) Ready to work in flexible working hours and collaborate with US/India/Colombia teams Excellent communication skills (written, verbal, listening, and articulation) Candidate should have team leading experience. (Minimum 2 reportees) Mandatory skills- Python, ETL/data warehouse testing (DBT/Informatica, Snowflake, SQL) Responsibilities Perform lead role in ETL testing, UI testing, DB testing and team management. Understand the holistic requirements, review and analyse stories, specifications, and technical design documents and develop detailed test cases and test data to ensure business functionality is thoroughly tested – both Automation & Manual. Validate ETL workflows, ensuring data integrity, accuracy and the Transformation rules using complex Snowflake SQL queries. Working Knowledge on DBT is a PLUS Create, Execute and maintain scripts on automation in BDD – Gherkin/Behave, Pytest. Experience in writing DB queries (preferably in Postgres/ Snowflake/ MySQL/ RDS) Preparation, review and update of test cases and relevant test data consistent with system requirements including functional, integration & regression, UAT testing. Coordinate with cross team subject matter experts to develop, maintain, and validate test scenarios to the best interest of that POD. Taking ownership on creating and maintaining artifacts on: Test strategy, BRD, Defect count/leakage report and different quality issues. Collaborate with DevOps/SRE team to integrate test automation into CI/CD pipelines (Jenkins, Rundeck, GitHub etc.) Should have the ability to oversee and guide a team of min 4 testers, lead them by example, institutionalizing best practices in testing processes & automation in agile methodology. Meet with internal stakeholders to review current testing approaches, provide feedback on ways to improve / extend / automate along with data backed inputs and provisioning senior leadership with metrics consolidation. Maximize the opportunity to excel in an open and recognized work culture. Be a problem solver and a team player. Requirements 8-11 years of strong expertise in STLC, defect management, Test Strategy designing, planning and approach. Should have experience with Test requirement understanding, test data, test plan & test case designing. Should have minimum 6+ years of strong work experience in UI, Database, ETL testing. Experience in ETL/data warehouse testing (DBT/Informatica, Snowflake, SQL). Any experience with AWS/Cloud hosted applications is an added advantage. Hands-on experience in writing DB queries (preferably in postgres/ Snowflake/ MySQL/ RDS) Should have 3+ years of experience with automation scripts execution, maintenance & enhancements with Selenium web-driver (v3+)/playwright, with programming experience in Python (MUST) with BDD – Gherkin and Behave, Pytest. Key competencies required: Strong analytical, Problem-Solving, Communication skills, Collaboration, Accountability, Stakeholder management, passion to drive initiatives, Risk highlighting and Team leading capabilities. Proven Team leadership experience with min 2 people reporting. Experienced working with Agile methodologies, such as Scrum, Kanban. MS Power BI reporting. Front end vs Back end validation – Good to have. Advantage if, Has Healthcare/Life Sciences domain experience Has a working knowledge on manual and automation testing, and ETL testing Professional Approach Ready to work in flexible working hours and collaborate with US/India/Colombia teams Skills: bdd (gherkin, behave, pytest),behave,snowflake sql,informatica,etl testing,ms power bi,pytest,snowflake,ui testing,etl/data warehouse testing (dbt/informatica, snowflake, sql),rundeck,bdd,automation,github,sql,database testing,software quality automation,jenkins,data warehouse testing,selenium,automation testing,python,postgres,db testing,selenium web-driver,db queries (postgres, mysql, rds),dbt,gherkin,ci/cd (jenkins, rundeck, github),rds,postgress,mysql
Posted 3 weeks ago
9.0 years
20 - 24 Lacs
Pune, Maharashtra, India
On-site
Role: Lead Software Quality Automation Engineer Experience: 9 to 12 Years Minimum 6 years in ETL/data warehouse testing (DBT/Informatica, Snowflake, SQL) Notice period: Candidates with an official notice period of maximum 1 month All key skills must be clearly mentioned in the project details section of the resume. Validate relocation cases thoroughly. Word Mode : Hybrid (2-3 days WHO/ week) Ready to work in flexible working hours and collaborate with US/India/Colombia teams Excellent communication skills (written, verbal, listening, and articulation) Candidate should have team leading experience. (Minimum 2 reportees) Mandatory skills- Python, ETL/data warehouse testing (DBT/Informatica, Snowflake, SQL) Responsibilities Perform lead role in ETL testing, UI testing, DB testing and team management. Understand the holistic requirements, review and analyse stories, specifications, and technical design documents and develop detailed test cases and test data to ensure business functionality is thoroughly tested – both Automation & Manual. Validate ETL workflows, ensuring data integrity, accuracy and the Transformation rules using complex Snowflake SQL queries. Working Knowledge on DBT is a PLUS Create, Execute and maintain scripts on automation in BDD – Gherkin/Behave, Pytest. Experience in writing DB queries (preferably in Postgres/ Snowflake/ MySQL/ RDS) Preparation, review and update of test cases and relevant test data consistent with system requirements including functional, integration & regression, UAT testing. Coordinate with cross team subject matter experts to develop, maintain, and validate test scenarios to the best interest of that POD. Taking ownership on creating and maintaining artifacts on: Test strategy, BRD, Defect count/leakage report and different quality issues. Collaborate with DevOps/SRE team to integrate test automation into CI/CD pipelines (Jenkins, Rundeck, GitHub etc.) Should have the ability to oversee and guide a team of min 4 testers, lead them by example, institutionalizing best practices in testing processes & automation in agile methodology. Meet with internal stakeholders to review current testing approaches, provide feedback on ways to improve / extend / automate along with data backed inputs and provisioning senior leadership with metrics consolidation. Maximize the opportunity to excel in an open and recognized work culture. Be a problem solver and a team player. Requirements 8-11 years of strong expertise in STLC, defect management, Test Strategy designing, planning and approach. Should have experience with Test requirement understanding, test data, test plan & test case designing. Should have minimum 6+ years of strong work experience in UI, Database, ETL testing. Experience in ETL/data warehouse testing (DBT/Informatica, Snowflake, SQL). Any experience with AWS/Cloud hosted applications is an added advantage. Hands-on experience in writing DB queries (preferably in postgres/ Snowflake/ MySQL/ RDS) Should have 3+ years of experience with automation scripts execution, maintenance & enhancements with Selenium web-driver (v3+)/playwright, with programming experience in Python (MUST) with BDD – Gherkin and Behave, Pytest. Key competencies required: Strong analytical, Problem-Solving, Communication skills, Collaboration, Accountability, Stakeholder management, passion to drive initiatives, Risk highlighting and Team leading capabilities. Proven Team leadership experience with min 2 people reporting. Experienced working with Agile methodologies, such as Scrum, Kanban. MS Power BI reporting. Front end vs Back end validation – Good to have. Advantage if, Has Healthcare/Life Sciences domain experience Has a working knowledge on manual and automation testing, and ETL testing Professional Approach Ready to work in flexible working hours and collaborate with US/India/Colombia teams Skills: automation,github,pytest,gherkin,automation testing,mysql,etl/data warehouse testing (dbt/informatica, snowflake, sql),selenium,rundeck,dbt,informatica,snowflake,db queries (postgres, mysql, rds),etl testing,bdd (gherkin, behave, pytest),python,behave,ci/cd (jenkins, rundeck, github),ms power bi,data warehouse testing,bdd,software quality automation,rds,sql,db testing,ui testing,snowflake sql,selenium web-driver,postgres,jenkins
Posted 3 weeks ago
5.0 - 8.0 years
19 - 20 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Senior Software Engineer Location: Hyderabad-36784 Job Type: Full-Time (Permanent) Experience Required: 5 to 8 Years Position Overview We are seeking a skilled and experienced Senior Software Engineer with a strong background in Informatica Administration , specifically in MDM-E360 or PIM-P360 . The ideal candidate will have hands-on experience in Oracle Database management and Unix/Linux systems . This role involves maintaining and optimizing data integration platforms to support Dell’s enterprise-wide data management initiatives. Key Responsibilities Perform administration, configuration, and support of Informatica MDM-E360 or PIM-P360 environments Install, upgrade, and maintain Informatica platforms in production and non-production environments Monitor system performance, tune configurations, and ensure high availability of services Manage user roles, access control, and security settings within Informatica MDM Troubleshoot and resolve application and infrastructure issues within Informatica ecosystem Collaborate with developers, data engineers, and DBAs to ensure seamless platform operation and data integrity Schedule and monitor ETL processes, performing incident analysis and root cause investigations Support deployment and version control of mappings, workflows, and services Create and maintain administrative documentation, SOPs, and maintenance procedures Provide support for Unix shell scripting tasks for automation and monitoring Work closely with Oracle DBAs for query tuning, backup/recovery, and data consistency checks Mandatory Skills 5–8 years of total experience in IT with at least 3–5 years in Informatica MDM Administration (E360 or P360) Strong experience in Oracle Database environments, including performance tuning and SQL optimization Proficiency with Unix/Linux command-line tools and shell scripting Experience with user administration, metadata management, data profiling, and operational monitoring in MDM systems Familiarity with Informatica services deployment, domain management, and high-availability configurations Preferred Skills (Nice To Have) Experience working in enterprise environments with large-scale data and integration needs Exposure to DevOps tools for automation and monitoring (e.g., Jenkins, Ansible) Familiarity with cloud deployments (AWS, Azure) for Informatica or Oracle Strong communication skills and ability to work in cross-functional teams ITIL-based incident and change management experience Education Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field Skills: automation,data,unix,informatica mdm-e360,oracle
Posted 3 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title:Corporate Trainer – Informatica MDM (Freshers Batch) Client: Capgemini Compensation:₹4,000 per training day Location:Hyderabad (Travel and accommodation will be arranged for outstation trainers) Training Type:Classroom/Onsite Training Start Date:20th August 2025 Duration:As per project requirement Job Description: We are seeking a knowledgeable and enthusiastic Informatica MDM Trainer to deliver corporate training sessions for a fresher batch at Capgemini. The trainer will be responsible for covering all key concepts of Informatica Master Data Management, ensuring foundational understanding, and preparing trainees for real-world application. Key Responsibilities: * Deliver comprehensive training on Informatica MDM tailored for fresh graduates. * Create and share training materials, practical exercises, and assessments. * Engage learners through interactive sessions and real-life scenarios. * Track trainee progress and provide feedback. * Adapt training methodology based on batch understanding and learning pace. Requirements: * Proven expertise in Informatica MDM with training or project experience. * Prior experience in delivering training (preferred, but not mandatory). * Ability to simplify complex concepts for freshers. * Strong communication and presentation skills. Compensation & Benefits: * ₹4,000 per training day. * Travel and accommodation provided for trainers from outside the location. Please send your updated resume to hr@bsh-technologies.com
Posted 3 weeks ago
7.0 - 11.0 years
0 Lacs
maharashtra
On-site
As a Solutions Architect with over 7 years of experience, you will have the opportunity to leverage your expertise in cloud data solutions to architect scalable and modern solutions on AWS. In this role at Quantiphi, you will be a key member of our high-impact engineering teams, working closely with clients to solve complex data challenges and design cutting-edge data analytics solutions. Your responsibilities will include acting as a trusted advisor to clients, leading discovery/design workshops with global customers, and collaborating with AWS subject matter experts to develop compelling proposals and Statements of Work (SOWs). You will also represent Quantiphi in various forums such as tech talks, webinars, and client presentations, providing strategic insights and solutioning support during pre-sales activities. To excel in this role, you should have a strong background in AWS Data Services including DMS, SCT, Redshift, Glue, Lambda, EMR, and Kinesis. Your experience in data migration and modernization, particularly with Oracle, Teradata, and Netezza to AWS, will be crucial. Hands-on experience with ETL tools such as SSIS, Informatica, and Talend, as well as a solid understanding of OLTP/OLAP, Star & Snowflake schemas, and data modeling methodologies, are essential for success in this position. Additionally, familiarity with backend development using Python, APIs, and stream processing technologies like Kafka, along with knowledge of distributed computing concepts including Hadoop and MapReduce, will be beneficial. A DevOps mindset with experience in CI/CD practices and Infrastructure as Code is also desired. Joining Quantiphi as a Solutions Architect is more than just a job it's an opportunity to shape digital transformation journeys and influence business strategies across various industries. If you are a cloud data enthusiast looking to make a significant impact in the field of data analytics, this role is perfect for you.,
Posted 3 weeks ago
1.0 - 3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. You are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt, take ownership and consistently deliver quality work that drives value for our clients and success as a team. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role: Specialist Tower: Data Analytics & Insights Managed Service Experience: 1 - 3 years Key Skills: Data Engineering Educational Qualification: Bachelor's degree in computer science/IT or relevant field Work Location: Bangalore, India Job Description As a Specialist, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution by using Data, Analytics & Insights Skills. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Review your work and that of others for quality, accuracy, and relevance. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good Team player. Take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: Primary Skill: ETL/ELT, SQL, Informatica, Python Secondary Skill: Azure/AWS/GCP, Talend, DataStage, etc. Data Engineer Should have minimum 1 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like Informatica, Talend, SSIS, SSRS, AWS, Azure, ADF, GCP, Snowflake, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have Certifications in Cloud Technology is an added advantage. Experience in Visualization tools like Power BI, Tableau, Qlik, etc. Managed Services- Data, Analytics & Insights At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights Managed Service where we focus more so on the evolution of our clients’ Data, Analytics, Insights and cloud portfolio. Our focus is to empower our clients to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Application Evolution Service offerings and engagement including help desk support, enhancement and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Tech Lead, Data Architecture at Fiserv, you will play a crucial role in our data warehousing strategy and implementation. Your responsibilities will include designing, developing, and leading the adoption of Snowflake-based solutions to ensure efficient and secure data systems that drive our business analytics and decision-making processes. Collaborating with cross-functional teams, you will define and implement best practices for data modeling, schema design, and query optimization in Snowflake. Additionally, you will develop and manage ETL/ELT workflows to ingest, transform, and load data from various resources into Snowflake, integrating data from diverse systems like databases, APIs, flat files, and cloud storage. Monitoring and tuning Snowflake performance, you will manage caching, clustering, and partitioning to enhance efficiency while analyzing and resolving query performance bottlenecks. You will work closely with data analysts, data engineers, and business users to understand reporting and analytic needs, ensuring seamless integration with BI Tools like Power BI. Your role will also involve collaborating with the DevOps team for automation, deployment, and monitoring, as well as planning and executing strategies for scaling Snowflake environments as data volume grows. Keeping up to date with emerging trends and technologies in data warehousing and data management is essential, along with providing technical support, troubleshooting, and guidance to users accessing the data warehouse. To be successful in this role, you must have 8 to 10 years of experience in data management tools like Snowflake, Streamsets, and Informatica. Experience with monitoring tools like Dynatrace and Splunk, Kubernetes cluster management, and Linux OS is required. Additionally, familiarity with containerization technologies, cloud services, CI/CD pipelines, and banking or financial services experience would be advantageous. Thank you for considering employment with Fiserv. To apply, please use your legal name, complete the step-by-step profile, and attach your resume. Fiserv is committed to diversity and inclusion and does not accept resume submissions from agencies outside of existing agreements. Beware of fraudulent job postings not affiliated with Fiserv to protect your personal information and financial security.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
You will be joining KPMG in India, a professional services firm affiliated with KPMG International Limited since its establishment in August 1993. Leveraging a global network of firms, we provide services to national and international clients across various sectors with offices in multiple cities across India. As part of the Financial Crimes specialist team, your role will involve providing solutions to BFSI clients through model validation testing for AML risk models and frameworks, sanctions screening, and transaction monitoring systems. We are seeking individuals with advanced data science and analytics skills to support our team in addressing the challenges associated with financial crime. Your responsibilities will include but are not limited to supporting functional SME teams in building data-driven Financial Crimes solutions, conducting statistical testing of screening matching algorithms and risk rating models, validating data models of AML systems, and developing AML models to detect suspicious activities and transactions. You will collaborate with cross-functional teams to analyze data for model development and validation, prepare detailed documentation and reports, and assist in feature engineering for automation of AML-related investigations. To qualify for this role, you should have a Bachelor's degree from an accredited university, at least 3 years of hands-on experience in Python with knowledge of frameworks like Java, Fast, Django, Tornado, or Flask, experience with Relational and NoSQL databases, proficiency in BI tools such as Power BI and Tableau, and an educational background in Data Science and Statistics. Additionally, expertise in machine learning algorithms, statistical analysis, and familiarity with regulatory guidelines for AML compliance are essential. Preferred qualifications include experience in AML model validation, statistical testing of risk models, and familiarity with AML technology platforms. Hands-on experience with data analytics tools like Informatica and Kafka is also desirable. If you are looking to contribute your advanced analytics skills to combat financial crime and support leading financial institutions in adhering to industry best practices, this role offers you the opportunity to work on challenging projects and develop professionally in a dynamic environment.,
Posted 3 weeks ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
As a Data Engineer/ETL Developer specializing in Talend and Power BI, your primary responsibility will be to study, analyze, and comprehend business requirements within the realm of business intelligence. You will be tasked with providing end-to-end solutions that cater to these requirements efficiently. Your role will involve designing and implementing ETL pipelines that ensure data quality and integrity across various platforms such as Talend Enterprise and Informatica. You will need to proficiently load data from diverse sources including Oracle, MSSql, File system, FTP services, and Rest APIs. Additionally, you will be expected to design and map data models that transform raw data into actionable insights while also developing comprehensive data documentation encompassing algorithms, parameters, and models. Analyzing historical and current data will be crucial for facilitating informed decision-making processes. To enhance the existing business intelligence systems, you will have to make necessary technical enhancements and optimize ETL processes for improved performance. Monitoring ETL jobs and troubleshooting any arising issues will also fall within your purview. In a leadership capacity, you will be required to oversee and guide the team's deliverables, ensuring adherence to best development practices. Your involvement in requirements gathering and analysis will be pivotal, and your ability to lead in such endeavors is essential. In terms of prerequisites, you should ideally possess up to 3 years of overall work experience, with a focus on SQL and ETL, particularly Talend. A minimum of 1 year of experience in Talend Enterprise/Open studio and related tools like Talend API, Talend Data Catalog, TMC, and TAC is a must. Proficiency in database design and data modeling is also a key requirement, along with hands-on experience in a coding language such as Java or Python. Desirable skills include familiarity with a BI tool like MS Power BI and the ability to leverage Power BI for creating interactive and visually appealing dashboards and reports. Strong analytical skills, effective written and verbal communication abilities, self-motivation, and a results-oriented approach are all qualities that will be highly valued in this role. If you possess advanced problem-solving skills, the capacity to work independently with a high level of accountability, and the ability to navigate complex distributed applications, you are encouraged to apply. Experience in multicultural environments will be an added advantage. This is a full-time, permanent position with a Monday to Friday schedule in the UK shift. The work location is in person.,
Posted 3 weeks ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Snowflake Developer Were looking for a highly skilled Snowflake Developer with strong expertise in Snowflake Data Cloud to join our data engineering team. SnowPro Certification is mandatory, and professionals with 4+ years of experience (including 3+ years on Snowflake) will be prioritized. If you're passionate about building scalable data solutions on modern cloud platforms, we'd love to connect. Roles & Responsibilities Build and manage scalable data pipelines on the Snowflake Data Cloud. Develop ETL/ELT workflows using tools like ADF, AWS Glue, Informatica, or Talend. Orchestrate data workflows using Airflow, Control-M, or similar tools. Write advanced SQL and Python scripts (Pandas, PySpark, Snowpark) for data transformation. Optimize pipeline performance and ensure data quality and reliability. Collaborate with cross-functional teams to deliver clean, structured data for analytics. Work with data modeling tools (dbt, Erwin) and integration tools (Fivetran, Stitch) as needed Must-Have Skills Snowflake Cloud Platform - strong hands-on experience ETL/ELT Tools - experience with one or more tools such as : Azure Data Factory AWS Glue Informatica Talend Qlik Orchestration : Proficiency With Tools Like Apache Airflow Control-M Tidal : Advanced SQL Python (including working with data frames using Pandas, PySpark, or Snowpark) Data Engineering Concepts : Strong knowledge of data pipelines, data wrangling, and optimization Good-to-Have Skills SQL scripting and procedural logic Data modeling tools (e.g., Erwin, dbt) Integration tools like Fivetran, Stitch Note : Immediate & Serving Notice period Candidates are prefered (ref:hirist.tech)
Posted 3 weeks ago
9.0 - 13.0 years
0 Lacs
hyderabad, telangana
On-site
You have a great opportunity to join our team as a Data Architect with 9+ years of experience. In this role, you will be responsible for designing, implementing, and managing cloud-based solutions on AWS and Snowflake. Your main tasks will include working with stakeholders to gather requirements, designing solutions, developing and executing test plans, and overseeing the information architecture for the data warehouse. To excel in this role, you must have a strong skillset in Snowflake, DBT, and Data Architecture Design experience in Data Warehouse. Additionally, it would be beneficial to have Informatica or any ETL Knowledge or Hands-On Experience, as well as Databricks understanding. You should have 9 - 11 years of IT experience with 3+ years of Data Architecture experience in Data Warehouse and 4+ years in Snowflake. As a Data Architect, you will need to optimize Snowflake configurations and data pipelines to improve performance, scalability, and overall efficiency. You should have a deep understanding of Data Warehousing, Enterprise Architectures, Dimensional Modeling, Star & Snowflake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support. In addition to your technical responsibilities, you will also be required to maintain detailed documentation for data solutions and processes, provide training and leadership to share expertise and best practices with the team, and collaborate with the data engineering team to ensure that data solutions are developed according to best practices. If you have 10+ years of overall experience in architecting and building large-scale, distributed big data products, expertise in designing and implementing highly scalable, highly available Cloud services and solutions, experience with AWS and Snowflake, as well as a strong understanding of data warehousing and data engineering principles, then this role is perfect for you. This is a full-time position based in Hyderabad, Telangana, with a Monday to Friday work schedule. Therefore, you must be able to reliably commute or plan to relocate before starting work. As part of the application process, we would like to know your notice period, years of experience in Snowflake, Data Architecture experience in Data Warehouse, current location, willingness to work from the office in Hyderabad, current CTC, and expected CTC. If you meet the requirements and are excited about this opportunity, we look forward to receiving your application. (Note: Experience: total work: 9 years is required for this position),
Posted 3 weeks ago
5.0 - 7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title : Senior Data Engineer Experience : 5-7 years Location : Hyderabad Mode : Hybrid Primary Skills Dimensional Modeling, Python Datawarehouse Development, Oracle Cloud Infrastructure (OCI). Expertise, ODI, Etl Tool, Tableau Desktop Informatica PowerCenter, Data Governance, ADW Management, Agile. Environment, Python Framework, Oracle SQL, GitHub. Job Description We are looking for a skilled Data Engineer with a strong background in Oracle Cloud Infrastructure (OCI) and Oracle Data Integration tools. The ideal candidate will have hands-on experience in building, automating, and managing robust Data Warehousing and ETL solutions. Key Responsibilities Design and implement scalable Data Warehouse and Data Lake architectures using Oracle ADW, OCI Object Storage, and Data Catalog. Build and maintain ETL pipelines using Oracle Data Integrator (ODI), OCI Data Integration and Informatica services. Understanding Data Warehousing and Dimensional Modelling concepts very well. Develop and automate data processing workflows using Python, SQL, and PL/SQL. Support data validation, ingestion, transformation, and publishing processes. Optimize performance and cost of data infrastructure on OCI. Collaborate with business and analytics teams to define data models, KPIs, and metrics. Implement data governance, security policies, and compliance standards (GxP, HIPAA, etc. Work in an Agile/Scrum environment to support project deliverables and production support. Qualifications & Skills Strong experience in Oracle Cloud Infrastructure (OCI): Object Storage, ADW, Data Catalog, Data Integration. Hands-on with ODI, Oracle Autonomous Database, and OIC (Oracle Integration Cloud). Strong skills in SQL, PL/SQL, Python, and data transformation logic. Experience with Shell scripting, Scheduling tools. Knowledge of Data lineage, Data Quality Frameworks, and Master Data Management. Familiarity with reporting/visualization tools like Tableau or APEX. Exposure to CI/CD, Git, and DevOps practices in a cloud data engineering setup. Preferred Qualifications Bachelors degree in Computer Science, Engineering, or related field. Oracle Certified Professional (OCP) or OCI Data Engineer certification is a plus. Experience in Life Sciences/Pharma, BFSI, or Regulatory Data domains. (ref:hirist.tech)
Posted 3 weeks ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
At PwC, our managed services team focuses on providing outsourced solutions and support to clients across various functions. We help organizations streamline operations, reduce costs, and enhance efficiency by managing key processes and functions on their behalf. Our team is skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC are responsible for transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and services. As an Associate at PwC, you will work as part of a team of problem solvers, assisting in solving complex business issues from strategy to execution. Professional skills and responsibilities at this level include using feedback and reflection to develop self-awareness, demonstrating critical thinking, and bringing order to unstructured problems. You will be involved in ticket quality review, status reporting for projects, adherence to SLAs, incident management, change management, and problem management. Additionally, you will seek opportunities for exposure to different situations, environments, and perspectives, uphold the firm's code of ethics, demonstrate leadership capabilities, and work in a team environment that includes client interactions and cross-team collaboration. Required Skills: - AWS Cloud Engineer - Minimum 2 years of hands-on experience in building advanced data warehousing solutions on leading cloud platforms - Minimum 1-3 years of Operate/Managed Services/Production Support Experience - Extensive experience in developing scalable, repeatable, and secure data structures and pipelines - Designing and implementing data pipelines for data ingestion, processing, and transformation in AWS - Building efficient ETL/ELT processes using industry-leading tools like AWS, PySpark, SQL, Python, etc. - Implementing data validation and cleansing procedures - Monitoring and troubleshooting data pipelines - Implementing and maintaining data security and privacy measures - Strong communication, problem-solving, quantitative, and analytical abilities Nice To Have: - AWS certification In our Managed Services platform, we deliver integrated services and solutions grounded in deep industry experience and powered by talent. Our team provides scalable solutions that add value to our clients" enterprise through technology and human-enabled experiences. We focus on empowering clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. As a member of our Data, Analytics & Insights Managed Service team, you will work on critical offerings, help desk support, enhancement, optimization work, and strategic roadmap and advisory level work. Your contribution will be crucial in supporting customer engagements both technically and relationally.,
Posted 4 weeks ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for the following roles: For Data Governance Lead: - You should have a minimum of 8-10 years of experience in the Informatica Platform, with expertise in Informatica, AXON, and EDC. - The job requires working from the office all 5 days in Chennai. - Immediate to 15 days notice period is required. - Your responsibilities will include sizing hardware, installing and configuring Informatica products on-premises and cloud, administering and configuring Informatica Products, and handling EDC setup. - You must have hands-on experience in EDC development activities like data discovery, data domain creation, data profiling, data lineage, and data curation. - Experience in architecting data governance solutions using Informatica tool AXON is essential. - Integration with other tools and Informatica tools like EDC, IDQ is necessary. - Ability to understand business context and translate it into AXON templates and facets is required. For MDM Lead: - We are seeking a resource with 8+ years of experience, including 5+ years of relevant experience in MDM. - You will be responsible for developing and configuring Informatica MDM on-premises solutions, including Hub Console, IDD, Match & Merge, and Hierarchy Manager. - The job requires working 5 days from the office in Chennai. - Immediate to 20 days notice period is expected. - Experience with SQL, PL/SQL, and relational databases such as Oracle, Teradata is required. - Understanding of data modelling, data integration, and ETL processes is necessary. Both roles are full-time and permanent positions with benefits including health insurance, provident fund, and a yearly bonus. The work schedule includes day shift, evening shift, fixed shift, Monday to Friday, and morning shift. To apply, you should have over 8 years of relevant experience and be able to join within 0-15 days as the work location is in person at our Chennai office.,
Posted 4 weeks ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
You will be working as a Data Engineer with expertise in Python and Pyspark programming. You should have a strong background in utilizing Cloud services such as Snowflake, Databricks, Informatica, Azure, AWS, GCP, as well as proficiency in Reporting technologies like PowerBI, Tableau, Spotfire, Alteryx, and Microstrategy. Your responsibilities will include developing and maintaining data pipelines, optimizing data workflows, and ensuring the efficiency and reliability of data integration processes. You will be expected to possess strong programming skills in Python and Pyspark, along with a deep understanding of SQL. It is essential for you to have experience in utilizing Snowflake, Databricks, PowerBI, Microstrategy, Tableau, and Spotfire. Additionally, familiarity with Informatica and Azure/AWS services would be advantageous. The interview process will be conducted virtually, and the work model for this position is remote. If you have 7-10 years of experience in this field and are available to start within 15 days, please consider applying for this opportunity by sending your resume to netra.s@twsol.com.,
Posted 4 weeks ago
5.0 - 9.0 years
0 Lacs
jaipur, rajasthan
On-site
As an experienced data engineer specializing in dashboard story development and data engineering pipelines, you will play a crucial role in analyzing log data to extract actionable insights for product enhancements and feature optimization. With 5+ years of hands-on experience, you will collaborate with cross-functional teams to gather business requirements, translate them into technical specifications, and design interactive dashboards using tools like Tableau, Power BI, or ThoughtSpot AI. You will be responsible for managing large volumes of application log data using Google Big Query, ensuring data integrity, consistency, and accessibility for analytical purposes. Your expertise in identifying patterns, trends, and anomalies in log data will be instrumental in visualizing key metrics and insights to communicate findings effectively with customer success and leadership teams. In addition to your primary responsibilities, you will work closely with product teams to understand log data generated by Python-based applications, define key performance indicators (KPIs), and optimize data pipelines and storage in Big Query. Your strong communication, teamwork, problem-solving skills, and ability to learn quickly and adapt to new technologies will be essential in this role. Preferred qualifications include knowledge of Generative AI (GenAI) and LLM-based solutions, experience with ThoughtSpot AI, Google Cloud Platform (GCP), and modern data warehouse architectures. You will also have the opportunity to participate in proof-of-concepts (POCs) and pilot projects, articulate ideas clearly to the team, and take ownership of data analytics and engineering solutions. Additional nice-to-have qualifications include experience working with large datasets, distributed data processing tools like Apache Spark or Hadoop, familiarity with Agile development methodologies, and ETL tools such as Informatica or Azure Data Factory. This full-time position in the IT Services and IT Consulting industry offers a dynamic environment where you can leverage your skills to drive meaningful business outcomes.,
Posted 4 weeks ago
1.0 - 5.0 years
0 Lacs
hyderabad, telangana
On-site
The ideal candidate for this role should have at least 4 years of experience as an ETL/Informatica developer. You should also have a minimum of 1 year of experience working with Snowflake and 1 year of experience with IICS. It is essential that you have hands-on experience developing specifications, test scripts, and code coverage for all integrations. Additionally, you should be adept at supporting the migration of integration code from lower to higher environments, such as production. In this role, you will be responsible for full and incremental ETL using Informatica Power Center. Your expertise in developing ETL/Informatica for Data Warehouse Integration from various data sources will be valuable. You should also have experience supporting integration configurations with iPaaS through connected apps or web services. Being able to work with Agile framework is a must for this position. The successful candidate should be willing to be on-call for selected off-shift hours. If you meet the requirements and are interested in this onsite position located in Hyderabad, please share your resumes with bhavana@ketsoftware.com and contact at 91828 22519.,
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |