Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 2+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 weeks ago
3.0 - 5.0 years
4 - 8 Lacs
Chennai, Coimbatore, Bengaluru
Work from Office
Primary Skill: ETLTesting Secondary Skill:Azure 5+ years data warehouse testing experience, 2+ years of Azure Cloud experience. Strong understanding of data marts and data warehouse concepts Expert in SQL with the ability to create source-to-target comparison test cases in SQL Creation of test plans, testcases, traceability matrix, closure reports Proficient with Pyspark, Python, Git, Jira, JTM Location : Pune, Chennai, Coimbatore, Bangalore Band - B2 and B3 Mandatory Skills: ETL Testing. Experience: 3-5 Years.
Posted 4 weeks ago
2.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 2+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 weeks ago
2.0 - 4.0 years
2 - 6 Lacs
Gurugram
Work from Office
About the Opportunity Job TypeApplication 29 July 2025 Title Analyst Programmer Department WPFH Location Gurgaon Level 2 Intro Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like youre part of something bigger. About your team The successful candidate would join the Data team . Candidate would be responsible for building data integration and distribution experience to work within the Distribution Data and Reporting team and its consumers. The team is responsible for developing new, and supporting existing, middle tier integration services and business services, and is committed to driving forwards the development of leading edge solutions. About your role This role would be responsible for liaising with the technical leads, business analysts, and various product teams to design, develop & trouble shoot the ETL jobs for various Operational data stores. The role will involve understanding the technical design, development and implementation of ETL and EAI architecture using Informatica / ETL tools. The successful candidate will be able to demonstrate an innovative and enthusiastic approach to technology and problem solving, will display good interpersonal skills and show confidence and ability to interact professionally with people at all levels and exhibit a high level of ownership within a demanding working environment. Key Responsibilities Work with Technical leads, Business Analysts and other subject matter experts. Understand the data model / design and develop the ETL jobs Sound technical knowledge on Informatica to take ownership of allocated development activities in terms of working independently Working knowledge on Oracle database to take ownership of the underlying SQLs for the ETL jobs (under guidance of the technical leads) Providing the development estimates Implement standards, procedures and best practices for data maintenance, reconciliation and exception management. Interact with cross functional teams for coordinating dependencies and deliverables. Essential Skils Technical Deep knowledge and Experience of using the Informatica Power Centre tool set min 3 yrs. Experience in Snowflake Experience of Source Control Tools Experience of using job scheduling tools such as Control-M Experience in UNIX scripting Strong SQL or Pl/SQL experience with a minimum of 2 years experience Experience in Data Warehouse, Datamart and ODS concepts Knowledge of data normalisation/OLAP and Oracle performance optimisation techniques 3 + Yrs Experience of either Oracle or SQL Server and its utilities coupled with experience of UNIX/Windows Functional 3 + years experience of working within financial organisations and broad base business process, application and technology architecture experience Experience with data distribution and access concepts with ability to utilise these concepts in realising a proper physical model from a conceptual one Business facing and ability to work alongside data stewards in systems and the business Strong interpersonal, communication and client facing skills Ability to work closely with cross functional teams About you B.E./B.Tech/MBA/M.C.A/Any other bachelors Degree. At least 3+years of experience in Data Integration and Distribution Experience in building web services and APIs Knowledge of Agile software development life-cycle methodologies
Posted 4 weeks ago
2.0 - 7.0 years
11 - 15 Lacs
Gurugram
Work from Office
Our world is transforming, and PTC is leading the way. Our software brings the physical and digital worlds together, enabling companies to improve operations, create better products, and empower people in all aspects of their business. Our people make all the difference in our success. Today, we are a global team of nearly 7,000 and our main objective is to create opportunities for our team members to explore, learn, and grow all while seeing their ideas come to life and celebrating the differences that make us who we are and the work we do possible. About PTC: PTC (NASDAQ: PTC) enables global manufacturers to drive digital transformation and achieve operational excellence through cutting-edge software solutions. Whether deployed on-premises, in the cloud, or via SaaS, PTC empowers customers to innovate faster, work smarter, and boost performance. At PTC, we don t just imagine a better world we enable it. Role Overview: As a Product Specialist , you will be part of a high-performing Technical Support team that helps customers resolve technical issues, understand our products, and maximize the value they receive from our solutions. This role is ideal for someone with a solid technical foundation who is eager to grow in the enterprise software support space. You will learn to work across teams, improve support processes, and grow into a trusted technical advisor for customers. Key Responsibilities: Investigate and troubleshoot customer-reported technical issues. Provide timely resolutions or workarounds to ensure customer satisfaction. Escalate complex issues to senior engineers or product teams with detailed analysis. Document resolutions and contribute to knowledge base articles for customer self-help. Collaborate with peers and cross-functional teams to support issue resolution. Manage and track assigned cases using Salesforce. Participate in internal training, knowledge-sharing sessions, and workshops. Follow established processes and contribute to continuous improvement initiatives. Available to work 24x7 on rotational basics and willingness to support weekend shifts when scheduled ensuring readiness for global support needs. Required Skills & Competencies: Basic to intermediate experience with SQL (Oracle or SQL Server preferred). Familiarity with application server environments (e.g., Apache Tomcat, web server setups). Exposure to Java-based enterprise applications (from a support or academic background). Ability to analyze logs, perform root cause analysis, and provide actionable insights. Experience with ETL tools (e.g., Informatica, Kettle or IICS) Good problem-solving skills with a focus on delivering customer value. Strong communication and documentation skills. Preferred Qualifications (Nice to Have): Exposure to UNIX/Linux operating systems and basic command-line knowledge. Basic familiarity with cloud platforms like AWS. Interest in or foundational knowledge of machine learning concepts. Bachelors degree in Computer Science, Information Systems, or related field. 2+ years of relevant experience in technical support, application support, or similar role. Why Join PTC? Be part of a company that values learning, inclusion, and innovation. Work with a supportive team and opportunities for skill development and career growth. Benefits include: Best-in-class insurance policies Generous leave and PTO Flexible work hours and casual dress code Birthday leave, no probation period Employee stock options and support for higher education ?
Posted 4 weeks ago
5.0 - 10.0 years
15 - 16 Lacs
Bengaluru
Work from Office
Location: Bengaluru Designation: Consultant Entity: Deloitte Touche Tohmatsu India LLP Your potential, unleashed. India s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team Enterprise technology has to do much more than keep the wheels turning; it is the engine that drives functional excellence and the enabler of innovation and long-term growth . Learn more about ET&P Your work profile As Consultant / Senior Consultant in our Oracle Team you ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - We are seeking a Senior Data Engineer with extensive experience in cloud platforms and data engineering tools, with a strong emphasis on Databricks. The ideal candidate will have deep expertise in designing and optimizing data pipelines, building scalable ETL workflows, and leveraging Databricks for advanced analytics and data processing. Experience with Google Cloud Platform is beneficial, particularly in integrating Databricks with cloud storage solutions and data warehouses such as BigQuery. The candidate should have a proven track record of working on data enablement projects across various data domains and be well-versed in the Data as a Product approach, ensuring data solutions are scalable, reusable, and aligned with business needs. Key Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks, ensuring efficient data ingestion, transformation, and processing. Implement and manage data storage solutions, including Delta Tables for structured storage and seamless data versioning. 5+ years of experience with cloud data services, with a strong focus on Databricks and its integration with Google Cloud Platform storage and analytics tools such as BigQuery. Leverage Databricks for advanced data processing, including the development and optimization of data workflows, Delta Live Tables, and ML-based data transformations. Monitor and optimize Databricks performance, focusing on cluster configurations, resource utilization, and Delta Table performance tuning. Collaborate with cross-functional teams to drive data enablement projects, ensuring scalable, reusable, and efficient solutions using Databricks. Apply the Data as a Product / Data as an Asset approach, ensuring high data quality, accessibility, and usability within Databricks environments. 5+ years of experience with analytical software and languages, including Spark (Databricks Runtime), Python, and SQL for data engineering and analytics. Should have strong expertise in Data Structures and Algorithms (DSA) and problem-solving, enabling efficient design and optimization of data workflows. Experienced in CI/CD pipelines using GitHub for automated data pipeline deployments within Databricks. Experienced in Agile/Scrum environments, contributing to iterative development processes and collaboration within data engineering teams. Experience in Data Streaming is a plus, particularly leveraging Kafka or Spark Structured Streaming within Databricks. Familiarity with other ETL/ELT tools is a plus, such as Qlik Replicate, SAP Data Services, or Informatica, with a focus on integrating these with Databricks. Qualifications: A Bachelor s or Master s degree in Computer Science, Engineering, or a related discipline. Over 5 years of hands-on experience in data engineering or a closely related field. Proven expertise in AWS and Databricks platforms. Advanced skills in data modeling and designing optimized data structures. Knowledge of Azure DevOps and proficiency in Scrum methodologies. Exceptional problem-solving abilities paired with a keen eye for detail. Strong interpersonal and communication skills for seamless collaboration. A minimum of one certification in AWS or Databricks, such as Cloud Engineering, Data Services, Cloud Practitioner, Certified Data Engineer, or an equivalent from reputable MOOCs. Location and way of working Base location: Bengaluru This profile involves occasional travelling to client locations OR this profile does not involve extensive travel for work. Hybrid is our default way of working. Each domain has customised the hybrid approach to their unique needs. Your role as a Consultant / Senior Consultant We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society. In addition to living our purpose, Analyst across our organization must strive to be: Inspiring - Leading with integrity to build inclusion and motivation Committed to creating purpose - Creating a sense of vision and purpose Agile - Achieving high-quality results through collaboration and Team unity Skilled at building diverse capability - Developing diverse capabilities for the future Persuasive / Influencing - Persuading and influencing stakeholders Collaborating - Partnering to build new solutions Delivering value - Showing commercial acumen Committed to expanding business - Leveraging new business opportunities Analytical Acumen - Leveraging data to recommend impactful approach and solutions through the power of analysis and visualization Effective communication Must be well abled to have well-structured and well-articulated conversations to achieve win-win possibilities Engagement Management / Delivery Excellence - Effectively managing engagement(s) to ensure timely and proactive execution as well as course correction for the success of engagement(s) Managing change - Responding to changing environment with resilience Managing Quality & Risk - Delivering high quality results and mitigating risks with utmost integrity and precision Strategic Thinking & Problem Solving - Applying strategic mindset to solve business issues and complex problems Tech Savvy - Leveraging ethical technology practices to deliver high impact for clients and for Deloitte Empathetic leadership and inclusivity - creating a safe and thriving environment where everyones valued for who they are, use empathy to understand others to adapt our behaviours and attitudes to become more inclusive. How you ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report . Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone s welcome entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you re applying to. Check out recruiting tips from Deloitte professionals.
Posted 4 weeks ago
8.0 - 13.0 years
13 - 18 Lacs
Kochi, Chennai, Thiruvananthapuram
Work from Office
" Healthcare , Salesforce , Visualforce , Eclipse Ide ","description":" Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others developmental activities Outcomes: Interpret the application\/feature\/component design to develop the same in accordance with specifications. Code debug test document and communicate product\/component\/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self\/team; provide feedback to FAST goals of team members Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to project schedule \/ timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Outputs Expected: Code: Code as per design Follow coding standards templates and checklists Review code \u2013 for team and peers Documentation: Create\/review templates checklists guidelines standards for design\/process\/development Create\/review deliverable documents. Design documentation r and requirements test cases\/results Configure: Define and govern configuration management plan Ensure compliance from the team Test: Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain relevance: Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project: Manage delivery of modules and\/or manage user stories Manage Defects: Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate: Create and provide input for effort estimation for projects Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release: Execute and monitor release process Design: Contribute to creation of design (HLD LLD SAD)\/architecture for Applications\ / Features\ / Business Components\/Data Models Interface with Customer: Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team: Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications: Take relevant domain\/technology certification Skill Examples: Explain and communicate the design \/ development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing \/ debugging features \/ components Perform and evaluate test in the customer or target environment Make quick decisions on technical\/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing \/ debugging features \/ components Make on appropriate utilization of Software \/ Hardware\u2019s. Strong analytical and problem-solving abilities Knowledge Examples: Appropriate software programs \/ modules Functional and technical designing Programming languages \u2013 proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile \u2013 Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments: Job Description Overall Experience: - Tech Lead: Overall at least 8 years IT industry experience with 6+ years in Salesforce.com - Any prior experience with other technologies like JAVA\/.NET is highly preferred - Experience in directly working with client is highly preferred - Candidate should have strong hands-on experience with administration, configuration, customization and development within Salesforce.com, and also in reports and dashboards Salesforce.com Certifications: - Must have Salesforce.com Developer (DEV401) certification - Highly preferred: Advanced Developer Certification (DEV 501) - Admin or Consultant certifications are desirable Force.com Knowledge: - Candidate should have strong experience with Visualforce, Apex, Triggers, Eclipse IDE, Salesforce Object Query Language (SOQL) and Javascript - Must possess coding best practices and understand the limitations of Apex coding - Candidates for Tech Lead \/ Senior Developer roles should have experience in integrating Salesforce with external systems using SOAP\/REST Services. This is highly preferred for Developer role also - Experience in working with Force.com Bulk API and the Metadata API - Must have sound implementation knowledge of workflow rules, validation rule, approval process, reports and dashboards - Experience using Apex Data Loader and other ETL tools such as Informatica - Experience with database development, SQL or PL\/SQL, database schemas, stored procedures is essential - Follow unit testing and test class best practices and should be capable of coding for positive and negative scenarios in testing - Must have experience with production deployment using change set\/eclipse\/ANT migration TOOL and working on solving deployment errors - Good to have knowledge on HTML\/HTML5, CSS, XML,AJAX, webservices, Google APIs, JQuery\/any other Javascript framework - Prior working experience with some integration tool is an added advantage - Candidate should have basic understanding of admin activities like user creation, role\/profile set up, Security set up etc. - Good to have \u2013 Salesforce mobile development using Salesforce1\/vfmobile; Experience with Heroku, ngForce Skills\/Experience required: - Should be a quick learner who can adapt to learning new technologies relevant to Salesforce - Need to be able to closely work with BSA to convert requirements to solutions and suggest options - Ability to create, analyse, and recommend, multiple alternative design solutions - Ability to advice on best practices and approaches to the team and the client - Experience in researching production system issues - Technical Lead should be capable of leading\/managing a team of developers, suggesting best practices, promoting reusable code and constantly doing code review to improve team\u2019s efficiency - Experience in creating and maintaining Technical Design documents and configuration steps - Ability to configure, test, debug software and document programs according to standards, policies and procedures - Ability to prepare test data, and steps for unit, integration and production testing - Must be able to work with the Business Analysts and Business counterparts to clarify and document requirements and configuration steps for current and future application requirements - Strong problem solving\/analytical skills - Ability to effectively balance and prioritize multiple projects concurrently - Strong and effective written and verbal communication skills - Excellent presentation skills and ability to collaborate with technical and business stakeholders at all levels of the organization ","
Posted 4 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description This position is responsible for life Insurance modelling Data team and exposure management services for all Insurance books of the company. These services will include and not limited to end-to-end life modelling, data testing, basis and model developments. The ideal candidate should be able to understand basic Life Insurance industry terminologies and stay abreast of industry trends, emerging technologies, and changes in regulatory landscape to drive continuous improvement in modelling practices. The person should also be managing group of technical resources to generate data file using ETL and Python.ExperienceAt least 5 years of experience in Data Engineering2+ Years of Experience in ETLRole and ResponsibilitiesUnderstanding client requirements and carrying out analyses that will aid the creation of solutions to meet these requirements.Engaging with stakeholders on requirements.Participating in pre-specified modelling changes while learning to appreciate best practice modelling guidelines.Participating in system testing activities and constructing results dashboards to aid in the walkthroughs of results analyses to stakeholders.Communicating progress on activities to relevant stakeholders and delivery managers.Preparing complete documentation and audit packs supporting developments and overall model releases.Ensuring process activities are consistent with defined process manuals and updating procedural manuals when necessary.Manage and mentor team of developersTeam handling is mandatory.Must have Skills: ETL Skills Strong in SQL Queries Experience in using Putty, WINSCP Good in Unix commands Automation using Python Preferred: Experience in Prophet – DCS coding ETL Development using Informatica, Advanced Excel - VBA Coding Qualifications Bachelor’s degree in Actuarial Science or Engineering related discipline may also be considered
Posted 4 weeks ago
3.0 - 8.0 years
5 - 15 Lacs
Navi Mumbai
Work from Office
Responsibilities: Perform installation and configuration of Informatica PowerCenter (PC), Informatica Data Quality (IDQ), and Master Data Management (MDM) solutions on various infrastructures, including on-premise and cloud platforms. Conduct debugging and troubleshooting of Informatica environments and processes to ensure optimal performance and resolve issues efficiently. Manage server patching and address Vulnerability Assessment (VA) findings to maintain secure and up-to-date systems. Execute job migrations between different environments (e.g., development, testing, production) and perform thorough validation to ensure data integrity and functionality. Collaborate with development and operations teams to support the full lifecycle of Informatica applications. Qualifications: Proven experience as an Informatica Administrator . Strong understanding and practical experience with Informatica PowerCenter, IDQ, and MDM . Proficiency in scripting (e.g., Shell, Python) for automation and task management. Solid knowledge of database concepts , with hands-on experience in Oracle databases preferred. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) is a plus. Excellent problem-solving and analytical skills. Strong communication and interpersonal abilities. IMMEDIATE JOINERS ONLY (15 DAYS OR LESS)
Posted 4 weeks ago
3.0 - 7.0 years
8 - 15 Lacs
Pune
Hybrid
Job Description We are Hiring for ETL Engineer with GCP Location: India (Pune) Exp: 3 - 7 Years Required Skills and Qualifications: 3+ years of experience in Data Engineering roles. Strong hands-on experience with Google Cloud Platform (GCP) data services, specifically BigQuery, Cloud Composer (Apache Airflow), and Cloud Storage. Mandatory expertise in Apache Airflow , including designing, developing, and deploying complex DAGs . Mandatory strong proficiency in SQL and PL/SQL for data manipulation, stored procedures, functions, and complex query writing. Experience into Informatica Ability to optimize BigQuery queries for performance and cost. Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in an agile environment. Bachelor's degree in Computer Science, Engineering, or a related field.
Posted 4 weeks ago
5.0 - 8.0 years
3 - 7 Lacs
Jaipur
Work from Office
2-3 years or more experience of successfully implementing Anaplan solutions and as an Architect on at least 2 Anaplan implementation projects. Total 5+ years of experience in related technologies. Domain experience in Telecom/Contract Management would be preferred. Anaplan certification of L1, L2, L3 and Solution Architect (Mandatory) certified. Understand clients business planning & performance management processes and related business requirements. Should provide meaningful observations on performance improvements, formula re-writing, troubleshooting and analyzing the problems and its related impacts within and across models. Hands-on in Anaplan New UX, ALM, Anaplan Connect, APIs. Guide and mentor other team members throughout the implementation process. Serve as the architectural SME for large-scale connected planning solutions. Provide candid, meaningful feedback and progress updates in a timely manner to the Project Manager/Business Partner and team. Develop Anaplan model documentation. Participate and/or lead Data Integration and Migration Solutions. Participate and/or lead UAT testing and deployment. Mandatory Skills: Anaplan. Experience: 5-8 Years.
Posted 4 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Must have demonstrable (5 yrs+) experience of Informatica (ideally Certified Cloud Developer, Advanced – including design, develop, test, deployment and administration of Informatica cloud /Informatica Power Centre and Scheduler) Must have developed and delivered SQL solutions (including developing queries, designing and building databases, schemas and views) Must have experience in data modelling and data warehouse design Must be aware of data protection/data security requirements in a regulated environment Must have demonstrable experience of secure data design (including user, schema, query/view and row/column level security) Must have experience with a variety of data sources (CSV, XML, JSON, API, SQL) Must have experience in data cleansing and data quality Must have experience of leading a data team that has delivered end to end Data Warehouse projects using Informatica Power Centre / Informatica Cloud BI / Power BI, from design phase to Go live and ongoing support Must have experience of GitHub (ideally Azure DevOps) Must have demonstrable experience of Linux/Unix shell scripting development and troubleshooting Must be well versed in operating and delivering in an ITIL environment (i.e. incident, problem, change, release management, etc.) Good understanding of software development and project life cycle, including Agile methodology and the SCRUM Framework Desirable to have Microsoft PowerBI experience (ideally designing and developing data models, data flows and reports) Desirable to have experience/knowledge of DAX Beneficial to have experience of integration with Salesforce Beneficial to have knowledge of Oracle databases and Microsoft Azure Data Lake Beneficial to have experience of Azure Data Factory, Microsoft Fabric, Azure Synapse Analytics Beneficial to have experience of OBIEE & OAS Beneficial to have experience of the financial services industry
Posted 4 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Role You'll be at the heart of developing and maintaining our sophisticated in-house insurance products built on relational or document databases. You will have the opportunity to join one of our product teams and contribute to the development of functionality which generates real business impact. About The Team We are a team that believes in engineering excellence and that our leaders should also be engineers themselves. We build applications that are carefully designed, thoughtfully implemented, and surpass the expectations of our users by working together with product owners. Quality and stability are first-class deliverables in everything we do, and we lead by example by embedding high standards into our processes. Your responsibilities include Designs, develops, deploys, and supports sustainable data/solutions architectures such as design patterns, reference data architecture, conceptual, logical, and physical data models for both Relational and NoSQL DB Data migration / ingestion / transfer from / to heterogeneous databases and file types. Performance Optimization (Query fine tuning, indexing strategy etc.) Support project team conduct Public Cloud Data Growth and Data Service Consumption assessment and forecast Collaborate effectively within a cross-functional team including requirements engineers, QA specialists, and other application engineers. Stay current with emerging technologies and Generative AI developments to continuously improve our solutions. About You You're a naturally curious and thoughtful professional who thrives in a high-performance engineering environment. Your passion for coding is matched by your commitment to delivering business value. You believe in continuous learning through self-improvement or by absorbing knowledge from those around you and you're excited to contribute to a team that values technical excellence. You Should Bring The Following Skills And Experiences Proficient in Relational and NoSQL DBs Proficient in PL/SQL programming Strong data model and database design skill for both Relational and NoSQL Experience with seamless data integration using Informatica and Azure Data Factory Must have previous public cloud experience, particularly with Microsoft Azure About Swiss Re Swiss Re is one of the world’s leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient. We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime. We cover both Property & Casualty and Life & Health. Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients. This is possible thanks to the collaboration of more than 14,000 employees across the world. Our success depends on our ability to build an inclusive culture encouraging fresh perspectives and innovative thinking. We embrace a workplace where everyone has equal opportunities to thrive and develop professionally regardless of their age, gender, race, ethnicity, gender identity and/or expression, sexual orientation, physical or mental ability, skillset, thought or other characteristics. In our inclusive and flexible environment everyone can bring their authentic selves to work and their passion for sustainability. If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience. Keywords Reference Code: 134122
Posted 4 weeks ago
4.0 years
0 Lacs
Telangana, India
On-site
Job Description Bachelor’s degree in computer science or similar field or equivalent work experience. 4+ years of hands-on Software/Application development experience with Informatica PowerCenter and PowerBI (visualization and modeling). Extensive working knowledge of PowerBI large data sets and modelling Extensive knowledge in DAX coding Experience in Performance analysis and tuning and Knowledge in troubleshooting tools like Tabular editor, DAX studio Experience in Incremental, Hybrid data refreshing methods Knowledge around PowerBI service and capacity managements In-depth knowledge of data warehousing and worked in star schema concepts like facts/dimension tables. Strong PowerBI modeling and data visualization experience in delivering projects. Strong in SQL and PL-SQL Strong Data Warehousing & Database fundamentals in MS SQL server DB. Strong in performance testing and troubleshooting of application issues using informatica logs.
Posted 4 weeks ago
4.0 - 9.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Must have technical skills: 4 years+ onSnowflake advanced SQL expertise 4 years+ on data warehouse experiences hands on knowledge with the methods to identify, collect, manipulate, transform, normalize, clean, and validate data, star schema, normalization / denormalization, dimensions, aggregations etc, 4+ Years experience working in reporting and analytics environments development, data profiling, metric development, CICD, production deployment, troubleshooting, query tuning etc, 3 years+ on Python advanced Python expertise 3 years+ on any cloud platform AWS preferred hands on experience on AWS on Lambda, S3, SNS / SQS, EC2 is bare minimum, 3 years+ on any ETL / ELT tool Informatica, Pentaho, Fivetran, DBT etc. 3+ years with developing functional metrics in any specific business vertical (finance, retail, telecom etc), Must have soft skills: Clear communication written and verbal communication, especially with time off, delays in delivery etc. Team Player Works in the team and works with the team, Enterprise Experience Understands and follows enterprise guidelines for CICD, security, change management, RCA, on-call rotation etc, Nice to have: Technical certifications from AWS, Microsoft, Azure, GCP or any other recognized Software vendor, 4 years+ on any ETL / ELT tool Informatica, Pentaho, Fivetran, DBT etc. 4 years+ with developing functional metrics in any specific business vertical (finance, retail, telecom etc), 4 years+ with team lead experience, 3 years+ in a large-scale support organization supporting thousands ofusers
Posted 4 weeks ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
At Moody's, we unite the brightest minds to turn today’s risks into tomorrow’s opportunities. We do this by striving to create an inclusive environment where everyone feels welcome to be who they are—with the freedom to exchange ideas, think innovatively, and listen to each other and customers in meaningful ways. If you are excited about this opportunity but do not meet every single requirement, please apply! You still may be a great fit for this role or other open roles. We are seeking candidates who model our values: invest in every relationship, lead with curiosity, champion diverse perspectives, turn inputs into actions, and uphold trust through integrity. Job Title: Software Engineer – Salesforce Location: Gurgaon, Haryana, India Department: Customer, Operations & Risk (COR), Moody’s Analytics Reporting Manager: Manav Vatsyayana Employment Type: Full-Time About The Role We are looking to bring on board a skilled and motivated Software Engineer to join our Production Support team within Moody’s Analytics. This role is critical to ensuring the stability, performance, and continuous improvement of our Salesforce platform and its integrations with enterprise systems. You will be part of a dynamic team that supports global users and collaborates closely with cross-functional stakeholders, vendors, and agile teams. If you are passionate about Salesforce technologies, thrive in high-availability environments, and enjoy solving complex problems, I’d love to hear from you. Key Responsibilities Provide daily production support for Salesforce applications, ensuring timely resolution of incidents and service requests. Lead and manage ticket inflow, task assignments, and daily reporting. Collaborate with L1 business leads to prioritize tasks and ensure alignment with business needs. Drive root cause analysis and resolution of integrated data issues across platforms. Oversee release management and operational support activities. Design and implement automation for build, release, and deployment processes. Support deployment of new features and configuration changes using DevOps tools. Communicate incident and request statuses to stakeholders, including senior leadership. Participate in project transitions, UAT, and knowledge transfer activities. Act as Duty Manager on a rotational basis, including weekends, for major incident management (if required). Participate in team meetings, document procedures, and ensure service level targets are met. Required Skills And Competencies Salesforce certifications: Administrator, Platform App Builder, and Platform Developer I. Apttus CPQ and FinancialForce certifications are a plus. Strong understanding of ITIL disciplines: Event, Incident, Request, Problem, Release, and Knowledge Management. Experience with data quality tools and techniques (e.g., SQL/SOQL for profiling, validation, cleansing). Proficiency in DevOps tools: GitHub and Jira or other similar tools like Bitbuket, AutoRabit, SVN, Aldon, TFS, Jenkins, Urban Code, Nolio and Puppet. Experience supporting Salesforce applications and ERP/data integration tools (e.g., SAP, MuleSoft, Informatica, IBM SPM). Strong analytical and problem-solving skills with attention to detail. Ability to manage competing priorities in a fast-paced, Agile environment. Excellent communication and interpersonal skills. Proficiency in reporting and analysis tools (e.g., Excel, PowerPoint). Familiarity with workload automation and monitoring tools such as BMC Remedy, Control-M, Tivoli, Nagios, and Splunk is advantageous. Education And Experience Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field. Minimum 4 years of experience in software development, DevOps, and production support, preferably within the financial services sector. About The Team You’ll be joining the Production Support Team under the Business Systems group in the Customer, Operations & Risk business unit. Our team supports Moody’s Analytics employees globally who rely on the Salesforce CRM platform. This is an exciting opportunity to work on cutting-edge Salesforce technologies and contribute to a high-impact support function. Moody’s is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected veteran status, sexual orientation, gender expression, gender identity or any other characteristic protected by law. Candidates for Moody's Corporation may be asked to disclose securities holdings pursuant to Moody’s Policy for Securities Trading and the requirements of the position. Employment is contingent upon compliance with the Policy, including remediation of positions in those holdings as necessary.
Posted 4 weeks ago
6.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments Data Engineering Role Summary: Skilled Data Engineer with strong Python programming skills and experience in building scalable data pipelines across cloud environments. The candidate should have a good understanding of ML pipelines and basic exposure to GenAI solutioning. This role will support large-scale AI/ML and GenAI initiatives by ensuring high-quality, contextual, and real-time data availability. ________________________________________ Key Responsibilities: Design, build, and maintain robust, scalable ETL/ELT data pipelines in AWS/Azure environments. Develop and optimize data workflows using PySpark, SQL, and Airflow. Work closely with AI/ML teams to support training pipelines and GenAI solution deployments. Integrate data with vector databases like ChromaDB or Pinecone for RAG-based pipelines. Collaborate with solution architects and GenAI leads to ensure reliable, real-time data availability for agentic AI and automation solutions. Support data quality, validation, and profiling processes. ________________________________________ Key Skills & Technology Areas: Programming & Data Processing: Python (4–6 years), PySpark, Pandas, NumPy Data Engineering & Pipelines: Apache Airflow, AWS Glue, Azure Data Factory, Databricks Cloud Platforms: AWS (S3, Lambda, Glue), Azure (ADF, Synapse), GCP (optional) Databases: SQL/NoSQL, Postgres, DynamoDB, Vector databases (ChromaDB, Pinecone) – preferred ML/GenAI Exposure (basic): Hands-on with Pandas, scikit-learn, knowledge of RAG pipelines and GenAI concepts Data Modeling: Star/Snowflake schema, data normalization, dimensional modeling Version Control & CI/CD: Git, Jenkins, or similar tools for pipeline deployment ________________________________________ Other Requirements: Strong problem-solving and analytical skills Flexible to work on fast-paced and cross-functional priorities Experience collaborating with AI/ML or GenAI teams is a plus Good communication and a collaborative, team-first mindset Experience in Telecom, E- Commerce, or Enterprise IT Operations is a plus. Skills ETL,BIGDATA,PYSPARK,SQL
Posted 4 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Your primary responsibilities include: Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools. Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT. Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred Technical And Professional Experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 4 weeks ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
No of years’ experience 5 to 8 years Detailed job description - – computer science academic background (bachelor’s or master’s degree) – extensive experience as ETL developer using Informatica IDMC – PostgreSQL (PL/pgSQL) and Oracle (PL/SQL) database skills – comfortable in a Linux based server environment – deep understanding of all aspects of the application development and support life cycle – strong skills in analysis, design, development, testing and support/troubleshooting with deep specialist skills in database, Informatica , Unix scripting , reporting, data modelling and data warehousing technologies . – fluent in development methods, tools and techniques, system – hands-on exp in any scheduler tool like Autosys – well-developed business communication skills – both written and verbal – ability to naturally facilitative approach to problem solving – analytical, problem-solving and synthesizing skills (you know how to figure things out) – solid personal prioritization and time management Mandatory Skills ETL developer using Informatica IDMC (Handson exp), PostgreSQL, PLSQL, Unix scripting, Scheduler tool (like Autosys) hands-on exp Good To Have Linux, Mainframe
Posted 4 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are Hiring!!!! Data Engineer – Snowflake, Python, DBT & Airflow Experience: 5+ years in data engineering Strong in Snowflake, Python, AWS, and DevOps tools Work Location : Mumbai, Pune, Chennai, Bangalore Work Mode : Hybrid Notice period : Immediate Openings: 1 - Role : Snowflake, Airflow, AWS, DBT (Primary) | Informatica/IICS (Secondary) 2 - Role : IICS/Informatica, AWS, Python (Primary) | Snowflake, Airflow, DBT (Secondary) Key Skills: Python (including pyspark, Snow park) Snowflake, DBT, Airflow, AWS Informatica/IICS CI/CD, GitHub, Terraform, Artifactory Responsibilities: Design & optimize data pipelines and ETL processes Develop using Snowflake, Python, DBT & Airflow Manage CI/CD and automation Follow best practices for code, testing & deployment Regards, Banupriya Suresh banupriya.s@kaaviansys.com
Posted 4 weeks ago
0.0 - 1.0 years
0 Lacs
Kukatpally, Hyderabad, Telangana
On-site
Requirements: 4 + years of ETL/Informatica developer experience. 1 year of snowflake required 1 year of IICS required Experience developing specifications, test scripts, and code coverage for all integrations Experience supporting the migration of integration code from lower to higher environments (e.g. production). Experience with full and incremental ETL using Informatica Power Center. Expertise in Developing ETL/Informatica for Data Warehouse Integration from various data sources. Experience supporting integration configurations with iPaaS through connected apps or web services. Experience supporting the migration of integration code from lower to higher environments (e.g. production) Able to be on-call for selected off-shift hours Experience working with Agile framework Job Types: Full-time, Permanent Benefits: Health insurance Provident Fund Ability to commute/relocate: Kukatpally, Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Preferred) Work Location: In person Application Deadline: 21/07/2025 Expected Start Date: 25/07/2025
Posted 4 weeks ago
6.0 - 8.0 years
1 - 4 Lacs
Chennai
Hybrid
Job Title:Snowflake Developer Experience: 6-8 Years Location:Chennai - Hybrid Job Description : 3+ years of experience as a Snowflake Developer or Data Engineer. Strong knowledge of SQL, SnowSQL, and Snowflake schema design. Experience with ETL tools and data pipeline automation. Basic understanding of US healthcare data (claims, eligibility, providers, payers). Experience working with largescale datasets and cloud platforms (AWS, Azure, GCP). Familiarity with data governance, security, and compliance (HIPAA, HITECH).
Posted 4 weeks ago
6.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role: Senior Associate Tower: Data, Analytics & Specialist Managed Service Experience: 6 - 10 years Key Skills: Azure Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: Bangalore Job Description As a Senior Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements: Required Skills: Azure Cloud Engineer: Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 6 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 3-5 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like Informatica, Talend, SSIS, AWS, Azure, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations. Perform data transformation and processing tasks to prepare the data for analysis and reporting in Azure Databricks or Azure Synapse Analytics for large-scale data transformations using tools like Apache Spark. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like databricks Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice to have: Azure certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.
Posted 4 weeks ago
5.0 - 10.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Overview We are seeking a skilled Associate Manager AIOps & MLOps Operations to support and enhance the automation, scalability, and reliability of AI/ML operations across the enterprise. This role requires a solid understanding of AI-driven observability, machine learning pipeline automation, cloud-based AI/ML platforms, and operational excellence. The ideal candidate will assist in deploying AI/ML models, ensuring continuous monitoring, and implementing self-healing automation to improve system performance, minimize downtime, and enhance decision-making with real-time AI-driven insights. Responsibilities Support and maintain AIOps and MLOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Assist in implementing real-time data observability, monitoring, and automation frameworks to enhance data reliability, quality, and operational efficiency. Contribute to developing governance models and execution roadmaps to drive efficiency across data platforms, including Azure, AWS, GCP, and on-prem environments. Ensure seamless integration of CI/CD pipelines, data pipeline automation, and self-healing capabilities across the enterprise. Collaborate with cross-functional teams to support the development and enhancement of next-generation Data & Analytics (D&A) platforms. Assist in managing the people, processes, and technology involved in sustaining Data & Analytics platforms, driving operational excellence and continuous improvement. Support Data & Analytics Technology Transformations by ensuring proactive issue identification and the automation of self-healing capabilities across the PepsiCo Data Estate.Responsibilities: Support the implementation of AIOps strategies for automating IT operations using Azure Monitor, Azure Log Analytics, and AI-driven alerting. Assist in deploying Azure-based observability solutions (Azure Monitor, Application Insights, Azure Synapse for log analytics, and Azure Data Explorer) to enhance real-time system performance monitoring. Enable AI-driven anomaly detection and root cause analysis (RCA) by collaborating with data science teams using Azure Machine Learning (Azure ML) and AI-powered log analytics. Contribute to developing self-healing and auto-remediation mechanisms using Azure Logic Apps, Azure Functions, and Power Automate to proactively resolve system issues. Support ML lifecycle automation using Azure ML, Azure DevOps, and Azure Pipelines for CI/CD of ML models. Assist in deploying scalable ML models with Azure Kubernetes Service (AKS), Azure Machine Learning Compute, and Azure Container Instances. Automate feature engineering, model versioning, and drift detection using Azure ML Pipelines and MLflow. Optimize ML workflows with Azure Data Factory, Azure Databricks, and Azure Synapse Analytics for data preparation and ETL/ELT automation. Implement basic monitoring and explainability for ML models using Azure Responsible AI Dashboard and InterpretML. Collaborate with Data Science, DevOps, CloudOps, and SRE teams to align AIOps/MLOps strategies with enterprise IT goals. Work closely with business stakeholders and IT leadership to implement AI-driven insights and automation to enhance operational decision-making. Track and report AI/ML operational KPIs, such as model accuracy, latency, and infrastructure efficiency. Assist in coordinating with cross-functional teams to maintain system performance and ensure operational resilience. Support the implementation of AI ethics, bias mitigation, and responsible AI practices using Azure Responsible AI Toolkits. Ensure adherence to Azure Information Protection (AIP), Role-Based Access Control (RBAC), and data security policies. Assist in developing risk management strategies for AI-driven operational automation in Azure environments. Prepare and present program updates, risk assessments, and AIOps/MLOps maturity progress to stakeholders as needed. Support efforts to attract and build a diverse, high-performing team to meet current and future business objectives. Help remove barriers to agility and enable the team to adapt quickly to shifting priorities without losing productivity. Contribute to developing the appropriate organizational structure, resource plans, and culture to support business goals. Leverage technical and operational expertise in cloud and high-performance computing to understand business requirements and earn trust with stakeholders. Qualifications 5+ years of technology work experience in a global organization, preferably in CPG or a similar industry. 5+ years of experience in the Data & Analytics field, with exposure to AI/ML operations and cloud-based platforms. 5+ years of experience working within cross-functional IT or data operations teams. 2+ years of experience in a leadership or team coordination role within an operational or support environment. Experience in AI/ML pipeline operations, observability, and automation across platforms such as Azure, AWS, and GCP. Excellent Communication: Ability to convey technical concepts to diverse audiences and empathize with stakeholders while maintaining confidence. Customer-Centric Approach: Strong focus on delivering the right customer experience by advocating for customer needs and ensuring issue resolution. Problem Ownership & Accountability: Proactive mindset to take ownership, drive outcomes, and ensure customer satisfaction. Growth Mindset: Willingness and ability to adapt and learn new technologies and methodologies in a fast-paced, evolving environment. Operational Excellence: Experience in managing and improving large-scale operational services with a focus on scalability and reliability. Site Reliability & Automation: Understanding of SRE principles, automated remediation, and operational efficiencies. Cross-Functional Collaboration: Ability to build strong relationships with internal and external stakeholders through trust and collaboration. Familiarity with CI/CD processes, data pipeline management, and self-healing automation frameworks. Strong understanding of data acquisition, data catalogs, data standards, and data management tools. Knowledge of master data management concepts, data governance, and analytics.
Posted 4 weeks ago
1.0 - 5.0 years
2 - 4 Lacs
Chennai, Tamil Nadu
Work from Office
Overview As Sales Sr. Mgr., ensure that exceptional leadership & operational direction is provided by his/her analysts team to sales employees across multiple teams and markets. His/her Planogram Analysts deliver visually appealing planograms based on store clustering, space definitions and defined flow. Work closely with Category Management and Space teams to ensure planograms meet approved parameters. Conduct planogram quality audit ensuring all planograms meet assortment requirements, visual appeal, innovation opportunities and shelving metrics. Continuously identify opportunities and implement processes to improve quality, timeliness of output and process efficiency through automation. Responsibilities Head the DX Sector Planogram Analyst team and ensure efficient, effective and comprehensive support of the sales employees across multiple teams and markets Lead and manage the Planogram Analysts work stream by working closely with Sector Space & Planogram team Ensure accurate and timely delivery of tasks regarding: deliver visually appealing versioned planograms based on store clustering, space definitions and defined flow work closely with Category Management and Space teams to ensure planogram meet approved parameters conduct planogram quality control ensuring all planograms meet assortment requirements, visual appeal, innovation opportunities and shelving metrics electronically deliver planograms to both internal teams and external customer specific systems manage multiple project timelines simultaneously ensure timelines are met by tracking project process, coordinating activities and resolving issues build and maintain relationships with internal project partners manage planogram version/store combination and/or store planogram assignments and to provide reporting and data as needed maintain planogram database with most updated planogram files retain planogram models and files for historical reference, as needed Invest and drive adoption of industry best practices across regions/sector, as required Partner with global teams to define strategy for End to End execution ownership and accountability. Lead workload forecasting and effectively drive prioritization conversation to support capacity management. Build stronger business context and elevate the teams capability from execution focused to end to end capability focused. Ensure delivery of accurate and timely planograms in accordance with agreed service level agreements (SLA) Work across multiple functions to aid in collecting insights for action-oriented cause of change analysis Ability to focus against speed of execution and quality of service delivery rather than achievement of SLAs Recognize opportunities and take action to improve delivery of work Implement continued improvements and simplifications of processes and optimal use of technology Scale-up operation in-line with business growth, both within existing scope, as well as new areas of opportunity Create an inclusive and collaborative environment People Leadership Enable direct reports capabilities and enforce consistency in execution of key capability areas; planogram QC, development and timely delivery Responsible for Hiring, talent assessment, competency development, performance management, productivity improvement, talent retention, career planning and development Provide and receive feedback about the global team and support effective partnership. Qualifications 10+ yrs. of retail/merchandizing experience (including JDA) 2+ yrs. of people leadership experience in a Space planning/planogram environment Bachelors in commerce/business administration/marketing, Masters degree is a plus Advanced level skill in Microsoft Office, with demonstrated intermediate-advanced Excel skills necessary Experience with analyzing and reporting data to identify issues, trends, or exceptions to drive improvement of results and find solutions Advanced knowledge and experience of space management technology platform JDA Propensity to learn PepsiCo software systems Ability to provide superior customer service Best-in-class time management skills, ability to multitask, set priorities and plan
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |