Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Description IntegriChain is the data and application backbone for market access departments of Life Sciences manufacturers. We deliver the data, the applications, and the business process infrastructure for patient access and therapy commercialization. More than 250 manufacturers rely on our ICyte Platform to orchestrate their commercial and government payer contracting, patient services, and distribution channels. ICyte is the first and only platform that unites the financial, operational, and commercial data sets required to support therapy access in the era of specialty and precision medicine. With ICyte, Life Sciences innovators can digitalize their market access operations, freeing up resources to focus on more data-driven decision support. With ICyte, Life Sciences innovators are digitalizing labor-intensive processes – freeing up their best talent to identify and resolve coverage and availability hurdles and to manage pricing and forecasting complexity. We are headquartered in Philadelphia, PA (USA), with offices in: Ambler, PA (USA); Pune, India; and Medellín, Colombia. For more information, visit www.integrichain.com, or follow us on Twitter @IntegriChain and LinkedIn. Job Description Perform lead role in ETL testing, UI testing, DB testing and team management. Understand the holistic requirements, review and analyse stories, specifications, and technical design documents and develop detailed test cases and test data to ensure business functionality is thoroughly tested – both Automation & Manual. Validate ETL workflows, ensuring data integrity, accuracy and the Transformation rules using complex Snowflake SQL queries. Working Knowledge on DBT is a PLUS Create, Execute and maintain scripts on automation in BDD – Gherkin/Behave, Pytest. Experience in writing DB queries (preferably in Postgres/ Snowflake/ MySQL/ RDS) Preparation, review and update of test cases and relevant test data consistent with system requirements including functional, integration & regression, UAT testing. Coordinate with cross team subject matter experts to develop, maintain, and validate test scenarios to the best interest of that POD. Taking ownership on creating and maintaining artifacts on: Test strategy, BRD, Defect count/leakage report and different quality issues. Collaborate with DevOps/SRE team to integrate test automation into CI/CD pipelines (Jenkins, Rundeck, GitHub etc.) Should have the ability to oversee and guide a team of min 4 testers, lead them by example, institutionalizing best practices in testing processes & automation in agile methodology. Meet with internal stakeholders to review current testing approaches, provide feedback on ways to improve / extend / automate along with data backed inputs and provisioning senior leadership with metrics consolidation. Maximize the opportunity to excel in an open and recognized work culture. Be a problem solver and a team player. Qualifications 8-11 years of strong expertise in STLC, defect management, Test Strategy designing, planning and approach. Should have experience with Test requirement understanding, test data, test plan & test case designing. Should have minimum 6+ years of strong work experience in UI, Database, ETL testing. Experience in ETL/data warehouse testing (DBT/Informatica, Snowflake, SQL). Any experience with AWS/Cloud hosted applications is an added advantage. Hands-on experience in writing DB queries (preferably in postgres/ Snowflake/ MySQL/ RDS) Should have 3+ years of experience with automation scripts execution, maintenance & enhancements with Selenium web-driver (v3+)/playwright, with programming experience in Python (MUST) with BDD – Gherkin and Behave, Pytest. Key competencies required: Strong analytical, Problem-Solving, Communication skills, Collaboration, Accountability, Stakeholder management, passion to drive initiatives, Risk highlighting and Team leading capabilities. Proven Team leadership experience with min 2 people reporting. Experienced working with Agile methodologies, such as Scrum, Kanban. MS Power BI reporting. Front end vs Back end validation Professional Approach Ready to work in flexible working hours and collaborate with US/India/Colombia teams Excellent communication skills (written, verbal, listening, and articulation) Additional Information What does IntegriChain have to offer? Mission driven: Work with the purpose of helping to improve patients' lives! Excellent and affordable medical benefits + non-medical perks including Flexible Paid Time Off and much more! Robust Learning & Development opportunities including over 700+ development courses free to all employees IntegriChain is committed to equal treatment and opportunity in all aspects of recruitment, selection, and employment without regard to race, color, religion, national origin, ethnicity, age, sex, marital status, physical or mental disability, gender identity, sexual orientation, veteran or military status, or any other category protected under the law. IntegriChain is an equal opportunity employer; committed to creating a community of inclusion, and an environment free from discrimination, harassment, and retaliation. Show more Show less
Posted 1 week ago
1.0 - 2.0 years
4 - 7 Lacs
Gurugram, Bengaluru, Mumbai (All Areas)
Hybrid
Role & responsibilities : Develop and maintain data workflows using Ab Initio tools. Analyze data, troubleshoot issues, and resolve defects within data pipelines. Participate in Agile ceremonies including daily stand-ups, sprint planning, and reviews. Apply CI/CD practices using tools like Jenkins and work within Unix/Linux environments. Maintain documentation including metadata definitions, onboarding materials, and SOPs. Contribute to modernization and cloud-readiness efforts, particularly leveraging Azure and Databricks. Preferred candidate profile 1-2 years of hands-on experience in data engineering, ETL development, or data analytics. Experience with SQL and working knowledge of Unix/Linux systems. Familiarity with ETL tools such as Ab Initio. Ability to analyze, debug, and optimize data workflows. Exposure to CI/CD pipelines and version control practices. Strong analytical thinking, attention to detail, and documentation skills. Comfortable working in Agile development environments
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Key Responsibilities The core responsibilities will be to develop high performance data visualizations, reports, dashboards, and provide ad-hoc reporting. Strong proficiency in Power BI, DAX, Power Query, and Power BI Service Familiarity with Power BI Service (publishing, workspaces, security roles). Strong SQL work experience and understanding must have. Familiarity with data warehousing concepts and ETL processes. Should be good at data modelling and be able to design, develop , deploy and maintain solutions for extracting, transforming and loading data. Demonstrate a proven ability to work in a fast-paced, agile environment. Be comfortable with initial ambiguity and offer solutions to lead a client down the right path. Support rapid prototyping and coordinate business requests and specifications with a Business Analytics Designer and Customers. Conduct Testing and Validation thorough testing of reports and dashboards to ensure accuracy and reliability of the data presented. Continuous Improvement Stay updated with the latest features and updates in Power BI. Good Technical skills and hands on experience in Informatica, Snowflake, SQL/PLSQL. Show more Show less
Posted 1 week ago
7.5 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica Product 360 (PIM) Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years of full time education Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have Skills : Informatica Product 360 (PIM), No Technology Specialization Good to Have Skills : Job Requirements : Key Responsibilities : a:Testing source code and debugging code b:Writing high-quality code to complete application development on schedule c:Strong analytical and problem-solving skills d:Possessing specialist knowledge of operating systems, devices, applications and software e:Manage all production system and recommend ways to optimize performance and provide solution to problems and prepare reports for all problems f:Administer and resolve applications issues, provide updates and perform root cause analysis Technical Experience : a:Experience with Informatica MDM and PIM Implementation b:In-depth understanding of PIM features and capabilities, propose solutions based on Informatica P360 platform capabilities c:Create end-to-end specifications for PIM solutions Lead project team members through the activities required to implement Informatica PIM solution make specific recommendations to bring the PIM implementation into conformance with best practices Informatica P360 PIM Certification is recommended Professional Attributes : a:Should have excellent communication and presentation skills b:Should be a good team player/lead c:Strong articulation skills are mandatory Educational Qualification: 15 years of full time education Additional Info : Show more Show less
Posted 1 week ago
6.0 - 10.0 years
3 - 8 Lacs
Noida
Work from Office
Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Preferred Skills Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives.
Posted 1 week ago
8.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About Analytix: Businesses of all sizes are faced with a rapidly changing competitive environment. Companies that possess both the ability to successfully navigate obstacles and the agility to react to market conditions are better positioned for long-term success. Analytix Solutions helps your company tackle these types of challenges. We empower business owners to confidently make informed decisions and positively impact profitability. We are a single-source provider of integrated solutions across multiple functional areas and disciplines. Through a combination of cross-disciplinary expertise, technological aptitude, and deep domain experience, we support our clients with efficient systems and processes, reliable data, and industry insights. We are your partner in strategically scaling your business for growth Website- Small Business Accounting Bookkeeping, Medical Billing, Audio Visual, IT Outsourcing Services (analytix.com) LinkedIn : Analytix Solutions: Overview | LinkedIn Analytix Business Solutions (India) Pvt. Ltd. : My Company | LinkedIn Job Description: Design, build, and maintain scalable, high-performance data pipelines and ETL/ELT processes across diverse database platforms. Architect and optimize data storage solutions to ensure reliability, security, and scalability. Leverage generative AI tools and models to enhance data engineering workflows, drive automation, and improve insight generation. Create and maintain comprehensive documentation for data systems, workflows, and models. Implement data modeling best practices and optimize data retrieval processes for better performance. Qualifications : Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. 5–8 years of experience in data engineering, designing and managing large-scale data systems. Strong expertise in database technologies, including: - SQL Databases: PostgreSQL, MySQL, SQL Server , NoSQL Databases: MongoDB, Cassandra , Data Warehouse/ Unified Platforms: Snowflake, Redshift, BigQuery, Microsoft Fabric Proficiency in Python and SQL, with experience in data processing frameworks (e.g., Pandas, PySpark). Experience with ETL tools (e.g., Apache Airflow, MS Fabric, Informatica, Talend) and data pipeline orchestration platforms. Strong understanding of data architecture, data modeling, and data governance principles. Experience with cloud platforms (preferably Azure) and associated data services. Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Data Officer Job #: req33197 Organization: World Bank Sector: Information Technology Grade: GF Term Duration: 2 years 0 months Recruitment Type: Local Recruitment Location: Chennai,India Required Language(s): English Preferred Language(s) Closing Date: 6/10/2025 (MM/DD/YYYY) at 11:59pm UTC Description Do you want to build a career that is truly worthwhile? Working at the World Bank Group provides a unique opportunity for you to help our clients solve their greatest development challenges. The World Bank Group is one of the largest sources of funding and knowledge for developing countries; a unique global partnership of five institutions dedicated to ending extreme poverty, increasing shared prosperity and promoting sustainable development. With 189 member countries and more than 130 offices worldwide, we work with public and private sector partners, investing in groundbreaking projects and using data, research, and technology to develop solutions to the most urgent global challenges. For more information, visit www.worldbank.org ITS Vice Presidency Context The Information and Technology Solutions (ITS) Vice Presidential Unit (VPU) enables the World Bank Group to achieve its mission of ending extreme poverty and boost shared prosperity on a livable planet by delivering transformative information and technologies to its staff working in over 150+ locations. For more information on ITS, see this video:https://www.youtube.com/watch?reload=9&v=VTFGffa1Y7w ITS shapes its strategy in response to changing business priorities and leverages new technologies to achieve three high-level business outcomes: business enablement, by providing Bank Group units with innovative digital tools and technologies to transform how they deliver value for their clients; empowerment & effectiveness, by ensuring that all Bank Group staff are connected, able to find information, and productive to accelerate the delivery of development solutions globally; and resilience, by equipping the Bank Group to provide risk-based cybersecurity and robust data protection for a global network and a growing cloud platform. Implementation of the strategy is guided by three core principles. The first is to deliver solutions for business partners that are customer-centric, innovative, and transformative. The second is to provide the Bank Group with value for money with selective and standard technologies. The third principle is to excel at the basics by providing a high performing, robust, and resilient IT environment for the organization. As a unit within the WB DataOperations and Technology office (ITSDOITSDOCorporate (ITSOC), the Data and Analytics unit (ITSDA) provides state-of-art information and technology applications to support the operations of the World Bank Group. Functions provided data, Information Managementensure that the systems meet the business needs of users and AI solutionsexternal clients to manage business processes for stakeholders across World Bank. The current technology landscape encompasses Cloud-based data platforms (Azure and AWS), Oracle, SQL Server, Business Objects, Tableau, Cisco Information Server (Composite), SAP BW/Hana, Informatica, .Net, HTML 5, CSS Frameworks, SharePoint and many others. Our plans are to migrate our on-prem data repositories and re-engineer based on new Cloud architectures in the coming years. Responsibilities Perform data analysis and create reusable assets for our Data & Analytics Portal, i.e., dashboard, data visualization & reports, including ad-hoc requests from clients. Analyze large datasets to identify trends, patterns, and insights, utilize tools and techniques to understand patterns. Ability to quickly grasp business insights and navigate through the data structures to assess the issue. Reverse engineering from reports, dashboards and applications through medallion architecture understand the business logic and document them. Work with cross-functional teams to understand data needs and provide analytical support. Develop solutions based on data analysis to address business challenges and Identify opportunities for process improvements through data insights. Document data processes, methodologies, and findings for future reference and maintain clear records of data sources and analysis methods. Identify and categorize source data (where the data originates) and establish a clear mapping between source and target fields. Analyze how changes will affect existing processes or systems and identify stakeholders impacted by data migration or integration. Develop validation checks to ensure data integrity post-migration and Conduct testing to confirm that the target meets requirements. Maintain comprehensive documentation of the analysis process and record decisions made, issues encountered, and resolutions. Work closely with data engineers to understand the target structures and design the semantic layer conducive for analytics. Work closely with Data governance team and business stakeholders to document the data elements metadata and report metadata. Compare source and target data structures to identify discrepancies and assess data quality issues, such as duplicates, missing values, or inconsistencies. Develop test plans, test scripts, automation procedures to test data and report quality Contribute, develop and maintain Enterprise Data Model Actively seeks knowledge needed to complete assignments and shares knowledge with others, communicating and presenting information in a clear and organized manner. Develop, maintain, support AI/ML models in support of various analytical and data science needs Selection Criteria Master's degree with 5 years’ experience OR equivalent combination of education and experience in relevant discipline such as Computer Science. Minimum 3 years of experience in each of the following areas: (i) SQL, Python, R or any programming language (ii) Reports and Dashboard (iii) Building analytics and troubleshooting issues (iv)Data Analysis. Ability to understand business requirements, decode it into data needs, correlate it with business processes and develop reporting, data requirements, data models etc. Excellent and proven skills in data modelling, data integration and understanding different ways of designing data schema. Hands on experience with cloud platforms covering Power BI platform & Tableau and covering on-prem Business Objects. Hands on experience in building semantic layer encompassing complex logic to accommodate reporting requirements. Good understanding on SAP BW/SAP HANA structures and decoding of business rules to migrate to Modern Cloud platforms. Knowledge of advance SQL programming skills to perform complex operations and data programming on large amount of data stored in data warehousing or lakes. Strong understanding of row and column-level security in the semantic layer to facilitate smooth reporting. Work with application team leads to refine and tighten the security framework and access control for internal and external data access points Ability and flexibility to learn and adapt to a spectrum of data technologies running on multiple platforms primarily on the Semantic layer modelling, Report building, API’s and Dashboards. Knowledge of building data warehouse applications in Hybrid environment both on-cloud and on-prem and ability to keep up to date with Cloud offerings and solutions in a global delivery environment. Ability to participate and collaborate within and across teams in developing options, roadmaps, evaluations, decision frameworks for complex enterprise solutions. Deep Experience in implementing and maintaining some of these tools such as Informatica Intelligent Cloud Services (IICS), Tableau, Tibco Data Virtualization, Collibra, Informatica MDM, Data Bricks, NoSQL Databases, PostgreSQL and Azure technologies is preferrable. Experience working on AI/ML, data science models is preferred. Proven experience in evaluating best of the breed tools in Data & Analytics and work closely with the leadership team to come up with pro and cons is preferred. Actively seeks knowledge needed to complete assignments and shares knowledge with others, communicating and presenting information in a clear and organized manner. Proven experience of working and navigating in teams with offshore/onsite model and collaborate across teams to build/maintain complex IT landscapes, and diverse client bases. Experience in finance, human resources, resource management, loans, and travel is preferred. Experience in writing unit/integration tests, work in agile iterative approach towards building products and documents work. Ability to deliver information effectively in support of team or workgroup. Excellent communication, writing/documentation, and facilitation skills. Ability to juggle multiple tasks in a fast-paced environment, and the maturity to participate in multiple complex programs at the same time in an agile environment. World Bank Group Core Competencies The World Bank Group offers comprehensive benefits, including a retirement plan; medical, life and disability insurance; and paid leave, including parental leave, as well as reasonable accommodations for individuals with disabilities. We are proud to be an equal opportunity and inclusive employer with a dedicated and committed workforce, and do not discriminate based on gender, gender identity, religion, race, ethnicity, sexual orientation, or disability. Learn more about working at the World Bank and IFC , including our values and inspiring stories. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Summary We are looking for an experienced SAP HANA / SQL Developer to design, develop, and implement high-performance data solutions for analytics and reporting. The ideal candidate will have strong expertise in SQL programming , SAP HANA development , and ETL tools such as Azure Data Factory, SAP BODS, Informatica, or SSIS. The role involves working on data warehousing, reporting applications, and end-to-end data integration projects. Key Responsibilities Analyze, plan, design, and implement SAP HANA solutions based on business requirements. Develop complex SQL queries, stored procedures, and manage data loads effectively. Translate functional requirements into technical design documents. Troubleshoot and resolve issues related to SQL models, views, and indexes. Perform effort estimation for development and enhancement activities. Collaborate with cross-functional teams including QA, SMEs, and infrastructure teams to ensure successful delivery. Ensure high standards of usability, performance, reliability, and data security. Create internal documentation and end-user training materials as needed. Follow best practices and contribute to coding standards and continuous improvement. Must-Have Skills Strong experience in SAP HANA development (views, stored procedures, indexes, performance tuning). Proficiency in SQL Programming with complex data models and stored procedures. Hands-on experience with at least one ETL tool: Azure Data Factory, SAP BODS, Informatica, or SSIS Solid understanding of Data Warehousing, Analytics, and Reporting applications. Good To Have Experience with Azure SQL or other cloud technologies. Background in data security and access control within enterprise applications. Soft Skills Strong written and verbal communication skills. Analytical thinker with excellent problem-solving abilities. Self-motivated, proactive, and able to take ownership of deliverables. Team-oriented and collaborative mindset. Skills: sap hana,sap hana development,sql programming,hana,technical documentation,communication skills,etl development,informatica,sql,performance tuning,azure data factory,ssis,data models,problem-solving,bods,sap,data warehousing,sap bods,analytical skills,etl,azure Show more Show less
Posted 1 week ago
6.0 - 11.0 years
18 - 25 Lacs
Hyderabad
Work from Office
Position: Informatica Developer Location: India Experience: 6+ Years About the Role: We are looking for an experienced Informatica Developer (IDMC) to design, build, and manage cloud-based data integration solutions. This role focuses on managing batch processes and ensuring seamless data exchange across systems. Key Responsibilities: Design and implement data integration workflows using Informatica Intelligent Cloud Services (IDMC). Develop and maintain Salesforce-centric data pipelines, including data extraction, transformation, and loading (ETL). Manage and optimize batch processes to handle large volumes of Salesforce data. Ensure accurate and efficient data movement between Salesforce and other applications/databases. Identify and resolve integration-related issues and optimize performance. Collaborate with business and technical teams to gather integration requirements. Maintain detailed technical documentation for integration components and processes. Key Skills & Experience: Hands-on experience in Informatica Cloud (IDMC) development. Mandatory experience in Salesforce data integration and understanding of Salesforce objects, data model, API usage, and best practices. Proficiency in designing cloud data integration workflows and batch processes. Good understanding of various data sources and cloud databases, APIs, flat files. Strong analytical, debugging, and troubleshooting abilities. Ability to work both independently and within cross-functional teams.
Posted 1 week ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
This role is for one of the Weekday's clients Salary range: Rs 2200000 - Rs 2400000 (ie INR 22-24 LPA) Min Experience: 5 years Location: Pune, Bengaluru, Chennai, Kolkata, Gurgaon JobType: full-time We are looking for an experienced Snowflake Developer to join our Data Engineering team. The ideal candidate will possess a deep understanding of Data Warehousing , SQL , ETL tools like Informatica , and visualization platforms such as Power BI . This role involves building scalable data pipelines, optimizing data architectures, and collaborating with cross-functional teams to deliver impactful data solutions. Requirements Key Responsibilities Data Engineering & Warehousing: Leverage over 5 years of hands-on experience in Data Engineering with a focus on Data Warehousing and Business Intelligence. Pipeline Development: Design and maintain ELT pipelines using Snowflake, Fivetran, and DBT to ingest and transform data from multiple sources. SQL Development: Write and optimize complex SQL queries and stored procedures to support robust data transformations and analytics. Data Modeling & ELT: Implement advanced data modeling practices including SCD Type-2, and build high-performance ELT workflows using DBT. Requirement Analysis: Partner with business stakeholders to capture data needs and convert them into scalable technical solutions. Data Quality & Troubleshooting: Conduct root cause analysis on data issues, maintain high data integrity, and ensure reliability across systems. Collaboration & Documentation: Collaborate with engineering and business teams. Develop and maintain thorough documentation for pipelines, data models, and processes. Skills & Qualifications Expertise in Snowflake for large-scale data warehousing and ELT operations. Strong SQL skills with the ability to create and manage complex queries and procedures. Proven experience with Informatica PowerCenter for ETL development. Proficiency with Power BI for data visualization and reporting. Hands-on experience with Fivetran for automated data integration. Familiarity with DBT, Sigma Computing, Tableau, and Oracle. Solid understanding of data analysis, requirement gathering, and source-to-target mapping. Knowledge of cloud ecosystems such as Azure (including ADF, Databricks); experience with AWS or GCP is a plus. Experience with workflow orchestration tools like Airflow, Azkaban, or Luigi. Proficiency in Python for scripting and data processing (Java or Scala is a plus). Bachelor's or Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related field. Key Tools & Technologies Snowflake, snowsql, Snowpark SQL, Informatica, Power BI, DBT Python, Fivetran, Sigma Computing, Tableau Airflow, Azkaban, Azure, Databricks, ADF Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
This role is for one of the Weekday's clients Salary range: Rs 2200000 - Rs 2400000 (ie INR 22-24 LPA) Min Experience: 5 years Location: Pune, Bengaluru, Chennai, Kolkata, Gurgaon JobType: full-time We are looking for an experienced Snowflake Developer to join our Data Engineering team. The ideal candidate will possess a deep understanding of Data Warehousing , SQL , ETL tools like Informatica , and visualization platforms such as Power BI . This role involves building scalable data pipelines, optimizing data architectures, and collaborating with cross-functional teams to deliver impactful data solutions. Requirements Key Responsibilities Data Engineering & Warehousing: Leverage over 5 years of hands-on experience in Data Engineering with a focus on Data Warehousing and Business Intelligence. Pipeline Development: Design and maintain ELT pipelines using Snowflake, Fivetran, and DBT to ingest and transform data from multiple sources. SQL Development: Write and optimize complex SQL queries and stored procedures to support robust data transformations and analytics. Data Modeling & ELT: Implement advanced data modeling practices including SCD Type-2, and build high-performance ELT workflows using DBT. Requirement Analysis: Partner with business stakeholders to capture data needs and convert them into scalable technical solutions. Data Quality & Troubleshooting: Conduct root cause analysis on data issues, maintain high data integrity, and ensure reliability across systems. Collaboration & Documentation: Collaborate with engineering and business teams. Develop and maintain thorough documentation for pipelines, data models, and processes. Skills & Qualifications Expertise in Snowflake for large-scale data warehousing and ELT operations. Strong SQL skills with the ability to create and manage complex queries and procedures. Proven experience with Informatica PowerCenter for ETL development. Proficiency with Power BI for data visualization and reporting. Hands-on experience with Fivetran for automated data integration. Familiarity with DBT, Sigma Computing, Tableau, and Oracle. Solid understanding of data analysis, requirement gathering, and source-to-target mapping. Knowledge of cloud ecosystems such as Azure (including ADF, Databricks); experience with AWS or GCP is a plus. Experience with workflow orchestration tools like Airflow, Azkaban, or Luigi. Proficiency in Python for scripting and data processing (Java or Scala is a plus). Bachelor's or Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related field. Key Tools & Technologies Snowflake, snowsql, Snowpark SQL, Informatica, Power BI, DBT Python, Fivetran, Sigma Computing, Tableau Airflow, Azkaban, Azure, Databricks, ADF Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
This role is for one of the Weekday's clients Salary range: Rs 2200000 - Rs 2400000 (ie INR 22-24 LPA) Min Experience: 5 years Location: Pune, Bengaluru, Chennai, Kolkata, Gurgaon JobType: full-time We are looking for an experienced Snowflake Developer to join our Data Engineering team. The ideal candidate will possess a deep understanding of Data Warehousing , SQL , ETL tools like Informatica , and visualization platforms such as Power BI . This role involves building scalable data pipelines, optimizing data architectures, and collaborating with cross-functional teams to deliver impactful data solutions. Requirements Key Responsibilities Data Engineering & Warehousing: Leverage over 5 years of hands-on experience in Data Engineering with a focus on Data Warehousing and Business Intelligence. Pipeline Development: Design and maintain ELT pipelines using Snowflake, Fivetran, and DBT to ingest and transform data from multiple sources. SQL Development: Write and optimize complex SQL queries and stored procedures to support robust data transformations and analytics. Data Modeling & ELT: Implement advanced data modeling practices including SCD Type-2, and build high-performance ELT workflows using DBT. Requirement Analysis: Partner with business stakeholders to capture data needs and convert them into scalable technical solutions. Data Quality & Troubleshooting: Conduct root cause analysis on data issues, maintain high data integrity, and ensure reliability across systems. Collaboration & Documentation: Collaborate with engineering and business teams. Develop and maintain thorough documentation for pipelines, data models, and processes. Skills & Qualifications Expertise in Snowflake for large-scale data warehousing and ELT operations. Strong SQL skills with the ability to create and manage complex queries and procedures. Proven experience with Informatica PowerCenter for ETL development. Proficiency with Power BI for data visualization and reporting. Hands-on experience with Fivetran for automated data integration. Familiarity with DBT, Sigma Computing, Tableau, and Oracle. Solid understanding of data analysis, requirement gathering, and source-to-target mapping. Knowledge of cloud ecosystems such as Azure (including ADF, Databricks); experience with AWS or GCP is a plus. Experience with workflow orchestration tools like Airflow, Azkaban, or Luigi. Proficiency in Python for scripting and data processing (Java or Scala is a plus). Bachelor's or Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related field. Key Tools & Technologies Snowflake, snowsql, Snowpark SQL, Informatica, Power BI, DBT Python, Fivetran, Sigma Computing, Tableau Airflow, Azkaban, Azure, Databricks, ADF Show more Show less
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Job Description : 5 to 10 Years of work exp. Strong understanding in data warehousing principals and data modelling Experience with AI/ML Ops – model build through implementation lifecycle in AWS Cloud environment and/or Snowflake Expert with SQL including knowledge of advanced query optimization techniques - build queries and data visualizations to support business use cases/analytics. Proven experience with software tools including Pyspark and Python, PowerBI, QuickSight and core AWS tools such as Lambda, RDS, Cloudwatch, Cloudtrail, SNS, SQS, etc. Hands-on experience on Snowflake, Snowsight, Snowpark, Snowpipe, SnowQL Experience building services/APIs on AWS Cloud environment. Data ingestion and curation as well as implementation of data pipelines. Experience working and leading vendor partner (on- and off-shore) resources. Experience in Informatica/ETL technology. Experience in DevOps and microservices would be preferred. Show more Show less
Posted 1 week ago
8.0 - 11.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description About Sopra Steria Sopra Steria, a major Tech player in Europe with 50,000 employees in nearly 30 countries, is recognised for its consulting, digital services and solutions. It helps its clients drive their digital transformation and obtain tangible and sustainable benefits. The Group provides end-to-end solutions to make large companies and organisations more competitive by combining in-depth knowledge of a wide range of business sectors and innovative technologies with a collaborative approach. Sopra Steria places people at the heart of everything it does and is committed to putting digital to work for its clients in order to build a positive future for all. In 2024, the Group generated revenues of €5.8 billion. Job Description The world is how we shape it. Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Preferred Skills Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives. Qualifications BTech/MCA Additional Information At our organization, we are committed to fighting against all forms of discrimination. We foster a work environment that is inclusive and respectful of all differences. All of our positions are open to people with disabilities. Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Bangalore Urban, Karnataka, India
Remote
Build Your Career at Informatica We seek innovative thinkers who believe in the power of data to drive meaningful change. At Informatica, we welcome adventurous, work-from-anywhere minds eager to tackle the world's most complex challenges. Our employees are empowered to push their bold ideas forward, and we are united by a shared passion for using data to do the extraordinary for each other and the world. Solution Architect - Bangalore We're looking for Solution Architect candidate with experience in Pre- Sales/ Technical sales or Consulting experience with Data Management, MDM, or Data Governance to join our team in Bangalore- Hybrid You will report to the Technical Sales Manager As a Solution Architect, you will engage with internal collaborators and customers and partners to develop technical account plan/strategy and increase sales engagements to technical closure. As an important member of the tech sales team, you will ensure success by capturing customer use cases and designing complex technical solutions on top of Informatica Data Management Cloud. You will educate customers and team members on the Informatica value proposition and participate in deep architectural discussions to ensure solutions are designed for successful deployment and use in the cloud. Within our technical sales community you will be an active, constant learner with a stay up to date with industry trends. You will mentor others on the team and contributing to the development of shared selling assets, knowledge stores, and enablement materials. To ensure sustained revenue growth you will contribute to pipeline generation by delivering technical workshops, being active in social media promotions, developing content for external circulation, and participating in industry marketing events. Your Role Responsibilities? Here's What You'll Do Manage customer engagements independently. Share best practices, content, and tips and tricks within primary responsibilities. Stay current on certification of services required for responsibilities. Perform activities leading up to the delivery of a customer demo with little assistance including discovery, technical qualification/fit, customer presentations, standard demos, and related customer-facing communication. Lead on RFP responses and POC's. Create customized demos. Partner with the CSM team on nurture activities including technical advisory, workshops, etc. Provide customer feedback on product gaps. Support demos at marketing events. Conduct technical workshops with customers. What We'd Like to See Basic certification on at least one cloud ecosystem and a data related cloud technology. Intermediate knowledge of security for cloud computing. Advanced level skills for Informatica services and product capabilities in respective major or area of focus. Storytelling and presentation skills specific to use cases. Ability to engage and create relationships with influencers, coaches, decision makers, and partners. Intermediate technical knowledge of hybrid deployment of software solutions, data warehousing, database, or business intelligence software concepts and products. Role Essentials 4+ years of relevant experience in Data Management, MDM, or Data Governance 5 years of prior presales/technical sales or consulting experience BA/BS or equivalent educational background This is a hybrid remote/in-office role. Perks & Benefits Comprehensive health, vision, and wellness benefits (Paid parental leave, adoption benefits, life insurance, disability insurance and 401k plan or international pension/retirement plans Flexible time-off policy and hybrid working practices Equity opportunities and an employee stock purchase program (ESPP) Comprehensive Mental Health and Employee Assistance Program (EAP) benefit Our DATA values are our north star and we are passionate about building and delivering solutions that accelerate data innovations. At Informatica, our employees are our greatest competitive advantage. So, if your experience aligns but doesn't exactly match every qualification, apply anyway. You may be exactly who we need to fuel our future with innovative ideas and a thriving culture. Informatica (NYSE: INFA), a leader in enterprise AI-powered cloud data management, brings data and AI to life by empowering businesses to realize the transformative power of their most critical assets. We pioneered the Informatica Intelligent Data Management Cloud™ that manages data across any multi-cloud, hybrid system, democratizing data to advance business strategies. Customers in approximately 100 countries and more than 80 of the Fortune 100 rely on Informatica. www.informatica.com. Connect with LinkedIn, X, and Facebook. Informatica. Where data and AI come to life.™ Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Company : Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. · Job Title: Informatica IDMC MDM · Location: Pune · Experience: 8+ yrs · Job Type : Contract to hire. · Notice Period:- Immediate joiners. Mandatory Skills: IDMC MDM IDMC MDM Engineer JD: • Configure and maintain Informatica IDMC MDM 360 SaaS platform to support core Master Data Management (MDM) capabilities including Business Entity modeling, match/merge rules, survivorship, and data quality. • Collaborate with business and data stakeholders to define master data structure and mapping, hierarchies, and data governance rules aligned with organizational needs. • Implement and tune match & merge rules, leveraging IDMC platform native matching capabilities and survivorship strategies for optimal entity resolution. • Design and configure Business Entity Services, workflows, and role-based user interfaces (UI) using Informatica's cloud-native application composer. • Develop data onboarding strategies and configure source system integration pipelines using Informatica Cloud Data Integration (CDI) and API-based ingestion. • Ensure platform scalability and performance through environment management, metadata configuration, and ongoing optimization in a multi-domain, cloud-native environment Show more Show less
Posted 1 week ago
0.0 - 5.0 years
0 Lacs
Hyderabad, Telangana
On-site
Experience- 5-10 years JD- Mandatory Skillset: Snowflake, DBT, Data Architecture Design experience in Data Warehouse. Good to have Informatica or any ETL Knowledge or Hands-On Experience Good to have Databricks understanding 5-10 years of IT experience with 2+ years of Data Architecture experience in Data Warehouse,3+ years in Snowflake Responsibilities Design, implement, and manage cloud-based solutions on AWS and Snowflake. Work with stakeholders to gather requirements and design solutions that meet their needs. Develop and execute test plans for new solutions Oversee and design the information architecture for the data warehouse, including all information structures such as staging area, data warehouse, data marts, and operational data stores. Ability to Optimize Snowflake configurations and data pipelines to improve performance, scalability, and overall efficiency. Deep understanding of Data Warehousing, Enterprise Architectures, Dimensional Modeling, Star & Snowflake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support Significant experience working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models) Maintain Documentation: Develop and maintain detailed documentation for data solutions and processes. Provide Training: Offer training and leadership to share expertise and best practices with the team Collaborate with the team and provide leadership to the data engineering team, ensuring that data solutions are developed according to best practices Job Type: Full-time Pay: From ₹1,500,000.00 per year Location Type: In-person Schedule: Monday to Friday Ability to commute/relocate: Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your notice period? How many years of experience do you have in Snowflake? How many years of experience do you have in Data Architecture experience in Data Warehouse? What is your current location? Are you ok with the work from office in Hyderabad location? What is your current CTC? What is your expected CTC? Experience: total work: 5 years (Required) Work Location: In person
Posted 1 week ago
0.0 - 10.0 years
0 Lacs
Pune, Maharashtra
On-site
You deserve to do what you love, and love what you do – a career that works as hard for you as you do. At Fiserv, we are more than 40,000 #FiservProud innovators delivering superior value for our clients through leading technology, targeted innovation and excellence in everything we do. You have choices – if you strive to be a part of a team driven to create with purpose, now is your chance to Find your Forward with Fiserv. Responsibilities Requisition ID R-10356383 Date posted 06/10/2025 End Date 06/20/2025 City Pune State/Region Maharashtra Country India Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Professional, Systems Engineering What does a successful Snowflakes Advisor do? We are seeking a highly skilled and experienced Snowflake Advisor to take ownership of our data warehousing strategy, implementation, maintenance and support. In this role, you will design, develop, and lead the adoption of Snowflake-based solutions to ensure scalable, efficient, and secure data systems that empower our business analytics and decision-making processes. As a Snowflake Advisor, you will collaborate with cross-functional teams, lead data initiatives, and act as the subject matter expert for Snowflake across the organization. What you will do: Define and implement best practices for data modelling, schema design, query optimization in Snowflakes Develop and manage ETL/ELT workflows to ingest, transform and load data into Snowflakes from various resources Integrate data from diverse systems like databases, API`s, flat files, cloud storage etc. into Snowflakes. Using tools like Streamsets, Informatica or dbt to streamline data transformation processes Monitor or tune Snowflake performance including warehouse sizing, query optimizing and storage management. Manage Snowflakes caching, clustering and partitioning to improve efficiency Analyze and resolve query performance bottlenecks Monitor and resolve data quality issues within the warehouse Collaboration with data analysts, data engineers and business users to understand reporting and analytic needs Work closely with DevOps team for Automation, deployment and monitoring Plan and execute strategies for scaling Snowflakes environments as data volume grows Monitor system health and proactively identify and resolve issues Implement automations for regular tasks Enable seamless integration of Snowflakes with BI Tools like Power BI and create Dashboards Support ad hoc query requests while maintaining system performance Creating and maintaining documentation related to data warehouse architecture, data flow, and processes Providing technical support, troubleshooting, and guidance to users accessing the data warehouse Optimize Snowflakes queries and manage Performance Keeping up to date with emerging trends and technologies in data warehousing and data management Good working knowledge of Linux operating system Working experience on GIT and other repository management solutions Good knowledge of monitoring tools like Dynatrace, Splunk Serve as a technical leader for Snowflakes based projects, ensuring alignment with business goals and timelines Provide mentorship and guidance to team members in Snowflakes implementation, performance tuning and data management Collaborate with stakeholders to define and prioritize data warehousing initiatives and roadmaps. Act as point of contact for Snowflakes related queries, issues and initiatives What you will need to have: Must have 8 to 10 years of experience in data management tools like Snowflakes, Streamsets, Informatica Should have experience on monitoring tools like Dynatrace, Splunk. Should have experience on Kubernetes cluster management CloudWatch for monitoring and logging and Linux OS experience Ability to track progress against assigned tasks, report status, and proactively identifies issues. Demonstrate the ability to present information effectively in communications with peers and project management team. Highly Organized and works well in a fast paced, fluid and dynamic environment. What would be great to have: Experience in EKS for managing Kubernetes cluster Containerization technologies such as Docker and Podman AWS CLI for command-line interactions CI/CD pipelines using Harness S3 for storage solutions and IAM for access management Banking and Financial Services experience Knowledge of software development Life cycle best practices Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 1 week ago
0.0 - 3.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Category: ERP/CRM/Tools Main location: India, Karnataka, Bangalore Position ID: J0125-2111 Employment Type: Full Time Position Description: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: IDMC Developer Position: IICS/IDMC Experience:4+Years Category: IT Infrastructure Main location: Gurgaon Position ID: J0125-2111 Unique requirements: Shift timing: Overlap with onshore team till 11 AM EST, and be flexible to shift working hours from time to time if needed Your future duties and responsibilities: This is a position for ETL Developer with hands-on experience having 1 to 3 years in Informatica PowerCenter/IICS. Nice to have at least 1 year’s experience in Scheduling. Working with the Subject Matter Experts and Business Analysts to grasp the best knowledge of existing/ new SLF investment product architecture and implement new features in the scope of the project and other project deliverables. Design, Development, Testing and Support of Informatica based ETL Applications, using existing and emerging technology platforms. The incumbent will be responsible for requirement gathering from the client team and building the interactive dashboards and stories in Tableau, should be an adaptable individual who enjoys getting into details. The person will be exposed to a variety of assignments, responding to new business service requests/assignments, problem solving, maintaining stability and performing technical implementations. Partner with SMEs, Developers and other stakeholders. Understand the project scope, identify activities/ tasks, task level estimates, schedule, dependencies, risks and provide inputs to Program Lead for review. Develop a robust ETL solutions which can best meet the business requirements for investments applications. Analyze the data sources, be it a RDBMS, CSV file, Salesforce, IMS or any other data source. Understanding of scheduling tools like ZEKE/Control M. Prepare/Modify design documents (High level design and Detailed design documents) based on business requirements. Suggest changes in design on technical grounds. Work on the functional and technical upgrades of the application Coordinate delivery of the assigned tasks with the onshore Partners or/and Business Analyst. Ensure timely notification and escalation of possible issues/problems, options and recommendations for prompt resolution Follow development standards to ensure that code is clear, efficient, logical and easily maintainable. Create and maintain application documentation like Network diagrams, Technical project handbook, technical data mapping doc, unit test documents, implementation plan. Ensure SLF Information Security Policies and General Computing Control is compiled to in all situations. Take complete ownership of work assignments and ensure the successful completion of assigned tasks. Ensure all written and verbal communication is clear, understandable and audience appropriate. Required qualifications to be successful in this role: A degree in Computer Science, related technology degree, or equivalent experience. Minimum 1 to 3 years of overall IT experience. Strong knowledge of ETL (Informatica PowerCenter), IICS (Informatica Intelligent Cloud Services), relational databases (Microsoft SQL Server), PL/SQL. Experience in scheduling tools like zeke, Control M etc. is nice-to-have. Competencies (Technical) A degree in Computer Science, related technology degree, or equivalent experience. Skills: Informatica SQL Unix Wealth Management Postgre SQL What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 1 week ago
5.0 - 8.0 years
12 - 16 Lacs
Pune
Work from Office
About The Role Role The purpose of this role is to provide strategic guidance and recommendations on pricing of contracts being executed in the assigned SBU while maintaining the competitive advantage and profit margins. Responsible for ensuring the SoW adherence to internal guidelines of all contracts in the SBU. ? Do - Contract pricing review and advise - Pricing strategy deployment - Drive the deployment of pricing strategy for the SBU/ Vertical / Account in line with the overall pricing strategy for Wipro - Partner and educate the Business Leaders about adherence to the pricing strategy, internal guidelines and SoW. - Business partnering for advice on contract commercials - Work closely with pre-sales and BU leadership to review the contracts about to be finalized and provide inputs on its structuring, payment milestones and terms & conditions - Review the Resources Loading Sheet (RLS)) submitted by pre-sales / delivery team and work on the contract pricing - Collaborate with the business leaders to propose a competitive pricing basis the effort estimate by considering the cost of resources, skills availability and identified premium skills - Review adherence of contract's commercial terms and conditions - Review the commercial terms and conditions proposed in the SoW - Ensure they are aligned with internal guidelines for credit period and the existing MSAs and recommend payment milestones - Ensure accurate revenue recognition and provide forecast - Implement and drive adherence to revenue recognition guidelines - Ensure revenue recognition by the BFMs / Service Line Finance Manage are done as per the IFRS standards - Partner with Finance Managers and educate them on revenue recognition standards and internal guidelines of Wipro - Provide accurate and timely forecast of revenue for the assigned SBU/ Vertical / Cluster / Accounts - Validation of order booking - Adherence to order booking guidelines - Oversee and ensure all documents, approvals and guidelines are adhered before the order is confirmed in the books of accounts - Highlight any deviations to the internal guidelines / standards and work with the concerned teams to address the deviations - Team Management - Team Management - Clearly define the expectations for the team - Assign goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports - Guide the team members in acquiring relevant knowledge and develop their professional competence - Educate and build awareness in the team in Wipro guidelines on revenue recognition, pricing strategy, contract terms and MSA - Ensure that the Performance Nxt is followed for the entire team - Employee Satisfaction and Engagement - Lead and drive engagement initiatives for the team - Track team satisfaction scores and identify initiatives to build engagement within the team ? Deliver 1. Financials Monetizing Wipro's efforts and value additions Comprehensiveness of pricing recommendations Accurate inputs in forecasting of revenue as per revenue recognition guidelines 2. Internal Customer Completeness of contracts checklist before order booking 3. Team Management Team attrition %, Employee satisfaction score, localization %, gender diversity % Training and skill building of team on pricing operations ? ? Mandatory Skills: Data Governance. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
5.0 - 8.0 years
9 - 14 Lacs
Chennai
Work from Office
About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Informatica MDM. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
5.0 - 8.0 years
5 - 9 Lacs
Pune
Work from Office
About The Role We are looking to fill Project Lead position for product migration assignment. Project Lead position for product migration assignment will be remote and require no travel. The Project Lead will act as a primary contact to external customers (Physicians and Hospitals) while working closely with cross-functional teams to manage product migration tasks. The role will be responsible for planning, coordinating and managing Revenue Cycle solutions product migration activities. Primary responsibilities of a Project Lead include ? Manage the product implementation of existing clients adding new business or modifying current business Responsible for client outreach to “kick off” the product implementation/migration process Present project plan to client on product modules to be implemented per signed contract Set client expectations and define scope of product migration activities Responsible to educate client on best practices and guide them through the product migration phase Manage all communication with the client including conducting meetings and conference calls Must be responsible for the coordination, communication and accurate reporting of all migration related activities under the supervision of Program Manager Facilitates training needs, schedules client training and reports on training completed Ability to manage multiple projects concurrently with aggressive timeframes Complete projects within set timelines while mitigating risks that could cause the project to delay. Manage detailed work plans, schedules and client status reports. Responsible for tracking and managing client product migration deliverables Responsible for gathering client requirements and collaborating with internal departments to resolve issues related to the client product migration Ensure successful hand-off to client services group for on-going support post product migration Collaborate with internal departments to resolve client questions and issues during product migration Responsible for keeping the project on track and in scope, anticipating and assessing project issues Responsible for developing resolutions working with the team and management, to meet productivity, quality, and client-satisfaction goal ? ? ? Mandatory Skills: Sybase. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
6.0 - 10.0 years
8 - 12 Lacs
Gurugram
Work from Office
About The Role : Role Purpose Data Analyst, Data Modeling, Data Pipeline, ETL Process, Tableau, SQL, Snowflake. Do Strong expertise in data modeling, data warehousing, and ETL processes. - Proficient in SQL and experience with data warehousing tools (e.g., Snowflake, Redshift, BigQuery) and ETL tools (e.g., Talend, Informatica, SSIS). - Demonstrated ability to lead and manage complex projects involving cross-functional teams. - Excellent analytical, problem-solving, and organizational skills. - Strong communication and leadership abilities, with a track record of mentoring and developing team members. - Experience with data visualization tools (e.g., Tableau, Power BI) is a plus. - Preference to candidates with experience in ETL using Python, Airflow or DBT Build capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Partner with team leaders to brainstorm and identify training themes and learning issues to better serve the client Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback2Self- ManagementProductivity, efficiency, absenteeism, Training Hours, No of technical training completed
Posted 1 week ago
5.0 - 8.0 years
3 - 7 Lacs
Kolkata
Work from Office
About The Role Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ? 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ? 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ? Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Snowflake. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
5.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Starburst Data Engineer/Architect Expertise in Starburst and policy management like Ranger or equivalent. In-depth knowledge of data modelling principles and techniques, including relational and dimensional. Excellent problem solving skills and the ability to troubleshoot and debug complex data related issues. Strong awareness of data tools and platforms like: Starburst, Snowflakes, Databricks and programming languages like SQL. In-depth knowledge of data management principles, methodologies, and best practices with excellent analytical, problem-solving and decision making skills. Develop, implement and maintain database systems using SQL. Write complex SQL queries for integration with applications. Develop and maintain data models (Conceptual, physical and logical) to meet organisational needs. ? Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customer??s business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement ? Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy ? ? Mandatory Skills: Startburst. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.
The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum
A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect
In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis
As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
16869 Jobs | Dublin
Wipro
9024 Jobs | Bengaluru
EY
7266 Jobs | London
Amazon
5652 Jobs | Seattle,WA
Uplers
5629 Jobs | Ahmedabad
IBM
5547 Jobs | Armonk
Oracle
5387 Jobs | Redwood City
Accenture in India
5156 Jobs | Dublin 2
Capgemini
3242 Jobs | Paris,France
Tata Consultancy Services
3099 Jobs | Thane