Jobs
Interviews

15 Edw Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 4.0 years

15 - 22 Lacs

Hyderabad, Secunderabad

Work from Office

Job Summary We are looking for a fresher MD/PhD with a specialization in Microbiology to join our team as a Clinical Outreach / Scientific Outreach professional. This position requires active field engagement in collaboration with the sales team, including visits to hospitals and clinical institutions to interact with physicians and other healthcare professionals. The candidate will be responsible for effectively communicating the scientific, microbiological, and clinical aspects of our products, ensuring a clear and thorough understanding of their clinical relevance, applications, and value. The candidate will be participating in Continuing Medical Education (CME) programs and Round Table meetings (RTMs). What we want you to do Work closely with the sales team during client visits, primarily engaging with doctors and healthcare providers. Explain the microbiological and clinical aspects of our products in a clear and professional manner. Bridge the gap between scientific knowledge and clinical application to support the adoption of our products. Provide technical support and medical guidance during client meetings and product demonstrations. Help doctors understand how the product integrates into patient care, infection control, and diagnostic workflows. Share relevant case studies, clinical experiences, or infection trends to highlight product effectiveness. Maintain a strong understanding of emerging microbiological trends and technologies, including Next-Generation Sequencing (NGS). Collaborate with internal teams such as R&D, sales, and Operto ensure accurate communication and feedback. Actively participate in Continuing Medical Education (CME) programs and Round Table Meetings (RTMs) What are we looking in you Freshers - Fresher MD/PhD with a specialization in Microbiology Proven track record of effective communication and collaboration with interdisciplinary healthcare teams. Demonstrated understanding of infection control protocols and antimicrobial stewardship principles. Familiarity with molecular and sequencing (NGS) technologies and their applications in clinical microbiology is advantageous. Strong knowledge of clinical microbiology, infectious diseases, and diagnostic methods Excellent verbal communication and presentation skills. Ability to explain complex technical and medical concepts in a simple, clinician-friendly language. Comfortable with on-field client interactions. Must be willing to travel to PAN India for CME programs and RTMs. What you will gain Dynamic and collaborative work environment dedicated to making a meaningfulimpact in healthcare Experience in working with advanced sequencing technology in the diagnostic industry i.e. NGS, WGS, Nanopore, and Illumina. Opportunities for professional development and continued education Competitive salary commensurate with experience Comprehensive health benefits package

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be joining Papigen, a fast-growing global technology services company that focuses on delivering innovative digital solutions through deep industry experience and cutting-edge expertise. The company specializes in technology transformation, enterprise modernization, and dynamic areas such as Cloud, Big Data, Java, React, DevOps, and more. The client-centric approach of Papigen combines consulting, engineering, and data science to help businesses evolve and scale efficiently. Your role as a Senior Data QA Analyst will involve supporting data integration, transformation, and reporting validation for enterprise-scale systems. This position requires close collaboration with data engineers, business analysts, and stakeholders to ensure the quality, accuracy, and reliability of data workflows, particularly in Azure Data Bricks and ETL pipelines. Key responsibilities include collaborating with Business Analysts and Data Engineers to understand requirements and translating them into test scenarios and test cases. You will need to develop and execute comprehensive test plans and scripts for data validation, as well as log and manage defects using tools like Azure DevOps. Your role will also involve supporting UAT and post-go-live smoke testing. You will be responsible for understanding data architecture and workflows, including ETL processes and data movement. Writing and executing complex SQL queries to validate data accuracy, completeness, and consistency will be crucial. Additionally, ensuring the correctness of data transformations and mappings based on business logic is essential. As a Senior Data QA Analyst, you will validate the structure, metrics, and content of BI reports. Performing cross-checks of report outputs against source systems and ensuring that reports reflect accurate calculations and align with business requirements will be part of your responsibilities. To be successful in this role, you should have a Bachelor's degree in IT, Computer Science, MIS, or a related field. You should also possess 8+ years of experience in QA, especially in data validation or data warehouse testing. Strong hands-on experience with SQL and data analysis is required, along with proven experience working with Azure Data Bricks, Python, and PySpark (preferred). Familiarity with data models like Data Marts, EDW, and Operational Data Stores is also necessary. Excellent understanding of data transformation, mapping logic, and BI validation is crucial, as well as experience with test case documentation, defect tracking, and Agile methodologies. Strong verbal and written communication skills are essential, along with the ability to work in a cross-functional environment. Working at Papigen will provide you with the opportunity to work with leading global clients, exposure to modern technology stacks and tools, a supportive and collaborative team environment, and continuous learning and career development opportunities.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be joining a fast-growing global technology services company, Papigen, that specializes in providing innovative digital solutions through industry expertise and cutting-edge technology. As a Senior Data QA Analyst, your primary responsibility will be to ensure the quality, accuracy, and reliability of data workflows in enterprise-scale systems, particularly focusing on Azure Data Bricks and ETL pipelines. This role will require close collaboration with data engineers, business analysts, and stakeholders to validate data integration, transformation, and reporting. Your key responsibilities will include collaborating with Business Analysts and Data Engineers to understand requirements and translate them into test scenarios and test cases. You will develop and execute comprehensive test plans and scripts for data validation, log and manage defects using tools like Azure DevOps, and support UAT and post-go-live smoke testing. Additionally, you will be responsible for understanding data architecture, writing and executing complex SQL queries, validating data accuracy, completeness, and consistency, and ensuring correctness of data transformations based on business logic. In terms of report testing, you will validate the structure, metrics, and content of BI reports, perform cross-checks of report outputs against source systems, and ensure that reports reflect accurate calculations and align with business requirements. To be successful in this role, you should have a Bachelor's degree in IT, Computer Science, MIS, or a related field, along with 8+ years of experience in QA, especially in data validation or data warehouse testing. Strong hands-on experience with SQL and data analysis is essential, and experience working with Azure Data Bricks, Python, and PySpark is preferred. Familiarity with data models like Data Marts, EDW, and Operational Data Stores, as well as knowledge of data transformation, mapping logic, and BI validation, will be beneficial. Experience with test case documentation, defect tracking, and Agile methodologies is also required, along with strong verbal and written communication skills to work effectively in a cross-functional environment. Joining Papigen will provide you with the opportunity to work with leading global clients, exposure to modern technology stacks and tools, a supportive and collaborative team environment, and continuous learning and career development opportunities.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

About The Role This is an Internal document. Job TitleSenior Data Engineer About The Role As a Senior Data Engineer, you will play a key role in designing and implementing data solutions @Kotak811. You will be responsible for leading data engineering projects, mentoring junior team members, and collaborating with cross-functional teams to deliver high-quality and scalable data infrastructure. Your expertise in data architecture, performance optimization, and data integration will be instrumental in driving the success of our data initiatives. Responsibilities 1. Data Architecture and Designa. Design and develop scalable, high-performance data architecture and data models. b. Collaborate with data scientists, architects, and business stakeholders to understand data requirements and design optimal data solutions. c. Evaluate and select appropriate technologies, tools, and frameworks for data engineering projects. d. Define and enforce data engineering best practices, standards, and guidelines. 2. Data Pipeline Development & Maintenancea. Develop and maintain robust and scalable data pipelines for data ingestion, transformation, and loading for real-time and batch-use-cases b. Implement ETL processes to integrate data from various sources into data storage systems. c. Optimise data pipelines for performance, scalability, and reliability. i. Identify and resolve performance bottlenecks in data pipelines and analytical systems. ii. Monitor and analyse system performance metrics, identifying areas for improvement and implementing solutions. iii. Optimise database performance, including query tuning, indexing, and partitioning strategies. d. Implement real-time and batch data processing solutions. 3. Data Quality and Governancea. Implement data quality frameworks and processes to ensure high data integrity and consistency. b. Design and enforce data management policies and standards. c. Develop and maintain documentation, data dictionaries, and metadata repositories. d. Conduct data profiling and analysis to identify data quality issues and implement remediation strategies. 4. ML Models Deployment & Management (is a plus) This is an Internal document. a. Responsible for designing, developing, and maintaining the infrastructure and processes necessary for deploying and managing machine learning models in production environments b. Implement model deployment strategies, including containerization and orchestration using tools like Docker and Kubernetes. c. Optimise model performance and latency for real-time inference in consumer applications. d. Collaborate with DevOps teams to implement continuous integration and continuous deployment (CI/CD) processes for model deployment. e. Monitor and troubleshoot deployed models, proactively identifying and resolving performance or data-related issues. f. Implement monitoring and logging solutions to track model performance, data drift, and system health. 5. Team Leadership and Mentorshipa. Lead data engineering projects, providing technical guidance and expertise to team members. i. Conduct code reviews and ensure adherence to coding standards and best practices. b. Mentor and coach junior data engineers, fostering their professional growth and development. c. Collaborate with cross-functional teams, including data scientists, software engineers, and business analysts, to drive successful project outcomes. d. Stay abreast of emerging technologies, trends, and best practices in data engineering and share knowledge within the team. i. Participate in the evaluation and selection of data engineering tools and technologies. Qualifications1. 3-5 years" experience with Bachelor's Degree in Computer Science, Engineering, Technology or related field required 2. Good understanding of streaming technologies like Kafka, Spark Streaming. 3. Experience with Enterprise Business Intelligence Platform/Data platform sizing, tuning, optimization and system landscape integration in large-scale, enterprise deployments. 4. Proficiency in one of the programming language preferably Java, Scala or Python 5. Good knowledge of Agile, SDLC/CICD practices and tools 6. Must have proven experience with Hadoop, Mapreduce, Hive, Spark, Scala programming. Must have in-depth knowledge of performance tuning/optimizing data processing jobs, debugging time consuming jobs. 7. Proven experience in development of conceptual, logical, and physical data models for Hadoop, relational, EDW (enterprise data warehouse) and OLAP database solutions. 8. Good understanding of distributed systems 9. Experience working extensively in multi-petabyte DW environment 10. Experience in engineering large-scale systems in a product environment

Posted 3 weeks ago

Apply

5.0 - 7.0 years

15 - 20 Lacs

Thiruvananthapuram

Work from Office

Role Proficiency: JD for SAP BODS Data Engineer Strong proficiency in designing, developing, and implementing robust ETL solutions using SAP Business Objects Data Services (BODS). with strong EDW experience Strong proficiency in SAP BODS development, including job design, data flow creation, scripting, and debugging. Design and develop ETL processes using SAP BODS to extract, transform, and load data from various sources. Create and maintain data integration workflows, ensuring optimal performance and scalability. Solid understanding of data integration, ETL concepts, and data warehousing principles. Proficiency in SQL for data querying and manipulation. Familiarity with data modeling concepts and database systems. Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills for effective collaboration. Ability to work independently and manage multiple tasks simultaneously. 3+ experience relevant ETL development (SAPBODS) Required Skills Data Warehousing,Sap Bods,Etl,Edw

Posted 1 month ago

Apply

6.0 - 10.0 years

12 - 16 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Job Title: Informatica Architect Job Type: Full-time, Contractor Location: HybridBengaluru | Pune About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market Job Summary Join our customer's team as an Informatica Architect and play a critical role in shaping data governance, data catalog, and data quality initiatives for enterprise-level products As a key leader, you will collaborate closely with Data & Analytics leads, ensuring the integrity, accessibility, and quality of business-critical data assets across multiple domains Key Responsibilities Lead data governance, data catalog, and data quality efforts utilizing Informatica and other industry-leading tools Design, develop, and manage data catalogs and enterprise data assets to support analytics and reporting across the organization Configure and optimize Informatica CDQ and Data Quality modules, ensuring adherence to enterprise data standards and policies Implement and maintain business glossaries, data domains, data lineage, and data stewardship resources for enterprise-wide use Collaborate with cross-functional teams to define critical data elements, data governance rules, and quality policies for multiple data sources Develop dashboards and visualizations to support data quality monitoring, compliance, and stewardship activities Continuously review, assess, and enhance data definitions, catalog resources, and governance practices to stay ahead of evolving business needs Required Skills and Qualifications Minimum 7-8 years of enterprise data integration, management, and governance experience with proven expertise in EDW technologies At least 5 years of hands-on experience with Informatica CDQ and Data Quality solutions, having executed 2+ large-scale Data Governance and Quality projects from inception to production Demonstrated proficiency configuring business glossaries, policies, dashboards, and search functions within Informatica or similar platforms In-depth expertise in data quality, data cataloguing, and data governance frameworks and best practices Strong background in Master Data Management (MDM), ensuring oversight and control of complex product catalogs Exceptional written and verbal communication skills, able to effectively engage technical and business stakeholders Experience collaborating with diverse teams to deliver robust data governance and analytics solutions Preferred Qualifications Administration and management experience with industry data catalog tools such as Collibra, Alation, or Atian Strong working knowledge of configuring user groups, permissions, data profiling, and lineage within catalog platforms Hands-on experience implementing open-source data catalog tools in enterprise environments

Posted 1 month ago

Apply

6.0 - 10.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Job Title: Informatica Architect Job Type: Full-time, Contractor Location: HybridBengaluru | Pune About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market Job Summary Join our customer's team as an Informatica Architect and play a critical role in shaping data governance, data catalog, and data quality initiatives for enterprise-level products As a key leader, you will collaborate closely with Data & Analytics leads, ensuring the integrity, accessibility, and quality of business-critical data assets across multiple domains Key Responsibilities Lead data governance, data catalog, and data quality efforts utilizing Informatica and other industry-leading tools Design, develop, and manage data catalogs and enterprise data assets to support analytics and reporting across the organization Configure and optimize Informatica CDQ and Data Quality modules, ensuring adherence to enterprise data standards and policies Implement and maintain business glossaries, data domains, data lineage, and data stewardship resources for enterprise-wide use Collaborate with cross-functional teams to define critical data elements, data governance rules, and quality policies for multiple data sources Develop dashboards and visualizations to support data quality monitoring, compliance, and stewardship activities Continuously review, assess, and enhance data definitions, catalog resources, and governance practices to stay ahead of evolving business needs Required Skills and Qualifications Minimum 7-8 years of enterprise data integration, management, and governance experience with proven expertise in EDW technologies At least 5 years of hands-on experience with Informatica CDQ and Data Quality solutions, having executed 2+ large-scale Data Governance and Quality projects from inception to production Demonstrated proficiency configuring business glossaries, policies, dashboards, and search functions within Informatica or similar platforms In-depth expertise in data quality, data cataloguing, and data governance frameworks and best practices Strong background in Master Data Management (MDM), ensuring oversight and control of complex product catalogs Exceptional written and verbal communication skills, able to effectively engage technical and business stakeholders Experience collaborating with diverse teams to deliver robust data governance and analytics solutions Preferred Qualifications Administration and management experience with industry data catalog tools such as Collibra, Alation, or Atian Strong working knowledge of configuring user groups, permissions, data profiling, and lineage within catalog platforms Hands-on experience implementing open-source data catalog tools in enterprise environments

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Chennai

Work from Office

Whats the role As an Associate Engineer Reliability RBI, you will involve preparing and validating RBI data sheets, identify corrosion issues, and support inspections during maintenance. It includes creating maintenance task lists, supporting projects, and ensuring high-quality data. Your role also involves helping technical specialists, maintaining error-free data, participating in standardization efforts, and preparing inspection reports. Additionally, it requires interacting with site engineers, assisting with site documents, responding to data inquiries, and engaging with stakeholders to improve processes and data quality. What youll be doing Preparation of RBI data sheets, converting process flow diagram into material selection diagrams, identifying corrosion loops, identifying integrity operating windows and operating parameters from corrosion monitoring framework, preparing corrosion loop descriptions and tags, identification of dominant failure modes & damage mechanism, knowledge of RBI procedures. Support Inspection supervision of static equipment during asset Turnarounds. Ensure Data validation of all the templates, create automation tools in order to expedite the process and speedy implementation. Review error log sheet, finding solutions and implement the same for maintaining high quality data uploads in the system. Creating equipment specific task lists for preventive, corrective and shutdown maintenances of the static equipment like Vessels, Columns, Heat Exchangers etc, Support projects, maintenance & improvements activities performed at the operating units, manufacturing sites, chemical plants, supply chain and distribution locations. Enable the assigned asset operating units, manufacturing sites, chemical plants, supply chain and distribution locations and its on-site employees in producing and compiling high value maintenance information and data, as measured by improvement trends in data quality. Support the Technical Specialistsand other engineers performing primary functions, with scope covering technical data and documents of the operating units, manufacturing sites, chemical plants, supply chain and distribution locations. This role is expected to help ensure: Data Maintenance for the site is performed error-free and within assigned controls (Clean the Stream). Assurance projects to extract, mine, clean, track, trend, analyze, or report are agreed with the site within local quality improvement plans. Provision, as a technical specialist, for Technical Support that enables the Site to leverage the benefits of utilizing the Engineering Tools to the fullest extent. Participation in Group and Operational Excellence (OE) team efforts to Standardize processes, tools, benchmarks, standards and best practices. The role may also be assigned as part of the Site Team or Project Team working under an Engineer- Reliability Static. Preparation of RBI data sheets, converting process flow diagram into material selection diagrams, identifying corrosion loops, identifying integrity operating windows and operating parameters from corrosion monitoring framework, preparing corrosion loop descriptions and tags, identification of dominant failure modes & damage mechanism, knowledge of RBI procedures. Support asset inspection activities during Turnarounds and other Major maintenance activities. Primary responsibility in Preparation, cleansing, validation of inspection data for w-IMS deployment in the designated/assigned OUs, RBI data preparation. Support in End-to-End RBI Assessment and implementation for Upstream/IG/Downstream sites within Shell using w-IMS tool. Preparation Inspection reports, analyzing the corrosion management framework, population of w-IMS templates. Create the Inspection, CAIR, RBI Re-assement & corrosion monitoring schedules based on the PEI methodology and align with maintenance plans and items in CMMS. Interact and communicate with the site inspection/integrity engineers to deliver the inspection work orders in CMMS. Assist the Team Leads/Technical Specialist to develop, maintain and manage the Site Reference Document along with the Assets/operating units. Assist the Team Lead/Technical Specialist in responding to inquiries related to the data and its functionality or use in a technical support function; including training site staff in ERPs or use of Request Tools, running pilot tests and demonstrations, increasing awareness of ERP functionality (e.g. using Measurement Points more effectively.) This Technical Support function includes supporting team leads and other staff on migration efforts. Other back up accountabilities, as follows: Engagement with the Site stakeholders such as Requestors and Approvers to communicate plans, obtain feedback, and solicit improvements. Assist the team leads/project focal to standardize processes, tools, methods, procedures as directed (Ex: Improving the data quality standards and best practices). Network with counterparts and other levels of organization across the different data centers. What you bring University Degree in Engineering or Equivalent Engineering discipline At least 3 years of experience in Oil & Gas/ Petrochemicals or relevant field. Experience working in Inspection Management systems such as w-IMS and CMMS systems like SAP is desired API 510 or API 570 (at least 1) experience working with those standards is desired Field experience in inspection processes, corrosion monitoring and related technical know-how, knowledge of various NDT processes, inspection management plans, piping integrity and maintenance of static equipment like pressure vessels, heat exchangers etc, & knowledge in Risk Based Inspection (RBI) Methodology API 570, 580, 510, 571 and 581. Field experi3-6ence in conducting corrosion monitoring inspection, wall thickness measurements, CUI, and experience in fabric maintenance/different types of insulation and installation techniques. Preparation of work packs and procedures related to MMI. Experience in maintaining Downstream/Upstream/IG Asset hierarchy, EDW, AIM-EDMS, SPI Intools, IDMS, WIMS is preferred. Professional Engineering certification (API RBI-580/581/570/510 etc.) and ASNT LEVEL II is desirable. Proficiency in SAP PM/MM module, MS Excel, MS Access, AutoCAD, AutoPlant, and other design/drafting software is highly desirable. Effective communication, stakeholder management skills, and knowledge of Lean Six Sigma CI methodology are necessary. Virtual working experience is highly desirable. Other Skills: Developed engagement and communication skills, knowledge of technical data processes, Shells physical processing assets, and associated operating processes. Ability to handle complexity and detail, prioritize tasks, and ensure delivery. Flexible, adaptable, with initiative, analytical capabilities, and problem-solving skills.

Posted 1 month ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

About The Role This is an Internal document. Job TitleSenior Data Engineer About The Role As a Senior Data Engineer, you will play a key role in designing and implementing data solutions @Kotak811. — You will be responsible for leading data engineering projects, mentoring junior team members, and collaborating with cross-functional teams to deliver high-quality and scalable data infrastructure. — Your expertise in data architecture, performance optimization, and data integration will be instrumental in driving the success of our data initiatives. Responsibilities 1. Data Architecture and Designa. Design and develop scalable, high-performance data architecture and data models. b. Collaborate with data scientists, architects, and business stakeholders to understand data requirements and design optimal data solutions. c. Evaluate and select appropriate technologies, tools, and frameworks for data engineering projects. d. Define and enforce data engineering best practices, standards, and guidelines. 2. Data Pipeline Development & Maintenancea. Develop and maintain robust and scalable data pipelines for data ingestion, transformation, and loading for real-time and batch-use-cases b. Implement ETL processes to integrate data from various sources into data storage systems. c. Optimise data pipelines for performance, scalability, and reliability. i. Identify and resolve performance bottlenecks in data pipelines and analytical systems. ii. Monitor and analyse system performance metrics, identifying areas for improvement and implementing solutions. iii. Optimise database performance, including query tuning, indexing, and partitioning strategies. d. Implement real-time and batch data processing solutions. 3. Data Quality and Governancea. Implement data quality frameworks and processes to ensure high data integrity and consistency. b. Design and enforce data management policies and standards. c. Develop and maintain documentation, data dictionaries, and metadata repositories. d. Conduct data profiling and analysis to identify data quality issues and implement remediation strategies. 4. ML Models Deployment & Management (is a plus) This is an Internal document. a. Responsible for designing, developing, and maintaining the infrastructure and processes necessary for deploying and managing machine learning models in production environments b. Implement model deployment strategies, including containerization and orchestration using tools like Docker and Kubernetes. c. Optimise model performance and latency for real-time inference in consumer applications. d. Collaborate with DevOps teams to implement continuous integration and continuous deployment (CI/CD) processes for model deployment. e. Monitor and troubleshoot deployed models, proactively identifying and resolving performance or data-related issues. f. Implement monitoring and logging solutions to track model performance, data drift, and system health. 5. Team Leadership and Mentorshipa. Lead data engineering projects, providing technical guidance and expertise to team members. i. Conduct code reviews and ensure adherence to coding standards and best practices. b. Mentor and coach junior data engineers, fostering their professional growth and development. c. Collaborate with cross-functional teams, including data scientists, software engineers, and business analysts, to drive successful project outcomes. d. Stay abreast of emerging technologies, trends, and best practices in data engineering and share knowledge within the team. i. Participate in the evaluation and selection of data engineering tools and technologies. Qualifications1. 3-5 years"™ experience with Bachelor's Degree in Computer Science, Engineering, Technology or related field required 2. Good understanding of streaming technologies like Kafka, Spark Streaming. 3. Experience with Enterprise Business Intelligence Platform/Data platform sizing, tuning, optimization and system landscape integration in large-scale, enterprise deployments. 4. Proficiency in one of the programming language preferably Java, Scala or Python 5. Good knowledge of Agile, SDLC/CICD practices and tools 6. Must have proven experience with Hadoop, Mapreduce, Hive, Spark, Scala programming. Must have in-depth knowledge of performance tuning/optimizing data processing jobs, debugging time consuming jobs. 7. Proven experience in development of conceptual, logical, and physical data models for Hadoop, relational, EDW (enterprise data warehouse) and OLAP database solutions. 8. Good understanding of distributed systems 9. Experience working extensively in multi-petabyte DW environment 10. Experience in engineering large-scale systems in a product environment

Posted 1 month ago

Apply

15.0 - 20.0 years

6 - 10 Lacs

Mumbai

Work from Office

A Brand Sales Specialist for IBM’s Data & AI portfolio is responsible for working with clients /partners to create thought leadership of the Data & AI portfolio. He/She needs to attain expertise on industry domain wrt key clients and addressable market and should demonstrate an aptitude to be seen as a Trusted Advisor/SME across all steps of the AI Ladder - right from Collect, Organize, Analyze and Infuse. Should be proficient at working with line of business owners to quantify the value of the solution to the client and be able to effectively collaborate across the IBM stakeholders and our business partners. Revenue - Responsible for Sales Budgets and Growth Objectives with respect to the portfolio across the country Channel Strategy - To help grow existing Ecosystems capacity, Identify New Partners and work with the Channel Managers to onboard them and also ensure present capacity is utilized to ensure BP's don't lose focus from our Core Products and existing clients. Develop Industry Use Cases & Sales Play -Build and execute on industry specific use cases and Sales Plays Demand Generation Planning key demand generation activities along with marketing team and design Marketing program to increase the share voice for the Data & AI SW portfolio through events and social Media campaign. Thought Leadership Engage with selected C-Suite Executives of Enterprise & Commercial organizations to share best practices around the Data & AI portfolio and build Unique repeatable Use Cases for each Industry. Client Success Ensure higher client satisfaction ( NPS Score ) and 100% deployment rate. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 15 years of experience selling software or applications software. Minimum 7+ years of experience in selling Data & AI Solutions like ETL, EDW, Data Fabric, Data Governance, Data Science / Model Ops, MDM Experience working with partners in complex implementation projects, including global system integrators and packaged software vendors. People Management Experience Preferred Ability to work with sales engineers and customer’s technical leads to understand existing software estate Identify Business pain points and build business cases for proposed solution. Experience with Complex Solution selling and commercial and legal negotiations skills working with procurement, legal, and business teams. Ability to leverage C-level relationships with executives. Preferred technical and professional experience NA

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

We are seeking a highly experienced Senior Analyst to help guide us in our quest with our global, regional, and functional commercial policy implementation, reporting & governance projects. This successful candidate will contribute by building metrics, analyzing processes, workflows, and systems with the objective of identifying opportunities for either improvement or automation. Our ideal candidate is comfortable working with all levels of management to gain an in-depth understanding of our strategy and improving customer experience. This role requires close collaboration with product, segment partners, product marketing, customer to cash, sales, marketing, technology, and finance areas. This position resides in the Commercial Excellence organization and reports to the Manager of Commercial Policy Reporting & Governance. About the Role In this role as a Senior Analyst Commercial Policy Reporting & Governance, you will Improve, execute, and effectively communicate significant analyses that identifies meaningful trends and opportunities across the business. Participate in regular meetings with stakeholders & management, assessing and addressing issues to identify and implement improvements toward efficient operations. Provide strong and timely business analytic support to business partners and various organizational stakeholders. Develop actionable road maps for improving workflows and processes. Effectively work with partners across the business to develop processes for capturing project activity, creating metrics driven dashboards for specific use cases, behaviors and evaluating the data for process improvement recommendations. Collaborate with Project Leads, Managers, and Business partners to determine schedules and project timelines ensuring alignments across all areas of the business . Drive commercial strategy and policy alignment with fast changing attributes, while managing reporting, tracking and governance best practices. Identify, assess, manage, and communicate risks while laying out mitigation plan and course corrections where appropriate. Provide insightful diagnostics and actionable insights to the leadership team in a proactive manner by spotting trends, questioning data and asking questions to understand underlying drivers. Proactively identify trends for future governance & reporting needs while presenting ideas to CE Leadership for new areas of opportunity to drive value. Prepare, analyze, and summarize various weekly, monthly, and periodic operational results for use by various key stakeholders, creating reports, specifications, instructions, and flowcharts. Conduct full lifecycle of analytics projects, including pulling, manipulating, and exporting data from project requirements documentation to design and execution. About You Youre a fit for the role of Senior Analyst Commercial Policy Reporting & Governance, if your background includes Bachelors degree required, preferably in Computer Science, Mathematics, Business management, or economics. 4 to 6+ years of professional experience in a similar role. The role requires the candidate to work from 2 pm - 11 pm IST. Willing to work in hybrid mode, Work from Office Twice a week. Proven project management skills related planning and overseeing projects from the initial ideation through to completion. Proven ability to take complex and disparate data sets and create streamlined and efficient data lakes with connected and routinized cadence. Advanced level skills in the following systemsPower BI, Snowflake, Redshift, Salesforce.com, EDW, Excel, MS PowerPoint, and Alteryx/similar middleware data transformation tools. Familiarity with contract lifecycle management tools like Conga CLM, HighQ CLM etc. Ability to quickly draw insights into trends in data and make recommendations to drive productivity and efficiency. Exceptional verbal, written, and visual communication skills Experience managing multiple projects simultaneously within a matrix organization, adhering to deadlines in a fast-paced environment Ability to deploy influencing techniques to drive cross-functional alignment and change across broad audience Ability to be flexible with working hours to support ever-changing demands of the business #LI-GS2 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world.

Posted 2 months ago

Apply

4.0 - 9.0 years

13 - 23 Lacs

Pune

Work from Office

Job Title: Data Engineer - SAS DI Location: Pune, Maharashtra, India Experience: 4+ years Job Type: Full-time Hybrid Shift : 11AM -8 PM Mon-Friday Data Engineer What We're Looking For: Design, develop, and maintain ETL pipelines using Informatica PowerCenter or Talend to extract, transform, and load data into EDW systems and data lake. Optimize and troubleshoot complex SQL queries and ETL jobs to ensure efficient data processing and high performance. Technologies - SQL, SAS DI , Informatica Power center, Talend , Big Data, Hive

Posted 2 months ago

Apply

12.0 - 17.0 years

25 - 35 Lacs

Pune, Chennai

Hybrid

• Should have led at least 3 large legacy EDW/data platform modernization & migrations to snowflake/databricks/data on cloud engagements in the last 5+ years. • Having experience in leading all aspects of the project/program life cycle, including

Posted 2 months ago

Apply

5 - 7 years

0 - 0 Lacs

Thiruvananthapuram

Work from Office

Job Summary: We are seeking a highly skilled SAP BODS Data Engineer with strong expertise in ETL development and Enterprise Data Warehousing (EDW) . The ideal candidate will have a deep understanding of SAP Business Objects Data Services (BODS) and will be responsible for designing, developing, and maintaining robust data integration solutions. Key Responsibilities: Design, develop, and implement efficient ETL solutions using SAP BODS. Build and optimize SAP BODS jobs, including job design, data flows, scripting, and debugging. Develop and maintain scalable data extraction, transformation, and loading (ETL) processes from diverse data sources. Create and manage data integration workflows to ensure high performance and scalability. Collaborate closely with data architects, analysts, and business stakeholders to deliver accurate and timely data solutions. Ensure data quality and consistency across different systems and platforms. Troubleshoot and resolve data-related issues in a timely manner. Document all ETL processes and maintain technical documentation. Required Skills & Qualifications: 3+ years of hands-on experience with ETL development using SAP BODS . Strong proficiency in SAP BODS job design, data flow creation, scripting, and debugging. Solid understanding of data integration , ETL concepts , and data warehousing principles . Proficiency in SQL for data querying, data manipulation, and performance tuning. Familiarity with data modeling concepts and major database systems (e.g., Oracle, SQL Server, SAP HANA). Excellent problem-solving skills and keen attention to detail. Strong communication and interpersonal skills to facilitate effective collaboration. Ability to work independently, prioritize tasks, and manage multiple tasks in a dynamic environment. Required Skills Sap,Edw,Etl

Posted 2 months ago

Apply

5 - 8 years

14 - 18 Lacs

Hyderabad

Work from Office

Role & responsibilities Plan, develop, and coordinate test activities including creation and execution of test plans and test cases. Perform debugging, defect tracking, test analysis, and documentation. Understand business functionality and application technology under test. Collaborate with on-site teams and other stream areas during release cycles. Utilize ESG QA tools, methodologies, and processes. Ensure low bug rates and high code quality during releases. Manage build deployments to QA and flag risks/issues proactively. Skills Required: Experience with SQL and ETL testing, schema validation, and SCD types. Strong knowledge in data warehouse/BI testing and cloud-based services (Azure). Expertise in writing complex SQL queries and validating data during migration. Proficient in UFT, TFS, Microsoft tools, and peripheral technologies (SAP, PeopleSoft, Aderant). Strong communication, estimation, and project delivery skills. Team leadership, remote collaboration, and quality focus. Interested candidates can share your resume to sarvani.j@ifinglobalgroup.com

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies