Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 years
0 Lacs
Delhi
On-site
Where Data Does More. Join the Snowflake team. At the forefront of the data revolution, Snowflake is building the world’s greatest data and applications platform. Our ‘get it done’ culture fosters innovation, impact, and collaboration. We are rapidly expanding our partner Go-To-Market initiatives with System Integrators, Cloud Service Providers, and Data Cloud Partners, who are crucial in helping customers leverage the Snowflake AI Data Cloud. We seek a self-driven individual with excellent English verbal and written communication skills to grow these partnerships, engaging both local and global teams. One of the unique benefits of Snowflake’s architecture is the ability to securely share data, applications and solutions with other Snowflake accounts without creating a copy of the data. The Snowflake Data Cloud builds on our secure data sharing functionality to be the ‘App Store’ for data, enabling providers and consumers to publish/discover and monetize data, applications and solutions. Providers to the Snowflake Marketplace use Data Sharing as the means to deliver their data or service, replacing traditional delivery methods such as files and APIs. Data Sharing and the Marketplace play a key strategic role in our Data Cloud vision and drive the network effect of the Data Cloud! Success in this position requires the candidate to be a technical advisor by aligning with key programs and educating/upskilling partners on these key product features. The candidate will present skills to both technical and executive audiences, whether it's white boarding or using presentations and demos to build mind share among Snowflake Data Cloud and SI Partners in India. We are looking for a technical member who understands the data and applications partner ecosystem as well as how to grow and manage content partnerships. In addition to technically onboarding and enabling partners, you will be an important guide in the creation of the Go-to-Market for new partners. This position will be based in Mumbai and occasional travel to partner sites or industry events within India may be required. As A Partner Solution Engineer, You Will: Technically on board and enable partners to re-platform their Data and AI applications onto the Snowflake AI Data Cloud. Collaborate with partners to develop Snowflake solutions in customer engagements. You will be working with our partners to create assets and demos, build hands-on POCs and pitch Snowflake solutions. Help Solution Providers/Practice Leads with the technical strategies that enables them to sell their offerings on Snowflake Keeping Partners up to date on key Snowflake product updates and future roadmaps to help them represent Snowflake to their clients about latest technology solutions and benefits Run technical enablement programs to provide best practices, and solution design workshops to help Partners create effective solutions. Drive strategic engagements by quickly grasping new concepts and articulating their business value. Showcase the impact of Snowflake through compelling customer success stories and case studies. Strong understanding of how Partners make revenue through the Industry priorities & complexities they face and influence where Snowflake products can have the most impact for their product services Conversations with other technologists, providing presentations at the C-level. Preferred skill sets and experiences: Have a total of 10+ years of relevant experience. Experience working with Tech Partners, ISVs and System Integrators (SIs) in India. Develop data domain thought leadership within the partner community. Providing technical product and deep architectural expertise & latest product capabilities with our Partner Solution Architect community based in India. Presales or hands-on experience with Data Warehouse, Data Lake or Lakehouse platform. Presales or hands-on experience in designing and building highly scalable data pipelines using Spark, Kafka to ingest data from various systems. Experience with our partner integration ecosystem like Alation, FiveTran, Informatica, dbtCloud etc are plus. Have hands-on experience and strong knowledge of Docker and how to containerize Python-based applications. Have knowledge of Container networking and Kubernetes. Have working knowledge of and integration with API’s Have proficiency in Agile development practices and Continuous Integration/Continuous Deployment (CI/CD), including DataOps and MLops Presales or hands-on experience using Big Data or Cloud integration technologies such as Azure Data Factory, AWS Glue, AWS Lambda, etc. Experience in the AI/ML domain is a plus. Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com
Posted 1 week ago
5.0 - 7.0 years
0 Lacs
Noida
On-site
5 - 7 Years 2 Openings Noida Role description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation: Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration: Define and govern the configuration management plan. Ensure compliance within the team. Testing: Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance: Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management: Manage the delivery of modules effectively. Defect Management: Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation: Create and provide input for effort and size estimation for projects. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management: Execute and monitor the release process to ensure smooth transitions. Design Contribution: Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface: Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management: Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications: Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples: Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments: Skills Cloud Platforms ( AWS, MS Azure, GC etc.) Containerization and Orchestration ( Docker, Kubernetes etc..) APIs - Change APIs to APIs development Data Pipeline construction using languages like Python, PySpark, and SQL Data Streaming (Kafka and Azure Event Hub etc..) Data Parsing ( Akka and MinIO etc..) Database Management ( SQL and NoSQL, including Clickhouse, PostgreSQL etc..) Agile Methodology ( Git, Jenkins, or Azure DevOps etc..) JS like Connectors/ framework for frontend/backend Collaboration and Communication Skills Aws Cloud,Azure Cloud,Docker,Kubernetes About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 1 week ago
3.0 - 7.0 years
5 - 9 Lacs
Hyderabad, Pune
Work from Office
Designing dashboards with the use of visualization tools likeTableau Communicating with customers to analyze historical data and identify KPIs Improving data processing speed by building SQL automations Tweaking SQL Queries for best performance Analysing the data so as to identify trends and share insights . Recognizing areas for automation Restricting data for particular users with the help of User filters Producing support documentation and keep existing documentation up-to-date Carrying out investigation of root cause analysis . Good knowledge onTableausever Administration. . Good knowledge onTableau3 cluster environment. . Knowledge on other reporting tools like OBIEE, Power BI Etc.. . Good knowledge on SQL to build reports intableau. . Knowledge on NOETIX Query Builder and NOETIX Administration activities. . Good knowledge on POWER SHELL scripts for Automation ofTableaureports.
Posted 1 week ago
5.0 - 8.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Informatica iPaas. Experience: 5-8 Years.
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Director – Data Presales Architect Location: Greater Noida Experience Required: 10-15 years Role Overview: We are seeking a highly skilled and experienced professional to lead and support our data warehousing and data center architecture initiatives. The ideal candidate will have deep expertise in Data Warehousing, Data Lakes, Data Integration, and Data Governance , with hands-on experience in ETL tools and cloud platforms such as AWS, Azure, GCP, and Snowflake . This role demands strong presales experience , technical leadership, and the ability to manage complex enterprise deals across multiple geographies. Key Responsibilities: Architect and design scalable Data Warehousing and Data Lake solutions Lead presales engagements, including RFP/RFI/RFQ lifecycle management Create and present compelling proposals and solution designs to clients Collaborate with cross-functional teams to deliver end-to-end solutions Estimate efforts and resources for customer requirements Drive Managed Services opportunities and enterprise deal closures Engage with clients across MEA, APAC, US, and UK regions Ensure alignment of solutions with business goals and technical requirements Maintain high standards of documentation and presentation for client-facing materials Must-Have: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field Certifications in AWS, Azure, GCP, or Snowflake are a plus Experience working in consulting or system integrator environments Strong knowledge of Data Warehousing, Data Lakes, Data Integration, and Data Governance Hands-on experience with ETL tools (e.g., Informatica, Talend, etc.) Exposure to c loud environments: AWS, Azure, GCP, Snowflake Minimum 2 years of presales experience with understanding of presales operating processes Experience in enterprise-level deals and Managed Services Proven ability to handle multi-geo engagements Excellent presentation and communication skills Strong understanding of effort estimation techniques for customer requirements
Posted 1 week ago
4.0 years
0 Lacs
Andhra Pradesh, India
On-site
A career within Salesforce Consulting services, will provide you with the opportunity to help our clients leverage Salesforce technology to enhance their customer experiences, enable sustainable change, and drive results. We focus on understanding our client’s challenges and developing custom solutions powered by Salesforce to transform their sales, service and marketing capabilities by exploring data and identifying trends, managing customer life cycles, strategically building and leveraging online communities, driving employee engagement and collaboration, and connecting directly with channel partners to share goals, objectives, and activities in a secure, branded location. To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. To help us achieve this we have the PwC Professional; our global leadership development framework. It gives us a single set of expectations across our lines, geographies and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future. As a Senior Associate, You'll Work As Part Of a Team Of Problem Solvers, Helping To Solve Complex Business Issues From Strategy To Execution. PwC Professional Skills And Responsibilities For This Management Level Include But Are Not Limited To: Use feedback and reflection to develop self awareness, personal strengths and address development areas. Delegate to others to provide stretch opportunities, coaching them to deliver results. Demonstrate critical thinking and the ability to bring order to unstructured problems. Use a broad range of tools and techniques to extract insights from current industry or sector trends. Review your work and that of others for quality, accuracy and relevance. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Seek and embrace opportunities which give exposure to different situations, environments and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. The Opportunity When you join PwC Acceleration Centers (ACs), you step into a pivotal role focused on actively supporting various Acceleration Center services, from Advisory to Assurance, Tax and Business Services. In our innovative hubs, you’ll engage in challenging projects and provide distinctive services to support client engagements through enhanced quality and innovation. You’ll also participate in dynamic and digitally enabled training that is designed to grow your technical and professional skills. As part of the Business Application Consulting team, you translate customer requirements into functional configurations of Salesforce.com. As a Senior Associate, you analyze complex problems, mentor others, and maintain rigorous standards. You focus on building client relationships and developing a deeper understanding of the business context, while navigating increasingly complex situations and growing your personal brand and technical knowledge. Responsibilities Translate customer requirements into functional Salesforce configurations Analyze and address complex issues within client projects Mentor and support junior team members Foster and strengthen client relationships Gain a thorough understanding of business context Manage and navigate intricate scenarios Enhance personal brand and technical skills Uphold exceptional standards and quality in deliverables What You Must Have Bachelor's Degree 4 years of experience 2-3 years of experience in Salesforce CPQ projects & Billing Experience with configuration & implementation of Salesforce CPQ Cloud 1-3 successful completions of CPQ and Billing entire cycle Implementation Thorough understanding of Quote to cash process Hands-on experience in Force.com platform using APEX, flows Experience in working with LWC - Lightning Web Components Experience in working with Advanced approval process Experience on SOAP/Rest/Platform Events/Streaming APIs and 3rd party integrations What Sets You Apart Bachelor of Technology preferred Proficient experience in Salesforce configuration, security and mapping features to the business requirements Experience in implementing integration solutions between CRM, ERP and Financial systems (example - Zuora, NetSuite) Advanced RDBMS knowledge and building SQL queries Proficient written and verbal communication skills Proficient experience wrt handling large data Producing and delivering technical solutions and integrated solutions involving different Salesforce clouds (including but not limited to Sales, Service, Revenue, Platform) and a variety of middleware products (Mulesoft, Informatica, etc) establishing quality and schedule
Posted 1 week ago
2.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
About QualityKiosk Technologies QualityKiosk Technologies is a leading independent Quality Engineering (QE) and digital transformation provider, helping businesses deliver high-performing, user-friendly applications. Founded in 2000, the company offers services in QA automation, performance assurance, intelligent automation (IA), robotic process automation (RPA), customer experience management, SRE, cloud, data analytics, and more. With a global presence in 25+ countries and a team of 4,000+ professionals, QualityKiosk supports major players across banking, e-commerce, telecom, automotive, insurance, OTT, and pharmaceuticals. Recognized by Forrester, Gartner, and others, it serves 50 of India’s Fortune 100 and 18 of the global Fortune 500 companies. Focused on innovation and rapid execution, the company aims for 5X growth in revenue and workforce over the next five years. Job Description Location: Mahape Experience: 2 years Early Joiners preferred! Strategic Responsibilities Support the development of the Data Strategy. Assist the DQ Lead in implementing data governance frameworks, policies, and standards. Help build the Data Governance Office as a center of excellence promoting a data-driven culture. Monitor industry and regulatory trends to guide data initiatives. Promote data quality and position data as a strategic asset. Core Responsibilities Support reporting on data quality performance and standards. Develop and maintain data quality rules and profiling tools using Informatica DQ. Contribute to data catalog artifacts (definitions, lineage, standards). Maintain organizational reference data. Provide expertise on systems and processes to improve data quality. Additional Accountabilities Create data quality dashboards and monitoring tools. Support RCA and business teams with data quality analysis. Communication Build strong relationships across business units and support functions. Engage with vendors, data partners, consultants, and regulators. Key KPIs Delivery of DQ roadmap Reduction in high-priority data issues Improved stakeholder satisfaction Measurable improvements in data quality Decision-Making Scope Advise on data governance implementation and strategic priorities. Recommend budget allocations for data initiatives. Qualifications & Experience Bachelor’s degree, 2+ years in data roles Experience in data quality, governance, and financial services. Skilled in Informatica DQ, SQL, and dashboard tools. Strong analytical, communication, and stakeholder management skills
Posted 1 week ago
0.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Category: Software Development/ Engineering Main location: India, Karnataka, Bangalore Position ID: J0625-0922 Employment Type: Full Time Position Description: Founded in 1976, CGI is among the largest independent IT and business consulting services firms in the world. With 94,000 consultants and professionals across the globe, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com. Job Title: OFSAA Developer Position: Lead Analyst Experience: 5 - 10 Years Category: Software Development/ Engineering Shift: General Main location: Bangalore/Chennai/Hyderabad/Pune Position ID: J0625-0922 Employment Type: Full Time Education Qualification: Bachelor's degree in Computer Science or related field or higher with minimum 5 years of relevant experience. Position Description: Works independently under limited supervision and applies knowledge of subject matter in Applications Development. Possess sufficient knowledge and skills to effectively deal with issues, challenges within field of specialization to develop simple applications solutions. Second level professional with direct impact on results and outcome. Your future duties and responsibilities: Designing, implementing, and maintaining OFSAA solutions for financial institutions Work closely with business and IT teams, developing and documenting SQL queries, executing data sanity checks, and managing data movement and server administration. Data Movement and Server Management Understand complex SQL queries, analyze data related issues and identify root cause of issues / defects. Develop and implement performance optimal system. Troubleshoot technical issues by analyzing OFSAA log files and fixing the issues. Analyze data related issues and identify root cause of issues / defects. Required qualifications to be successful in this role: Must-Have Skills: A Design, develop, and configure components within the OFSAA suite, including Data Foundation, DIH (Data Integration Hub), FSDF, and Metadata Manager. Implement business rules, data models, and mapping logic using OFSAA tools like Rule Framework, ERWIN, or T2T (Table to Table). Design and develop ETL workflows to extract, transform, and load financial and risk data from source systems into OFSAA staging and atomic layers. Work with ODI, Informatica, or PL/SQL scripts to ingest and transform large datasets. Build and maintain validation rules to ensure data accuracy, completeness, and consistency across ingestion and reporting layers. Optimize OFSAA components and SQL queries for performance and scalability. Collaborate with Business Analysts, Data Architects, and Reporting Teams to gather requirements and translate them into OFSAA configuration and code. Good-to-Have Skills: Good to have knowledge on finance and accounting principles. CGI is an equal opportunity employer. In addition, CGI is committed to providing accommodation for people with disabilities in accordance with provincial legislation. Please let us know if you require reasonable accommodation due to a disability during any aspect of the recruitment process and we will work with you to address your needs. #LI-SN1 Skills: Data flow analysis Data Modeling Data Warehousing DataStage ETL Oracle SQL Developer Python What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 1 week ago
0.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Designation: Senior Analyst Level: L2 Experience: 4 to 7 years Location: Chennai Job Description: We are seeking a highly skilled and motivated Senior Data Quality Analyst (DQA) who is responsible for ensuring the accuracy, completeness, and reliability of an organization’s data, enabling informed decision-making. The ideal candidate works with various Business stakeholders to understand business requirements and define data quality standards, developing and enforcing data validation procedures to ensure compliance with the company’s data standards. Responsibilities: Data Quality Monitoring & Validation (40% of Time): Profile Data: Identify anomalies (missing values, duplicates, outliers) Run Data Quality Checks: Validate against business rules. Automate Checks: Schedule scripts (SQL/Python) to flag issues in real time. Issue Resolution & Root Cause Analysis (30% of Time): Triage Errors: Work with IT/data engineers to fix corrupt data Track Defects: Log issues in Jira/Snowflake and prioritize fixes. Root Cause Analysis: Determine if issues stem from ETL bugs, user input, or system failures. Governance & Documentation (20% of Time): Ensuring compliance with data governance frameworks Metadata Management: Document data lineage. Compliance Audits: Ensure adherence to GDPR, HIPAA, or internal policies. Implementing data quality standards and policies Stakeholder Collaboration (10% of Time): Train Teams: Educate data citizens, data owners, data stewards on data quality best practices. Monitoring and reporting on data quality metrics including Reports to Leaderships. Skills: Technical Skills Knowledge of data quality tools and data profiling techniques (e.g., Talend, Informatica, Ataccama, DQOPS, Open Source tool) Familiarity with database management systems and data governance initiatives Proficiency in SQL and data management principles Experience with data integration and ETL tools Understanding of data visualization tools and techniques Knowledge of data governance and metadata management Familiarity with Python/R for automation and scripting Analytical Skills Strong analytical and problem-solving skills Ability to identify data patterns and trends Understanding of statistical analysis and data quality metrics Experience with data cleansing and data validation techniques including data remediation Ability to assess data quality and identify areas needing improvement Experience with conducting data audits and implementing data quality processes Ability to document data quality rules and procedures Job Snapshot Updated Date 25-07-2025 Job ID J_3911 Location Chennai, Tamil Nadu, India Experience 4 - 7 Years Employee Type Permanent
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana
On-site
Hyderabad, Telangana Job ID 30187591 Job Category Digital Technology Job Title – Master Data Analyst Preferred Location - Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do Job Summary We are seeking a detail-oriented and experienced Master Data Analyst to ensure the accuracy, consistency, and integrity of our critical master data across various enterprise systems. The Master Data Analyst will play a crucial role in data governance, data quality initiatives, and supporting business processes through reliable and well-managed master data. Key Responsibilities Develop, implement, and maintain master data management (MDM) policies, standards, and procedures. Ensure data quality, completeness, and consistency of master data (e.g., customer, product, vendor, material) across all relevant systems. Perform data profiling , cleansing, and validation to identify and resolve data quality issues. Collaborate with business units and IT teams to define data definitions, business rules, and data hierarchies . Act as a data steward, overseeing the creation, modification, and deletion of master data records. Support data integration efforts, ensuring master data is accurately and efficiently synchronized between systems. Document master data processes, data flows, and data lineage . Participate in projects related to data migration, system implementations, and data governance initiatives. Provide training and support to end-users on master data best practices and tools. Required Qualifications Bachelor's degree in Information Systems, Data Science, or a related quantitative field. 3+ years of experience in a Master Data Management (MDM), Data Quality, or Data Analyst role, specifically focused on master data. Strong understanding of master data concepts, data governance principles, and data lifecycle management. Proficiency with data analysis tools and techniques. Experience with enterprise resource planning (ERP) systems (e.g., SAP, Oracle, Microsoft Dynamics) and their master data structures. Experienced in cloud platforms (AWS, Azure) or relevant data technologies. Excellent analytical, problem-solving, and communication skills, with the ability to translate technical concepts to non-technical stakeholders. Proven ability to work independently and collaboratively in a fast-paced environment. Preferred Qualifications Experience with MDM software solutions (e.g., Informatica MDM, SAP MDG ). Familiarity with SQL and experience querying relational databases . Knowledge of SAP modules (ECC, CRM, BW) and with data governance, metadata management, and data cataloging tools (e.g., Alation, Collibra). Familiarity handling MDM in SAP ECC and SAP S/4 versions Knowledge of data warehousing concepts and business intelligence tools (e.g., Power BI, Tableau ). Experience with data governance frameworks and tools. Certifications in data management or related fields. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.
Posted 1 week ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Greetings from Synergy Resource Solutions, a leading Recruitment Consultancy. Our Client is an ISO 27001:2013 AND ISO 9001 Certified company, and pioneer in web design and development company from India. Company has also been voted as the Top 10 mobile app development companies in India. Company is leading IT Consulting and web solution provider for custom software, website, games, custom web application, enterprise mobility, mobile apps and cloud-based application design & development. Company is ranked one of the fastest growing web design and development company in India, with 3900+ successfully delivered projects across United States, UK, UAE, Canada and other countries. Over 95% of client retention rate demonstrates their level of services and client satisfaction. Position : Senior Data Engineer Experience : 5+ Years relevant experience Education Qualification : Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. Job Location : Ahmedabad Shift : 11 AM – 8.30 PM Key Responsibilities: Our client seeking an experienced and motivated Senior Data Engineer to join their AI & Automation team. The ideal candidate will have 5–8 years of experience in data engineering, with a proven track record of designing and implementing scalable data solutions. A strong background in database technologies, data modeling, and data pipeline orchestration is essential. Additionally, hands-on experience with generative AI technologies and their applications in data workflows will set you apart. In this role, you will lead data engineering efforts to enhance automation, drive efficiency, and deliver data driven insights across the organization. Job Description: • Design, build, and maintain scalable, high-performance data pipelines and ETL/ELT processes across diverse database platforms. • Architect and optimize data storage solutions to ensure reliability, security, and scalability. • Leverage generative AI tools and models to enhance data engineering workflows, drive automation, and improve insight generation. • Collaborate with cross-functional teams (Data Scientists, Analysts, and Engineers) to understand and deliver on data requirements. • Develop and enforce data quality standards, governance policies, and monitoring systems to ensure data integrity. • Create and maintain comprehensive documentation for data systems, workflows, and models. • Implement data modeling best practices and optimize data retrieval processes for better performance. • Stay up-to-date with emerging technologies and bring innovative solutions to the team. Qualifications: • Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. • 5–8 years of experience in data engineering, designing and managing large-scale data systems. Strong expertise in database technologies, including: The mandatory skills are as follows: SQL NoSQL (MongoDB or Cassandra, or CosmosDB) One of the following : Snowflake or Redshift or BigQuery or Microsft Fabrics Azure • Hands-on experience implementing and working with generative AI tools and models in production workflows. • Proficiency in Python and SQL, with experience in data processing frameworks (e.g., Pandas, PySpark). • Experience with ETL tools (e.g., Apache Airflow, MS Fabric, Informatica, Talend) and data pipeline orchestration platforms. • Strong understanding of data architecture, data modeling, and data governance principles. • Experience with cloud platforms (preferably Azure) and associated data services. Skills: • Advanced knowledge of Database Management Systems and ETL/ELT processes. • Expertise in data modeling, data quality, and data governance. • Proficiency in Python programming, version control systems (Git), and data pipeline orchestration tools. • Familiarity with AI/ML technologies and their application in data engineering. • Strong problem-solving and analytical skills, with the ability to troubleshoot complex data issues. • Excellent communication skills, with the ability to explain technical concepts to non-technical stakeholders. • Ability to work independently, lead projects, and mentor junior team members. • Commitment to staying current with emerging technologies, trends, and best practices in the data engineering domain. If your profile is matching with the requirement & if you are interested for this job, please share your updated resume with details of your present salary, expected salary & notice period.
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
About Forsys: Forsys Inc. is a leading company specializing in Lead-to-Revenue transformation, utilizing a combination of strategy, technology, and business transformation to foster growth. The company boasts a team of over 500 professionals dispersed across various locations such as the US, India, UK, Colombia, and Brazil, with its headquarters situated in the Bay Area. Forsys is renowned for its commitment to innovation and excellence. As an implementation partner for major vendors like Conga, Salesforce, and Oracle, as well as an incubator for groundbreaking ideas and solutions, Forsys holds a unique position within the consulting industry. The company is dedicated to empowering its clients by uncovering new revenue streams and cultivating a culture of innovation. To learn more about our vision and the impact we are making, visit forsysinc.com. Data Migration Technical Lead: Forsys is currently seeking a full-time Data Migration Technical Lead who is a proficient Salesforce Revenue Cloud Data Migration Specialist. In this role, you will be responsible for overseeing and executing data migration activities as part of Revenue Cloud implementation and transformation projects. As a key member of the Forsys Data Migration team, you will analyze data from multiple source systems, consult with clients on data transformation, and manage end-to-end data and document migration processes. Responsibilities: - Possessing over 8 years of experience as a data migration technical lead, with a proven track record in handling complex migration projects. - Developing and implementing data migration strategies for Salesforce Revenue Cloud, including CPQ (Configure Price Quote), Billing, and related modules. - Collaborating with clients to assess their data requirements, creating data models, and establishing data mappings. - Evaluating source data quality, devising data cleansing strategies, and executing data cleaning processes as needed. - Building ETL/ELT pipelines using tools like Informatica, Talend, or native Salesforce tools. - Adhering to best practices for data migration and following established standards and protocols. - Assessing different source systems to determine optimal data transfer methods and managing large volumes of data effectively. - Designing and conducting data validation procedures pre and post-migration, and generating Data Reconciliation reports. - Implementing testing protocols to ensure data accuracy and consistency with client specifications. - Providing technical support throughout the data migration process to ensure efficiency and smooth operation. - Creating comprehensive documentation of the migration process to guide future projects. - Mentoring team members and fostering collaboration to achieve project deliverables effectively. - Demonstrating the ability to perform effectively in high-pressure environments. Eligibility: - Minimum of 8 years of experience in data migration or ETL roles, with at least 2 years focusing on Salesforce Revenue Cloud (CPQ + Billing). - Proficiency in utilizing ETL Tools such as Pentaho, Mulesoft, Informatica, Data Stage, SSIS, etc. - Strong understanding of the Salesforce data model and experience in various phases of Data Migration. - Advanced SQL skills, familiarity with APIs, and integration patterns. - Experience in data/process mapping for Data Migrations involving Salesforce, Oracle, and Legacy systems is preferred. - Extensive experience working with different databases and SQL queries. - Knowledge of Supply Chain/CRM/Quote to Cash/Quote to Order business processes. - Proficiency in handling various data formats (XML, JSON, etc.). - Expertise in SOAP & REST Services, API implementation, and Cloud services. - Strong communication skills, ability to work effectively in teams, both onshore and offshore, and driven to achieve goals. - Self-motivated, goal-oriented individuals with strong analytical and problem-solving skills. - Prior experience with source systems such as NetSuite, SAP, Zuora, or Oracle for migration to Salesforce Revenue Cloud is advantageous.,
Posted 1 week ago
5.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Job Title : Informatica MDM Developer Experience Required : 5+ Years Location : Mumbai, hyderabad, bengaluru, pune, chennai Job Description We are seeking a skilled Informatica MDM Developer with over 5 years of hands-on experience in implementing Informatica Master Data Management (MDM) solutions. The ideal candidate must possess relevant certifications and deep implementation experience, with the ability to work independently and contribute effectively to enterprise MDM initiatives. Key Responsibilities Lead and participate in end-to-end Informatica MDM implementations. Design and develop data models, match & merge rules, hierarchies, and workflows in Informatica MDM. Collaborate with stakeholders to understand business requirements and translate them into MDM solutions. Perform unit testing, performance tuning, and deployment support. Provide post-deployment support and enhancements for existing MDM systems. Primary Must-Have Skills (Non-Negotiable) Minimum 5+ years of hands-on Informatica MDM implementation experience. Mandatory certification in Informatica MDM (Developer / Implementation Specialist). Strong knowledge of Informatica Hub, IDD, SIF API, and User Exits. Proficient in data cleansing, matching, merging, and hierarchy management. Solid understanding of data quality and data governance concepts. (ref:hirist.tech)
Posted 1 week ago
130.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Manager, Data Visualization The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of the company IT operating model, Tech centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each tech center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview A unique opportunity to be part of an Insight & Analytics Data hub for a leading biopharmaceutical company and define a culture that creates a compelling customer experience. Bring your entrepreneurial curiosity and learning spirit into a career of purpose, personal growth, and leadership. We are seeking those who have a passion for using data, analytics, and insights to drive decision-making that will allow us to tackle some of the world's greatest health threats As a manager in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Our Quantitative Sciences team use big data to analyze the safety and efficacy claims of our potential medical breakthroughs. We review the quality and reliability of clinical studies using deep scientific knowledge, statistical analysis, and high-quality data to support decision-making in clinical trials. What Will You Do In This Role Design & develop user-centric data visualization solutions utilizing complex data sources. Identify & define key business metrics and KPIs in partnership with business stakeholders. Define & develop scalable data models in alignment & support from data engineering & IT teams. Lead UI UX workshops to develop user stories, wireframes & develop intuitive visualizations. Collaborate with data engineering, data science & IT teams to deliver business friendly dashboard & reporting solutions. Apply best practices in data visualization design & continuously improve upon intuitive user experience for business stakeholders. Provide thought leadership and data visualization best practices to the broader Data & Analytics organization. Identify opportunities to apply data visualization technologies to streamline & enhance manual / legacy reporting deliveries. Provide training & coaching to internal stakeholders to enable a self-service operating model. Co-create information governance & apply data privacy best practices to solutions. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace. What Should You Have 5 years’ relevant experience in data visualization, infographics, and interactive visual storytelling Working experience and knowledge in Power BI / QLIK / Spotfire / Tableau and other data visualization technologies Working experience and knowledge in ETL process, data modeling techniques & platforms (Alteryx, Informatica, Dataiku, etc.) Experience working with Database technologies (Redshift, Oracle, Snowflake, etc) & data processing languages (SQL, Python, R, etc.) Experience in leveraging and managing third party vendors and contractors. Self-motivation, proactivity, and ability to work independently with minimum direction. Excellent interpersonal and communication skills Excellent organizational skills, with ability to navigate a complex matrix environment and organize/prioritize work efficiently and effectively. Demonstrated ability to collaborate and lead with diverse groups of work colleagues and positively manage ambiguity. Experience in Pharma and or Biotech Industry is a plus. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Clinical Decision Support (CDS), Clinical Testing, Communication, Create User Stories, Data Visualization, Digital Transformation, Healthcare Innovation, Information Technology Operations, IT Operation, Management Process, Marketing, Motivation Management, Requirements Management, Self Motivation, Statistical Analysis, Statistics, Thought Leadership, User Experience (UX) Design Preferred Skills Job Posting End Date 07/31/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R359276
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You will be joining Cliff IT Solutions as a Sr. Data Modeler & Data Analyst located in Hyderabad. Your primary responsibility will be to design and implement data models, ensure data quality, and establish robust data governance frameworks. This full-time on-site role involves creating data architecture, managing Extract Transform Load (ETL) processes, and collaborating with stakeholders to enhance data systems. Your expertise in data management principles will be crucial in improving information systems effectively. To excel in this role, you should possess skills in Data Governance, Data Quality, Data Modeling, and Data Architecture. Experience with Extract Transform Load (ETL) processes is essential, along with proficiency in analytical problem-solving. Strong communication and teamwork abilities are required to engage with stakeholders effectively. A degree in Computer Science, Information Technology, or related fields is preferred. Additional experience in the Identity Management and Security Governance domain, as well as with tools like Informatica, Teradata, Axiom, SQL, and Databricks, will be advantageous.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a BI Development Lead at our Pune location, you will be the driving force behind the development of Business Intelligence solutions using the Power BI ecosystem. Your deep technical expertise and BI ecosystem knowledge will be pivotal in designing, developing, and delivering BI services through engaging dashboard solutions. With a minimum of 5 years of experience in BI and Analytics, you will bring a proven track record in BI consultancy and reporting roles, utilizing data query and reporting/analysis tools effectively. Your proficiency in building complex SQL queries against Microsoft SQL Server and/or Oracle, including CTEs, subqueries, and pivot queries, will be instrumental in data manipulation and cleansing for reporting and analysis purposes. Experience with SQL Server Integration Services (SSIS) or similar tools like Azure Data Factory, Azure Databricks, or Informatica would be advantageous. Your ability to conduct SQL profiling and analyze query execution plans to optimize performance will be essential in ensuring efficient data retrieval and processing. Your strong analytical skills, problem-solving abilities, and excellent communication skills in English, both written and verbal, will be key assets in this role. Collaboration and teamwork are essential, as you will work closely with others while also demonstrating the ability to work independently. Being action-oriented, self-motivated, and proactive with a continuous learning mindset will enable you to stay abreast of evolving technologies and contribute effectively to the team. A Bachelor's degree in Computer Science is preferred, while a Master's degree would be an added advantage. With 5-9 years of experience in DWBI development projects, including at least 2 years of hands-on experience with BI and Visualization technologies such as Power BI and Tableau, you will bring a wealth of expertise to our team. If you are detail-oriented, possess a high level of integrity, and thrive in a fast-paced, dynamic environment, we invite you to join us in this challenging and rewarding role.,
Posted 1 week ago
7.0 - 15.0 years
0 Lacs
karnataka
On-site
As an expert in AI, Data Governance, and Metadata Management, you will play a key role in architecting, implementing, and maintaining enterprise data governance processes. Your responsibilities will include designing and implementing data quality management, metadata standardization, and stewardship processes across various data domains. You will be tasked with creating and enforcing data governance rules, setting up operational models, and managing stakeholder alignment across IT, legal, compliance, and business teams. Additionally, you will be responsible for supporting compliance with regulations such as GDPR, HIPAA, CCPA, and Indian regulations under the Digital Personal Data Protection Act. You will lead the technical team in delivering roadmap milestones, monitoring Data Governance Key Performance Indicators (KPIs), and guiding the adoption of governance tools and processes. To excel in this role, you should have 7-15 years of experience in Data Governance, Data Quality, or Metadata Management roles. Hands-on experience with key Data Governance platforms such as Collibra, Informatica, and Alation is essential. Strong technical skills in scripting (Python, SQL), metadata/configure pipelines, automation, data engineering, and ETL/ELT flows are required. Experience in establishing and monitoring data quality metrics, conducting data profiling, audits, and root-cause analysis is crucial. Moreover, familiarity with cloud environments and integration practices (AWS, Snowflake, databases), understanding of SDLC, Agile methods, Jira/Confluence, DevOps, and data platform operations are necessary. Exceptional stakeholder management, communication, and leadership abilities are vital for coordinating cross-functional teams. Educating stakeholders through workshops, training, and governance councils, as well as managing governance awareness campaigns, will be part of your responsibilities. Joining Narwal offers you the opportunity to shape the future of a rapidly growing company. You will enjoy a competitive salary and benefits package, a supportive and inclusive company culture, and a collaborative and innovative work environment. Access to professional development and growth opportunities is provided, and Narwal is certified as a Great Place to Work. Narwal is an equal opportunity employer that celebrates diversity and is dedicated to creating an inclusive environment for all employees. To learn more, please visit: [Narwal Website](https://www.narwalinc.com/).,
Posted 1 week ago
3.0 - 8.0 years
0 Lacs
karnataka
On-site
As an experienced professional with 3-8 years of expertise, you will be responsible for developing and maintaining data solutions in Azure environments using PL/SQL, MS SQL, and MS Access. Your role will involve utilizing Informatica or similar ETL tools for efficient data extraction, transformation, and loading processes. Applying strong programming skills and relational database concepts, you will be instrumental in designing and implementing effective data solutions. Additionally, you will collaborate with teams using Agile methodologies and project management tools such as Jira or Git Stash to ensure successful project delivery. Your responsibilities will also include working with Python or other ETL tools to enhance data processing capabilities and contribute to the design and optimization of data workflows and processes. It is essential to stay updated on technologies like Java and Angular to facilitate seamless integration with existing systems.,
Posted 1 week ago
1.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description Want to participate in building the next generation of online payment system that supports multiple countries and payment methods? Amazon Payment Services (APS) is a leading payment service provider in MENA region with operations spanning across 8 countries and offers online payment services to thousands of merchants. APS team is building robust payment solution for driving the best payment experience on & off Amazon. Over 100 million customers send tens of billions of dollars moving at light-speed through our systems annually. We build systems that process payments at an unprecedented scale with accuracy, speed and mission-critical availability. We innovate to improve customer experience, with support for currency of choice, in-store payments, pay on delivery, credit and debit card payments, seller disbursements and gift cards. Many new exciting & challenging ideas are in the works. Key job responsibilities Key job responsibilities Data Engineers focus on managing data requests, maintaining operational excellence, and enhancing core infrastructure. You will be collaborating closely with both technical and non-technical teams to design and execute roadmaps Basic Qualifications 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ - K20 Job ID: A3042356
Posted 1 week ago
7.0 - 15.0 years
0 Lacs
karnataka
On-site
The Quality Assurance (QA) and Control Manager will oversee the planning, coordination, and execution of QA activities for a large-scale SAP ERP set up. This role ensures that SAP-Center of Expertise meet internal quality standards, industry best practices, and business requirements. The manager will also be responsible for designing and managing governance frameworks to monitor process improvements and maintain long-term operational excellence in ERP and enabled processes aligned to the strategic objectives of SAP-CoE. Define and implement a comprehensive quality assurance strategy and plan specific to the service management, specification and development of new functionality, project management, and operations. Develop and enforce quality standards, testing protocols, and documentation procedures across SAP modules. Conduct quality gate reviews on SAP-CoE projects. Monitor deliverables from SAP consultants, developers, and business stakeholders to ensure they meet agreed-upon quality criteria. Provide any special input reviewing the testing procedures and development and execution of testing strategies including Unit Testing, Integration Testing, User Acceptance Testing (UAT), and Regression Testing. Ensure qualitative process in defects management. Establish control mechanisms to ensure that implemented ERP processes are compliant with internal policies and external regulations. Work closely with BU/FU leads and business process owners to align SAP processes with organizational objectives and continuous improvement efforts. Define KPIs and dashboards to monitor process adherence and performance post-implementation. Implement and drive continuous improvements in SAP-CoE. Maintain quality Document management system. Identify, document, and manage quality-related risks. Conduct root cause analysis for defects or process failures and ensure corrective/preventive actions are implemented. Conduct periodic process Audits and implement corrective actions. Ensure Process compliance through effective documentation and process traceability. Provide regular QA status reports to management/steering committees. Facilitate workshops and meetings with functional teams to ensure quality awareness and continuous engagement. Act as a point of contact for QA/QC-related issues and escalate critical quality risks appropriately. Responsible to ensure compliance with applicable external and internal regulations, procedures, and guidelines. Living Hitachi Energy's core values of safety and integrity, which means taking responsibility for your own actions while caring for your colleagues and the business. Bachelors or masters degree in information technology, Engineering, or related field. 15+ years of experience in large scale SAP ERP implementation with at least 7+ years in quality assurance/control in SAP/ERP projects. Strong understanding of SAP modules and implementation methodologies. Certification in Quality Management and SAP Quality Assurance. Knowledge in Data - Syniti, Informatica, SAP Data Intelligence, Testing - Worksoft Tricentris, Selenium, etc. Proven experience in enterprise process design, process mapping, and control frameworks. Proficiency in both spoken & written English language is required.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You will be working with KPMG Global Services (KGS) India, a strategic global delivery organization collaborating with over 50 KPMG firms to offer progressive, scalable, and customized business solutions. The KGS India team consists of approximately 21,000 employees across eight locations in India including Bengaluru, Gurugram, Hyderabad, Mumbai, Kochi, Noida, Pune, and Kolkata, providing a variety of Advisory and Tax-related services to KPMG firms worldwide. As a Salesforce Developer/Architect & OmniStudio professional, your primary responsibilities will include designing and developing tailored solutions within the Salesforce platform using technologies like Apex, Visualforce, and Lightning Components. You will work closely with functional teams to understand their requirements thoroughly and craft innovative solutions that align with their needs. Your role will involve writing efficient and well-designed code following coding conventions, conducting rigorous testing of solutions, collaborating with cross-functional teams, participating in code reviews, and staying updated on new Salesforce technologies for recommending suitable solutions. Key Requirements: - Hands-on experience as a Salesforce Developer with proficiency in Apex, Visualforce, and Lightning Component Framework. - In-depth understanding of Salesforce architecture and design principles such as Apex, Visualforce, Lighting Components, and SOQL. - Proficiency in developing custom solutions on the Salesforce Lightning Platform. - Extensive knowledge of Salesforce products and services including Public Sector Solutions, Discovery Framework in PSS, Salesforce OmniStudio (OmniScripts/Integration Procedures), Conga, DocuSign, Salesforce Experience Cloud, and Force.com. - Familiarity with Git repo, DevOps processes, and experience in real-life Salesforce project implementations involving complex business logic and API integrations. - Strong grasp of Salesforce configuration, Apex, and declarative tools like Process Builder, Workflows, and Flows. - Exposure to integration technologies such as MuleSoft, Informatica, or Jitterbit would be advantageous. - Ability to analyze and troubleshoot complex technical issues related to Salesforce. - Excellent verbal and written communication skills to collaborate with non-technical stakeholders. - Mandatory certifications: Salesforce Platform App Builder and Platform Developer. Additional certifications like Advanced Administrator, Integration Architecture, or Solution Architecture are a plus. Education: - Technical Graduate (BE/ BTech) You must be willing to go beyond your regular working hours when necessary to ensure project success and meet customer requirements across different time zones. Effective time management and task prioritization are essential for completing assigned duties within the scheduled timelines.,
Posted 1 week ago
9.0 - 12.0 years
25 - 35 Lacs
Pune
Hybrid
We have opening for " Princpal IT Engineer Applications - Data Engineer " role, with one of the top US Product based MNC, Pune. Total exp.-9-12 years, NP-upto 60 days Shift timing- 2 PM to 11 PM Location- Pune, Hinjewadi (Phase 2) PFB Must have skills : Minimum of 8 years of hands-on experience in software or data engineering roles Deep understanding of data modeling and data architecture design; must be a well-rounded technical expert Strong expertise in SQL scripting and performance tuning within large-scale data environments Proven experience working with high-volume data systems Demonstrated ability to solve complex problems using Informatica Excellent communication skills with the ability to clearly articulate and explain technical scenarios Experience in building data products or solutions from the ground up Proficient in Python, Shell scripting, and PL/SQL, with a strong focus on automation Must have strong hands-on experience; not seeking candidates focused on team management or leadership roles Good to have skills: Experience supporting Medicare, Medicaid, or other regulatory healthcare data platforms Nice to have certifications in Informatica Cloud, Oracle, Databricks, and cloud platforms (e.g., Azure, AWS)
Posted 1 week ago
2.0 - 7.0 years
14 - 18 Lacs
Chennai, Bengaluru
Work from Office
Project Role: Product Support Advisor Experience: 4-8 Years Job Location: Bangalore/Chennai(Hybrid) Skills: SQL, MDM, ETL, L2/L3 Production Support Shifts: Rotational Key responsibilities: • Test, debug and potentially automate ETL processes using PL/SQL and Json. • Support the production deliverables and investigating production issues. • Communicate with immediate supervisor and other project team members on the status of work in progress, both verbal and written communication • Work very closely on production database/environment • Update project, system, and other technical documents, as required • Work in any of the shifts as demand by project Education: Bachelors/Master's degree in Engineering or equivalent experience Required Technical Skills: • Extensive experience in production L2/L3 support • Excellent debugging, analytical and problem-solving skills • Very good understanding about the production timelines • Preferred experience in any MDM technologies • Extensive experience with SQL, Java, Python, Json scripts and unix scripts. • Hands-on experience in SQL and PL/SQL with the ability to write and tune complex SQL queries in large volume of data in production database • Hands-on experience with any ETL tool such as Informatica, Talend, etc. • Experience in any production issues ticketing tool • Preferred Reltio MDM knowledge • Very good communication skill to interact with client
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description As a Data Engineer, you will leverage your technical expertise in data, analytics, cloud technologies, and analytic software tools to identify best designs, improve business processes, and generate measurable business outcomes. You will work with Data Engineering teams from within D&A, across the Pro Tech portfolio and additional Ford organizations such as GDI&A (Global Data Insight & Analytics), Enterprise Connectivity, Ford Customer Service Division, Ford Credit, etc. Develop EL/ELT/ETL pipelines to make data available in BigQuery analytical data store from disparate batch, streaming data sources for the Business Intelligence and Analytics teams. Work with on-prem data sources (Hadoop, SQL Server), understand the data model, business rules behind the data and build data pipelines (with GCP, Informatica) for one or more Ford Pro verticals. This data will be landed in GCP BigQuery. Build cloud-native services and APIs to support and expose data-driven solutions. Partner closely with our data scientists to ensure the right data is made available in a timely manner to deliver compelling and insightful solutions. Design, build and launch shared data services to be leveraged by the internal and external partner developer community. Building out scalable data pipelines and choosing the right tools for the right job. Manage, optimize and Monitor data pipelines. Provide extensive technical, strategic advice and guidance to key stakeholders around data transformation efforts. Understand how data is useful to the enterprise. Responsibilities Bachelors Degree 3+ years of experience with SQL and Python 2+ years of experience with GCP or AWS cloud services; Strong candidates with 5+ years in a traditional data warehouse environment (ETL pipelines with Informatica) will be considered 3+ years of experience building out data pipelines from scratch in a highly distributed and fault-tolerant manner. Comfortable with a broad array of relational and non-relational databases. Proven track record of building applications in a data-focused role (Cloud and Traditional Data Warehouse) Qualifications Experience with GCP cloud services including BigQuery, Cloud Composer, Dataflow, CloudSQL, GCS, Cloud Functions and Pub/Sub. Inquisitive, proactive, and interested in learning new tools and techniques. Familiarity with big data and machine learning tools and platforms. Comfortable with open source technologies including Apache Spark, Hadoop, Kafka. 1+ year experience with Hive, Spark, Scala, JavaScript. Strong oral, written and interpersonal communication skills Comfortable working in a dynamic environment where problems are not always well-defined. M.S. in a science-based program and/or quantitative discipline with a technical emphasis.
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Professional, Data Architecture Good Hand on experience on ETL and BI tools like SSIS, SSRS, Power BI etc. Readiness to play an individual contributor role on the technical front Excellent communication skills Readiness to travel onsite for short term, as required A good experience in ETL development for 3-5 years and with hands-on experience in a migration or data warehousing project Should have strong database fundamentals and experience in writing Unit test cases and test scenarios Expert knowledge in writing SQL commands, queries and stored procedures Good knowledge of ETL tools like SSIS, Informatica, etc. and data warehousing concepts Should have good knowledge in writing macros Good client handling skills with preferred onsite experience Thank You For Considering Employment With Fiserv. Please Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our Commitment To Diversity And Inclusion Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note To Agencies Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning About Fake Job Posts Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France