Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
30 - 45 Lacs
Pune
Work from Office
Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills
Posted 1 week ago
3.0 - 8.0 years
30 - 45 Lacs
Bengaluru
Work from Office
Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills
Posted 1 week ago
3.0 - 7.0 years
42 - 72 Lacs
Dindigul
Work from Office
Responsibilities: * Design, develop, test & maintain Informatica CDGC solutions using ETL processes with SQL queries. * Collaborate with cross-functional teams on project requirements & deliverables. Provident fund Health insurance Office cab/shuttle Annual bonus Food allowance
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title – Master Data Analyst Preferred Location - Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do Job Summary We are seeking a detail-oriented and experienced Master Data Analyst to ensure the accuracy, consistency, and integrity of our critical master data across various enterprise systems. The Master Data Analyst will play a crucial role in data governance, data quality initiatives, and supporting business processes through reliable and well-managed master data. Key Responsibilities Develop, implement, and maintain master data management (MDM) policies, standards, and procedures. Ensure data quality, completeness, and consistency of master data (e.g., customer, product, vendor, material) across all relevant systems. Perform data profiling, cleansing, and validation to identify and resolve data quality issues. Collaborate with business units and IT teams to define data definitions, business rules, and data hierarchies. Act as a data steward, overseeing the creation, modification, and deletion of master data records. Support data integration efforts, ensuring master data is accurately and efficiently synchronized between systems. Document master data processes, data flows, and data lineage. Participate in projects related to data migration, system implementations, and data governance initiatives. Provide training and support to end-users on master data best practices and tools. Required Qualifications Bachelor's degree in Information Systems, Data Science, or a related quantitative field. 3+ years of experience in a Master Data Management (MDM), Data Quality, or Data Analyst role, specifically focused on master data. Strong understanding of master data concepts, data governance principles, and data lifecycle management. Proficiency with data analysis tools and techniques. Experience with enterprise resource planning (ERP) systems (e.g., SAP, Oracle, Microsoft Dynamics) and their master data structures. Experienced in cloud platforms (AWS, Azure) or relevant data technologies. Excellent analytical, problem-solving, and communication skills, with the ability to translate technical concepts to non-technical stakeholders. Proven ability to work independently and collaboratively in a fast-paced environment. Preferred Qualifications Experience with MDM software solutions (e.g., Informatica MDM, SAP MDG). Familiarity with SQL and experience querying relational databases. Knowledge of SAP modules (ECC, CRM, BW) and with data governance, metadata management, and data cataloging tools (e.g., Alation, Collibra). Familiarity handling MDM in SAP ECC and SAP S/4 versions Knowledge of data warehousing concepts and business intelligence tools (e.g., Power BI, Tableau). Experience with data governance frameworks and tools. Certifications in data management or related fields. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice
Posted 1 week ago
1.0 - 4.0 years
10 - 11 Lacs
Chennai
Work from Office
As a Data Engineer, you will leverage your technical expertise in data, analytics, cloud technologies, and analytic software tools to identify best designs, improve business processes, and generate measurable business outcomes. You will work with Data Engineering teams from within D&A, across the Pro Tech portfolio and additional Ford organizations such as GDI&A (Global Data Insight & Analytics), Enterprise Connectivity, Ford Customer Service Division, Ford Credit, etc. Develop EL/ELT/ETL pipelines to make data available in BigQuery analytical data store from disparate batch, streaming data sources for the Business Intelligence and Analytics teams. Work with on-prem data sources (Hadoop, SQL Server), understand the data model, business rules behind the data and build data pipelines (with GCP, Informatica) for one or more Ford Pro verticals. This data will be landed in GCP BigQuery. Build cloud-native services and APIs to support and expose data-driven solutions. Partner closely with our data scientists to ensure the right data is made available in a timely manner to deliver compelling and insightful solutions. Design, build and launch shared data services to be leveraged by the internal and external partner developer community. Building out scalable data pipelines and choosing the right tools for the right job. Manage, optimize and Monitor data pipelines. Provide extensive technical, strategic advice and guidance to key stakeholders around data transformation efforts. Understand how data is useful to the enterprise. Experience with GCP cloud services including BigQuery, Cloud Composer, Dataflow, CloudSQL, GCS, Cloud Functions and Pub/Sub. Inquisitive, proactive, and interested in learning new tools and techniques. Familiarity with big data and machine learning tools and platforms. Comfortable with open source technologies including Apache Spark, Hadoop, Kafka. 1+ year experience with Hive, Spark, Scala, JavaScript. Strong oral, written and interpersonal communication skills Comfortable working in a dynamic environment where problems are not always well-defined. M. S. in a science-based program and/or quantitative discipline with a technical emphasis. Bachelors Degree 3+ years of experience with SQL and Python 2+ years of experience with GCP or AWS cloud services; Strong candidates with 5+ years in a traditional data warehouse environment (ETL pipelines with Informatica) will be considered 3+ years of experience building out data pipelines from scratch in a highly distributed and fault-tolerant manner. Comfortable with a broad array of relational and non-relational databases. Proven track record of building applications in a data-focused role (Cloud and Traditional Data Warehouse)
Posted 1 week ago
7.0 - 9.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role:-Data Analyst Exp:- 6-11 Yrs Location:-Hyderabad Primary Skills:- ETL,Informatica,Python, SQL,BI tools and Investment Domain Please share your resumes to rajamahender.n@technogenindia.com , Job Description:- The Minimum Qualifications Education: Bachelor’s or Master’s degree in Data Science, Statistics, Mathematics, Computer Science, Actuarial Science, or related field. Experience: 7-9 years of experience as a Data Analyst, with at least 5 years supporting Finance within the insurance industry. Hands-on experience with Vertica/Teradata for querying, performance optimization, and large-scale data analysis. Advanced SQL skills: proficiency in Python is a strong plus. Proven ability to write detailed source-to-target mapping documents and collaborate with technical teams on data integration. Experience working in hybrid onshore-offshore team environments. Deep understanding of data modelling concepts and experience working with relational and dimensional models. Strong communication skills with the ability to clearly explain technical concepts to non-technical audiences. A strong understanding of statistical concepts, probability and accounting standards, financial statements (balance sheet, income statement, cash flow statement), and financial ratios. Strong understanding of life insurance products and business processes across the policy lifecycle. Investment Principles: Knowledge of different asset classes, investment strategies, and financial markets. Quantitative Finance: Understanding of financial modelling, risk management, and derivatives. Regulatory Framework: Awareness of relevant financial regulations and compliance requirements. The Ideal Qualifications Technical Skills: Proven track record of Analytical and Problem-Solving skills. A solid understanding of Financial Accounting Systems and knowledge of accounting principles, reporting and budgeting Strong data analysis skills for extracting insights from financial data Proficiency in data visualization tools and reporting software is also important. Experience integrating financial systems with actuarial, policy administration, and claims platforms. Familiarity with actuarial processes, reinsurance, or regulatory reporting requirements. Experience with General Ledger systems such as SAP and forecasting tools like Anaplan. Soft Skills: Exceptional communication and interpersonal skills. Ability to influence and motivate teams without direct authority. Excellent time management and organizational skills, with the ability to prioritize multiple initiatives. What to Expect as Part of MassMutual and the Team Regular meetings with the Corporate Technology leadership team Focused one-on-one meetings with your manager Access to mentorship opportunities Access to learning content on Degreed and other informational platforms Your ethics and integrity will be valued by a company with a strong and stable ethical business with industry leading pay and benefits
Posted 1 week ago
0 years
0 Lacs
New Delhi, Delhi, India
On-site
The purpose of this role is to oversee the development of our database marketing solutions, using database technologies such as Microsoft SQL Server/Azure, Amazon Redshift, Google BigQuery. The role will be involved in design, development, troubleshooting, and issue resolution. The role involves upgrading, enhancing, and optimizing the technical solution. It involves continuous integration and continuous deployment of various requirements changes in the business logic implementation. Interactions with internal stakeholders and/or clients to explain technology solutions and a clear understanding of client’s business requirements through which to guide optimal design/solution to meet their needs. The ability to communicate to both technical and non-technical audiences is key. Job Description: Must Have Skills: Database (SQL server / Snowflake / Teradata / Redshift / Vertica / Oracle / Big query / Azure DW etc. ETL (Extract, Transform, Load) tool (Talend, Informatica, SSIS, DataStage, Matillion) Python, UNIX shell scripting, Project & resource management Workflow Orchestration (Tivoli, Tidal, Stonebranch) Client-facing skills Good to have Skills: Experience in Cloud computing (one or more of AWS, Azure, GCP) . AWS Preferred. Key responsibilities: Understanding and practical knowledge of data warehouse, data mart, data modelling, data structures, databases, and data ingestion and transformation Strong understanding of ETL processes as well as database skills and common IT offerings i.e. storage, backups and operating system. Has a strong understanding of the SQL and data base programming language Has strong knowledge of development methodologies and tools Contribute to design and oversees code reviews for compliance with development standards Designs and implements technical vision for existing clients Able to convert documented requirements into technical solutions and implement the same in given timeline with quality issues. Able to quickly identify solutions for production failures and fix them. Document project architecture, explain detailed design to team and create low level to high level design. Perform mid to complex level tasks independently. Support Client, Data Scientists and Analytical Consultants working on marketing solution. Work with cross functional internal team and external clients . Strong project Management and organization skills . Ability to lead/work 1 – 2 projects of team size 2 – 3 team members. Code management systems which include Code review and deployments Location: DGS India - Pune - Baner M- Agile Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
You desire impactful work. You’re RGA ready RGA is a purpose-driven organization working to solve today’s challenges through innovation and collaboration. A Fortune 500 Company and listed among its World’s Most Admired Companies , we’re the only global reinsurance company to focus primarily on life- and health-related solutions. Join our multinational team of intelligent, motivated, and collaborative people, and help us make financial protection accessible to all. General Summary Under limited supervision, participate in and support various GFS initiatives related to administration and data integration. This includes working with RGA Valuation, Finance, Pricing, and Risk Management to ensure consistent and quality data extracts are produced. Lead and support the development and implementation of processes for analyzing, mapping, and testing client data to be used in various downstream processes. Lead and support the analysis of client reported inventories for new deals and review changes to existing deals. Responsibilities Serve as technical resource for guiding the team in extracting, loading and mapping of client data files. Serve as a subject matter expert when dealing with the most complex issues related to data conversion. Write and execute data queries to get results needed for analysis, validity and accuracy testing. Interpret data, analyze results and provides ongoing reporting and analysis of key metrics Champion the future vision of the department and assist in creation/maintenance of data repository documentation and data standards and guidelines Solve business problems with a moderate level of complexity; analyze possible solutions using technical experience and judgment and precedents Explain data research and findings in a clear and straightforward manner to assist leadership in prioritizing business and information needs Analyze, test, and debug system solutions. Consult with other Operations and Risk Management associates in the development of solutions for specific business needs Perform other duties/projects as assigned. Required Education and Experience Bachelor’s degree in Information Technology, Computer Science, Data Science, Actuarial Science or a related degree, or equivalent experience 3-5 years of experience in a data quality assurance and/or annuity/pension administration system testing role Preferred Progress toward FLMI, ALHC or another relevant professional accreditation Required Skills and Abilities Intermediate Word, Excel, VBA and SQL/Query skills Advanced level of investigative, analytical and problem solving skills Detailed oriented, passionate about completing the task correctly rather than quickly Advanced oral and written communication skills demonstrating ability to share and impart knowledge Advanced ability to liaise with individuals across a wide variety of operational, functional, and technical disciplines Familiarity with insurance administrative systems, ability to calculate benefits under multiple structures, and basic understanding of how the data affects liabilities and financial results Ability to work effectively within a team environment and individually Advanced ability to translate business needs and problems into viable/accepted solutions. Advanced skills in customer relationship management and change management Ability to interpret and understand various client data formats Broad business knowledge, including knowledge of valuation, finance, and/or administrative systems Advanced ability to set goals and handle multiple tasks, clients, and projects simultaneously; Ability to appropriately balance priorities, deadlines, and deliverables Willingness to learn new skills and software applications Ability to customize a process for testing that can be repeated by others if needed Ability to liaise with individuals across a wide variety of operational, functional, and technical disciplines Preferred Advanced-to-expert knowledge of database application such as Access, SQL Server, or Oracle as well as SQL Complex analytical and problem-solving skills Experience with data management and or visualization tools such as (Tableau, Alteryx, Informatica, Python, etc.) Demonstrated management experience Insurance industry knowledge This is the contractual role for 1 year What You Can Expect From RGA Gain valuable knowledge from and experience with diverse, caring colleagues around the world. Enjoy a respectful, welcoming environment that fosters individuality and encourages pioneering thought. Join the bright and creative minds of RGA, and experience vast, endless career potential.
Posted 1 week ago
7.0 years
8 - 9 Lacs
Thiruvananthapuram
On-site
7 - 9 Years 4 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning. Experience in data warehouse design and cost improvements. Apply and optimize data models for efficient storage retrieval and processing of large datasets. Communicate and explain design/development aspects to customers. Estimate time and resource requirements for developing/debugging features/components. Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification. Knowledge Examples: Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF. Proficient in SQL for analytics and windowing functions. Understanding of data schemas and models. Familiarity with domain-related data. Knowledge of data warehouse optimization techniques. Understanding of data security concepts. Awareness of patterns frameworks and automation practices. Additional Comments: We are seeking a highly experienced Senior Data Engineer to design, develop, and optimize scalable data pipelines in a cloud-based environment. The ideal candidate will have deep expertise in PySpark, SQL, Azure Databricks, and experience with either AWS or GCP. A strong foundation in data warehousing, ELT/ETL processes, and dimensional modeling (Kimball/star schema) is essential for this role. Must-Have Skills 8+ years of hands-on experience in data engineering or big data development. Strong proficiency in PySpark and SQL for data transformation and pipeline development. Experience working in Azure Databricks or equivalent Spark-based cloud platforms. Practical knowledge of cloud data environments – Azure, AWS, or GCP. Solid understanding of data warehousing concepts, including Kimball methodology and star/snowflake schema design. Proven experience designing and maintaining ETL/ELT pipelines in production. Familiarity with version control (e.g., Git), CI/CD practices, and data pipeline orchestration tools (e.g., Airflow, Azure Data Factory Skills Azure Data Factory,Azure Databricks,Pyspark,Sql About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 1 week ago
15.0 years
0 Lacs
Hyderābād
On-site
Project Role : Technology Account Lead Project Role Description : Function as primary contact for technology work at each account. Integrate technology contracts and engagements at the client. Leverage all technology offerings to expand the scope of technology work at the account (up-sell/cross-sell). Create the technology account plan and get the right people involved to maximize the opportunity and build the account. Must have skills : SAP PP Production Planning & Control Discrete Industries, Informatica-BI Tools Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Technology Account Lead, you will serve as the primary contact for technology initiatives at each assigned account. Your typical day will involve integrating technology contracts and engagements, collaborating with various stakeholders, and leveraging technology offerings to enhance the scope of work. You will be responsible for creating a comprehensive technology account plan, ensuring that the right individuals are engaged to maximize opportunities and foster account growth. Your role will require strategic thinking and effective communication to align technology solutions with client needs, ultimately driving success for both the client and the organization. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Facilitate regular meetings to ensure alignment and progress across teams. - Mentor junior professionals to enhance their skills and knowledge in technology account management. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP PP Production Planning & Control Process Industries. - Strong understanding of technology integration and account management strategies. - Experience in developing and executing technology account plans. - Ability to analyze client needs and propose tailored technology solutions. - Excellent communication and interpersonal skills to engage with diverse teams and stakeholders. Additional Information: - The candidate should have minimum 12 years of experience in SAP PP Production Planning & Control Process Industries. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education
Posted 1 week ago
130.0 years
4 - 7 Lacs
Hyderābād
On-site
Job Description Manager, Data Visualization The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of the company IT operating model, Tech centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each tech center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview: A unique opportunity to be part of an Insight & Analytics Data hub for a leading biopharmaceutical company and define a culture that creates a compelling customer experience. Bring your entrepreneurial curiosity and learning spirit into a career of purpose, personal growth, and leadership. We are seeking those who have a passion for using data, analytics, and insights to drive decision-making that will allow us to tackle some of the world's greatest health threats As a manager in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Our Quantitative Sciences team use big data to analyze the safety and efficacy claims of our potential medical breakthroughs. We review the quality and reliability of clinical studies using deep scientific knowledge, statistical analysis, and high-quality data to support decision-making in clinical trials. What will you do in this role: Design & develop user-centric data visualization solutions utilizing complex data sources. Identify & define key business metrics and KPIs in partnership with business stakeholders. Define & develop scalable data models in alignment & support from data engineering & IT teams. Lead UI UX workshops to develop user stories, wireframes & develop intuitive visualizations. Collaborate with data engineering, data science & IT teams to deliver business friendly dashboard & reporting solutions. Apply best practices in data visualization design & continuously improve upon intuitive user experience for business stakeholders. Provide thought leadership and data visualization best practices to the broader Data & Analytics organization. Identify opportunities to apply data visualization technologies to streamline & enhance manual / legacy reporting deliveries. Provide training & coaching to internal stakeholders to enable a self-service operating model. Co-create information governance & apply data privacy best practices to solutions. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace. What Should you have: 5 years’ relevant experience in data visualization, infographics, and interactive visual storytelling Working experience and knowledge in Power BI / QLIK / Spotfire / Tableau and other data visualization technologies Working experience and knowledge in ETL process, data modeling techniques & platforms (Alteryx, Informatica, Dataiku, etc.) Experience working with Database technologies (Redshift, Oracle, Snowflake, etc) & data processing languages (SQL, Python, R, etc.) Experience in leveraging and managing third party vendors and contractors. Self-motivation, proactivity, and ability to work independently with minimum direction. Excellent interpersonal and communication skills Excellent organizational skills, with ability to navigate a complex matrix environment and organize/prioritize work efficiently and effectively. Demonstrated ability to collaborate and lead with diverse groups of work colleagues and positively manage ambiguity. Experience in Pharma and or Biotech Industry is a plus. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who we are: We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What we look for: Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Business Intelligence (BI), Clinical Decision Support (CDS), Clinical Testing, Communication, Create User Stories, Data Visualization, Digital Transformation, Healthcare Innovation, Information Technology Operations, IT Operation, Management Process, Marketing, Motivation Management, Requirements Management, Self Motivation, Statistical Analysis, Statistics, Thought Leadership, User Experience (UX) Design Preferred Skills: Job Posting End Date: 07/31/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R359276
Posted 1 week ago
15.0 years
0 Lacs
Hyderābād
On-site
Project Role : Integration Engineer Project Role Description : Provide consultative Business and System Integration services to help clients implement effective solutions. Understand and translate customer needs into business and technology solutions. Drive discussions and consult on transformation, the customer journey, functional/application designs and ensure technology and business solutions represent business requirements. Must have skills : Infrastructure As Code (IaC) Good to have skills : Google Cloud Storage, Microsoft Azure Databricks, Ansible on Microsoft Azure Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Integration Engineer, you will provide consultative Business and System Integration services to assist clients in implementing effective solutions. Your typical day will involve engaging with clients to understand their needs, facilitating discussions on transformation, and ensuring that the technology and business solutions align with their requirements. You will work collaboratively with various teams to translate customer needs into actionable plans, driving the customer journey and application designs to achieve optimal outcomes. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate workshops and meetings to gather requirements and feedback from stakeholders. - Develop and maintain documentation related to integration processes and solutions. - Infrastructure as Code (IaC): Knowledge of tools like Terraform, Terraform linkage, Helm, Ansible, ansible tower dependency and package management - Broad knowledge of operating systems - Network management knowledge and understanding of network protocols, configuration, and troubleshooting. Proficiency in configuring and managing network settings within cloud platforms - Security: Knowledge with cybersecurity principles and practices, implementing security frameworks that ensure secure workloads and data protection - Expert proficiency in Linux CLI - Monitoring of the environment from technical perspective. - Monitoring the costs of the development environment. Professional & Technical Skills: - Must To Have Skills: Proficiency in Infrastructure As Code (IaC). - Good To Have Skills: Experience with Hitachi Data Systems (HDS), Google Cloud Storage, Microsoft Azure Databricks. - Strong understanding of cloud infrastructure and deployment strategies. - Experience with automation tools and frameworks for infrastructure management. - Familiarity with version control systems and CI/CD pipelines. - Solid understanding of Data Modelling, Data warehousing and Data platforms design. - Working knowledge of databases and SQL. - Proficient with version control such as: Git, GitHub or GitLab - Solid understanding of Data warehousing and Data platforms design. - Experience supporting BAT teams and BAT test environments. - Experience with workflow and batch scheduling. Added advantage of Control-M and Informatica experience. - Good know-how on Financial Markets. Know-how on Clearing, Trading and Risk business process will be added advantage - Know-How on Java, Spark & BI reporting will be an added advantage. - Know-how of cloud platform and affinity towards modern technology an added advantage. - Experience in CI/CD pipeline and exposure to DevOps methodologies will be considered as added advantage. Additional Information: - The candidate should have minimum 5 years of experience in Infrastructure As Code (IaC). - This position is based in Hyderabad. - A 15 years full time education is required. 15 years full time education
Posted 1 week ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description And Requirements Position Summary The engineer role is to support external data transmission, operations, scheduling and middleware transmission. Experience in Windows and Linux environments and knowledge of Informatica MFT & Data Exchange tools. Should be able to handle day to day customer transmission and Informatica MFT/DX activities. Job Responsibilities Design and implement complex integration solutions through collaboration with engineers, application teams and operations team across the global enterprise Provide technical support to application developers when required. This includes promoting use of best practices, ensuring standardization across applications and trouble shooting Able to create new setups and support existing transmissions Able to diagnose and troubleshoot transmission and connection issues Experience in Windows administration and good to have expertise in IBM workload scheduler Hands on experience in tools like IIS, Informatica MFT & DX console, Splunk and IBM workload scheduler Responsibilities also include planning, engineering, and implementation of new transmissions as well as migration of setups The role will participate in the evaluation and recommendation of new products and technologies The role will also represent the domain in relevant automation and value innovation efforts Technical leadership, ability to think strategically and effectively communicate solutions to a variety of stake holders Able to debug production issues by analyzing the logs directly and using tools like Splunk. Learn new technologies based on demand and help team members by coaching and assisting Willing to work in rotational shifts Good Communication skill with the ability to communicate clearly and effectively Knowledge, Skills And Abilities Education Bachelor's degree in computer science, Information Systems, or related field Experience 7+ years of total experience and at least 4+ years of experience in designing and implementation of complex integration solutions through collaboration with engineers, application and operations team Create new setups and support existing transmissions Experience in tools like IIS, Informatica MFT & DX console, Splunk and IBM workload scheduler SSH/SSL/Tectia Microsoft IIS IBM Connect:Direct IBM Sterling Informatica MFT Operating System Knowledge (Linux/Windows/AIX) Troubleshooting Azure Dev Ops Pipeline Knowledge Mainframe z/OS Knowledge Open Shift and Kube Enterprise Scheduling Knowledge (Maestro) Good to Have : Python and/or Powershell Agile SAFe for Teams Ansible (Automation) Elastic Other Requirements (licenses, Certifications, Specialized Training – If Required) Working Relationships Internal Contacts (and purpose of relationship): MetLife internal partners External Contacts (and purpose of relationship) – If Applicable MetLife external partners About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us!
Posted 1 week ago
10.0 years
0 Lacs
Delhi
On-site
Where Data Does More. Join the Snowflake team. At the forefront of the data revolution, Snowflake is building the world’s greatest data and applications platform. Our ‘get it done’ culture fosters innovation, impact, and collaboration. We are rapidly expanding our partner Go-To-Market initiatives with System Integrators, Cloud Service Providers, and Data Cloud Partners, who are crucial in helping customers leverage the Snowflake AI Data Cloud. We seek a self-driven individual with excellent English verbal and written communication skills to grow these partnerships, engaging both local and global teams. One of the unique benefits of Snowflake’s architecture is the ability to securely share data, applications and solutions with other Snowflake accounts without creating a copy of the data. The Snowflake Data Cloud builds on our secure data sharing functionality to be the ‘App Store’ for data, enabling providers and consumers to publish/discover and monetize data, applications and solutions. Providers to the Snowflake Marketplace use Data Sharing as the means to deliver their data or service, replacing traditional delivery methods such as files and APIs. Data Sharing and the Marketplace play a key strategic role in our Data Cloud vision and drive the network effect of the Data Cloud! Success in this position requires the candidate to be a technical advisor by aligning with key programs and educating/upskilling partners on these key product features. The candidate will present skills to both technical and executive audiences, whether it's white boarding or using presentations and demos to build mind share among Snowflake Data Cloud and SI Partners in India. We are looking for a technical member who understands the data and applications partner ecosystem as well as how to grow and manage content partnerships. In addition to technically onboarding and enabling partners, you will be an important guide in the creation of the Go-to-Market for new partners. This position will be based in Mumbai and occasional travel to partner sites or industry events within India may be required. As A Partner Solution Engineer, You Will: Technically on board and enable partners to re-platform their Data and AI applications onto the Snowflake AI Data Cloud. Collaborate with partners to develop Snowflake solutions in customer engagements. You will be working with our partners to create assets and demos, build hands-on POCs and pitch Snowflake solutions. Help Solution Providers/Practice Leads with the technical strategies that enables them to sell their offerings on Snowflake Keeping Partners up to date on key Snowflake product updates and future roadmaps to help them represent Snowflake to their clients about latest technology solutions and benefits Run technical enablement programs to provide best practices, and solution design workshops to help Partners create effective solutions. Drive strategic engagements by quickly grasping new concepts and articulating their business value. Showcase the impact of Snowflake through compelling customer success stories and case studies. Strong understanding of how Partners make revenue through the Industry priorities & complexities they face and influence where Snowflake products can have the most impact for their product services Conversations with other technologists, providing presentations at the C-level. Preferred skill sets and experiences: Have a total of 10+ years of relevant experience. Experience working with Tech Partners, ISVs and System Integrators (SIs) in India. Develop data domain thought leadership within the partner community. Providing technical product and deep architectural expertise & latest product capabilities with our Partner Solution Architect community based in India. Presales or hands-on experience with Data Warehouse, Data Lake or Lakehouse platform. Presales or hands-on experience in designing and building highly scalable data pipelines using Spark, Kafka to ingest data from various systems. Experience with our partner integration ecosystem like Alation, FiveTran, Informatica, dbtCloud etc are plus. Have hands-on experience and strong knowledge of Docker and how to containerize Python-based applications. Have knowledge of Container networking and Kubernetes. Have working knowledge of and integration with API’s Have proficiency in Agile development practices and Continuous Integration/Continuous Deployment (CI/CD), including DataOps and MLops Presales or hands-on experience using Big Data or Cloud integration technologies such as Azure Data Factory, AWS Glue, AWS Lambda, etc. Experience in the AI/ML domain is a plus. Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com
Posted 1 week ago
5.0 - 7.0 years
0 Lacs
Noida
On-site
5 - 7 Years 2 Openings Noida Role description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation: Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration: Define and govern the configuration management plan. Ensure compliance within the team. Testing: Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance: Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management: Manage the delivery of modules effectively. Defect Management: Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation: Create and provide input for effort and size estimation for projects. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management: Execute and monitor the release process to ensure smooth transitions. Design Contribution: Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface: Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management: Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications: Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples: Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments: Skills Cloud Platforms ( AWS, MS Azure, GC etc.) Containerization and Orchestration ( Docker, Kubernetes etc..) APIs - Change APIs to APIs development Data Pipeline construction using languages like Python, PySpark, and SQL Data Streaming (Kafka and Azure Event Hub etc..) Data Parsing ( Akka and MinIO etc..) Database Management ( SQL and NoSQL, including Clickhouse, PostgreSQL etc..) Agile Methodology ( Git, Jenkins, or Azure DevOps etc..) JS like Connectors/ framework for frontend/backend Collaboration and Communication Skills Aws Cloud,Azure Cloud,Docker,Kubernetes About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 1 week ago
3.0 - 7.0 years
5 - 9 Lacs
Hyderabad, Pune
Work from Office
Designing dashboards with the use of visualization tools likeTableau Communicating with customers to analyze historical data and identify KPIs Improving data processing speed by building SQL automations Tweaking SQL Queries for best performance Analysing the data so as to identify trends and share insights . Recognizing areas for automation Restricting data for particular users with the help of User filters Producing support documentation and keep existing documentation up-to-date Carrying out investigation of root cause analysis . Good knowledge onTableausever Administration. . Good knowledge onTableau3 cluster environment. . Knowledge on other reporting tools like OBIEE, Power BI Etc.. . Good knowledge on SQL to build reports intableau. . Knowledge on NOETIX Query Builder and NOETIX Administration activities. . Good knowledge on POWER SHELL scripts for Automation ofTableaureports.
Posted 1 week ago
5.0 - 8.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Informatica iPaas. Experience: 5-8 Years.
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Director – Data Presales Architect Location: Greater Noida Experience Required: 10-15 years Role Overview: We are seeking a highly skilled and experienced professional to lead and support our data warehousing and data center architecture initiatives. The ideal candidate will have deep expertise in Data Warehousing, Data Lakes, Data Integration, and Data Governance , with hands-on experience in ETL tools and cloud platforms such as AWS, Azure, GCP, and Snowflake . This role demands strong presales experience , technical leadership, and the ability to manage complex enterprise deals across multiple geographies. Key Responsibilities: Architect and design scalable Data Warehousing and Data Lake solutions Lead presales engagements, including RFP/RFI/RFQ lifecycle management Create and present compelling proposals and solution designs to clients Collaborate with cross-functional teams to deliver end-to-end solutions Estimate efforts and resources for customer requirements Drive Managed Services opportunities and enterprise deal closures Engage with clients across MEA, APAC, US, and UK regions Ensure alignment of solutions with business goals and technical requirements Maintain high standards of documentation and presentation for client-facing materials Must-Have: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field Certifications in AWS, Azure, GCP, or Snowflake are a plus Experience working in consulting or system integrator environments Strong knowledge of Data Warehousing, Data Lakes, Data Integration, and Data Governance Hands-on experience with ETL tools (e.g., Informatica, Talend, etc.) Exposure to c loud environments: AWS, Azure, GCP, Snowflake Minimum 2 years of presales experience with understanding of presales operating processes Experience in enterprise-level deals and Managed Services Proven ability to handle multi-geo engagements Excellent presentation and communication skills Strong understanding of effort estimation techniques for customer requirements
Posted 1 week ago
4.0 years
0 Lacs
Andhra Pradesh, India
On-site
A career within Salesforce Consulting services, will provide you with the opportunity to help our clients leverage Salesforce technology to enhance their customer experiences, enable sustainable change, and drive results. We focus on understanding our client’s challenges and developing custom solutions powered by Salesforce to transform their sales, service and marketing capabilities by exploring data and identifying trends, managing customer life cycles, strategically building and leveraging online communities, driving employee engagement and collaboration, and connecting directly with channel partners to share goals, objectives, and activities in a secure, branded location. To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. To help us achieve this we have the PwC Professional; our global leadership development framework. It gives us a single set of expectations across our lines, geographies and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future. As a Senior Associate, You'll Work As Part Of a Team Of Problem Solvers, Helping To Solve Complex Business Issues From Strategy To Execution. PwC Professional Skills And Responsibilities For This Management Level Include But Are Not Limited To: Use feedback and reflection to develop self awareness, personal strengths and address development areas. Delegate to others to provide stretch opportunities, coaching them to deliver results. Demonstrate critical thinking and the ability to bring order to unstructured problems. Use a broad range of tools and techniques to extract insights from current industry or sector trends. Review your work and that of others for quality, accuracy and relevance. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Seek and embrace opportunities which give exposure to different situations, environments and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. The Opportunity When you join PwC Acceleration Centers (ACs), you step into a pivotal role focused on actively supporting various Acceleration Center services, from Advisory to Assurance, Tax and Business Services. In our innovative hubs, you’ll engage in challenging projects and provide distinctive services to support client engagements through enhanced quality and innovation. You’ll also participate in dynamic and digitally enabled training that is designed to grow your technical and professional skills. As part of the Business Application Consulting team, you translate customer requirements into functional configurations of Salesforce.com. As a Senior Associate, you analyze complex problems, mentor others, and maintain rigorous standards. You focus on building client relationships and developing a deeper understanding of the business context, while navigating increasingly complex situations and growing your personal brand and technical knowledge. Responsibilities Translate customer requirements into functional Salesforce configurations Analyze and address complex issues within client projects Mentor and support junior team members Foster and strengthen client relationships Gain a thorough understanding of business context Manage and navigate intricate scenarios Enhance personal brand and technical skills Uphold exceptional standards and quality in deliverables What You Must Have Bachelor's Degree 4 years of experience 2-3 years of experience in Salesforce CPQ projects & Billing Experience with configuration & implementation of Salesforce CPQ Cloud 1-3 successful completions of CPQ and Billing entire cycle Implementation Thorough understanding of Quote to cash process Hands-on experience in Force.com platform using APEX, flows Experience in working with LWC - Lightning Web Components Experience in working with Advanced approval process Experience on SOAP/Rest/Platform Events/Streaming APIs and 3rd party integrations What Sets You Apart Bachelor of Technology preferred Proficient experience in Salesforce configuration, security and mapping features to the business requirements Experience in implementing integration solutions between CRM, ERP and Financial systems (example - Zuora, NetSuite) Advanced RDBMS knowledge and building SQL queries Proficient written and verbal communication skills Proficient experience wrt handling large data Producing and delivering technical solutions and integrated solutions involving different Salesforce clouds (including but not limited to Sales, Service, Revenue, Platform) and a variety of middleware products (Mulesoft, Informatica, etc) establishing quality and schedule
Posted 1 week ago
2.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
About QualityKiosk Technologies QualityKiosk Technologies is a leading independent Quality Engineering (QE) and digital transformation provider, helping businesses deliver high-performing, user-friendly applications. Founded in 2000, the company offers services in QA automation, performance assurance, intelligent automation (IA), robotic process automation (RPA), customer experience management, SRE, cloud, data analytics, and more. With a global presence in 25+ countries and a team of 4,000+ professionals, QualityKiosk supports major players across banking, e-commerce, telecom, automotive, insurance, OTT, and pharmaceuticals. Recognized by Forrester, Gartner, and others, it serves 50 of India’s Fortune 100 and 18 of the global Fortune 500 companies. Focused on innovation and rapid execution, the company aims for 5X growth in revenue and workforce over the next five years. Job Description Location: Mahape Experience: 2 years Early Joiners preferred! Strategic Responsibilities Support the development of the Data Strategy. Assist the DQ Lead in implementing data governance frameworks, policies, and standards. Help build the Data Governance Office as a center of excellence promoting a data-driven culture. Monitor industry and regulatory trends to guide data initiatives. Promote data quality and position data as a strategic asset. Core Responsibilities Support reporting on data quality performance and standards. Develop and maintain data quality rules and profiling tools using Informatica DQ. Contribute to data catalog artifacts (definitions, lineage, standards). Maintain organizational reference data. Provide expertise on systems and processes to improve data quality. Additional Accountabilities Create data quality dashboards and monitoring tools. Support RCA and business teams with data quality analysis. Communication Build strong relationships across business units and support functions. Engage with vendors, data partners, consultants, and regulators. Key KPIs Delivery of DQ roadmap Reduction in high-priority data issues Improved stakeholder satisfaction Measurable improvements in data quality Decision-Making Scope Advise on data governance implementation and strategic priorities. Recommend budget allocations for data initiatives. Qualifications & Experience Bachelor’s degree, 2+ years in data roles Experience in data quality, governance, and financial services. Skilled in Informatica DQ, SQL, and dashboard tools. Strong analytical, communication, and stakeholder management skills
Posted 1 week ago
0.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Category: Software Development/ Engineering Main location: India, Karnataka, Bangalore Position ID: J0625-0922 Employment Type: Full Time Position Description: Founded in 1976, CGI is among the largest independent IT and business consulting services firms in the world. With 94,000 consultants and professionals across the globe, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com. Job Title: OFSAA Developer Position: Lead Analyst Experience: 5 - 10 Years Category: Software Development/ Engineering Shift: General Main location: Bangalore/Chennai/Hyderabad/Pune Position ID: J0625-0922 Employment Type: Full Time Education Qualification: Bachelor's degree in Computer Science or related field or higher with minimum 5 years of relevant experience. Position Description: Works independently under limited supervision and applies knowledge of subject matter in Applications Development. Possess sufficient knowledge and skills to effectively deal with issues, challenges within field of specialization to develop simple applications solutions. Second level professional with direct impact on results and outcome. Your future duties and responsibilities: Designing, implementing, and maintaining OFSAA solutions for financial institutions Work closely with business and IT teams, developing and documenting SQL queries, executing data sanity checks, and managing data movement and server administration. Data Movement and Server Management Understand complex SQL queries, analyze data related issues and identify root cause of issues / defects. Develop and implement performance optimal system. Troubleshoot technical issues by analyzing OFSAA log files and fixing the issues. Analyze data related issues and identify root cause of issues / defects. Required qualifications to be successful in this role: Must-Have Skills: A Design, develop, and configure components within the OFSAA suite, including Data Foundation, DIH (Data Integration Hub), FSDF, and Metadata Manager. Implement business rules, data models, and mapping logic using OFSAA tools like Rule Framework, ERWIN, or T2T (Table to Table). Design and develop ETL workflows to extract, transform, and load financial and risk data from source systems into OFSAA staging and atomic layers. Work with ODI, Informatica, or PL/SQL scripts to ingest and transform large datasets. Build and maintain validation rules to ensure data accuracy, completeness, and consistency across ingestion and reporting layers. Optimize OFSAA components and SQL queries for performance and scalability. Collaborate with Business Analysts, Data Architects, and Reporting Teams to gather requirements and translate them into OFSAA configuration and code. Good-to-Have Skills: Good to have knowledge on finance and accounting principles. CGI is an equal opportunity employer. In addition, CGI is committed to providing accommodation for people with disabilities in accordance with provincial legislation. Please let us know if you require reasonable accommodation due to a disability during any aspect of the recruitment process and we will work with you to address your needs. #LI-SN1 Skills: Data flow analysis Data Modeling Data Warehousing DataStage ETL Oracle SQL Developer Python What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 1 week ago
0.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Designation: Senior Analyst Level: L2 Experience: 4 to 7 years Location: Chennai Job Description: We are seeking a highly skilled and motivated Senior Data Quality Analyst (DQA) who is responsible for ensuring the accuracy, completeness, and reliability of an organization’s data, enabling informed decision-making. The ideal candidate works with various Business stakeholders to understand business requirements and define data quality standards, developing and enforcing data validation procedures to ensure compliance with the company’s data standards. Responsibilities: Data Quality Monitoring & Validation (40% of Time): Profile Data: Identify anomalies (missing values, duplicates, outliers) Run Data Quality Checks: Validate against business rules. Automate Checks: Schedule scripts (SQL/Python) to flag issues in real time. Issue Resolution & Root Cause Analysis (30% of Time): Triage Errors: Work with IT/data engineers to fix corrupt data Track Defects: Log issues in Jira/Snowflake and prioritize fixes. Root Cause Analysis: Determine if issues stem from ETL bugs, user input, or system failures. Governance & Documentation (20% of Time): Ensuring compliance with data governance frameworks Metadata Management: Document data lineage. Compliance Audits: Ensure adherence to GDPR, HIPAA, or internal policies. Implementing data quality standards and policies Stakeholder Collaboration (10% of Time): Train Teams: Educate data citizens, data owners, data stewards on data quality best practices. Monitoring and reporting on data quality metrics including Reports to Leaderships. Skills: Technical Skills Knowledge of data quality tools and data profiling techniques (e.g., Talend, Informatica, Ataccama, DQOPS, Open Source tool) Familiarity with database management systems and data governance initiatives Proficiency in SQL and data management principles Experience with data integration and ETL tools Understanding of data visualization tools and techniques Knowledge of data governance and metadata management Familiarity with Python/R for automation and scripting Analytical Skills Strong analytical and problem-solving skills Ability to identify data patterns and trends Understanding of statistical analysis and data quality metrics Experience with data cleansing and data validation techniques including data remediation Ability to assess data quality and identify areas needing improvement Experience with conducting data audits and implementing data quality processes Ability to document data quality rules and procedures Job Snapshot Updated Date 25-07-2025 Job ID J_3911 Location Chennai, Tamil Nadu, India Experience 4 - 7 Years Employee Type Permanent
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana
On-site
Hyderabad, Telangana Job ID 30187591 Job Category Digital Technology Job Title – Master Data Analyst Preferred Location - Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do Job Summary We are seeking a detail-oriented and experienced Master Data Analyst to ensure the accuracy, consistency, and integrity of our critical master data across various enterprise systems. The Master Data Analyst will play a crucial role in data governance, data quality initiatives, and supporting business processes through reliable and well-managed master data. Key Responsibilities Develop, implement, and maintain master data management (MDM) policies, standards, and procedures. Ensure data quality, completeness, and consistency of master data (e.g., customer, product, vendor, material) across all relevant systems. Perform data profiling , cleansing, and validation to identify and resolve data quality issues. Collaborate with business units and IT teams to define data definitions, business rules, and data hierarchies . Act as a data steward, overseeing the creation, modification, and deletion of master data records. Support data integration efforts, ensuring master data is accurately and efficiently synchronized between systems. Document master data processes, data flows, and data lineage . Participate in projects related to data migration, system implementations, and data governance initiatives. Provide training and support to end-users on master data best practices and tools. Required Qualifications Bachelor's degree in Information Systems, Data Science, or a related quantitative field. 3+ years of experience in a Master Data Management (MDM), Data Quality, or Data Analyst role, specifically focused on master data. Strong understanding of master data concepts, data governance principles, and data lifecycle management. Proficiency with data analysis tools and techniques. Experience with enterprise resource planning (ERP) systems (e.g., SAP, Oracle, Microsoft Dynamics) and their master data structures. Experienced in cloud platforms (AWS, Azure) or relevant data technologies. Excellent analytical, problem-solving, and communication skills, with the ability to translate technical concepts to non-technical stakeholders. Proven ability to work independently and collaboratively in a fast-paced environment. Preferred Qualifications Experience with MDM software solutions (e.g., Informatica MDM, SAP MDG ). Familiarity with SQL and experience querying relational databases . Knowledge of SAP modules (ECC, CRM, BW) and with data governance, metadata management, and data cataloging tools (e.g., Alation, Collibra). Familiarity handling MDM in SAP ECC and SAP S/4 versions Knowledge of data warehousing concepts and business intelligence tools (e.g., Power BI, Tableau ). Experience with data governance frameworks and tools. Certifications in data management or related fields. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.
Posted 1 week ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Greetings from Synergy Resource Solutions, a leading Recruitment Consultancy. Our Client is an ISO 27001:2013 AND ISO 9001 Certified company, and pioneer in web design and development company from India. Company has also been voted as the Top 10 mobile app development companies in India. Company is leading IT Consulting and web solution provider for custom software, website, games, custom web application, enterprise mobility, mobile apps and cloud-based application design & development. Company is ranked one of the fastest growing web design and development company in India, with 3900+ successfully delivered projects across United States, UK, UAE, Canada and other countries. Over 95% of client retention rate demonstrates their level of services and client satisfaction. Position : Senior Data Engineer Experience : 5+ Years relevant experience Education Qualification : Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. Job Location : Ahmedabad Shift : 11 AM – 8.30 PM Key Responsibilities: Our client seeking an experienced and motivated Senior Data Engineer to join their AI & Automation team. The ideal candidate will have 5–8 years of experience in data engineering, with a proven track record of designing and implementing scalable data solutions. A strong background in database technologies, data modeling, and data pipeline orchestration is essential. Additionally, hands-on experience with generative AI technologies and their applications in data workflows will set you apart. In this role, you will lead data engineering efforts to enhance automation, drive efficiency, and deliver data driven insights across the organization. Job Description: • Design, build, and maintain scalable, high-performance data pipelines and ETL/ELT processes across diverse database platforms. • Architect and optimize data storage solutions to ensure reliability, security, and scalability. • Leverage generative AI tools and models to enhance data engineering workflows, drive automation, and improve insight generation. • Collaborate with cross-functional teams (Data Scientists, Analysts, and Engineers) to understand and deliver on data requirements. • Develop and enforce data quality standards, governance policies, and monitoring systems to ensure data integrity. • Create and maintain comprehensive documentation for data systems, workflows, and models. • Implement data modeling best practices and optimize data retrieval processes for better performance. • Stay up-to-date with emerging technologies and bring innovative solutions to the team. Qualifications: • Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. • 5–8 years of experience in data engineering, designing and managing large-scale data systems. Strong expertise in database technologies, including: The mandatory skills are as follows: SQL NoSQL (MongoDB or Cassandra, or CosmosDB) One of the following : Snowflake or Redshift or BigQuery or Microsft Fabrics Azure • Hands-on experience implementing and working with generative AI tools and models in production workflows. • Proficiency in Python and SQL, with experience in data processing frameworks (e.g., Pandas, PySpark). • Experience with ETL tools (e.g., Apache Airflow, MS Fabric, Informatica, Talend) and data pipeline orchestration platforms. • Strong understanding of data architecture, data modeling, and data governance principles. • Experience with cloud platforms (preferably Azure) and associated data services. Skills: • Advanced knowledge of Database Management Systems and ETL/ELT processes. • Expertise in data modeling, data quality, and data governance. • Proficiency in Python programming, version control systems (Git), and data pipeline orchestration tools. • Familiarity with AI/ML technologies and their application in data engineering. • Strong problem-solving and analytical skills, with the ability to troubleshoot complex data issues. • Excellent communication skills, with the ability to explain technical concepts to non-technical stakeholders. • Ability to work independently, lead projects, and mentor junior team members. • Commitment to staying current with emerging technologies, trends, and best practices in the data engineering domain. If your profile is matching with the requirement & if you are interested for this job, please share your updated resume with details of your present salary, expected salary & notice period.
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
About Forsys: Forsys Inc. is a leading company specializing in Lead-to-Revenue transformation, utilizing a combination of strategy, technology, and business transformation to foster growth. The company boasts a team of over 500 professionals dispersed across various locations such as the US, India, UK, Colombia, and Brazil, with its headquarters situated in the Bay Area. Forsys is renowned for its commitment to innovation and excellence. As an implementation partner for major vendors like Conga, Salesforce, and Oracle, as well as an incubator for groundbreaking ideas and solutions, Forsys holds a unique position within the consulting industry. The company is dedicated to empowering its clients by uncovering new revenue streams and cultivating a culture of innovation. To learn more about our vision and the impact we are making, visit forsysinc.com. Data Migration Technical Lead: Forsys is currently seeking a full-time Data Migration Technical Lead who is a proficient Salesforce Revenue Cloud Data Migration Specialist. In this role, you will be responsible for overseeing and executing data migration activities as part of Revenue Cloud implementation and transformation projects. As a key member of the Forsys Data Migration team, you will analyze data from multiple source systems, consult with clients on data transformation, and manage end-to-end data and document migration processes. Responsibilities: - Possessing over 8 years of experience as a data migration technical lead, with a proven track record in handling complex migration projects. - Developing and implementing data migration strategies for Salesforce Revenue Cloud, including CPQ (Configure Price Quote), Billing, and related modules. - Collaborating with clients to assess their data requirements, creating data models, and establishing data mappings. - Evaluating source data quality, devising data cleansing strategies, and executing data cleaning processes as needed. - Building ETL/ELT pipelines using tools like Informatica, Talend, or native Salesforce tools. - Adhering to best practices for data migration and following established standards and protocols. - Assessing different source systems to determine optimal data transfer methods and managing large volumes of data effectively. - Designing and conducting data validation procedures pre and post-migration, and generating Data Reconciliation reports. - Implementing testing protocols to ensure data accuracy and consistency with client specifications. - Providing technical support throughout the data migration process to ensure efficiency and smooth operation. - Creating comprehensive documentation of the migration process to guide future projects. - Mentoring team members and fostering collaboration to achieve project deliverables effectively. - Demonstrating the ability to perform effectively in high-pressure environments. Eligibility: - Minimum of 8 years of experience in data migration or ETL roles, with at least 2 years focusing on Salesforce Revenue Cloud (CPQ + Billing). - Proficiency in utilizing ETL Tools such as Pentaho, Mulesoft, Informatica, Data Stage, SSIS, etc. - Strong understanding of the Salesforce data model and experience in various phases of Data Migration. - Advanced SQL skills, familiarity with APIs, and integration patterns. - Experience in data/process mapping for Data Migrations involving Salesforce, Oracle, and Legacy systems is preferred. - Extensive experience working with different databases and SQL queries. - Knowledge of Supply Chain/CRM/Quote to Cash/Quote to Order business processes. - Proficiency in handling various data formats (XML, JSON, etc.). - Expertise in SOAP & REST Services, API implementation, and Cloud services. - Strong communication skills, ability to work effectively in teams, both onshore and offshore, and driven to achieve goals. - Self-motivated, goal-oriented individuals with strong analytical and problem-solving skills. - Prior experience with source systems such as NetSuite, SAP, Zuora, or Oracle for migration to Salesforce Revenue Cloud is advantageous.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough