Jobs
Interviews

1049 Normalization Jobs - Page 39

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Company Description Malayaj Solutions is a leading information technology organisation headquartered in Bangalore. Since 2018, we have provided customised technology services to address specific client needs in areas such as development, automation, digital transformation, and managed services. Our skilled professionals are dedicated to delivering productive outcomes and exemplary customer service. Malayaj Solutions is trusted by companies and partners globally, offering innovative solutions across industries like healthcare, education, energy resources, and more. Role Description This is a full-time, on-site role located in Noida/Gurgaon for a junior database administrator. The Junior Database Administrator will be responsible for managing and maintaining databases, designing database structures, and troubleshooting database issues. Daily tasks will include setting up database replication, ensuring data integrity and security, and optimising performance. Collaboration with other IT team members to support business applications is also required. Qualifications Database Administration and Database Skills Hand-on experience with PostgreSQL or Oracle DB is a must Proficiency in Database Design, including schema design and normalization Strong Troubleshooting skills, specifically for database-related issues Experience with Database Replication and ensuring data consistency Excellent problem-solving and analytical skills Ability to work collaboratively and maintain effective communication with team members Bachelor's degree in Computer Science, Information Technology, or a related field is preferred Experience with specific database management systems (e.g., MySQL, PostgreSQL, SQL Server) Key Responsibilities: Install, configure, and maintain MS SQL Server, PostgreSQL, and Oracle databases. Manage database security, backup & recovery, and high-availability setups. Proactively monitor database performance, identify bottlenecks, and implement tuning solutions. Plan and execute database migrations, upgrades, and patch management. Collaborate with development and infrastructure teams for schema designs, query optimisation, and issue resolution. Ensure compliance with data protection regulations and internal data governance policies. Develop and maintain database documentation, standards, and procedures. Must-Have Skills: 3+ years of proven experience as a database administrator. Strong expertise in PostgreSQL , and Oracle DB Experience in database clustering, replication, and disaster recovery setups. Proficient in SQL scripting, query optimisation, and performance tuning. Familiarity with cloud-based database services (AWS RDS, Azure SQL, etc.) is a plus. Good to Have: Exposure to NoSQL databases like Redis. Experience working in an Agile/DevOps environment. Certification in MS SQL/PostgreSQL is an added advantage. Location: Noida/Gurgaon (Preference will be given to candidates open to hybrid working model.) Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY - GDS Consulting – GRC Technology – Enterprise GRC – Manager As EY GDS Consulting Manager, you’ll manage as well as contribute technically and functionally to GRC Technology client engagements and internal projects. You’ll also identify potential business opportunities for EY within existing engagements and escalate these as appropriate. Similarly, you’ll anticipate and identify risks within engagements and share any issues with senior members of the team. The opportunity We’re looking for a seasoned Platforms engineer with experience on GRC/IRM modules, who will be responsible for the evaluation, recommendation and build out of customer requirements on the GRC platforms utilizing defined best practices for configuration and development, ensuring the scalability and performance of these GRC Platforms. This position requires a high level of functional GRC expertise with exposure and understanding of various GRC platforms (like Archer & ServiceNow), strategic thinking, and the ability to lead conversations and projects. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of the service offering. Your Key Responsibilities Lead requirement gathering and/or reviewing client's process requirements and how they map to GRC Platforms, driving platform expansion and adoption Scope solution requirements and configure solutions around the platform to meet customer needs and project deliverables Solution design to include integration of AI/Gen AI/Microservices for document/data/access management, 3rd party integrations, and cloud environment management and monitoring Understand solutions architecture design patterns and create solution architectures to client CIO/CTOs Define and implement IT architecture and data pipelines tailored for risk and GRC technologies Evaluate and select appropriate technologies and tools for data ingestion and normalization Consider dependencies, relationships, and integration points to ensure proper solution integration with other systems when applicable Leverage knowledge and experience to deliver end-to-end automated solutions which includes technical implementation of IT Infrastructure Library (ITIL) processes, workflow customization, process automation, report development, dashboard creation, and system configurations Identify new opportunities to provide additional value to clients and improve business performance Create/Review statements of work to help ensure appropriate level of effort Participate in growing and enhancing internal process for successful client delivery Stay updated with the latest trends and advancements in IT architecture, data engineering, and GRC technologies Ensure that all implementation work is carried out in accordance with the company policies and industry best practices. Lead business development activities including written proposals, presales activities, functional demonstrations and presentations. Ensuring adherence to quality processes specified for the project. Develop and maintain productive working relationships with client personnel. Planning and monitoring of the project deliverables from the team. Mentor the project team in executing the identified projects. Skills And Attributes For Success Conduct performance reviews and contribute to performance feedback for staff and senior staff. Foster teamwork, quality culture and lead by example. Understand and follow workplace policies and procedures. Training and mentoring of project resources. Participating in the organization-wide people initiatives To qualify for the role, you must have 7+ years industry experience, experience in leadership - leading larger teams. Should have led/completed at least 2 to 3 engagements/projects in a similar role. Demonstrated ability to map solutions to address client business issues and problem statements Functional knowledge and implementation experience of GRC frameworks working directly with customers and clients Experience in IT solution architecture and understanding of enterprise architecture.. Strong understanding of ITIL processes. Exceptional problem-solving capability and ability to think strategically. Excellent interpersonal and communication skills. Experience in strategy, business development, finance and budgeting desirable Good understanding of GRC technology platforms including Archer, ServiceNow. Ideally, you should also have B.E/B.Tech (Comp. Science, IT, Electronics, Electronics & Telecommunications)/MBA with a minimum of 7+ years of experience with other Big3 or panelled SI/ ITeS companies Robust understanding of program and project management practices Familiarity with a typical IT systems development life cycle Knowledge of industry standards and regulations related to risk and compliance. Knowledge and experience of GRC/IRM modules. Good to have experience in GRC roadmap review, vendor comparison and selection. Exposure to multiple GRC tools like MetricStream, Enablon, etc. would be an added advantage What We Look For A Team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment with consulting skills. An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Haryana, India

On-site

About The Role We are seeking a talented and detail-oriented Database Developer Performance Tuning to join our technology team. In this role, you will be responsible for designing, developing, optimizing, and maintaining SQL Server databases with a strong focus on performance tuning and T-SQL programming. Youll work closely with application developers, QA engineers, and system architects to ensure efficient and scalable database solutions. The ideal candidate is someone with a solid foundation in database design and performance optimization, who is also proactive in identifying potential issues and proposing innovative solutions. Key Responsibilities Develop, enhance, and maintain database objects including stored procedures, functions, views, triggers, and scripts using T-SQL. Design and implement scalable database solutions that align with business and technical requirements. Collaborate with software development teams to integrate efficient database queries into applications. Analyze and troubleshoot SQL Server performance issues including slow-running queries, blocking, deadlocks, and index fragmentation. Perform index tuning, query optimization, and execution plan analysis. Monitor and fine-tune SQL Server environments for high performance and availability. Design and develop SSRS reports, dashboards, and data visualizations based on business requirements. Prepare technical documentation including data models, specifications, and functional flow diagrams. Maintain version control and change management using tools like Team Foundation Server (TFS) or Git. Perform routine database administration tasks including backups, security, patching, and schema migrations. Evaluate existing systems and recommend improvements for stability, scalability, and efficiency. Provide technical support to development and QA teams, and assist in debugging and fixing : Technical Skills Proficient in Microsoft SQL Server and T-SQL for database development and performance tuning. Strong understanding of relational database design, normalization, and data integrity principles. Hands-on experience with SSRS (SQL Server Reporting Services) for report generation and customization. Experience with Team Foundation Server (TFS) or equivalent version control and issue tracking tools. Familiarity with cloud-based databases (Azure SQL, AWS RDS) is a plus but not mandatory. Understanding of application integration techniques and agile software development (Scrum) is desirable (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

As Lead Splunk, Your Role And Responsibilities Would Include Hands-on experience in the SIEM domain Deep understanding of Splunk backend operations (UF, HF, SH, and Indexer Cluster) and architecture Strong knowledge of Log Management and Splunk SIEM. Understanding of log collection, parsing, normalization, and retention practices. Expertise in optimizing logs and license usage. Solid understanding of designing, deploying, and implementing scalable SIEM architecture. Understanding of data parsimony as a concept, especially in terms of German data security standards. Working knowledge of integrating Splunk logging infrastructure with third-party observability tools like ELK and DataDog. Experience in identifying the security and non-security logs and applying appropriate filters to route the logs correctly. Expertise in understanding network architecture and identifying the components of impact. Proficiency in Linux administration. Experience with Syslog. Proficiency in scripting languages like Python, PowerShell, or Bash for task automation. Expertise with OEM SIEM tools, preferably Splunk. Experience with open-source SIEM/log storage solutions like ELK or Datadog. Strong documentation skills for creating high-level design (HLD), low-level design (LLD), implementation guides, and operation manuals. Skills: siem,linux administration,team collaboration,communication skills,architecture design,python,parsing,normalization,retention practices,powershell,data security,log management,bash,splunk,log collection,documentation,syslog,incident response,data analysis Show more Show less

Posted 1 month ago

Apply

3.0 - 5.5 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Role : Data, Analytics & Insights Managed Service-Testing Engineer-Associate Tower : Testing Managed Service Experience : 3 - 5.5 years Key Skills: Data Analytics, SQL, Python, Statistical Analysis, ETL, Reporting Educational Qualification : Bachelor's degree in computer science/IT or relevant field Work Location: Bangalore, India Job Description At an Associate level, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution by using Data, Analytics & Insights Skills. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Review your work and that of others for quality, accuracy, and relevance. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Flexible to work in stretch opportunities/assignments. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good Team player. Take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: Primary Skill: Data Analytics concepts, SQL, ETL concepts, Python, Cloud platform (AWS/AZURE), JAVA core concepts, GITLAB Secondary Skill: Reporting, API testing, Salesforce Workbench, Airflow/prefect Data Analyst Should have minimum 3 year’s hand on experience building advanced Data Analytics Should have minimum 3 years’ hands on Experience ETL testing with strong emphasis on accuracy, reliability and data integrity. Should have minimum 3 years’ hands on Experience in SQL for Data Handling, Minimum 0-1 years of experience in Manual Testing Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data for downstream consumption like Analytics modeling and Testing Data accuracy. Should have understanding on core data concepts: Data ingestion, transformation, enrichment, quality, governance, lineage and design Validating data fields, formats and utility using Quality Control checklists Should have sound knowledge of snowflake data warehouse and Salesforce workbench for data management and reconciliation. Ability to transform requirements and user stories into comprehensive test cases. Strong foundation in data modelling principles, including database schemas, normalization and relationships. Ability to define and develop test strategies, test plans, KPI’s, Metrics, Defect management processes and reporting mechanism. Designing Test Cases, generate test data based on the requirements and project needs. Familiarity with data quality frameworks and tools to enhance testing efficiency. In-depth understanding of industry standards and best practices Exposure to Data Quality Platform. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have Certifications in ETL and other BI tools is an added advantage. Tools knowledge: - Snowflakes, Qlik, Appflow, Data Gaps, Query Surge, iceDQ, Ataccama or equivalent Cloud platforms: AWS,AZURE Soft Skills Should have Exceptional written and verbal communication skills to articulate technical details effectively Strong problem-solving skills with keen attention to detail and precision. Ability to work autonomously as well as collaborate effectively within diverse teams. A proactive mindset focused on continuous improvement in quality assurance processes. Adaptability manage multiple priorities and excel in dynamic, fast-paced environments. Managed Services- Data, Analytics & Insights At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights Managed Service where we focus more so on the evolution of our clients’ Data, Analytics, Insights and cloud portfolio. Our focus is to empower our clients to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Application Evolution Service offerings and engagement including help desk support, enhancement and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Data Engineer Location: Chennai (Hybrid) Summary Design, develop, and maintain scalable data pipelines and systems to support the collection, integration, and analysis of healthcare and enterprise data. The primary responsibilities of this role include designing and implementing efficient data pipelines, architecting robust data models, and adhering to data management best practices. In this position, you will play a crucial part in transforming raw data into meaningful insights, through development of semantic data layers, enabling data-driven decision-making across the organization. The ideal candidate will possess strong technical skills, a keen understanding of data architecture, and a passion for optimizing data processes. Accountability Design and implement scalable and efficient data pipelines to acquire, transform, and integrate data from various sources, such as electronic health records (EHR), medical devices, claims data, and back-office enterprise data Develop data ingestion processes, including data extraction, cleansing, and validation, ensuring data quality and integrity throughout the pipeline Collaborate with cross-functional teams, including subject matter experts, analysts, and engineers, to define data requirements and ensure data pipelines meet the needs of data-driven initiatives Design and implement data integration strategies to merge disparate datasets, enabling comprehensive and holistic analysis Implement data governance practices and ensure compliance with healthcare data standards, regulations (e.g., HIPAA), and security protocols Monitor and troubleshoot pipeline and data model performance, identifying and addressing bottlenecks, and ensuring optimal system performance and data availability Design and implement data models that align with domain requirements, ensuring efficient data storage, retrieval, and delivery Apply data modeling best practices and standards to ensure consistency, scalability, and reusability of data models Implement data quality checks and validation processes to ensure the accuracy, completeness, and consistency of healthcare data Develop and enforce data governance policies and procedures, including data lineage, architecture, and metadata management Collaborate with stakeholders to define data quality metrics and establish data quality improvement initiatives Document data engineering processes, methodologies, and data flows for knowledge sharing and future reference Stay up to date with emerging technologies, industry trends, and healthcare data standards to drive innovation and ensure compliance Skills 4+ years strong programming skills in object-oriented languages such as Python Proficiency in SQL Hands on experience with data integration tools, ETL/ELT frameworks, and data warehousing concepts Hands on experience with data modeling and schema design, including concepts such as star schema, snowflake schema and data normalization Familiarity with healthcare data standards (e.g., HL7, FHIR), electronic health records (EHR), medical coding systems (e.g., ICD-10, SNOMED CT), and relevant healthcare regulations (e.g., HIPAA) Hands on experience with big data processing frameworks such as Apache Hadoop, Apache Spark, etc. Working knowledge of cloud computing platforms (e.g., AWS, Azure, GCP) and related services (e.g., DMS, S3, Redshift, BigQuery) Experience integrating heterogeneous data sources, aligning data models and mapping between different data schemas Understanding of metadata management principles and tools for capturing, storing, and managing metadata associated with data models and semantic data layers Ability to track the flow of data and its transformations across data models, ensuring transparency and traceability Understanding of data governance principles, data quality management, and data security best practices Strong problem-solving and analytical skills with the ability to work with complex datasets and data integration challenges Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams Education Bachelor's or Master's degree in computer science, information systems, or a related field. Proven experience as a Data Engineer or similar role with a focus on healthcare data. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

As Lead Splunk, your role and responsibilities would include: Hands on experience in the SIEM domain o Expert knowledge on splunk> Backend operations (UF, HF, SH and Indexer Cluster) and architecture o Expert knowledge of Log Management and Splunk SIEM. Understanding of log collection, parsing, normalization, and retention practices. o Expert in Logs/License optimization techniques and strategy. o Good Understanding of Designing, Deployment & Implementation of a scalable SIEM Architecture . o Understanding of data parsimony as a concept, especially in terms of German data security standards. o Working knowledge of integration of Splunk logging infrastructure with 3rd party Observability Tools (e.g. ELK, DataDog etc.) o Experience in identifying the security and non-security logs and apply adequate filters/re-route the logs accordingly. o Expert in understanding the Network Architecture a nd identifying the components of impact. o Expert in Linux Administration. o Proficient in working with Syslog . o Proficiency in scripting languages like Python, PowerShell, or Bash to automate tasks Expertise with OEM SIEM tools preferably Splunk E xperience with open source SIEM/Log storage solutions like ELK OR Datadog etc. . o Very good with documentation of HLD, LLD, Implementation guide and Operation Manuals Show more Show less

Posted 1 month ago

Apply

0.0 years

0 Lacs

Bhopal, Madhya Pradesh

On-site

Position: Python Intern specializing in AI Location : Bhopal, Madhya Pradesh (Work from Office) Duration: 3 to 6 Months ✅ Must-Have Skills Core Python Programming: Functions, loops, list comprehensions, classes Error handling (try-except), logging File I/O operations, working with JSON/CSV Python Libraries for AI/ML: numpy, pandas – Data manipulation & analysis matplotlib, seaborn – Data visualization scikit-learn – Classical machine learning models Basic familiarity with tensorflow or pytorch Working knowledge of Openai / Transformers (bonus) AI/ML Fundamentals: Supervised and unsupervised learning (e.g., regression, classification, clustering) Concepts of overfitting, underfitting, and bias-variance tradeoff Train-test split, cross-validation Evaluation metrics: accuracy, precision, recall, F1-score, confusion matrix Data Preprocessing: Handling missing data, outliers Data normalization, encoding techniques Feature selection & dimensionality reduction (e.g., PCA) Jupyter Notebook Proficiency: Writing clean, well-documented notebooks Using markdown for explanations and visualizing outputs Version Control: Git basics (clone, commit, push, pull) Using GitHub/GitLab for code collaboration ✅ Good-to-Have Skills Deep Learning: Basic understanding of CNNs, RNNs, transformers Familiarity with keras or torch.nn for model building Generative AI: Prompt engineering, working with LLM APIs like OpenAI or Hugging Face Experience with vector databases (Qdrant, FAISS) NLP: Tokenization, stemming, lemmatization TF-IDF, Word2Vec, BERT basics Projects in sentiment analysis or text classification Tools & Platforms: VS Code, JupyterLab Google Colab / Kaggle Docker (basic understanding) Math for AI: Linear algebra, probability & statistics Basic understanding of gradients and calculus ✅ Soft Skills & Project Experience Participation in mini-projects (e.g., spam detector, digit recognizer) Kaggle competition experience Ability to clearly explain model outputs and results Documenting findings and creating simple dashboards or reports. Job Types: Full-time, Internship Pay: From ₹5,000.00 per month Schedule: Day shift Work Location: In person

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Experience: 7+ to 10 Yrs Notice Period: Immediate joiners Work Timings: Normal working Hours Location: Gurgaon, Work from office -Hybrid mode, client location As Lead Splunk, Your Role And Responsibilities Would Include Hands on experience in the SIEM domain Expert knowledge on splunk> Backend operations (UF, HF, SH and Indexer Cluster) and architecture Expert knowledge of Log Management and Splunk SIEM. Understanding of log collection, parsing, normalization, and retention practices. Expert in Logs/License optimization techniques and strategy. Good Understanding of Designing, Deployment & Implementation of a scalable SIEM Architecture. Understanding of data parsimony as a concept, especially in terms of German data security standards. Working knowledge of integration of Splunk logging infrastructure with 3rd party Observability Tools (e.g. ELK, DataDog etc.) Experience in identifying the security and non-security logs and apply adequate filters/re-route the logs accordingly. Expert in understanding the Network Architecture and identifying the components of impact. Expert in Linux Administration. Proficient in working with Syslog. Proficiency in scripting languages like Python, PowerShell, or Bash to automate tasks Expertise with OEM SIEM tools preferably Splunk E xperience with open source SIEM/Log storage solutions like ELK OR Datadog etc. . Very good with documentation of HLD, LLD, Implementation guide and Operation Manuals Skills: integration with 3rd party tools,python,log management,logs optimization,documentation,security,siem architecture design,parsing,oem siem tools,linux administration,normalization,log collection,syslog,powershell,bash,security logs identification,siem,retention practices,data parsimony,splunk Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Description Senior Engineer, Data Modeling Gurgaon/Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be Doing What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to make sure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Py spark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to make sure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to the Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What We OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Flex is the diversified manufacturing partner of choice that helps market-leading brands design, build and deliver innovative products that improve the world. We believe in the power of diversity and inclusion and cultivate a workplace culture of belonging that views uniqueness as a competitive edge and builds a community that enables our people to push the limits of innovation to make great products that create value and improve people's lives. A career at Flex offers the opportunity to make a difference and invest in your growth in a respectful, inclusive, and collaborative environment. If you are excited about a role but don't meet every bullet point, we encourage you to apply and join us to create the extraordinary. To support our extraordinary teams who build great products and contribute to our growth, we’re looking to add a Specialist – Planning in Chennai , India. A professional who can quickly and accurately process purchase orders in a fast-paced environment. Has excellent stake holders service skills and works well in a team to consistently meet challenging performance targets. What a typical day looks like: Responsible for providing expertise and support to the Customer Focus Team (CFT), ensuring the ability of the materials planning for a specific project or projects as required. Providing materials support to the weekly production planned orders and enables to achieve Kit on time drop to meet Customer Schedule Key assignments includes providing timely Materials status through use of available Shortage reports, Submission of Excess and Obsolete Inventory to the Customer, Work Order Management, inventory management to achieve the operating goals. Senior Materials Planners for New Emerging NPI Accounts to provide faster service to the NPI Customer to effectively communicate with the customer protecting Business interest of Flex Working on customer forecast for activity like normalization, forecast comparison etc. Working on customer forecast & shipment using waterfall method. Responsible for analyzing availability of materials & capacity based on customer demand & coming up with aggressive but achievable loading schedule. Responsible for running weekly system reports to determine material shortages & work on their closure with buying team. Responsible for handling work order management based on build plan. Responsible for identifying & taking various inventory management measures. The experience we’re looking to add to our team: Education: Bachelor’s Degree or Engineering Graduates Experience: 3-5 yr. Planning/ Supply Chain Mandatory Knowledge of computer software applications, MS Excel, Word & PowerPoint (PF) Proficiency: ERP/P2P systems BAAN / SAP/ Oracle / Kinaxis / Pulse Knowledge of Engineering BOMs, product structure, EOL, ECO Management Knowledge of complete planning cycle including MPS, MRP, Demand Planning, Materials planning, Production planning. Communication: Communication, both verbal and written, is an important part of this role. The job holder is required to exchange information, ideas and views on business related matters concerning the Planning function, throughout the Company at all levels. Innovation: The jobholder is required to show a willingness to question traditional methodology and make recommendations on new ways of approaching problems and improving existing processes. Here are a few examples of what you will get for the great work you provide: Health Insurance PTO #RA01 Site Flex is an Equal Opportunity Employer and employment selection decisions are based on merit, qualifications, and abilities. We celebrate diversity and do not discriminate based on: age, race, religion, color, sex, national origin, marital status, sexual orientation, gender identity, veteran status, disability, pregnancy status, or any other status protected by law. We're happy to provide reasonable accommodations to those with a disability for assistance in the application process. Please email accessibility@flex.com and we'll discuss your specific situation and next steps (NOTE: this email does not accept or consider resumes or applications. This is only for disability assistance. To be considered for a position at Flex, you must complete the application process first). Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

TCS HIRING!! ROLE: AWS Data architect LOCATION: HYDERABAD YEAR OF EXP: 8 + YEARS Data Architect: Must have: Relational SQL/ Caching expertise – Deep knowledge of Amazon Aurora PostgreSQL, ElastiCache etc.. Data modeling – Experience in OLTP and OLAP schemas, normalization, denormalization, indexing, and partitioning. Schema design & migration – Defining best practices for schema evolution when migrating from SQL Server to PostgreSQL. Data governance – Designing data lifecycle policies, archival strategies, and regulatory compliance frameworks. AWS Glue & AWS DMS – Leading data migration strategies to Aurora PostgreSQL. ETL & Data Pipelines – Expertise in Extract, Transform, Load (ETL) workflows . Glue jobs features and event-driven architectures. Data transformation & mapping – PostgreSQL PL/pgSQL migration / transformation expertise while ensuring data integrity. Cross-platform data integration – Connecting cloud and on-premises / other cloud data sources. AWS Data Services – Strong experience in S3, Glue, Lambda, Redshift, Athena, and Kinesis. Infrastructure as Code (IaC) – Using Terraform, CloudFormation, or AWS CDK for database provisioning. Security & Compliance – Implementing IAM, encryption (AWS KMS), access control policies, and compliance frameworks (eg. GDPR ,PII). Query tuning & indexing strategies – Optimizing queries for high performance. Capacity planning & scaling – Ensuring high availability, failover mechanisms, and auto-scaling strategies. Data partitioning & storage optimization – Designing cost-efficient hot/cold data storage policies. Should have experience with setting up the AWS architecture as per the project requirements Good to have: Data Warehousing – Expertise in Amazon Redshift, Snowflake, or BigQuery. Big Data Processing – Familiarity with Apache Spark, EMR, Hadoop, or Kinesis. Data Lakes & Analytics – Experience in AWS Lake Formation, Glue Catalog, and Athena. Machine Learning Pipelines – Understanding of SageMaker, BedRock etc. for AI-driven analytics. CI/CD for Data Pipelines – Knowledge of AWS CodePipeline, Jenkins, or GitHub Actions. Serverless Data Architectures – Experience with event-driven systems (SNS, SQS, Step Functions). Show more Show less

Posted 1 month ago

Apply

1.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

The impact that you will be making The role would require you to develop sophisticated products that add value to the client and result in new projects and revenue streams. What this role entail Design, develop, and optimize SQL databases, tables, views, and stored procedures to meet business requirements and performance goals. Write efficient and high-performing SQL queries to retrieve, manipulate, and analyze data. Ensure data integrity, accuracy, and security through regular monitoring, backups, and data cleansing activities. Identify and resolve database performance bottlenecks, optimizing queries and database configurations. Investigate and resolve database-related issues, including errors, connectivity problems, and data inconsistencies. Collaborate with cross-functional teams, including Data Analysts, Software Developers, and Business Analysts, to support data-driven decision-making. Maintain comprehensive documentation of database schemas, processes, and procedures. Implement and maintain security measures to protect sensitive data and ensure compliance with data protection regulations. Assist in planning and executing database upgrades and migrations. What lands you in the role 1- 3 years of relevant work experience as a SQL Developer or in a similar role. Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience). Proficiency in SQL, including T-SQL for Microsoft SQL Server or PL/SQL for Oracle. Strong knowledge of database design principles, normalization, and indexing. Experience with database performance tuning and optimization techniques. Familiarity with data modeling tools and techniques. Understanding of data warehousing concepts is a plus. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Ability to work independently and manage multiple tasks simultaneously. Certifications in database management (e.g., Microsoft Certified: Azure Database Administrator Associate) is a plus. What we offer An opportunity to be part of some of the best enterprise SaaS products to be built out of India. Opportunities to quench your thirst for problem-solving, experimenting, learning, and implementing innovative solutions. A flat, collegial work environment, with a work hard, play hard attitude. A platform for rapid growth if you are willing to try new things without fear of failure. Remuneration with best-in-class industry standards with generous health insurance cover About Impact Analytics Impact Analytics™ ( Series D Funded ) delivers AI-native SaaS solutions and consulting services that help companies maximize profitability and customer satisfaction through deeper data insights and predictive analytics. With a fully integrated, end-to-end platform for planning, forecasting, merchandising, pricing, and promotions, Impact Analytics empowers companies to make smarter decisions based on real-time insights rather than relying on last year’s inputs to forecast and plan this year’s business. Powered by over one million machine learning models, Impact Analytics has been leading AI innovation for a decade, setting new benchmarks in forecasting, planning, and operational excellence across the retail, grocery, manufacturing, and CPG sectors. In 2025, Impact Analytics is at the forefront of the Agentic AI revolution, delivering autonomous solutions that enable businesses to adapt in real time, optimize operations, and drive profitability without manual intervention . Here’s a link to our website: www.impactanalytics.co . Some of our accolades include: Ranked as one of America's Fastest-Growing Companies by Financial Times for five consecutive years: 2020-2024. Ranked as one of America's Fastest-Growing Private Companies by Inc. 5000 for seven consecutive years: 2018-2024. Voted #1 by more than 300 retailers worldwide in the RIS Software LeaderBoard 2024 report. Ranked #72 in America’s Most Innovative Companies list in 2023 —by Fortune —alongside companies like Microsoft, Tesla, Apple, IBM, etc. Forged a strategic partnership with Google to equip retailers with cutting-edge generative AI tools. Recognized in multiple Gartner reports , including Market Guides and Hype Cycle , spanning assortments, merchandising, forecasting, algorithmic retailing, and Unified Price, Promotion, and Markdown Optimization Applications. Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We are seeking an experienced MySQL Database Administrator (DBA) to join our tech team in Hyderabad. The ideal candidate will be responsible for managing high-performance MySQL environments, with a strong focus on schema design, clustering, replication, and high availability. Expertise in MySQL schema design is critical to ensure data integrity, optimal performance, and future scalability. The role requires a deep understanding of indexing strategies, normalization principles, and advanced query optimization techniques to maintain fast and efficient database responses. Additional responsibilities include implementing partitioning strategies and collaborating closely with our development and DevOps teams to ensure our systems are scalable, secure, and highly available. Key Responsibilities • Design, deploy, and maintain MySQL databases with clustering and replication. • Configure and manage MySQL clusters (e.g., InnoDB Cluster, Galera Cluster) for high availability. • Monitor performance and proactively resolve bottlenecks or issues. • Set up and manage automated backups, failover, and disaster recovery plans. • Optimize SQL queries, indexes, and schema designs for performance. • Collaborate with developers to support efficient data structures and queries. • Ensure database security, data integrity, and compliance with best practices. • Maintain clear documentation of configurations, procedures, and topologies. Required Qualifications • Bachelor’s degree in Computer Science, Information Technology, or a related field. • 3+ years of hands-on experience as a MySQL DBA, including clustering (InnoDB or Galera). • Deep understanding of MySQL replication (master-slave, master-master). • Strong SQL skills and experience with stored procedures, triggers, and indexing. • Experience with performance tuning and query optimization. • Familiarity with Linux environments and scripting (Bash or Python). • Experience with cloud-hosted MySQL (AWS RDS, Azure Database for MySQL, etc.). Preferred Skills • Experience with monitoring tools (e.g., Percona Toolkit, PMM, Nagios, Zabbix). • Familiarity with Laravel or Node.js backend integrations. • Exposure to DevOps tools like Ansible, Docker, or Kubernetes. • Understanding of NoSQL or caching solutions (e.g., Redis, Memcached). Why Join Us in Hyderabad? • Work on exciting projects with modern, scalable infrastructure. • Join a collaborative and innovative tech culture. • Enjoy a flexible, growth-focused work environment. • Competitive salary and growth opportunities in a fast-growing organization. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Jaipur, Rajasthan, India

Remote

Project Role : Infra Tech Support Practitioner Project Role Description : Provide ongoing technical support and maintenance of production and development systems and software products (both remote and onsite) and for configured services running on various platforms (operating within a defined operating model and processes). Provide hardware/software support and implement technology at the operating system-level across all server and network areas, and for particular software solutions/vendors/brands. Work includes L1 and L2/ basic and intermediate level troubleshooting. Must have skills : Kubernetes Good to have skills : Microsoft Azure Container Infrastructure, DevOps Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Infra Tech Support Practitioner, you will provide ongoing technical support and maintenance for production and development systems and software products. You will be responsible for configuring services on various platforms and implementing technology at the operating system-level. Your role will involve troubleshooting at both basic and intermediate levels, ensuring smooth operations and resolving any issues that arise. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Ensure smooth operations and maintenance of production and development systems. - Configure services on various platforms. - Implement technology at the operating system-level. - Troubleshoot issues at both basic and intermediate levels. Professional & Technical Skills: - Must To Have Skills: Proficiency in Kubernetes. - Good To Have Skills: Experience with DevOps, Microsoft Azure Container Infrastructure. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 5 years of experience in Kubernetes. - This position is based at our Gurugram office. - A 15 years full time education is required. 15 years full time education Show more Show less

Posted 1 month ago

Apply

2.0 - 4.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Description The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Senior Analyst Roles & Responsibilities Responsible for managing multiple Middle Office and Regulatory support processes Independently managing middle / back-office operations for Investment Banks Preferable relevant work experience in a similar domain (Swap Data Repository Reconciliation) and team management skills Build domain expertise across processes supported, assist in speedy remediation of issues, implement process improvements and build/enhance controls to prevent future escalations Participate in Middle Office/group level initiatives Responsible for ensuring Change Management and Process documentation is maintained in an updated fashion at all times Review and analyze trade data between risk and finance systems Investigate the genuine breaks for root cause and facilitate resolution and decision support, wherever necessary Provide support on the change and new business requests received from various RFDAR/non-RFDAR teams by assessing the business requirements, performing testing, providing SME support Apply data normalization methods such as filtering, standardization, enrichment, aggregation Create reports/metrics/analysis to cover the Daily / Weekly / Monthly requests Mailbox Management / Queue Management Build domain expertise Functional & Technical Skills Bachelor's Degree in B. Com /BBM or Master’s Degree in M. Com/MBA/PGDM 2 to 4 years of experience in Confirmations, Portfolio management, Settlements or equity Should have basic knowledge of finance, trade life cycle, investment banking, and derivatives High levels of energy enthusiasm, commitment and productivity, Proactive, effective influencer, result-oriented Should be good with logical and quantitative abilities to derive information from data Time management and ability to resolve issues speedily Above average in planning, organizing and time management About Us At eClerx, we serve some of the largest global companies – 50 of the Fortune 500 clients. Our clients call upon us to solve their most complex problems, and deliver transformative insights. Across roles and levels, you get the opportunity to build expertise, challenge the status quo, think bolder, and help our clients seize value About The Team eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law. Show more Show less

Posted 1 month ago

Apply

12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

As a Staff Software Engineer- Business intelligence with expertise in business intelligence field, utilizing Tableau Desktop, Tableau Server, Tableau Prep and Tableau Cloud you will be responsible to oversee the development and utilization of data systems. You will be reporting to the Sr. Manager – Data Engineering, to join our dynamic team in the Foreign Exchange payments processing industry. This role focuses on data analytics and business intelligence, and is responsible for architecting, designing, developing, and maintaining interactive and insightful data visualizations using Tableau. This role requires a deep understanding of business processes, technology, data management, and regulatory compliance. The successful candidate will work closely with business and IT leaders to ensure that the enterprise data architecture supports business goals, and that data governance policies and standards are adhered to across the organization. Your responsibilities will include working closely with data engineers, analysts, cross-functional teams, and other stakeholders to ensure that our data platform meets the needs of our organization and supports our data-driven initiatives. It also includes building a new data platform, integrating data from various sources, and ensuring data availability for various application and reporting needs. Additionally, the candidate should have experience working with AI/ML technologies and collaborating with data scientists to meet their data requirements.. In your role as a Staff Software Engineer- Business Intelligence, you will: Define and evolve the technical vision and architecture for analytics, BI, and reporting across different data models and visualizations that support use cases across different products or domains Drive the roadmap for BI modernization with a focus on scalability, performance, and selfservice capabilities. Collaborate with business stakeholders to understand their data visualization needs and business requirements and translate them into effective dashboards and reports. Architect, design, develop, and maintain visually appealing, interactive dashboards that ensure they provide meaningful and actionable insights. Design, build, and launch collections of sophisticated data models and visualizations that support use cases across different products or domains Manage workbook publishing and data refresh schedules to ensure data is updated and relevant for stakeholders. Optimize Tableau dashboards for optimal performance, including improvements in load times and responsiveness, as well as effective handling of large datasets. Develop, monitor, and maintain ETL processes for automated, scheduled data refreshes, collaborating with team members to resolve issues related to source system extracts, mapping, and data loading in a timely and efficient manner. Cleanse and prepare data for analysis and reporting to ensure data accuracy and integrity, and relevance. Conduct training sessions and support end-users in the use and interpretation of data visualizations. A successful candidate for this position should have Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, Business Analytics or related IT discipline; or equivalent experience 12+ years of experience in the business intelligence field, utilizing Tableau Desktop, Tableau Server, Tableau Prep and Tableau Cloud Hands on experience in data engineering, BI, or analytics, including at least 5+ years in a lead or principal role Extensive experience as a Tableau developer Extensive experience with MS SQL Server and relational database management systems, including data modeling and normalization Proficient in writing complex statements, scripts, stored procedures, triggers, and views to implement business intelligence and data warehousing solutions Experience working with ETL Ability to take ownership of projects and independently drive them from concept to completion, delivering high-quality results on time. Strong understanding of data visualization principles Excellent analytical and problem-solving skills Ability to communicate complex data insights to non-technical stakeholders Experience in performance tuning and optimization of Tableau solutions Expertise in Cloud platforms (e.g., Google Cloud Platform, AWS, or Azure) Experience with Visualization tools such as Power BI Professional certification in Tableau Candidates with banking domain experience will be highly preferred. About Convera Convera is the largest non-bank B2B cross-border payments company in the world. Formerly Western Union Business Solutions, we leverage decades of industry expertise and technology-led payment solutions to deliver smarter money movements to our customers – helping them capture more value with every transaction. Convera serves more than 30,000 customers ranging from small business owners to enterprise treasurers to educational institutions to financial institutions to law firms to NGOs. Our teams care deeply about the value we bring to our customers which makes Convera a rewarding place to work. This is an exciting time for our organization as we build our team with growth-minded, results-oriented people who are looking to move fast in an innovative environment. As a truly global company with employees in over 20 countries, we are passionate about diversity; we seek and celebrate people from different backgrounds, lifestyles, and unique points of view. We want to work with the best people and ensure we foster a culture of inclusion and belonging. We offer an abundance of competitive perks and benefits including: Competitive salary Opportunity to earn an annual bonus. Great career growth and development opportunities in a global organization A flexible approach to work There are plenty of amazing opportunities at Convera for talented, creative problem solvers who never settle for good enough and are looking to transform Business to Business payments. Apply now if you’re ready to unleash your potential. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary A career within…. A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities - Experience in designing and implementing the ELT architecture to build data warehouse including source-to-staging, staging-to-target mapping design Experience in Configuring Master Repository, Work Repository, Projects, Models, Sources, Targets, Packages, Knowledge Modules, Mappings, Scenarios, Load plans, and Metadata. Experience in creating database connections, physical and logical schema using the Topology Manager Experience in creation of packages, construction of data warehouse and data marts, and synchronization using ODI Experience in architecting data-related solutions, developing data warehouses, developing ELT/ETL jobs, Performance tuning and identifying bottlenecks in the process flow. Experience using Dimensional Data modeling, Star Schema modeling, Snow-Flake modeling, - Experience using Normalization, Fact and Dimensions Tables, Physical and Logical Data Modeling. Having Good Knowledge in Oracle cloud services and Database options. Strong Oracle SQL expertise using tools such as SQL Developer - Understanding ERP modules is good to have Mandatory Skill Sets ODI, OAC Preferred Skill Sets ODI, OAC Years Of Experience Required 8 - 10 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Oracle Data Integrator (ODI) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 11 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description We are seeking a skilled SQL Developer with a strong background in writing and optimizing Stored Procedures (SPS) to join our data team. The ideal candidate will be responsible for developing, maintaining, and fine-tuning SQL queries and procedures to support business applications and analytics functions. Key Responsibilities Design, develop, and maintain complex SQL queries, stored procedures, functions, and triggers. Optimize existing SQL code for performance, scalability, and reliability. Collaborate with application developers, data analysts, and business stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and security across various systems. Perform ETL (Extract, Transform, Load) tasks to support data integration and migration efforts. Troubleshoot and resolve database-related issues, including performance bottlenecks. Document database structures, procedures, and development processes. Required Qualifications 4+ years of hands-on experience in SQL development. Strong expertise in writing and optimizing Stored Procedures in SQL Server, Oracle, or MySQL. Proficient in database design, normalization, and indexing strategies. Experience in performance tuning and query optimization techniques. Familiarity with ETL tools and data integration best practices. Skills Sql,Stored procedure Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Requirements 4 years experience in Implementation experience /Development experience in ServiceNow SAM/HAM/Sam pro Experience on SAM and HAM implementation and customization Good understanding on SAMpro architecture and should have hands on experience on software Entitlement data load, Intune integration, License reconciliation, Direct integration profiles, software catalog development. He/She should have experience on Model Normalization, customizing different HAM workflows like Hardware asset refresh order, reclamation order, loaner order, transfer order and disposal order etc... Software asset management Hardware asset management ServiceNow ITAM Hands-on experience with ITAM tools and platforms , preferably on ServiceNow having done SAM Table Installs Data analysis Entitlement to Deployment mapping and Consumption Analysis and creating the Final Entitlement License Position. In depth understanding of end to end Software Asset Management process framework comprising the SAM Tool and its interaction with Contracts , Procurement , Publisher Owner Teams Understanding of CMDB and Discovery process in relation to the SAM processes SAM Expertise : License Compliance, Software Lifecycle Management, Software Compliance Audits, Software Optimization Able to provide Thought Leadership from Industry Best Practices perspective to the client on the SAM practices (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible. As a Lead Software Engineer at JPMorgan Chase within Commercial and Investment Bank - Commercial Cards team, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. You will be responsible for designing, developing, and maintaining data models that support the organization's data architecture and business intelligence initiatives. This role involves working closely with data architects, business analysts, and other stakeholders to ensure that data models meet business requirements and are aligned with industry best practices. Job Responsibilities Designs and develops conceptual, logical, and physical data models to support data integration, data warehousing, and business intelligence solutions. Collaborates with business analysts and stakeholders to gather and understand data requirements and translate them into data models. Ensures data models are optimized for performance, scalability, and maintainability. Works with data architects to ensure data models align with the overall data architecture and strategy. Develops and maintains data dictionaries, metadata repositories, and data lineage documentation. Conducts data model reviews and provide recommendations for improvements. Supports data governance initiatives by ensuring data models adhere to data quality and data management standards. Assists in the development and implementation of data modeling best practices and standards. Provides support and guidance to development teams during the implementation of data models. Stays up-to-date with industry trends and advancements in data modeling techniques and tools. Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering* concepts and 5+ years applied experience Proven experience as a Data Modeler or in a similar role for more than 5 years Strong understanding of data modeling concepts, including normalization, denormalization, and dimensional modeling. Proficiency in data modeling tools such as ER/Studio, ERwin, or similar. Experience with relational databases (e.g., Oracle, SQL Server, MySQL) and data warehousing solutions. Knowledge of data integration and ETL processes. Excellent analytical and problem-solving skills. Strong communication and collaboration skills. Ability to work independently and as part of a team ABOUT US Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job title: R&D Data Steward Manager Associate Location: Hyderabad About The Job Sanofi is a global life sciences company committed to improving access to healthcare and supporting the people we serve throughout the continuum of care. From prevention to treatment, Sanofi transforms scientific innovation into healthcare solutions, in human vaccines, rare diseases, multiple sclerosis, oncology, immunology, infectious diseases, diabetes and cardiovascular solutions and consumer healthcare. More than 110,000 people in over 100 countries at Sanofi are dedicated to making a difference on patients’ daily life, wherever they live and enabling them to enjoy a healthier life. As a company with a global vision of drug development and a highly regarded corporate culture, Sanofi is recognized as one of the best pharmaceutical companies in the world and is pioneering the application of Artificial Intelligence (AI) with strong commitment to develop advanced data standards to increase reusability & interoperability and thus accelerate impact on global health. The R&D Data Office serves as a cornerstone to this effort. Our team is responsible for cross-R&D data strategy, governance, and management. We sit in partnership with Business and Digital, and drive data needs across priority and transformative initiatives across R&D. Team members serve as advisors, leaders, and educators to colleagues and data professionals across the R&D value chain. As an integral team member, you will be responsible for defining how R&D's structured, semi-structured and unstructured data will be stored, consumed, integrated / shared and reported by different end users such as scientists, clinicians, and more. You will also be pivotal in the development of sustainable mechanisms for ensuring data are FAIR (findable, accessible, interoperable, and reusable). Position Summary The R&D Data Steward plays a critical role in the intersection between business and data, where stewards will guide business teams on how to unlock value from data. This role will drive definition and documentation of R&D data standards in line with enterprise. Data stewards will place heavily cross-functional roles and must be comfortable with R&D data domains, data policies, and data cataloguing. Main Responsibilities Work in collaboration with R&D Data Office leadership (including the Data Capability and Strategy Leads), business, R&D Digital subject matter experts and other partners to: Understand the data-related needs for various cross-R&D capabilities (E.g., data catalog, master data management etc) and associated initiatives Influence, design, and document data governance policies, standards and procedures for R&D data Drive data standard adoption across capabilities and initiatives; manage and maintain quality and integrity of data via data enrichment activities (E.g., cleansing, validating, enhancing etc) Understand and adopt data management tools such as R&D data catalogue, etc Develop effective data sharing artifacts for appropriate usage of data across R&D data domains Ensure the seamless running of the data-related activities and verify data standard application from ingest through access Maintain documentation and act as an expert on data definitions, data flows, legacy data structures, access rights models, etc. for assigned domain Oversee data pipeline and availability and escalate issues where they surface; ensure on-schedule/on-time delivery and proactive management of risks/issues Educate and guide R&D teams on standards and information management principles, methodologies, best practices, etc. Oversee junior data stewards and/or business analysts based on complexity or size of initiatives/functions supported Deliverables Defines Data quality and communication metrics for assigned domains and 1-2 business functions Implements continuous improvement opportunities such as functional training Accountable for data quality and data management activities for the assigned domains. Facilitates data issue resolution Defines business terms and data elements (metadata) according to data standards, and ensures standardization/normalization of metadata Leads working groups to identify data elements, perform root cause and impact analysis, and identify improvements for metadata and data quality Regularly communicates with other data leads, expert Data Steward and escalates issues as appropriate About You Experience in Business Data Management, Information Architecture, Technology, or related fields Experience in bioinformatics is preferred, life sciences industry or academia is mandatory. Demonstrated ability to understand end-to-end data use and needs ,Knowledge of R&D data domains (e.g., across research, clinical, regulatory etc) Solid grasp of data governance practices and track record of implementation ,Ability to understand data processes and requirements, particularly in R&D at an enterprise level Demonstrated strong attention to detail, quality, time management and customer focus ,Excellent written and oral communications skills Strong networking, influencing and negotiating skills and superior problem-solving skills ,Demonstrated willingness to make decisions and to take responsibility for such Excellent interpersonal skills (team player) ,People management skills either in matrix or direct line function Familiar with data management practices and technologies (e.g., Collibra, Informatica etc); experience in practices not required ,Knowledge of pharma R&D industry regulations and compliance requirements related to data governance Education: Candidates should have an educational background in biological sciences (biology or chemistry) null Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job title: R&D Data Steward Manager Associate Location: Hyderabad About The Job Sanofi is a global life sciences company committed to improving access to healthcare and supporting the people we serve throughout the continuum of care. From prevention to treatment, Sanofi transforms scientific innovation into healthcare solutions, in human vaccines, rare diseases, multiple sclerosis, oncology, immunology, infectious diseases, diabetes and cardiovascular solutions and consumer healthcare. More than 110,000 people in over 100 countries at Sanofi are dedicated to making a difference on patients’ daily life, wherever they live and enabling them to enjoy a healthier life. As a company with a global vision of drug development and a highly regarded corporate culture, Sanofi is recognized as one of the best pharmaceutical companies in the world and is pioneering the application of Artificial Intelligence (AI) with strong commitment to develop advanced data standards to increase reusability & interoperability and thus accelerate impact on global health. The R&D Data Office serves as a cornerstone to this effort. Our team is responsible for cross-R&D data strategy, governance, and management. We sit in partnership with Business and Digital, and drive data needs across priority and transformative initiatives across R&D. Team members serve as advisors, leaders, and educators to colleagues and data professionals across the R&D value chain. As an integral team member, you will be responsible for defining how R&D's structured, semi-structured and unstructured data will be stored, consumed, integrated / shared and reported by different end users such as scientists, clinicians, and more. You will also be pivotal in the development of sustainable mechanisms for ensuring data are FAIR (findable, accessible, interoperable, and reusable). Position Summary The R&D Data Steward plays a critical role in the intersection between business and data, where stewards will guide business teams on how to unlock value from data. This role will drive definition and documentation of R&D data standards in line with enterprise. Data stewards will place heavily cross-functional roles and must be comfortable with R&D data domains, data policies, and data cataloguing. Main Responsibilities Work in collaboration with R&D Data Office leadership (including the Data Capability and Strategy Leads), business, R&D Digital subject matter experts and other partners to: Understand the data-related needs for various cross-R&D capabilities (E.g., data catalog, master data management etc) and associated initiatives Influence, design, and document data governance policies, standards and procedures for R&D data Drive data standard adoption across capabilities and initiatives; manage and maintain quality and integrity of data via data enrichment activities (E.g., cleansing, validating, enhancing etc) Understand and adopt data management tools such as R&D data catalogue, etc Develop effective data sharing artifacts for appropriate usage of data across R&D data domains Ensure the seamless running of the data-related activities and verify data standard application from ingest through access Maintain documentation and act as an expert on data definitions, data flows, legacy data structures, access rights models, etc. for assigned domain Oversee data pipeline and availability and escalate issues where they surface; ensure on-schedule/on-time delivery and proactive management of risks/issues Educate and guide R&D teams on standards and information management principles, methodologies, best practices, etc. Oversee junior data stewards and/or business analysts based on complexity or size of initiatives/functions supported Deliverables Defines Data quality and communication metrics for assigned domains and 1-2 business functions Implements continuous improvement opportunities such as functional training Accountable for data quality and data management activities for the assigned domains. Facilitates data issue resolution Defines business terms and data elements (metadata) according to data standards, and ensures standardization/normalization of metadata Leads working groups to identify data elements, perform root cause and impact analysis, and identify improvements for metadata and data quality Regularly communicates with other data leads, expert Data Steward and escalates issues as appropriate About You Experience in Data wrangling, Data programming, Business Data Management, Information Architecture, Technology, or related fields ,Demonstrated ability to understand end-to-end data use and needs Experience in CMC (Chemistry manufacturing & control) experience in R&D /CRO/Pharma data domains (e.g., across research, clinical, regulatory etc) Solid grasp of data governance practices and track record of implementation, Ability to understand data processes and requirements, particularly in R&D at an enterprise level Demonstrated strong attention to detail, quality, time management and customer focus, Excellent written and oral communications skills Strong networking, influencing and negotiating skills and superior problem-solving skills, demonstrated willingness to make decisions and to take responsibility for such Excellent interpersonal skills (team player), People management skills either in matrix or direct line function Familiar with data management practices and technologies (e.g., Collibra, Informatica etc); experience in practices not required, Knowledge of pharma R&D industry regulations and compliance requirements related to data governance Education: Scientific or life sciences background null Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job title: Data Steward Manager Associate Location: Hyderabad About The Job Sanofi is a global life sciences company committed to improving access to healthcare and supporting the people we serve throughout the continuum of care. From prevention to treatment, Sanofi transforms scientific innovation into healthcare solutions, in human vaccines, rare diseases, multiple sclerosis, oncology, immunology, infectious diseases, diabetes and cardiovascular solutions and consumer healthcare. More than 110,000 people in over 100 countries at Sanofi are dedicated to making a difference on patients’ daily life, wherever they live and enabling them to enjoy a healthier life. As a company with a global vision of drug development and a highly regarded corporate culture, Sanofi is recognized as one of the best pharmaceutical companies in the world and is pioneering the application of Artificial Intelligence (AI) with strong commitment to develop advanced data standards to increase reusability & interoperability and thus accelerate impact on global health. The R&D Data Office serves as a cornerstone to this effort. Our team is responsible for cross-R&D data strategy, governance, and management. We sit in partnership with Business and Digital, and drive data needs across priority and transformative initiatives across R&D. Team members serve as advisors, leaders, and educators to colleagues and data professionals across the R&D value chain. As an integral team member, you will be responsible for defining how R&D's structured, semi-structured and unstructured data will be stored, consumed, integrated / shared and reported by different end users such as scientists, clinicians, and more. You will also be pivotal in the development of sustainable mechanisms for ensuring data are FAIR (findable, accessible, interoperable, and reusable). Position Summary The R&D Data Steward plays a critical role in the intersection between business and data, where stewards will guide business teams on how to unlock value from data. This role will drive definition and documentation of R&D data standards in line with enterprise. Data stewards will place heavily cross-functional roles and must be comfortable with R&D data domains, data policies, and data cataloguing. Main Responsibilities Work in collaboration with R&D Data Office leadership (including the Data Capability and Strategy Leads), business, R&D Digital subject matter experts and other partners to: Understand the data-related needs for various cross-R&D capabilities (E.g., data catalog, master data management etc) and associated initiatives Influence, design, and document data governance policies, standards and procedures for R&D data Drive data standard adoption across capabilities and initiatives; manage and maintain quality and integrity of data via data enrichment activities (E.g., cleansing, validating, enhancing etc) Understand and adopt data management tools such as R&D data catalogue, etc Develop effective data sharing artifacts for appropriate usage of data across R&D data domains Ensure the seamless running of the data-related activities and verify data standard application from ingest through access Maintain documentation and act as an expert on data definitions, data flows, legacy data structures, access rights models, etc. for assigned domain Oversee data pipeline and availability and escalate issues where they surface; ensure on-schedule/on-time delivery and proactive management of risks/issues Educate and guide R&D teams on standards and information management principles, methodologies, best practices, etc. Oversee junior data stewards and/or business analysts based on complexity or size of initiatives/functions supported Deliverables Defines Data quality and communication metrics for assigned domains and 1-2 business functions Implements continuous improvement opportunities such as functional training Accountable for data quality and data management activities for the assigned domains. Facilitates data issue resolution Defines business terms and data elements (metadata) according to data standards, and ensures standardization/normalization of metadata Leads working groups to identify data elements, perform root cause and impact analysis, and identify improvements for metadata and data quality Regularly communicates with other data leads, expert Data Steward and escalates issues as appropriate About You Experience in Data cataloging, Data Governance, Data Analysis, Data Quality, Metadata, Business Data Management, Information Architecture, Technology, or related fields Demonstrated ability to understand end-to-end data use and needs Knowledge of R&D data domains (e.g., across research, clinical, regulatory etc) Solid grasp of data governance practices and track record of implementation Ability to understand data processes and requirements, particularly in R&D at an enterprise level Demonstrated strong attention to detail, quality, time management and customer focus Excellent written and oral communications skills, Strong networking, influencing and negotiating skills and superior problem-solving skills, demonstrated willingness to make decisions and to take responsibility for such Excellent interpersonal skills (team player), People management skills either in matrix or direct line function Familiar with data management practices and technologies (e.g., Collibra, Informatica etc); experience in practices not required, Knowledge of pharma R&D industry regulations and compliance requirements related to data governance Bachelor's in computer science, Business, Engineering, Information Technology null Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Essential Duties And Responsibilities Accounting and Closing: Coordinate with sites across India to ensure monthly closing is completed and validated within the timeline Provide guidelines/coaching in relation to reporting requirements when sites have new business module or have key business changes, monitor the implementation, as well as continuous improvements. Lead accounting/reporting consistency implementation with guide from Corporate or regional finance leaders/self-initiation per continuous improvement needs, including data validation, data mapping, training, developing canned reports, design normalization solutions as well as monitoring the implementation Finance Planning, Forecast, And Reporting And Analysis Prepare and submit flash reports to global leadership team and regional leadership team Coordinate financial review meetings; work with each BU to deliver accurate reporting and financial forecast. Prepares consolidated regular (monthly) financial reports, analysis, bridges Provide management with insights into drivers of revenue, risks, and opportunities as well as variance analysis of actual, prior year and forecast Establish monthly/Qtrly business review material template for sites and support material preparation for regional leadership team Establish monthly KPI tracker / dash board reports for regional and country team Improve current process on reporting and analysis, to adopt new tools, to automate/reduce manual processes Coordinate Strategic Plans and lead annual budget / quarterly forecast process include but not limited to below Design budget and forecast schedule for the whole region, establish template, publish key assumptions, determine and submit accounting changes proposal, collect and consolidate sites' submissions, prepare overlay loading, and analyze and comment on the reporting of preliminary and final packs. Support functional leaders, coordinate functional costs forecast, budgeting, actual reporting, and analysis at India level, design customization reports as well as lead harmonization projects requested by functional leaders. Complete budget/forecast packs and key P&L components analysis(Pricing/FX/Mix/IC markup elimination/Inflation/CI/SGA) per Corporate timeline Coordinate and consolidate daily & weekly sales reports for India Provide administration support in relation to BI to India Finance users and provide trainings as necessary Coordinate and consolidate JV reporting requests for India as well as provide technical support to sites finance teams in relation to GAAP conversion Others Coordinate and consolidate risk and exposure reports Participate in the internal control and compliance activities, including SOX testing/review, Control Self-Assessment and Balance Sheet Review as necessary Assists with ad hoc regional finance projects when required Keys to Success Requires in-depth knowledge and experience Requires conceptual and practical expertise in own area and general knowledge of related areas Has knowledge of best practices and how own area integrates with others; is aware of the competition and the factors that differentiate the company in the market Supervisory Responsibilities Requirements And Preferred Skills QUALIFICATIONS: To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Business or Accounting degree 4+ years of experience in FP&A, regional exposure, accounting / costing experience a plus Strong communication and organizational skills. Must be willing to travel. Ability to provide concurrent oversight to multiple resources and projects. Demonstrated proficiency managing analytically rigorous initiatives. Required Skills Manufacturing background and knowledge in ERP & BI, with proven skills in the areas of planning, operational excellence & analytics, and strong business acumen. Ability to meet deadlines and demonstrates effective time management Strong skill in HFM (Consolidation & Planning tool) and Microsoft applications (Excel, PowerPoint) Strong communicator - both written and verbal in technical and non-technical environment Proactive, self-motivated. Ability to work independently with minimal supervision and follow through to meet objectives Works well under pressure, demonstrates flexibility in work style to accommodate changing priorities and fixed deadlines. Strong interpersonal skill and team player. Ability to build collaborative relationships cross hierarchy, cross function and cross region. Excellent business ethics and integrity Demonstrated ability to manage high pressure situations manages conflict and prioritizes workloads. Self-directed and motivated. Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies