Jobs
Interviews

1190 Normalization Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : mSQL,SQL Writing,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

5.0 years

0 Lacs

Patna, Bihar, India

On-site

Job Description : Senior Software Engineer - SQL Server Position : Team Lead Education Qualification : Any Graduate Minimum Years of Experience : 5+ Years Key Skills : MS SQL Server Type of Employment : Permanent Requirement : Immediate or Max 15 days Location : Any where in India Job Responsibilities Responsible to work with development team to develop, implement, and manage data base models for core product development. Responsible to write SQL database views, tables, and stored procedures to support engineering product development. Responsible for designing and maintaining SSIS, T-SQL, and SQL jobs. Responsible for developing and maintaining complex stored procedures for loading data into staging tables from OLTP, and other intermediary systems. Responsible for analysis, design specifications, development, implementation, and maintenance of DB. Responsible for designing partitioning of DB for Archive data. Responsible to ensure that the best practices and standards established for the use of tools like SQL Server, SSIS, SSRS, Excel Power Pivot/View/Map are incorporated in Data Analytics solutions design. Responsible for documenting complex processes, business requirements and specifications. Requirements Technical Skills : Experience in database design, normalization, query design, performance tuning Proficient in writing complex Transact SQL code. Proficient in MS SQL Server query tuning. Experience in writing stored procedures, functions, views and triggers. Experience in Indexes, column store index, SQL server column storage, Query execution plan. Provide authentication and authorizations for Database. Develop best practices for database design and development activities. Experience in database migration activities. Strong analytical, multi-tasking and problem-solving skills. (ref:hirist.tech)

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Join Barclays as an Analyst - Cost Utility role, where to support in execution of end to end monthly financial close, performing aged accrual analysis, Vendor cost analysis, production of financials, flash, providing support in commentaries , executing APE amendments, Normalization at AE levels , supporting FC & FBP in relation to any queries from auditors. At Barclays, we don't just anticipate the future - we're creating it. To be successful in this role, you should have below skills: Qualified CA / CMA / CPA / ACCA / CFA / MBA Finance from premier institute with minimum a year of relevant experience CA Inter / Commerce Graduate with minimum few years of relevant experience Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Some Other Highly Valued Skills May Include Below Knowledge of SAP and understanding around Ledger hierarchy Broad understanding of Finance Business Partnering Intermediate to Advanced excel and Powerpoint skills Knowledge of automation tools like Alteryx You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in our Noida office. Purpose of the role To provide financial expertise and support to specific business units or departments within the organisation, and act as a liaison between the finance function and various business units, helping to bridge the gap between financial data and business decisions. Accountabilities Development and implementation of business unit financial strategies, plans and budgets, using insights to evaluate the financial implications of strategic initiatives and recommend appropriate actions. Development of financial models to forecast future performance, assess investment opportunities, and evaluate financial risks for business units, and to analyse the impact of business decisions on financial performance and provision of recommendations. . Cross functional collaboration to provide financial insights and guidance to business unit stakeholders. Identification of opportunities and implementation of financial process improvements that streamline financial operations. Support to business units in identification, assessment, and mitigation of financial risks, including provision of training and guidance to business units on financial risk management and compliance practices. Analysis and presentation of financial data to provide insights into business performance, identify trends, and support decision-making. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Us VE3 is at the forefront of delivering cloud‑native data solutions to premier clients across finance, retail and healthcare. As a rapidly growing UK‑based consultancy, we pride ourselves on fostering a collaborative, inclusive environment where every voice is heard—and every idea can become tomorrow’s breakthrough. Role: Database Designer / Senior Data Engineer What You’ll Do Architect & Design Lead the design of modern, scalable data platforms on AWS and/or Azure, using best practices for security, cost‑optimisation and performance. Develop detailed data models (conceptual, logical, physical) and document data dictionaries and lineage. Build & Optimize Implement robust ETL/ELT pipelines using Python, SQL, Scala (as appropriate), leveraging services such as AWS Glue, Azure Data Factory, and open‑source frameworks (Spark, Airflow). Tune data stores (RDS, SQL Data Warehouse, NoSQL like Redis) for throughput, concurrency and cost. Establish real‑time data streaming solutions via AWS Kinesis, Azure Event Hubs or Kafka. Collaborate & Deliver Work closely with data analysts, BI teams and stakeholders to translate business requirements into data solutions and dashboards. Partner with DevOps/Cloud Ops to automate CI/CD for data code and infrastructure (Terraform, CloudFormation). Governance & Quality Define and enforce data governance, security and compliance standards (GDPR, ISO27001). Implement monitoring, alerting and data quality frameworks (Great Expectations, AWS CloudWatch). Mentor & Innovate Act as a technical mentor for junior engineers; run brown‑bag sessions on new cloud services or data‑engineering patterns. Proactively research emerging big‑data and streaming technologies to keep our toolset cutting‑edge. Who You Are Academic Background: Bachelor’s (or higher) in Computer Science, Engineering, IT or similar. Experience: ≥3 years in a hands‑on Database Designer / Data Engineer role, ideally within a cloud environment. Technical Skills: Languages: Expert in SQL; strong Python or Scala proficiency. Cloud Services: At least one of AWS (Glue, S3, Kinesis, RDS) or Azure (Data Factory, Data Lake Storage, SQL Database). Data Modelling: Solid understanding of OLTP vs OLAP, star/snowflake schemas, normalization & denormalization trade‑offs. Pipeline Tools: Familiarity with Apache Spark, Kafka, Airflow or equivalent. Soft Skills: Excellent communicator—able to present complex technical designs in clear, non‑technical terms. Strong analytical mindset; thrives on solving performance bottlenecks and scaling challenges. Team player—collaborative attitude in agile/scrum settings. Nice to Have Certifications: AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate/Expert. Exposure to data‑science workflows (Jupyter, ML pipelines). Experience with containerized workloads (Docker, Kubernetes) for data processing. Familiarity with DataOps practices and tools (dbt, Great Expectations, Terraform). Our Commitment to Diversity We’re an equal‑opportunity employer committed to inclusive hiring. All qualified applicants—regardless of ethnicity, gender identity, sexual orientation, neurodiversity, disability status or veteran status—are encouraged to apply.

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Gruve Gruve is an innovative software services startup dedicated to transforming enterprises to AI powerhouses. We specialize in cybersecurity, customer experience, cloud infrastructure, and advanced technologies such as Large Language Models (LLMs). Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks. Position Summary We are looking for a highly skilled and detail-oriented Quality Engineer – Cybersecurity Migrations to support the validation of security policy migrations across major NGFW platforms. In this role, you will be responsible for analyzing the output of automated migration tools, resolving post-migration issues, and ensuring consistent, secure, and functional firewall configurations in customer environments. Your work will directly impact customer satisfaction and operational stability. Key Roles & Responsibilities Design and maintain test plans and test cases to validate the accuracy and completeness of automated firewall migration output. Review and verify firewall configurations migrated by automation tools to ensure they meet expected functionality and security posture. Identify logic gaps, configuration anomalies, or rule mismatches introduced during tool-based migrations. Collaborate with tool development teams to report defects, validate fixes, and improve transformation logic. Troubleshoot and resolve post-migration escalations, including policy behaviour mismatches, broken traffic flows, and unexpected security outcomes. Perform side-by-side comparisons of pre- and post-migration rules to verify functional equivalence. Raise, track, and close vendor support (TAC) cases where deep platform-level issues are involved. Contribute to documentation including SOPs, test coverage reports, known issues, and configuration validation guides. Interface directly with customers to understand post-migration challenges and ensure successful resolution. Basic Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field (BE / B.Tech / MCA or equivalent). 6+ years of hands-on experience in firewall configuration, validation, and troubleshooting. Strong technical knowledge of NGFW platforms including Palo Alto, Cisco FTD, FortiGate, and Check Point. Familiarity with policy and object modelling across different firewall platforms. Experience in writing test cases and test plans for automation tools or configuration transformation logic. Strong understanding of networking and security concepts, including routing, NAT, VPNs, zones, and application filtering. Excellent debugging and root cause analysis skills in post-deployment/migration environments. Ability to interpret logs, packet captures, and platform-specific diagnostics to isolate issues. Preferred Qualifications Certification: PCNSA, PCNSE, NSE-1, NSE-2, NSE-3, NSE-4 Certification: CCNA (R&S) / CCNP (R&S) Exposure to firewall migration automation tools Understanding policy normalization and risk scoring tools (e.g., Tufin, FireMon) Why Gruve At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you. Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Security Engineer Project Role Description : Apply security skills to design, build and protect enterprise systems, applications, data, assets, and people. Provide services to safeguard information, infrastructures, applications, and business processes against cyber threats. Must have skills : Security Platform Engineering Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: We are seeking a skilled Security Engineer with expertise in Google Chronicle SIEM, parser development, and foundational knowledge of cybersecurity. The ideal candidate will be responsible for analyzing security data and logs, ensuring accurate aggregation, normalization, tagging, and classification. You will work closely with log sources, particularly security and networking devices, to enhance our security monitoring capabilities. Roles & Responsibilities: Conduct security and data/log analysis, focusing on the aggregation, normalization, tagging, and classification of logs. Research, analyze, and understand log sources for security monitoring, with a particular focus on security and networking devices such as firewalls, routers, antivirus products, proxies, IDS/IPS, and operating systems. Validate log sources and indexed data, optimizing search criteria to improve search efficiency. Utilize automation tools to build and validate log collectors for parsing aggregated logs. Professional & Technical Skills: Proficiency in log analysis and SIEM tools, including but not limited to Google Chronicle, Splunk, ArcSight, and QRadar. Experience in SIEM content creation and reporting is essential. Strong experience in manual security log review and analysis, such as Windows Event Log and Linux Syslog, including incident classification, investigation, and remediation. Solid understanding of multiple attack vectors, including malware, Trojans, exploit kits, ransomware, phishing techniques, and APTs, as well as familiarity with attack techniques outlined in the OWASP Top 10. Knowledge of security and networking devices, including firewalls, routers, antivirus products, proxies, IDS/IPS, and operating systems. TCP/IP networking skills for packet and log analysis. Experience working with Windows and Unix platforms. Familiarity with databases is an advantage. Experience in GCP, AWS and Azure environments is a plus. Additional Information: - The candidate should have minimum 5 years of experience in Security Platform Engineering. - This position is based at our Pune office. - A 15 years full time education is required.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Gurgaon

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Databricks Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Any Btech Degree Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable applications using Microsoft Azure Databricks. Your typical day will involve collaborating with the team to understand business requirements, designing and developing applications, and ensuring the applications meet quality standards and performance expectations. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with the team to understand business requirements and translate them into technical specifications. - Design, develop, and test applications using Microsoft Azure Databricks. - Ensure the applications meet quality standards and performance expectations. - Troubleshoot and debug applications to identify and resolve issues. - Provide technical guidance and support to junior developers. - Stay updated with the latest industry trends and technologies related to application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Microsoft Azure Databricks. - This position is based at our Hyderabad office. - A Any Btech Degree is required. Any Btech Degree

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments Data Engineering Role Summary: Skilled Data Engineer with strong Python programming skills and experience in building scalable data pipelines across cloud environments. The candidate should have a good understanding of ML pipelines and basic exposure to GenAI solutioning. This role will support large-scale AI/ML and GenAI initiatives by ensuring high-quality, contextual, and real-time data availability. ________________________________________ Key Responsibilities: Design, build, and maintain robust, scalable ETL/ELT data pipelines in AWS/Azure environments. Develop and optimize data workflows using PySpark, SQL, and Airflow. Work closely with AI/ML teams to support training pipelines and GenAI solution deployments. Integrate data with vector databases like ChromaDB or Pinecone for RAG-based pipelines. Collaborate with solution architects and GenAI leads to ensure reliable, real-time data availability for agentic AI and automation solutions. Support data quality, validation, and profiling processes. ________________________________________ Key Skills & Technology Areas: Programming & Data Processing: Python (4–6 years), PySpark, Pandas, NumPy Data Engineering & Pipelines: Apache Airflow, AWS Glue, Azure Data Factory, Databricks Cloud Platforms: AWS (S3, Lambda, Glue), Azure (ADF, Synapse), GCP (optional) Databases: SQL/NoSQL, Postgres, DynamoDB, Vector databases (ChromaDB, Pinecone) – preferred ML/GenAI Exposure (basic): Hands-on with Pandas, scikit-learn, knowledge of RAG pipelines and GenAI concepts Data Modeling: Star/Snowflake schema, data normalization, dimensional modeling Version Control & CI/CD: Git, Jenkins, or similar tools for pipeline deployment ________________________________________ Other Requirements: Strong problem-solving and analytical skills Flexible to work on fast-paced and cross-functional priorities Experience collaborating with AI/ML or GenAI teams is a plus Good communication and a collaborative, team-first mindset Experience in Telecom, E- Commerce, or Enterprise IT Operations is a plus. Skills ETL,BIGDATA,PYSPARK,SQL

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title : Data Engineer Experience : 4-9 Years Location : Noida, Chennai & Pune Skills : Python, Pyspark, Snowflake & Redshift Key Responsibilities • Migration & Modernization • Lead the migration of data pipelines, models, and workloads from Redshift to Snowflake/Yellowbrick. • Design and implement landing, staging, and curated data zones to support scalable ingestion and consumption patterns. • Evaluate and recommend tools and frameworks for migration, including file formats, ingestion tools, and orchestration. • Design and build robust ETL/ELT pipelines using Python, PySpark, SQL, and orchestration tools (e.g., Airflow, dbt). • Support both batch and streaming pipelines, with real-time processing via Kafka, Snowpipe, or Spark Structured Streaming. • Build modular, reusable, and testable pipeline components that handle high volume and ensure data integrity. • Define and implement data modeling strategies (star, snowflake, normalization/denormalization) for analytics and BI layers. • Implement strategies for data versioning, late-arriving data, and slowly changing dimensions. • Implement automated data validation and anomaly detection (using tools like dbt tests, Great Expectations, or custom checks). • Build logging and alerting into pipelines to monitor SLA adherence, data freshness, and pipeline health. • Contribute to data governance initiatives including metadata tracking, data lineage, and access control. Required Skills & Experience • 10+ years in data engineering roles with increasing responsibility. • Proven experience leading data migration or re-platforming projects. • Strong command of Python, SQL, and PySpark for data pipeline development. • Hands-on experience with modern data platforms like Snowflake, Redshift, Yellowbrick, or BigQuery. • Proficient in building streaming pipelines with tools like Kafka, Flink, or Snowpipe. • Deep understanding of data modeling, partitioning, indexing, and query optimization. • Expertise with ETL orchestration tools (e.g., Apache Airflow, Prefect, Dagster, or dbt). • Comfortable working with large datasets and solving performance bottlenecks. • Experience in designing data validation frameworks and implementing DQ rules.

Posted 2 weeks ago

Apply

0.0 - 10.0 years

0 Lacs

Saidapet, Chennai, Tamil Nadu

On-site

Job Information Date Opened 07/18/2025 Job Type Full time City Saidapet State/Province Tamil Nadu Country India Zip/Postal Code 600096 Industry Technology Job Description Job Title: Database Consultant Job Summary: The Database Consultant is responsible for evaluating, optimizing, and securing the organization’s database systems to ensure high performance, data integrity, and regulatory compliance. This role supports the classification, integration, and lifecycle management of data assets in alignment with national standards and organizational policies. The consultant plays a key role in enabling data-driven decision-making and maintaining robust data infrastructure. Key Responsibilities: Database Assessment & Optimization: Analyze existing database systems for performance, scalability, and reliability. Recommend and implement tuning strategies to improve query efficiency and resource utilization. Support database upgrades, migrations, and reconfigurations. Security & Compliance: Ensure databases comply with national cybersecurity and data protection regulations. Implement access controls, encryption, and backup strategies to safeguard data. Conduct regular audits and vulnerability assessments. Data Classification & Integration: Support the classification of data assets based on sensitivity, usage, and ownership. Facilitate integration of data across platforms and applications to ensure consistency and accessibility. Collaborate with data governance teams to maintain metadata and lineage documentation. Lifecycle Management: Develop and enforce policies for data retention, archival, and disposal. Monitor data growth and storage utilization to support capacity planning. Ensure databases are aligned with business continuity and disaster recovery plans. Collaboration & Advisory: Work closely with application developers, data analysts, and IT teams to understand data requirements. Provide expert guidance on database design, normalization, and indexing strategies. Assist in selecting and implementing database technologies that align with business goals. Innovation & Best Practices: Stay current with emerging database technologies and trends (e.g., cloud databases, NoSQL, data mesh). Promote best practices in database management and data governance. Contribute to the development of enterprise data strategies. Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. Proven experience in database administration, consulting, or data architecture. Proficiency in SQL and familiarity with major database platforms (e.g., Oracle, SQL Server, PostgreSQL, MySQL, MongoDB). Knowledge of data governance frameworks and compliance standards (e.g., GDPR, HIPAA, ISO 27001). Strong analytical, problem-solving, and communication skills. 6-10 years of relevant Experience in IT

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

On-site

Key Responsibilities Architect, develop, and optimize detection content across SIEM platforms such as Microsoft Sentinel, Splunk, and Google Chronicle. Normalize and structure diverse log sources using schemas like Splunk CIM, Microsoft Sentinel, OCSF, and Chronicle UDM to ensure consistent detection across the board. Collaborate with teams including Threat Labs and Data Engineering to improve parsing, data transformation, and use case configurations. Perform end-to-end development, customization, and onboarding of supported and custom data sources (EDR, firewall, antivirus, proxies, OS, databases). Repair events with missing or incorrect data, create parser extensions, and manage flow logic for log ingestion pipelines. Conduct log source analysis and maintain robust documentation of data structures, parsing rules, and detection logic. Build and maintain monitoring reports to ensure data pipeline availability and proactively identify performance issues or gaps in data coverage. Continuously evaluate and refine detection content and parsing logic for high fidelity and low false-positive rates. Required Qualifications 7+ years of experience in security engineering, detection content development, or SIEM management. Strong hands-on experience with SIEM platforms, particularly Microsoft Sentinel, Splunk, and Chronicle. Expertise with multiple data models including Splunk CIM, Sentinel schemas, Chronicle UDM, and OCSF. Experience working with diverse log sources (e.g., EDRs, firewalls, antivirus, proxies, databases, OS logs). Skilled in event parsing, field extraction, normalization, and enrichment for log data. Familiarity with scripting/query languages such as KQL, SPL, and UDM search syntax. Strong understanding of SOC operations, detection engineering workflows, and threat modeling frameworks (MITRE ATT&CK, etc.). Preferred Qualifications Experience working with cloud-native and hybrid security architectures. Familiarity with data transformation tools and stream processing pipelines. Previous collaboration with threat research or threat intelligence teams. Security certifications such as GCIA, GCTI, or similar are a plus. (ref:hirist.tech)

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Working with Us Challenging. Meaningful. Life-changing. Those aren't words that are usually associated with a job. But working at Bristol Myers Squibb is anything but usual. Here, uniquely interesting work happens every day, in every department. From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it. You'll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams. Take your career farther than you thought possible. Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment. We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives. Read more careers.bms.com/working-with-us . Key Responsibilities Responsible for operational and strategic management of the software portfolio to optimize the value, cost and minimize the risk for Software and SaaS investments. Expertise reading US SW contractual agreements and managing the license entitlements and metrics throughout the lifecycle. Review Top Spend Publisher Software Products for software product spend, accuracy, and optimization. Assist in any Software Rationalization initiatives to reduce software spend. Track, maintain, and orchestrate license and maintenance agreement renewals with software owners. Manage software license information including licensing contracts, SW agreements, license metrics & SW models. Responsible for software asset management data quality. Provide data insights from various sources showing license entitlements, installations, usage, renewal. Troubleshoots the ServiceNow SAMPro Module for up-to-date normalization, discovery, completeness, and entitlement assurance. Collect and maintain accurate Software Licensing information in repositories to address budgeting, software compliance & inventory, contracts, and cost. Partners with the Software Owners and IT Software Sourcing and Procurement team during software publisher renewals, true-ups, and reconciliations, audits, as well as assist with dispute resolution and defense initiatives. Advisor to Software Owners to manage enterprise License true-up & reconciliations. Provide support during software publisher audits. Assist with dispute resolution. Recommend audit defense initiatives. Develop and maintain SAM metrics and KPIs to measure the effectiveness of software asset management capabilities and identify areas for improvement. Actively assesses risk and cost reduction opportunities and makes recommendations to Software Owners and Software Asset Management leadership to optimize the software asset portfolio. Manage the ServiceNow Content library with publisher part number library requests. Expand the SaaS Software subscription usage visibility by integrating to SaaS provider portals. Perform reconciliations to prove the accuracy of the integrations and confirm with Software Product Owners. Partners with BMS Software owners, IT Software Souring & procurement teams to ensure proactive asset management. procured in the US with US contractual agreements. Critically evaluates and interprets current trends. Contributes to vision for functional / regional / departmental strategy. Employs a broad knowledge base of technologies and approaches to solve complex and novel problems. Recommends course of action to achieve desired results Create, update and maintain Demand records for Software and SaaS (Software-as-a-Service) assets and licenses in ServiceNow in order to facilitate Budgeting and Projection exercises. Review Software & SaaS purchasing requests and contracting activities including contract. processing and compliance, purchasing list management, requisition, purchase order, and invoice processing. Route Software & SaaS requests and contracts for appropriate processing and approvals, ensuring compliance to BMS policies and procedures. Coordinate and facilitate communications between stakeholders, Legal, Global Procurement, Finance and Service Providers. Provide subject matter expertise and guidance on the processes for Software & SaaS, contracting and orders. Perform administrative tasks necessary to support the Software/SaaS purchase request and contracting processes. Qualifications & Experience Strong understanding of Software License Management. Requires deep expertise in software licensing and software asset management functions. Demonstrated commitment to customer experience and success - ability to simplify experiences and deliver outcomes for the business and your customers. Partner with subject matter experts, including software owners, ServiceNow administrators, sourcing team members, project and program managers, financial managers, and engineers to obtain critical information required for the management of software. Demonstrated growth mindset with a willingness to learn, adapt, embrace feedback, and continuously improve. Partners with stakeholders & customers to shape the goals and objectives. Strong understanding of ITIL, ITSM processes and ServiceNow platform capabilities . SAM certifications such as IAITAM and Microsoft licensing certifications. Influences internal and external stakeholders to ensure operational decisions and business requirements have a positive impact on the function and BMS. Directs external vendors tactically, provided some strategic input to vendors on services delivered. Recommends pursuing actions based on impact on people, process, technology, structure, and/or workflow. Initiates challenging opportunities that build strong capabilities for self and team. Develops and implements proactive approaches to new technologies and processes. If you come across a role that intrigues you but doesn't perfectly line up with your resume, we encourage you to apply anyway. You could be one step away from work that will transform your life and career. Uniquely Interesting Work, Life-changing Careers With a single vision as inspiring as Transforming patients' lives through science™ , every BMS employee plays an integral role in work that goes far beyond ordinary. Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues. On-site Protocol BMS has an occupancy structure that determines where an employee is required to conduct their work. This structure includes site-essential, site-by-design, field-based and remote-by-design jobs. The occupancy type that you are assigned is determined by the nature and responsibilities of your role Site-essential roles require 100% of shifts onsite at your assigned facility. Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility. For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture. For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function. BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles. Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer. If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms.com . Visit careers.bms.com/ eeo -accessibility to access our complete Equal Employment Opportunity statement. BMS cares about your well-being and the well-being of our staff, customers, patients, and communities. As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters. BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area. If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information https //careers.bms.com/california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Greetings from Zensar Technologies, Hyderabad!!! Office Location : DLF City, Hyderabad Mode : 5 Days Work from Office evry week We are hiring for HRBP roles. What's this role about? The role is People partner role for Hyderabad Location. Assist in monitoring/tracking employee relations issues including resolution and follow-up. Respond to questions, requests, and concerns from employees and management regarding company and Human Resources programs, policies and guidelines Here's how you'll contribute: Manage and ensure the smooth functioning of the day-to-day operations of the HR Functions. Review and implementation of the HR policies, procedures and programs across the associate life cycle. This includes Associate Onboarding and Induction, Performance Management, Reporting and local legislations; as applicable, Learning and development initiative implementation, Compensation, Benefits and Policy administration, Rewards and Recognition, Implement Associate engagement framework, Associate retention, Associate communication, Monthly MIS and reporting to Business, Associate safety, welfare, wellness and health. Evaluate the challenges inherent in the emerging business need and propose alternatives / solutions to the unit / HR Leadership. Drive the performance management system to align to business needs in addition to people needs; align goal setting to business outcomes and people drivers. Align normalization to reflect business performance, guide rating/evaluation to evaluate potential for growth. Design an HR dashboard to capture and highlight key statistics on a period basis as required by the unit, in order to communicate regularly with the stakeholders, the people health of the unit. Guide and work with business and associates in driving Associate engagement and development initiatives in alignment with business outcomes. Provide inputs to design Associate engagement practices which create a positive work environment and facilitate associate to perform. Monitor the effectiveness and bring improvements, understand business requirements for processes and implementation of policies and procedures. Design and drive extensive and continuous communication channels to drive organization values. Ensure a robust communication model to ensure maximum connect of associates with the management and dissemination of information.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Summary: A Business Intelligence Data Analyst II is responsible for performing analysis for any requests/questions that require research within the Data Domain at Verisys. A Business Intelligence Data Analyst supports short and long term operational/strategic business activities by performing analysis and making recommendations on data improvements. Duties/Responsibilities: Working collaboratively with all internal stakeholders to meet client needs. Implement and deliver on a broad range of projects and priorities including identifying the most efficient solution and quality review. Work with other teams within the Data Domain to troubleshoot issues and answer questions relative to client needs. Document pertinent processes and research outcomes. Functions may include database architecture, engineering, design, optimization, security, and administration; as well as data modeling, big data development, Extract, Transform, and Load (ETL) development, storage engineering, data warehousing, data provisioning and other similar roles. Performs other duties as assigned. Leverage analytics and quantitative methods to inform and influence decision-making; Foster a data-driven, test-and-learn culture through analytics, experimentation, and in-depth research into data quality and availability Influence and build relationships with people across all levels of the organization Responsible for gathering, structuring, and analyzing data and providing recommendations to management; Presents the results of data analyses and recommendations to management and process owners; Implement and deliver on a broad range of projects and priorities including identifying the most efficient solution and quality review. Monitor assigned client workflow deliveries including remediating errors, schedule changes, and escalation of issues. May design schemas, write SQL or other data markup scripting and helps to support development of Analytics and Applications that build on top of data. Performs analysis/research/work within the DBTask Team Required Skills/Abilities: Excellent verbal and written communication skills. Excellent organizational skills. Self-directed and highly motivated Ability to quickly learn new technologies. Ability to solve complex challenges and operate in a fast-paced team environment. Experience with Data Cleansing and Normalization. Strong SQL skills and experience working with relational databases, as well as working familiarity with a variety of databases. Experience in Web-services/API. Education/Experience: Required: Undergraduate degree or minimum of 3-5 years professional related work experience. Preferred : Knowledge of Healthcare domain. Strong understanding and experience of ETL processes & experience in ETL tools k. Experience with cloud technologies such as Kubernetes, AWS, RDS. Experience with G-Suite. Verisys transforms provider data, workforce data, and relationship management. More than 400 healthcare, life science, and background screening organizations depend on us to credential providers, improve data quality, publish compliant provider directories, and conduct employment verifications. Our comprehensive solutions deliver accurate and secure information. As a result, we’re the largest outsourced credentials verification organization in the United States. Since we’ve partnered with the most complex institutions in healthcare for decades, we can help organizations of any size discover their true potential. At Verisys, you can have a rewarding career on every level. In addition to challenging and meaningful work, you will have the chance to give back to your community, make a positive impact on the environment, participate in a range of diversity and inclusion initiatives, and find the support, coaching, and training it takes to advance your career. Our commitment to individual choice lets you customize aspects of your career path, your educational opportunities, and your benefits. And our culture of innovation means your ideas on how to improve our business and our clients will be heard.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Lead C onsultant - Netezza, SQL Server Strong understanding of database design, normalization, and data modeling Responsibilities H ands-on experience with Netezza, SQL Server Proficient in writing complex queries, procedures, and other DB objects Strong understanding of database design , normalization , and data modeling Experience with query performance tuning , index optimization , and execution plan analysis Qualifications we seek in you! Minimum Q ualifications / Skills B.E/ B. TECH/ Any Graduation Equivalent Preferred Q ualifications / Skills SQL server , Netezza Why join Genpact Lead AI-first transformation - Build and scale AI solutions that redefine industries Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career &mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

hackajob is collaborating with Comcast to connect them with exceptional tech professionals for this role. Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for planning and designing new software and web applications. Edits new and existing applications. Implements, testing and debugging defined software components. Documents all development activity. Works with moderate guidance in own area of knowledge. Job Description Key Skills: Advanced SQL ( Mysql, Presto, Oracle etc) Data Modeling (Normalization and Denormilation) ETL Tools (Talend, Pentaho, Informatica and Creation of Custom ETL Scripts) Big Data Technologies (Hadoop, Spark, Hive, Kafka etc) Data Warehousing (AWS, Big Query etc) Reporting (Tableau, Power BI) Core Responsibilities Data focused role would be expected to leverage these skills to design and implement robust data solutions. They would also play a key role in mentoring junior team members and ensuring the quality and efficiency of data processes. Skills in data visualization tools like Tableau and Power BI. Good to have Data Quality principles Analyzes and determines integration needs. Evaluates and plans software designs, test results and technical manuals. Reviews literature, patents and current practices relevant to the solution of assigned projects. Programs new software, web applications and supports new applications under development and the customization of current applications. Edits and reviews technical requirements documentation. Works with Quality Assurance team to determine if applications fit specification and technical requirements. Displays knowledge of engineering methodologies, concepts, skills and their application in the area of specified engineering specialty. Displays knowledge of and ability to apply, process design and redesign skills. Displays in-depth knowledge of and ability to apply, project management skills. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) and overtime as necessary. Other duties and responsibilities as assigned. Employees At All Levels Are Expected To Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality - to help support you physically, financially and emotionally through the big milestones and in your everyday life. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 2-5 Years

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Key Responsibilities: Design, develop and maintain complex PL/SQL code and stored procedure to support business logic and data processing requirements. Develop and maintain data models for new and existing applications ensuring scalability, flexibility and integrity. Perform database performance tuning and query optimizations to enhance system efficiency. Ensure data quality standards are met through robust validation, transformation and cleansing processes. Administer and support Oracle databases ensuring high availability, backup and recovery and proactive monitoring. Provide database maintenance and support including patching, upgrades and troubleshooting production issues. Monitor system health and implement proactive alerts and diagnostics for performance and capacity management. Support and implement solutions using NoSQL database (e.g. MongoDB, Cassandra) where appropriate for specific use case. Collaborate with development teams to ensure code quality and best practices in PL/SQL development. Develop and maintain documentation related to data architecture, data flow and database processes. Required Skills and Qualificatioons: 8-12 years of experience in Oracle database development and administration, Mondgo DB Development and administration String proficiency in PL/SQL, SQL, Indexes, Triggers, Sequences, procedures, functions and Oracle database internals. Proven experience with data modeling (logical and physical), normalization and schema design Expertise in performance tuning, query optimization and indexing strategies. Hands-on experience with relational database (Oracle, PostgresSQL, MySQL) and NoSQL database (e.g. MongoDB, Cassandra) Solid understanding of database monitoring tools and tunning methodologies. Experience in data quality management practices and tools Strong exposure to code quality standards and tools for PL/SQL development (e.g. TOAD, SQL Developer, SonarQube) Knowledge of backup and recovery strategies, high availability and disaster recovery planning Familiarity with DevOps practices, including database CI/CD integration. Familiarity with Liquibase for database versioning and deployment automation. Preferred Skills: Experience in cloud-based databases (e.g. Oracle Cloud, AWS RDS, Azure SQL) Exposure to ETL tools and data integration platforms. Familiarity with regulatory compliance standards (e.g. GDPR, HIPPA) related to data management. Soft Skills: Strong analytical and problem-solving skills. Excellent communication and documentation abilities. Ability to work independently and as part of a cross-functional team. String attention to detail and commitment to data integrity. Education: Bachelor’s degree/University degree or equivalent experience ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

DXFactor is a US-based tech company working with customers across the globe. We are a Great place to work with certified company. We are looking for candidates for Data Engineer (4 to 6 Yrs exp) We have our presence in: US India (Ahmedabad, Bangalore) Location : Ahmedabad Website : www.DXFactor.com Designation: Data Engineer (Expertise in SnowFlake, AWS & Python) Key Responsibilities Design, develop, and maintain scalable data pipelines for batch and streaming workflows Implement robust ETL/ELT processes to extract data from various sources and load into data warehouses Build and optimize database schemas following best practices in normalization and indexing Create and maintain documentation for data flows, pipelines, and processes Collaborate with cross-functional teams to translate business requirements into technical solutions Monitor and troubleshoot data pipelines to ensure optimal performance Implement data quality checks and validation processes Build and maintain CI/CD workflows for data engineering projects Stay current with emerging technologies and recommend improvements to existing systems Requirements Bachelor's degree in Computer Science, Information Technology, or related field Minimum 4+ years of experience in data engineering roles Strong proficiency in Python programming and SQL query writing Hands-on experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra) Experience with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery) Proven track record in building efficient and scalable data pipelines Practical knowledge of batch and streaming data processing approaches Experience implementing data validation, quality checks, and error handling mechanisms Working experience with cloud platforms, particularly AWS (S3, EMR, Glue, Lambda, Redshift) and/or Azure (Data Factory, Databricks, HDInsight) Understanding of different data architectures including data lakes, data warehouses, and data mesh Demonstrated ability to debug complex data flows and optimize underperforming pipelines Strong documentation skills and ability to communicate technical concepts effectively

Posted 2 weeks ago

Apply

9.0 years

0 Lacs

Mundra, Gujarat, India

On-site

About Business JOB DESCRIPTION Adani Group: Adani Group is a diversified organisation in India comprising 10 publicly traded companies. It has created a world-class logistics and utility infrastructure portfolio that has a pan-India presence. Adani Group is headquartered in Ahmedabad, in the state of Gujarat, India. Over the years, Adani Group has positioned itself to be the market leader in its logistics and energy businesses focusing on large-scale infrastructure development in India with O & M practices benchmarked to global standards. With four IG-rated businesses, it is the only Infrastructure Investment Grade issuer in India. Adani Power Limited (APL): Adani Power Limited (APL), a part of the diversified Adani Group, is the largest private thermal power producer in India. We have a power generation capacity of 15,250 MW comprising thermal power plants in Gujarat, Maharashtra, Karnataka, Rajasthan, Chhattisgarh, Madhya Pradesh, and Jharkhand, and a 40 MW solar power project in Gujarat. Job Purpose: This role is responsible to monitor and operate plant systems, including Boiler, Turbine, Generator, BOP, switchyard and transformer area ensuring their optimal performance and executing necessary start-up and shut-down procedures. This role also performs and monitors Flue Gas Desulfurization (FGD) operations, maintaining equipment, promoting energy conservation, and ensuring environmental compliance. Responsibilities Control Room Engineer/Lead Desk Engineer /Lead FGD/ Switchyard Engineer System Monitoring And Operations Monitor Boiler, Turbine, Generator, BOP, and Electrical Systems to ensure optimal performance. Execute start-up and shut-down operations of the plant systems, as required. Maintain close observation of plant parameters and respond promptly to any deviations. Perform routine changeover of equipment and trial of emergency drives. Emergency Response And Equipment Handling Handle plant emergencies related to Boiler-Turbine-Generator (BTG) and coordinate responses. Manage emergencies related to switchyards and grid problems to minimize impact on operations. Ensure safe isolation and normalization of equipment in response to operational needs or emergencies. Perform Flue Gas Desulfurization (FGD) operations and monitor FGD system parameters. Maintain all plant parameter logbooks accurately and up-to-date. Energy Conservation And Environmental Compliance Promote energy conservation in all activities, focusing on Specific Oil Consumption (SOC), Auxiliary Power Consumption (APC), Demineralized (DM) Water usage, and Heat Rate. Monitor critical chemistry parameters to ensure environmental compliance and operational efficiency. Implement and oversee FGD operations to reduce emissions and comply with environmental regulations. Business Sustainability Ensure adherence to IMS, AWMS, DISHA, CHETNA guidelines within the department. Maintain safety of personnel and equipment through proper training and adherence to safety protocols. Adhere to Permit to Work (PTW) systems and Standard Operating Procedures (SOPs). Notify and report defects, problems in the plant to the shift in charge in a timely manner. Digitization And Automation Execute comprehensive digitization strategies to optimize operational efficiency. Implement automation solutions to support overall organizational goals/strategy. Implement process and system improvements, adopting newer technologies and innovative ideas. Key Stakeholders - Internal Maintenance Engineers Key Stakeholders - External NA Qualifications Educational Qualification: BE/B.tech in Mechanical or Electrical Engineering or a related field from a recognized institution. Work Experience (Range Of Years) Minimum of 9+ years of experience in power plant operations with a focus on thermal power plants Preferred Industry Experience in the power generation industry, specifically with thermal power plants, is highly preferred.

Posted 2 weeks ago

Apply

3.0 years

7 - 9 Lacs

Gurgaon

On-site

Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do As a Data Engineer, you will play a crucial role in designing, building, and maintaining the data infrastructure and systems required for efficient and reliable data processing. You will collaborate with cross-functional teams, including data scientists, analysts, to ensure the availability, integrity, and accessibility of data for various business needs. This role requires a strong understanding of data management principles, database technologies, data integration, and data warehousing concepts. Key Responsibilities Develop and maintain data warehouse solutions, including data modeling, schema design, and indexing strategies Optimize data processing workflows for improved performance, reliability, and scalability Identify and integrate diverse data sources, both internal and external, into a centralized data platform Implement and manage data lakes, data marts, or other storage solutions as required Ensure data privacy and compliance with relevant data protection regulations Define and implement data governance policies, standards, and best practices Transform raw data into usable formats for analytics, reporting, and machine learning purposes Perform data cleansing, normalization, aggregation, and enrichment operations to enhance data quality and usability Collaborate with data analysts and data scientists to understand data requirements and implement appropriate data transformations What You'll Bring Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field Proficiency in SQL and experience with relational databases (e.g., Snowflake, MySQL, PostgreSQL, Oracle) 3+ years of experience in data engineering or a similar role Hands-on programming skills in languages such as Python or Java is a plus Familiarity with cloud-based data platforms (e.g., AWS, Azure, GCP) and related services (e.g., S3, Redshift, BigQuery) is good to have Knowledge of data modeling and database design principles Familiarity with data visualization tools (e.g., Tableau, Power BI) is a plus Strong problem-solving and analytical skills with attention to detail Experience with HR data analysis and HR domain knowledge is preferred Who You'll Work With As part of the People Analytics team, you will modernize HR platforms, capabilities & engagement, automate/digitize core HR processes and operations and enable greater efficiency. You will collaborate with the global people team and colleagues across BCG to manage the life cycle of all BCG employees. The People Management Team (PMT) is comprised of several centers of expertise including HR Operations, People Analytics, Career Development, Learning & Development, Talent Acquisition & Branding, Compensation, and Mobility. Our centers of expertise work together to build out new teams and capabilities by sourcing, acquiring and retaining the best, diverse talent for BCG’s Global Services Business. We develop talent and capabilities, while enhancing managers’ effectiveness, and building affiliation and engagement in our new global offices. The PMT also harmonizes process efficiencies, automation, and global standardization. Through analytics and digitalization, we are always looking to expand our PMT capabilities and coverage Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About The Role Grade Level (for internal use): 10 The Role: Senior Scrum Master The Team The team is focused on agile product development offering insights into global capital markets and the financial services industry. This is an opportunity to be a pivotal part of our fast-growing global organization during an exciting phase in our company's evolution. The Impact The Senior Scrum Master plays a crucial role in driving Agile transformation within the technology team. By facilitating efficient processes and fostering a culture of continuous improvement, this role directly contributes to the successful delivery of projects and enhances the overall team performance. What’s In It For You Opportunity to lead and drive Agile transformation within a leading global organization. Engage with a dynamic team committed to delivering high-quality solutions. Access to professional development and growth opportunities within S&P Global. Work in a collaborative and innovative environment that values continuous improvement. Responsibilities And Impact Facilitate Agile ceremonies such as sprint planning, daily stand-ups, retrospectives, and reviews. Act as a servant leader to the Agile team, guiding them towards continuous improvement and effective delivery. Manage scope changes, risks, and escalate issues as needed, coordinating testing efforts and assisting scrum teams with technical transitions. Support the team in defining and achieving sprint goals and objectives. Foster a culture of collaboration and transparency within the team and across stakeholders. Encourage and support the development of team members, mentoring them in Agile best practices. Conduct data analysis and create and interpret metrics for team performance tracking and improvement. Conduct business analysis and requirement gathering sessions to align database solutions with stakeholder needs. Collaborate with stakeholders to help translate business requirements into technical specifications. Ensure adherence to Agile best practices and participate in Scrum events. Lead initiatives to improve team efficiency and effectiveness in project delivery. What We’re Looking For Basic Required Qualifications: Bachelor's degree in a relevant field or equivalent work experience. Minimum of 5-9 years of experience in a Scrum Master role, preferably within a technology team. Strong understanding of Agile methodologies, particularly Scrum and Kanban. Excellent communication and interpersonal skills. Proficiency in business analysis: Experience in gathering and analyzing business requirements, translating them into technical specifications, and collaborating with stakeholders to ensure alignment between business needs and database solutions. Requirement gathering expertise: Ability to conduct stakeholder interviews, workshops, and requirements gathering sessions to elicit, prioritize, and document business requirements related to database functionality and performance. Basic understanding of SQL queries: Ability to comprehend and analyze existing SQL queries to identify areas for performance improvement. Fundamental understanding of database structure: Awareness of database concepts including normalization, indexing, and schema design to assess query performance. Additional Preferred Qualifications Certified Scrum Master (CSM) or similar Agile certification. Experience with Agile tools such as Azure DevOps, JIRA, or Trello. Proven ability to lead and influence teams in a dynamic environment. Familiarity with software development lifecycle (SDLC) and cloud platforms like AWS, Azure, or Google Cloud. Experience in project management and stakeholder engagement. Experience leveraging AI tools to support requirements elicitation, user story creation and refinement, agile event facilitation, and continuous improvement through data-driven insights. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316176 Posted On: 2025-06-25 Location: Hyderabad, Telangana, India

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Misson: Within the Global Business Unit Renewables, we are seeking a technically proficient Digital Twin & SCADA Integration Engineer to lead the automation and integration of SCADA data into our digital twin platform. This role is responsible for extracting and validating SCADA tags of Renewable Energy sites (Wind, PV, BESS), and automating the creation of digital twin representations with Azure-based solutions. The ideal candidate will be skilled in industrial automation, cloud integrations, and asset mapping. This position is crucial for our ambition to enhance operational efficiency and improve data accuracy across our renewable energy portfolio. Responsibilities: The scope of the role includes, but is not limited to, the following functional areas: Data Standardization & Validation: Define and implement data collection standards based on established templates (e.g., Darwin’s RGBU governance fields). Develop and enforce validation routines to detect missing values, duplicates, and data inconsistencies. SCADA Data Extraction: Normalize vendor-specific naming conventions and automate the export of tags to structured databases (Azure SQL or Azure Storage). Digital Twin Creation: Extract asset hierarchies from validated data (Substations, Feeders, ITS, Inverters, Array Boxes, Strings, Modules). Deliver digital twin structured data in the agreed format to enable its creation in our digital platform Maintain traceability by storing mapping data in a centralized repository. Collaborate with cross-functional teams to build end-to-end data pipelines that feed into the digital twin platform using Azure Cloud services (Data Factory, Azure Functions, and REST APIs) Monitoring & Troubleshooting: Implement robust error handling and logging mechanisms to monitor data flows and system performance. Troubleshoot integration issues, ensuring continuous operation and high data quality. Continuous Improvement: Research and stay up to date with emerging trends and technologies in site digitization and digital transformation. Propose and implement improvements to existing digitization processes. Interfaces: R-GBU HQ Countries IT/OT teams Countries Construction and O&M teams OEM representatives Internal development and IT teams Qualifications: Bachelor’s or Master’s degree in Computer Science, Electrical Engineering, Industrial Automation, or a related field. Experience : 3+ years of hands-on experience in industrial automation, SCADA systems, or digital twin technologies. Demonstrated expertise in OPC UA integration, data extraction, and cloud-based data pipelines. Experience with digital twin platforms and familiarity with Digital Twins Definition Language (DTDL) is a plus. Technical Skills: Proficiency in programming languages such as Python, C#, or JavaScript. Strong knowledge of Azure services including Data Factory, SQL Database, Azure Storage, and IoT Hub. Solid understanding of RESTful API integrations and data normalization techniques. Business Skills: Excellent communication and interpersonal skills, with the ability to convey complex technical information to non-technical stakeholders. Strong problem-solving skills and attention to detail. Ability to work independently and as part of a team in a fast-paced environment. Behavioral skills: Strategic thinking and attention to detail. Ability to adapt to new technologies and processes. Strong collaboration and teamwork mindset. Proven ability to manage multiple projects simultaneously. Commitment to continuous learning and process optimization. Preferred qualifications: Experience in the renewable energy sector, particularly with PV site operations Familiarity with industrial data security and compliance best practices. Languages: You have an excellent command and fluency in English Other Languages… Are a Plus Business Unit: T&G Division: T&G AMEA - India Legal Entity: ENGIE Energy India Private Limited Professional Experience: Skilled ( >3 experience <15 years) Education Level: Technical Qualification

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Role : Team Leader - Service Desk Location : Pune/Bangalore Job Summary –  Candidates with a minimum 6 years of Service Desk experience with minimum 2 years in Front Line Leadership / Management role– We are looking for candidates with domain expertise in End User Support Services, and skilled in technical troubleshooting and delivery operations management.  Passport (Mandate); Advantage - US business visa (B1) Years of experience needed – 5-8 years Technical Skills  Analytical skills  Effective Business Communication  Coaching skills  Operations Management  SLA Management  MS Office  Operational knowledge of contact center platform and ITSM tool  Performance Management skills  Conflict management skills  Capacity management  Presentation skills  Training need identification  Technical Skills-Client Technical Service Awareness – Intermediate  Technical Troubleshooting - Account Management/password reset - Advance.  Technical Troubleshooting - OS – Advance  Technical Troubleshooting - End Devices - Advance  Ticketing Tool – Advance  MS Office – Intermediate  Contact center platform operating skills – Intermediate.  Contact center platform reports – Intermediate.  Networking concepts – Intermediate  Client Process Knowledge – Advanced  DMAIC Methodology – Intermediate  Client Business Awareness – Advanced  Telephone etiquette – Expert.  Email etiquette – Expert.  Customer service skills – Expert  Knowledge Base Navigation Skills – Advanced  Analytical skills – Intermediate  Operations Management – Advanced  SLA Management – Intermediate  Effective Business Communication – Advance  Decision Making Skills – Advance  Measuring Performance/Performance Management Skills – Advance  Coaching for Success – Advance  Motivating Others – Advance  Conflict Management Skills – Advance  Patience – Advance  Managing Stress – Advance  Positive attitude to change – Advance.  Attitude to feedback/willing to learn – Advance.  Relating to Others – Advance  Influencing Others – Advance  Team Player – Advance  Insight into the Customer's Mindset – Advance  Solution Based Approach – Advance  Follow Through – Advance  Personal Credibility – Advance  Self-Development – Intermediate  Result Focus – Intermediate  Drive to Win – Intermediate  Recognize Efforts – Advanced  Approachability – Advanced  Dealing with Fairness – Expert  Fostering Teamwork - Advanced Management Skills  Supervise and review Service Desk activities.  Review and ensure compliance to standards like PCI, ISO, ISMS, BCMS by facilitating audits by internal and external teams.  Place hiring request and conducting interviews.  Work with HR and support groups to improve employee retention and satisfaction.  In-person feedback to reporting agents on daily basis regarding ticket hygiene and operational/procedural hygiene  Root cause analysis, tracking and reporting of escalation and SLA misses.  Attend change meetings and analyze potential impact to Service Desk operations.  Performance appraisal and normalization  Participate in calibration and collaboration meetings with support function leads.  Conduct new hire technical and account specific training based on the requirements.  Create, maintain, and update account training plan.  Provide hands-on assistance to team members in case of issues, both through direct intervention and mentoring  Prepare Score Cards and discuss and share feedback around improvement areas.  Identify top performers and nominate for Rewards and Recognition and appreciation.  Monitor ticket ageing reports and drive team members to work on ageing tickets.  FCR analysis - find out controllable resolution errors that could have been resolved at L1. Behavioral Skills  Good in communication  Positive energy  Positive attitude  Self-learner Qualification  Any Graduate Certification  ITIL certified. About Mphasis Mphasis applies next-generation technology to help enterprises transform businesses globally. Customer centricity is foundational to Mphasis and is reflected in the Mphasis’ Front2Back™ Transformation approach. Front2Back™ uses the exponential power of cloud and cognitive to provide hyper-personalized (C=X2C2TM=1) digital experience to clients and their end customers. Mphasis’ Service Transformation approach helps ‘shrink the core’ through the application of digital technologies across legacy environments within an enterprise, enabling businesses to stay ahead in a changing world. Mphasis’ core reference architectures and tools, speed and innovation with domain expertise and specialization are key to building strong relationships with marquee clients.

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

India

Remote

About Us: At Soul AI, data is the backbone of every product we build. Our engineering teams in SF and Hyderabad rely on accurate, performant databases to power AI applications. We’re hiring a Database Analyst to manage, analyze, and maintain data pipelines and structures. Key Responsibilities: Analyze data structures and schemas for optimization. Manage indexing, partitioning, and query performance. Monitor and maintain data quality and consistency. Provide analytical support to various teams. Required Qualifications: 2+ years in database analysis or administration. Proficient in SQL and relational databases. Familiarity with normalization, constraints, and performance tuning. Why Join Us? Competitive pay (₹1200/hour) Flexible hours. Remote opportunity.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies