Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 - 9.0 years
0 Lacs
Hyderabad, Telangana
On-site
About the Role: Grade Level (for internal use): 10 The Role: Senior Scrum Master The Team: The team is focused on agile product development offering insights into global capital markets and the financial services industry. This is an opportunity to be a pivotal part of our fast-growing global organization during an exciting phase in our company's evolution. The Impact: The Senior Scrum Master plays a crucial role in driving Agile transformation within the technology team. By facilitating efficient processes and fostering a culture of continuous improvement, this role directly contributes to the successful delivery of projects and enhances the overall team performance. What’s in it for you: Opportunity to lead and drive Agile transformation within a leading global organization. Engage with a dynamic team committed to delivering high-quality solutions. Access to professional development and growth opportunities within S&P Global. Work in a collaborative and innovative environment that values continuous improvement. Responsibilities and Impact: Facilitate Agile ceremonies such as sprint planning, daily stand-ups, retrospectives, and reviews. Act as a servant leader to the Agile team, guiding them towards continuous improvement and effective delivery. Manage scope changes, risks, and escalate issues as needed, coordinating testing efforts and assisting scrum teams with technical transitions. Support the team in defining and achieving sprint goals and objectives. Foster a culture of collaboration and transparency within the team and across stakeholders. Encourage and support the development of team members, mentoring them in Agile best practices. Conduct data analysis and create and interpret metrics for team performance tracking and improvement. Conduct business analysis and requirement gathering sessions to align database solutions with stakeholder needs. Collaborate with stakeholders to help translate business requirements into technical specifications. Ensure adherence to Agile best practices and participate in Scrum events. Lead initiatives to improve team efficiency and effectiveness in project delivery. What We’re Looking For: Basic Required Qualifications: Bachelor's degree in a relevant field or equivalent work experience. Minimum of 5-9 years of experience in a Scrum Master role, preferably within a technology team. Strong understanding of Agile methodologies, particularly Scrum and Kanban. Excellent communication and interpersonal skills. Proficiency in business analysis: Experience in gathering and analyzing business requirements, translating them into technical specifications, and collaborating with stakeholders to ensure alignment between business needs and database solutions. Requirement gathering expertise: Ability to conduct stakeholder interviews, workshops, and requirements gathering sessions to elicit, prioritize, and document business requirements related to database functionality and performance. Basic understanding of SQL queries: Ability to comprehend and analyze existing SQL queries to identify areas for performance improvement. Fundamental understanding of database structure: Awareness of database concepts including normalization, indexing, and schema design to assess query performance. Additional Preferred Qualifications: Certified Scrum Master (CSM) or similar Agile certification. Experience with Agile tools such as Azure DevOps, JIRA, or Trello. Proven ability to lead and influence teams in a dynamic environment. Familiarity with software development lifecycle (SDLC) and cloud platforms like AWS, Azure, or Google Cloud. Experience in project management and stakeholder engagement. Experience leveraging AI tools to support requirements elicitation, user story creation and refinement, agile event facilitation, and continuous improvement through data-driven insights. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316176 Posted On: 2025-06-25 Location: Hyderabad, Telangana, India
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Role & Responsibilities : Database Development and Optimization : Design, develop, and optimize SQL databases, tables, views, and stored procedures to meet business requirements and performance goals. Data Retrieval and Analysis : Write efficient and high-performing SQL queries to retrieve, manipulate, and analyze data. Data Integrity and Security : Ensure data integrity, accuracy, and security through regular monitoring, backups, and data cleansing activities. Performance Tuning : Identify and resolve database performance bottlenecks, optimizing queries and database configurations. Error Resolution : Investigate and resolve database-related issues, including errors, connectivity problems, and data inconsistencies. Cross-Functional Collaboration : Collaborate with cross-functional teams, including Data Analysts, Software Developers, and Business Analysts, to support data-driven decision-making. Maintain comprehensive documentation of database schemas, processes, and procedures. Implement and maintain security measures to protect sensitive data and ensure compliance with data protection regulations. Assist in planning and executing database upgrades and migrations. To be considered for this role, you should have : Relevant work experience as a SQL Developer or in a similar role. Education : Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience). Technical Skills Proficiency in SQL, including T-SQL for Microsoft SQL Server or PL/SQL for Oracle. Strong knowledge of database design principles, normalization, and indexing. Experience with database performance tuning and optimization techniques. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Ability to work independently and manage multiple tasks simultaneously. Desirable Skills Database Management Certifications : Certifications in database management (e. , Microsoft Certified : Azure Database Administrator Associate) are a plus. Data Warehousing Knowledge : Understanding of data warehousing concepts is a plus. (ref:hirist.tech)
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title – Data Engineer (SQL Server, Python, AWS, ETL) Preferred Location: Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Description Will work with high-performance software engineering and Analytics teams that consistently deliver on commitments with continuous quality and efficiency improvements. In this role, you will develop technical capabilities for several of Carrier’s software development teams, supporting both current and next-generation technology initiatives. This position requires a demonstrated, hands-on technical person with the ability delivery technical tasks and owns development phase of software development, including coding, troubleshooting, deployment, and ongoing maintenance. Role Responsibilities Design, develop, and implement SQL Server databases based on business requirements and best practices. Create database schema, tables, views, stored procedures, and functions to support application functionality and data access. Ensure data integrity, security, and performance through proper database design and normalization techniques. Analyze query execution plans and performance metrics to identify and address performance bottlenecks. Implement indexing strategies and database optimizations to improve query performance. Design and implement ETL processes to extract, transform, and load data from various sources into SQL Server databases. Document database configurations, performance tuning activities, and Power BI solutions for knowledge sharing and future reference. Provide training and support to end-users on SQL Server best practices, database performance optimization techniques, and Power BI usage. Minimum Requirements BTech degree in Computer Science or related discipline, MTech degree preferred. Assertive communication, strong analytical, problem solving, debugging, and leadership skills. Experience with source control tools like Bit Bucket and/or Git. Good Hands-on experience diagnosing performance bottlenecks, wait stats, SQL query monitoring, review and optimization strategies. Create normalized and highly scalable logical and physical database design and switch between different database technologies like Oracle, SQL Server, Elastic databases. 5+ years of overall experience building and maintaining SQL server and data engineering for the organization. 5+ year SQL server development experience with strong programming experience in writing stored procedures and functions. Excellent understanding of Snowflake and other data warehouses. Experience in designing and hands-on development in cloud-based analytics solutions. Understanding on AWS storage services and AWS Cloud Infrastructure offerings. Designing and building data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Develop reusable, typed frontend components using hooks and modern state management patterns. Ensure responsive UI/UX and cross-browser compatibility. Design RESTful or GraphQL APIs using Express and TypeScript. Model relational schemas and write optimized SQL queries and stored procedures. Optimize database performance using indexes, partitions, and EXPLAIN plans. Write unit and integration tests using the Jest and React Testing Library. Participate actively in code reviews and maintain coding standards. Qualifications Required Skills React.js with TypeScript (React 16+ with functional components and hooks) Node.js with TypeScript and Express MySQL (schema design, normalization, indexing, query optimization, stored procedures) HTML5, CSS3/Sass, ECMAScript 6+ Git, npm/yarn, Webpack/Vite, ESLint/Prettier, Swagger/OpenAPI Jest, React Testing Library
Posted 1 month ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title: Senior Engineer – Data SQL Engineer, AVP Location: Pune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Your Role - What You’ll Do As a SQL Engineer, you would be responsible for design, development and optimization of complex database systems. You would be writing efficient SQL queries, stored procedures and possess expertise in data modeling, performance optimization and working with large scale relational databases. Key Responsibilities: Design, develop and optimize complex SQL queries, stored procedures, views and functions. Work with large datasets to perform data extraction, transformation and loading(ETL). Develop and maintain scalable database schemas and models Troubleshoot and resolve database-related issues including performance bottlenecks and data quality concerns Maintain data security and compliance with data governance policy. Your Skills And Experience Skills You’ll Need : Must Have: 8+ years of hands-on experience with SQL in relational databases – SQL Server, Oracle, MySQL PostgreSQL. Strong working experience of PLSQL and T-SQL. Strong understanding of data modelling, normalization and relational DB design. Skills Desirable skills that will help you excel Ability to write high performant, heavily resilient queries in Oracle / PostgreSQL / MSSQL Working knowledge of Database modelling techniques like Star Schema, Fact-Dimension Models and Data Vault. Awareness of database tuning methods like AWR reports, indexing, partitioning of data sets, defining tablespace sizes and user roles etc. Hands on experience with ETL tools - Pentaho/Informatica/Stream sets. Good experience in performance tuning, query optimization and indexing. Hands-on experience on object storage and scheduling tools Experience with cloud-based data services like data lakes, data pipelines, and machine learning platforms. Educational Qualifications Bachelor’s degree in Computer Science/Engineering or relevant technology & science Technology certifications from any industry leading cloud providers How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
3.0 years
0 Lacs
India
Remote
Job Title: Voice Processing Specialist Location: Remote /Jaipur Job Type: Full-time / Contract Experience: 3+ years expertise in voice cloning, transformation, and synthesis technologies Job Summary We are seeking a talented and motivated Voice Processing Specialist to join our team and lead the development of innovative voice technologies. The ideal candidate will have a deep understanding of speech synthesis, voice cloning, and transformation techniques. You will play a critical role in designing, implementing, and deploying state-of-the-art voice models that enhance naturalness, personalization, and flexibility of speech in AI-powered applications. This role is perfect for someone passionate about advancing human-computer voice interaction and creating lifelike, adaptive voice systems. Key Responsibilities Design, develop, and optimize advanced deep learning models for voice cloning, text-to-speech (TTS), voice conversion, and real-time voice transformation. Implement speaker embedding and voice identity preservation techniques to support accurate and high-fidelity voice replication. Work with large-scale and diverse audio datasets, including preprocessing, segmentation, normalization, and data augmentation to improve model generalization and robustness. Collaborate closely with data scientists, ML engineers, and product teams to integrate developed voice models into production pipelines. Fine-tune neural vocoders and synthesis architectures for better voice naturalness and emotional range. Stay current with the latest advancements in speech processing, AI voice synthesis, and deep generative models through academic literature and open-source projects. Contribute to the development of tools and APIs for deploying models on cloud and edge environments with high efficiency and low latency. Required Skills Strong understanding of speech signal processing, speech synthesis, and automatic speech recognition (ASR) systems. Hands-on experience with voice cloning frameworks such as Descript Overdub, Coqui TTS, SV2TTS, Tacotron, FastSpeech, or similar. Proficiency in Python and deep learning frameworks like PyTorch or TensorFlow. Experience working with speech libraries and toolkits such as ESPnet, Kaldi, Librosa, or SpeechBrain. In-depth knowledge of mel spectrograms, vocoder architectures (e.g., WaveNet, HiFi-GAN, WaveGlow), and their role in speech synthesis. Familiarity with REST APIs, model deployment, and cloud-based inference systems using platforms like AWS, Azure, or GCP. Ability to optimize models for performance in real-time or low-latency environments. Preferred Qualifications Experience in real-time voice transformation, including pitch shifting, timing modification, or emotion modulation. Exposure to emotion-aware speech synthesis, multilingual voice models, or prosody modeling. Design, develop, and optimize advanced deep learning models for voice cloning, text-to-speech (TTS), voice conversion, and real-time voice transformation Background in audio DSP (Digital Signal Processing) and speech analysis techniques. Previous contributions to open-source speech AI projects or publications in relevant domains. Why Join Us You will be part of a fast-moving, collaborative team working at the forefront of voice AI innovation. This role offers the opportunity to make a significant impact on products that reach millions of users, helping to shape the future of interactive voice experiences. Skills: automatic speech recognition (asr),vocoder architectures,voice cloning,voice processing,data,real-time voice transformation,speech synthesis,pytorch,tensorflow,voice conversion,speech signal processing,audio dsp,rest apis,python,cloud deployment,transformation,mel spectrograms,deep learning
Posted 1 month ago
3.0 years
0 Lacs
India
Remote
Job Title: AI Image Processing Specialist Location: Remote /Jaipur Job Type: Full-time / Contract Experience: 3+ years in computer vision, with medical imaging a plus Job Summary We are seeking a highly skilled and detail-oriented AI Image Processing Specialist to join our team, with a strong focus on medical imaging , computer vision , and deep learning . In this role, you will be responsible for developing and optimizing scalable image processing pipelines tailored for diagnostic, radiological, and clinical applications. Your work will directly contribute to advancing AI capabilities in healthcare by enabling accurate, efficient, and compliant medical data analysis. You will collaborate with data scientists, software engineers, and healthcare professionals to build cutting-edge AI solutions with real-world impact. Key Responsibilities Design, develop, and maintain robust image preprocessing pipelines to handle various medical imaging formats such as DICOM, NIfTI, and JPEG2000. Build automated, containerized, and scalable computer vision workflows suitable for high-throughput medical imaging analysis. Implement and fine-tune models for core vision tasks, including image segmentation, classification, object detection, and landmark detection using deep learning techniques. Ensure that all data handling, processing, and model training pipelines adhere to regulatory guidelines such as HIPAA, GDPR, and FDA/CE requirements. Optimize performance across pipeline stages — including data augmentation, normalization, contrast adjustment, and image registration — to ensure consistent model accuracy. Integrate annotation workflows using tools such as CVAT, Labelbox, or SuperAnnotate and implement strategies for active learning and semi-supervised annotation. Manage reproducibility and version control across datasets and model artifacts using tools like DVC, MLFlow, and Airflow. Required Skills Strong experience with Python and image processing libraries such as OpenCV, scikit-image, and SimpleITK. Proficiency in deep learning frameworks like TensorFlow or PyTorch, including experience with model architectures like U-Net, ResNet, or YOLO adapted for medical applications. Deep understanding of medical imaging formats, preprocessing techniques (e.g., windowing, denoising, bias field correction), and challenges specific to healthcare datasets. Experience working with computer vision tasks such as semantic segmentation, instance segmentation, object localization, and detection. Familiarity with annotation platforms, data curation workflows, and techniques for managing large annotated datasets. Experience with pipeline orchestration, containerization (Docker), and reproducibility tools such as Airflow, DVC, or MLFlow. Preferred Qualifications Experience with domain-specific imaging datasets in radiology, pathology, dermatology, or ophthalmology. Understanding of clinical compliance frameworks such as FDA clearance for software as a medical device (SaMD) or CE marking in the EU. Exposure to multi-modal data fusion, combining imaging with EHR, genomics, or lab data for holistic model development. Experience with pipeline orchestration, containerization (Docker), and reproducibility tools such as Airflow, DVC, or MLFlow. Ensure that all data handling, processing, and model training pipelines adhere to regulatory guidelines such as HIPAA, GDPR, and FDA/CE requirements. Why Join Us Be part of a forward-thinking team shaping the future of AI in healthcare. You’ll work on impactful projects that improve patient outcomes, streamline diagnostics, and enhance clinical decision-making. We offer a collaborative environment, opportunities for innovation, and a chance to work at the cutting edge of AI-driven healthcare. Skills: docker,u-net,mlflow,containerization,image segmentation,simpleitk,yolo,image processing,computer vision,medical imaging,object detection,tensorflow,opencv,pytorch,image preprocessing,resnet,python,dvc,airflow,scikit-image,annotation workflows
Posted 1 month ago
2.0 years
3 - 4 Lacs
Noida
On-site
modelling We are looking for a highly skilled Sr. Developer with 2+ years of experience in web-based project development. The successful candidate will be responsible for designing, developing, and implementing web applications using PHP and various open-source frameworks. Key Responsibilities: Collaborate with cross-functional teams to identify and prioritize project requirements Develop and maintain high-quality, efficient, and well-documented code Troubleshoot and resolve technical issues Implement Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Work with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Develop and maintain database PL/SQL, stored procedures, and triggers Requirements: 2+ years of experience in web-based project development using PHP Experience with various open-source frameworks such as Laravel, WordPress, Drupal, Joomla, OsCommerce, OpenCart, TomatoCart, VirtueMart, Magento, Yii 2, CakePHP 2.6, Zend 1.10, and Kohana Strong knowledge of Object-Oriented PHP, Curl, Ajax, Prototype.Js, JQuery, Web services, Design Patterns, MVC architecture, and Object-Oriented Methodologies Experience with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Well-versed with RDBMS MySQL (can work with other SQL flavors too) Experience with Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Job Type: Full-time Pay: ₹25,000.00 - ₹40,000.00 per month Benefits: Health insurance Provident Fund Schedule: Day shift Morning shift Education: Bachelor's (Required) Experience: Total: 2 years (Required) WordPress: 2 years (Required) PHP: 2 years (Required) Laravel: 2 years (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person
Posted 1 month ago
0 years
5 - 8 Lacs
Calcutta
On-site
Job requisition ID :: 82238 Date: Jun 23, 2025 Location: Kolkata Designation: Associate Director Entity: Associate Director | SAP QM | Kolkata | SAP Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team SAP is about much more than just the numbers. It’s about attesting to accomplishments and challenges and helping to assure strong foundations for future aspirations. Deloitte exemplifies the what, how, and why of change so you’re always ready to act ahead Your work profile. As a Manager in our SAP Team, you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - SAP QM Professional should have: End to end project implementation experience in SAP SD in atleast 12- 15 projects (excluding support projects). Must To Have Skills: Proficiency in SAP Quality Management (QM) Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Qualifications Graduate degree (Science or Engineering) from premier institutes. Strong communication skills (written & verbal). Willingness to travel for short and long term durations. Your role as a leader At Deloitte India, we believe in the importance of leadership at all levels. We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society and make an impact that matters. Actively focuses on developing effective communication and relationship-building skills Builds own understanding of our purpose and values; explores opportunities for impact Understands expectations and demonstrates personal accountability for keeping performance on track Understands how their daily work contributes to the priorities of the team and the business Demonstrates strong commitment to personal learning and development; acts as a brand ambassador to help attract top talent How you’ll grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to help build world-class skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Centre Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the Communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world
Posted 1 month ago
0 years
15 - 21 Lacs
Noida, Uttar Pradesh, India
On-site
We are inviting applications for the role of Lead Consultant – Java Kafka . The ideal candidate will have strong hands-on experience in Java-based microservices development using Kafka and Postgres. You will also work on Microsoft Access-based database solutions ensuring data normalization and process integrity. Primary Skills (Must-Have) Core Java (v1.8 or higher) Spring Boot & Spring Framework (Core, AOP, Batch) Apache Kafka PostgreSQL Secondary Skills (Good To Have) Google Cloud Platform (GCP) CI/CD Tools – CircleCI preferred GitHub – for version control and collaboration Monitoring Tools – Splunk, Grafana Key Responsibilities Develop and maintain enterprise-level applications using Java, Spring Boot, Kafka, and Postgres. Design, build, and maintain Microsoft Access Databases with proper normalization and referential integrity. Implement and maintain microservices architectures for high-volume applications. Participate in code reviews, unit testing, and integration testing. Manage version control with GitHub and contribute to DevOps pipelines with CI/CD tools like CircleCI. Collaborate with cross-functional teams for application development and deployment on cloud-based infrastructure (preferably GCP). Monitor system performance using Splunk and Grafana and recommend improvements. Qualifications Minimum Educational Qualifications BE / B.Tech / M.Tech / MCA in Computer Science, Information Technology, or a related field Preferred Qualifications Experience with Oracle PL/SQL, SOAP/REST Web Services Familiarity with MVC frameworks such as Struts, JSF Hands-on experience with cloud-based infrastructure, preferably GCP Skills: kafka,ci/cd tools – circleci,spring framework (core, aop, batch),monitoring tools – splunk,apache kafka,monitoring tools – grafana,spring boot,java,google cloud platform (gcp),postgresql,springboot,github,core java (v1.8 or higher)
Posted 1 month ago
5.0 - 12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: ServiceNow CMDB Functional Consultant Skills: CMDB, CSDM, ITOM, Discovery & Event Management Experience: 5 - 12 years Locations: Greater Noida, Pune & Bengaluru Responsible for designing, implementing & maintaining a CMDB system for an organization. Collecting & organizing information about hardware, software, and other IT assets, as well as their relationships and dependencies. Must have a strong understanding of IT infrastructure & configuration management principles, as well as excellent communication and problem - solving skills Analyzing the organization's current IT infrastructure & identifying areas for improvement in terms of configuration management. Developing a strategy & roadmap for implementing a CMDB system, including identifying the necessary data sources and integration points. Collaborating with various IT teams, such as network, server & application teams, to gather and validate configuration data. Defining & documenting the data model and taxonomy for the CMDB, ensuring it aligns with industry best practices. Configuring & customizing the CMDB tool to meet the specific needs of the organization. Conducting data quality checks & implementing processes for ongoing data maintenance and governance. Providing training & support to end - users on how to use the CMDB system effectively. Collaborating with IT teams to ensure accurate & timely updates to the CMDB as changes are made to the IT infrastructure. Conducting regular audits of the CMDB to ensure data accuracy & completeness. Monitoring & reporting on key performance indicators (KPIs) related to the CMDB, such as data quality, compliance, and usage. Staying updated on industry trends & best practices in CMDB management and making recommendations for improvement. Working with external vendors & consultants as needed to support the CMDB system. Preferred Qualifications: Strong knowledge of ITOM Modules & CMDB. Should have experience with CMDB Class Manager, Class Hierarchy & CMDB Manager policies. Strong knowledge of Identification, Normalization & Reconciliation rules. Configuration of CMDB classes and attributes & provide guidance to clients and other team members on ITOM best practices. Good Knowledge of TBM Taxonomy and relationship with CMDB.
Posted 1 month ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring the functionality and efficiency of the applications, as well as collaborating with the team to provide solutions to work-related problems. A typical day in this role involves designing and implementing application features, troubleshooting and debugging issues, and actively participating in team discussions to contribute to the development process. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Design and implement application features based on business requirements. - Troubleshoot and debug issues to ensure the functionality and efficiency of applications. - Collaborate with the team to provide solutions to work-related problems. - Stay updated with the latest technologies and industry trends. - Conduct code reviews and provide constructive feedback to improve code quality. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Apache Spark. - This position is based at our Indore office. - A 15 years full-time education is required. 15 years full time education
Posted 1 month ago
4.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Exp: 4 to 6 Years Location: Chennai( T Nagar) Work mode: 5 Days Work from office Notice period: Immediate or Max 30 Days Only Job Summary: We are seeking a highly skilled and motivated MSSQL Developer with strong Database Administrator (DBA) knowledge to join our dynamic team in Chennai. This role offers a unique opportunity to contribute to the development and maintenance of our critical database systems and applications. The ideal candidate will be proficient in designing, developing, and optimizing SQL Server databases. You will work closely with development teams to ensure efficient and reliable data solutions. Responsibilities: Database Development: Design, develop, and implement database schemas, tables, stored procedures, views, functions, and triggers using T-SQL. Write efficient and optimized SQL queries for data retrieval, manipulation, and reporting. Participate in the full software development lifecycle, from requirements gathering to deployment and maintenance. Develop and maintain ETL (Extract, Transform, Load) processes using tools like SSIS (SQL Server Integration Services). Ensure data integrity, accuracy, and consistency across the database systems. Database Administration: Monitor database performance, identify bottlenecks, and implement performance tuning measures (e.g., indexing, query optimization). Troubleshoot database issues and provide timely resolutions. Develop and maintain database documentation, including data models, schemas, and operational procedures. Collaboration and Communication: Collaborate effectively with application development teams to understand their data requirements and provide optimal database solutions. Communicate technical information clearly and concisely to both technical and non-technical stakeholders. Participate in code reviews and provide constructive feedback. Stay up-to-date with the latest SQL Server features, tools, and best practices. Required Skills and Experience: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 4 to 6 years of hands-on experience as an MSSQL Developer. Proven experience in database design, development, and optimization using T-SQL. Solid understanding of relational database concepts, normalization, and data modeling. Demonstrable experience in SQL Server database administration tasks performance tuning, and security. Experience with ETL processes and tools, preferably SSIS. Familiarity with database monitoring tools and techniques. Understanding of high availability and disaster recovery concepts and implementations. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Skills: Knowledge of other database technologies (e.g., NoSQL). Experience with scripting languages like PowerShell. Familiarity with agile development methodologies.
Posted 1 month ago
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Role Overview As a Senior Data Solutions Architect in the Business Analytics, Automation & AI team, you will be responsible for architecting and delivering comprehensive, end-to-end data solutions across cloud and on-premises platforms in Business Intelligence and Artificial Intelligence domains. Your focus will include leading strategic data migration automation initiatives that optimize and automate the transfer of ERP, CRM, and other enterprise data to modern data platforms, ensuring data cleansing and high-quality, reliable datasets. This hands-on role also involves establishing and managing a small, high-performing team of data engineers and analysts that thrives on streamlined processes and rapid innovation. Leveraging an IT consulting mindset, experience with global enterprises and complex data ecosystems, you will inspire and nurture technical talent, driving a culture of continuous learning and development. As a leader, you will foster ambition and accountability through goal-oriented frameworks and actively contribute to transformative organizational initiatives that push beyond business as usual, pioneering digitization and data-driven transformation within the company. Key Responsibilities Architect and deliver end-to-end data solutions across cloud and on-premises platforms, including AWS, Azure, Informatica, etc. Lead strategic data migration automation initiatives, optimizing and automating the movement of ERP, CRM, and other enterprise data to modern data platforms. Drive business intelligence transformation, ensuring robust data models, efficient ETL pipelines, and scalable analytics architectures for Enterprise BI needs. Build and manage AI data architectures that support AI workflows, including handling unstructured and semi-structured data, real-time data streams, and large-scale datasets for model training and inference. Implement advanced data preprocessing steps such as data cleaning, normalization, encoding categorical variables, feature engineering, and data enrichment to prepare data optimally for AI models. Manage and mentor a team of 10 data engineers and analysts, fostering skill development in BI and AI data technologies. Collaborate with business/function stakeholders to align data architecture with business goals, ensuring solutions meet both technical and operational requirements. Establish and enforce data governance, data quality, and data security frameworks, using tools like Collibra or similar. Participate in strategic project engagements, leveraging consulting expertise to define and propose best-fit solutions. Ensure compliance with regulatory and security standards, implementing access controls, encryption, and audit mechanisms. Required Skills & Qualifications Technical Expertise: Deep hands-on experience with Informatica, AWS ( including S3, Redshift )/Azure, Databricks, and Big Data platforms. Strong proficiency in Python, SQL, and NoSQL for building scalable ETL/data pipelines and managing structured/unstructured data. Experience with data governance tools (e.g., Collibra), data modeling, and data warehouse design. Knowledge of Tableau/PowerBI/Alteryx is a must. Knowledge of ERP, CRM data structures, and integration patterns. Familiarity with AI/ML frameworks like TensorFlow, PyTorch, and LLM orchestration tools (e.g., LangChain, LlamaIndex ) to support AI model workflows. Proven skills in building modular, scalable, and automated ETL/AI pipelines with robust data quality and security controls. Certifications: Certified Solutions Architect from AWS/Microsoft (Azure)/Google Cloud. Additional certifications in Databricks, or Informatica are a plus. Consulting Experience: Proven track record in an IT consulting environment, engaging with large enterprises and MNCs in strategic data solutioning projects. Strong stakeholder management, business needs assessment, and change management skills. Leadership & Soft Skills: Experience managing and mentoring small teams, developing technical skills in BI and AI data domains. Ability to influence and align cross-functional teams and stakeholders. Excellent communication, documentation, and presentation skills. Strong problem-solving, analytical thinking, and strategic vision. Preferred Experience Leading large-scale data migration and transformation programs for ERP/CRM systems. Implementing data governance and security policies across multi-cloud environments. Working with global clients in regulated industries. Driving adoption of modern data platforms and BI/AI/automation solutions in enterprise settings. Certifications AWS Certified Solutions Architect – Professional/ Microsoft Certified: Azure Solutions Architect Expert AWS Certified Data Engineer – Professional/Databricks Certified Data Engineer Professional Educational Qualifications: Master’s/bachelor’s degree in engineering or Master of Computer Applications is required. A Masters in Business Administration (MBA) is a plus. Primary Location : IN-Karnataka-Bangalore Schedule : Full-time Unposting Date : Ongoing
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana
On-site
Hyderabad, Telangana Job ID 30162733 Job Category Digital Technology Job Title – Data Engineer (SQL Server, Python, AWS, ETL) Preferred Location: Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Description: Will work with high-performance software engineering and Analytics teams that consistently deliver on commitments with continuous quality and efficiency improvements. In this role, you will develop technical capabilities for several of Carrier’s software development teams, supporting both current and next-generation technology initiatives. This position requires a demonstrated, hands-on technical person with the ability delivery technical tasks and owns development phase of software development, including coding, troubleshooting, deployment, and ongoing maintenance. Role Responsibilities: Design, develop, and implement SQL Server databases based on business requirements and best practices. Create database schema, tables, views, stored procedures, and functions to support application functionality and data access. Ensure data integrity, security, and performance through proper database design and normalization techniques. Analyze query execution plans and performance metrics to identify and address performance bottlenecks. Implement indexing strategies and database optimizations to improve query performance. Design and implement ETL processes to extract, transform, and load data from various sources into SQL Server databases. Document database configurations, performance tuning activities, and Power BI solutions for knowledge sharing and future reference. Provide training and support to end-users on SQL Server best practices, database performance optimization techniques, and Power BI usage. Minimum Requirements: BTech degree in Computer Science or related discipline, MTech degree preferred. Assertive communication, strong analytical, problem solving, debugging, and leadership skills. Experience with source control tools like Bit Bucket and/or Git. Good Hands-on experience diagnosing performance bottlenecks, wait stats, SQL query monitoring, review and optimization strategies. Create normalized and highly scalable logical and physical database design and switch between different database technologies like Oracle, SQL Server, Elastic databases. 5+ years of overall experience building and maintaining SQL server and data engineering for the organization. 5+ year SQL server development experience with strong programming experience in writing stored procedures and functions. Excellent understanding of Snowflake and other data warehouses. Experience in designing and hands-on development in cloud-based analytics solutions. Understanding on AWS storage services and AWS Cloud Infrastructure offerings. Designing and building data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.
Posted 1 month ago
7.5 years
0 Lacs
Gurugram, Haryana, India
On-site
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Large Language Models Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an AI/ML Engineer, you will develop applications and systems utilizing AI tools, Cloud AI services, and GenAI models. Your role involves implementing deep learning, neural networks, chatbots, and image processing in production-ready quality solutions. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Lead the implementation of large language models in AI applications. - Research and apply cutting-edge AI techniques to enhance system performance. - Contribute to the development of innovative AI solutions for complex business challenges. Professional & Technical Skills: - Must To Have Skills: Proficiency in Large Language Models. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Large Language Models. - This position is based at our Bengaluru office. - A 15 years full-time education is required. 15 years full time education
Posted 1 month ago
0 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, colour, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes. Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI
Posted 1 month ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI
Posted 1 month ago
0 years
0 Lacs
Lucknow, Uttar Pradesh, India
On-site
About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI
Posted 1 month ago
4.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Freshworks: Organizations everywhere struggle under the crushing costs and complexities of “solutions” that promise to simplify their lives. To create better experience for their customers and employees. To help them grow. Software is a choice that can make or break a business. Create better or worse experiences. Propel or throttle growth. Business software has become a blocker instead of ways to get work done. There’s another option. Freshworks. With a fresh vision for how the world works. At Freshworks, we build uncomplicated service software that delivers exceptional customer and employee experiences. Our enterprise-grade solutions are powerful, yet easy to use, and quick to deliver results. Our people-first approach to AI eliminates friction, making employees more effective and organizations more productive. Over 72,000 companies, including Bridgestone, New Balance, Nucor, S&P Global, and Sony Music, trust Freshworks’ customer experience (CX) and employee experience (EX) software to fuel customer loyalty and service efficiency. And over 4,500 Freshworks employees make this possible, all around the world. Fresh vision. Real impact. Come build it with us. Job Description Write scripts for automating DevOps tasks such as configuration management, provisioning, and deployments using Python, Ruby, or Go. Integrate scripts with DevOps tools and pipelines. Manage user accounts, permissions, and file systems. Perform advanced Linux administration and shell scripting tasks. Automate system administration tasks using shell scripts. Design and implement CI/CD pipelines for automating deployments and testing. Utilize popular CI/CD tools such as Jenkins and GitLab CI/CD. Integrate CI/CD pipelines with version control systems and container orchestration platforms. Set up and manage monitoring and logging solutions. Use tools for collecting, analyzing, and visualizing application and infrastructure logs. Troubleshoot issues based on monitoring and logging data. Utilize Git for version control and collaboration. Perform branching, merging, and conflict resolution using Git. Set up and manage Git repositories. Work effectively with developers, operations teams, and other stakeholders. Document DevOps processes and procedures. Troubleshoot complex DevOps issues. Identify root causes of problems and implement solutions. Qualifications Experience: 4-7 years Advanced understanding of programming concepts (data structures, algorithms, object-oriented programming). In-depth knowledge of Linux administration and shell scripting. Proficiency in using common Linux commands and tools for system administration. Extensive experience with Git. Proficiency in Git commands for branching, merging, and conflict resolution. Expert knowledge of CI/CD principles and best practices. Proficiency in CI/CD tools (e.g., Jenkins, GitLab CI/CD, Azure DevOps Pipelines). Experience in setting up and managing monitoring and logging solutions. Advanced communication and collaboration skills. Advanced problem-solving and analytical skills. Knowledge of RDBMS like MySQL, PostgreSQL. Expertise in database design, normalization, and optimization. Strong understanding of SQL and proficiency in writing complex queries. Experience in database administration tasks such as backup, recovery, security, and performance tuning. Extensive experience with Kubernetes environments. Deep understanding of Kubernetes architecture and components. Proficiency in using kubectl commands and managing Kubernetes resources. Experience in setting up and managing Kubernetes clusters. Experience with major cloud platforms (e.g., AWS, Azure, GCP). Familiarity with cloud-specific DevOps tools and services for deployments, monitoring, and scaling. Additional Information At Freshworks, we are creating a global workplace that enables everyone to find their true potential, purpose, and passion irrespective of their background, gender, race, sexual orientation, religion and ethnicity. We are committed to providing equal opportunity for all and believe that diversity in the workplace creates a more vibrant, richer work environment that advances the goals of our employees, communities and the business.
Posted 1 month ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Must Haves Strong knowledge in SQL development and relational databases (2 years with MS SQL Server, additional experience with databases such as PostgreSQL/Oracle a plus). Experience with database design, indexing, normalization, and query optimization techniques (including performance profiling). Priority focus on Microsoft products (MS SQL, Azure DevOps, Azure Data Factory, Visual Studio) Job Description Proven experience as an SQL Developer or in a similar role. Strong knowledge of SQL and relational databases (2 years working with MS-SQL Server). Experience with database design, indexing, normalization, and query optimization techniques – including the difference between join types and performance implications behind SQL commands and the syntax used to invoke them. Familiarity with tools such as Synapse is desirable.Familiarity with data modeling tools and practices. Capable of understanding existing materialized views, write new ones and eventually optimize their performance/execution time. Ability to work with large datasets and handle the associated performance challenges. Understanding of database security and best practices. Experience with ETL processes and tools (optional, but a plus).Strong problem-solving skills and attention to detail.Excellent communication skills and ability to collaborate with cross-functional teams.
Posted 1 month ago
4.0 - 7.0 years
0 Lacs
Hyderābād
Remote
Job Information Industry IT Services Date Opened 06/20/2025 Salary Confidential Job Type Contract Work Experience 4-7 Years City Hyderabad, open to remote State/Province Telangana Country India Zip/Postal Code 500081 Job Description Veltris is a Digital Product Engineering Services partner committed to driving technology-enabled transformation across enterprises, businesses, and industries. We specialize in delivering next-generation solutions for sectors including healthcare, technology, communications, manufacturing, and finance. With a focus on innovation and acceleration, Veltris empowers clients to build, modernize, and scale intelligent products that deliver connected, AI-powered experiences. Our experience-centric approach, agile methodologies, and exceptional talent enable us to streamline product development, maximize platform ROI, and drive meaningful business outcomes across both digital and physical ecosystems. In a strategic move to strengthen our healthcare offerings and expand industry capabilities, Veltris has acquired BPK Technologies. This acquisition enhances our domain expertise, broadens our go-to-market strategy, and positions us to deliver even greater value to the enterprise and mid-market clients in healthcare and beyond. Job Description for Business Analyst Roles & Responsibilities: Gather requirements and translate them into user stories that can be engineered and developed. Create requirements in Azure DevOps board. Document and communicate translated requirements to team members Attend daily stand-up and need basis meetings Will be working in IST time zone and will be required to have a few hours of overlap with the US/Canada time zone. You will be required to participate in product architecture, design, and requirement discussions Work with your product manager or senior Business Analyst. Must Have skills: Must have a good understanding of relational database. Should be able to understand client requirements and do research to break down requirements that can be engineered and developed. Should have hands-on experience in writing SQL queries, joins, filtering, data normalization, etc Should have good analytical skills and be able to analyse data in Excel sheet. Ability to multitask Excellent verbal and written communication in English Good to Have skills: Working knowledge of Agile methodology Understanding of Azure DevOps Able to understand and create ER diagram, DB schema ETL, DWH, BI knowledge will be an added advantage Dentistry and healthcare domains are preferred Experience : 4 - 8 yrs. Qualification - Bachelor's Degree in Computer Science, Management Information Sciences, Mathematics, Engineering, Business, or area of functional responsibility preferred, or a combination of equivalent education and experience. Disclaimer: The information provided herein is for general informational purposes only and reflects the current strategic direction and service offerings of Veltris. While we strive for accuracy, Veltris makes no representations or warranties regarding the completeness, reliability, or suitability of the information for any specific purpose. Any statements related to business growth, acquisitions, or future plans, including the acquisition of BPK Technologies, are subject to change without notice and do not constitute a binding commitment. Veltris reserves the right to modify its strategies, services, or business relationships at its sole discretion. For the most up-to-date and detailed information, please contact Veltris directly.
Posted 1 month ago
0 years
3 - 4 Lacs
Mumbai
On-site
Company Description Forbes Advisor i s a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We believe in the power of entrepreneurial capitalism and use it on various platforms to ignite the conversations that drive systemic change in business, culture, and society. We celebrate success and are committed to using our megaphone to drive diversity, equity and inclusion. We are the world’s biggest business media brand and we consistently place in the top 20 of the most popular sites in the United States, in good company with brands like Netflix, Apple and Google. In short, we have a big platform and we use it responsibly. Job Description The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Data Research Engineer- Team Lead will involve guiding team members through code standards, optimization techniques, and best practices in debugging and testing. They oversee the development and consistent application of testing protocols, including unit, integration, and performance testing, ensuring a high standard of code quality across the team. They work closely with engineers, offering technical mentorship in areas like Git version control, task tracking, and documentation processes, as well as advanced Python and database practices. Responsibilities Technical Mentorship and Code Quality: Guide and mentor team members on coding standards, optimization techniques, and debugging. Conduct thorough code reviews, provide constructive feedback, and enforce code quality standards to ensure maintainable and efficient code. Testing and Quality Assurance Leadership: Develop, implement, and oversee rigorous testing protocols, including unit, integration, and performance testing, to guarantee the reliability and robustness of all projects. Advocate for automated testing and ensure comprehensive test coverage within the team. Process Improvement and Documentation: Establish and maintain high standards for version control, documentation, and task tracking across the team. Continuously refine these processes to enhance team productivity, streamline workflows, and ensure data quality. Hands-On Technical Support: Serve as the team’s primary resource for troubleshooting complex issues, particularly in Python, MySQL, GitKraken, and Knime. Provide on-demand support to team members, helping them overcome technical challenges and improve their problem-solving skills. High-Level Technical Mentorship: Provide mentorship in advanced technical areas, including architecture design, data engineering best practices, and advanced Python programming. Guide the team in building scalable and reliable data solutions. Cross-Functional Collaboration: Work closely with data scientists, product managers, and quality assurance teams to align on data requirements, testing protocols, and process improvements. Foster open communication across teams to ensure seamless integration and delivery of data solutions. Continuous Learning and Improvement: Stay updated with emerging data engineering methodologies and best practices, sharing relevant insights with the team. Drive a culture of continuous improvement, ensuring the team’s skills and processes evolve with industry standards. Data Pipelines: Design, implement, and maintain scalable data pipelines for efficient data transfer, cleaning, normalization, transformation, aggregation, and visualization to support production-level workloads. Big Data: Leverage distributed processing frameworks such as PySpark and Kafka to manage and process massive datasets efficiently. Cloud-Native Data Solutions: Develop and optimize workflows for cloud-native data solutions, including BigQuery, Databricks, Snowflake, Redshift, and tools like Airflow and AWS Glue. Regulations: Ensure compliance with regulatory frameworks like GDPR and implement robust data governance and security measures. Skills and Experience Experience : 8 + years Technical Proficiency: Programming: Expert-level skills in Python, with a strong understanding of code optimization, debugging, and testing. Object-Oriented Programming (OOP) Expertise: Strong knowledge of OOP principles in Python, with the ability to design modular, reusable, and efficient code structures. Experience in implementing OOP best practices to enhance code organization and maintainability. Data Management: Proficient in MySQL and database design, with experience in creating efficient data pipelines and workflows. Tools: Advanced knowledge of Git and GitKraken for version control, with experience in task management, ideally on GitHub. Familiarity with Knime or similar data processing tools is a plus. Testing and QA Expertise: Proven experience in designing and implementing testing protocols, including unit, integration, and performance testing. Ability to embed automated testing within development workflows. Process-Driven Mindset: Strong experience with process improvement and documentation, particularly for coding standards, task tracking, and data management protocols. Leadership and Mentorship: Demonstrated ability to mentor and support junior and mid-level engineers, with a focus on fostering technical growth and improving team cohesion. Experience leading code reviews and guiding team members in problem-solving and troubleshooting. Problem-Solving Skills: Ability to handle complex technical issues and serve as a key resource for team troubleshooting. Expertise in guiding others through debugging and technical problem-solving. Strong Communication Skills: Effective communicator capable of aligning cross-functional teams on project requirements, technical standards, and data workflows. Adaptability and Continuous Learning: A commitment to staying updated with the latest in data engineering, coding practices, and tools, with a proactive approach to learning and sharing knowledge within the team. Data Pipelines: Comprehensive expertise in building and optimizing data pipelines, including data transfer, transformation, and visualization, for real-world applications. Distributed Systems: Strong knowledge of distributed systems and big data tools such as PySpark and Kafka. Data Warehousing: Proficiency with modern cloud data warehousing platforms (BigQuery, Databricks, Snowflake, Redshift) and orchestration tools (Airflow, AWS Glue). Regulations: Demonstrated understanding of regulatory compliance requirements (e.g., GDPR) and best practices for data governance and security in enterprise settings Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves Qualifications Educational Background: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. Equivalent experience in data engineering roles will also be considered. Additional Information All your information will be kept confidential according to EEO guidelines.
Posted 1 month ago
0 years
2 - 3 Lacs
Mumbai
On-site
eClerx is hiring a Product Data Management Analyst who will work within our Product Data Management team to help our customers enhance online product data quality for Electrical, Mechanical & Electronics products. It will also involve creating technical specifications and product descriptions for online presentation. The candidate will also be working on consultancy projects on redesigning e-commerce customer’s website taxonomy and navigation. The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality (electrical, mechanical, electronics) from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical and Functional Skills: Bachelor’s Degree in Engineering from Electrical, Mechanical OR Electronics stream Excellent technical knowledge of engineering products (Pumps, motors, HVAC, Plumbing, etc.) and technical specifications Intermediate knowledge of MS Office/Internet.
Posted 1 month ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
Remote
Join us as a Senior Database Developer—Drive High-Performance Data Systems for Financial Services! MSBC is seeking a Senior Database Developer with expertise in Oracle and PL/SQL to design, develop, and optimize complex database solutions. This role offers an exciting opportunity to enhance data integrity, scalability, and performance while working on mission-critical applications. Collaborate with industry experts to deliver efficient and secure database solutions supporting financial services and enterprise applications. If you are passionate about database development and thrive in a fast-paced, technology-driven environment, join us in driving innovation and efficiency through data management. Key Tools and Technologies: Database Management: Oracle Database, MySQL, PostgreSQL, NoSQL Development & Optimization: PL/SQL, SQL, Query Optimization, Index Tuning, Execution Plans Architecture & Data Modeling: Logical & Physical Data Modeling, Normalization, Data Governance Security & Performance: Data Security, Performance Tuning, Backup & Recovery, Disaster Recovery Version Control & Deployment: Git, Database Deployment Strategies Cloud & Automation: Oracle Cloud, AWS RDS, ETL Processes, BI Tools, DevOps Practices Key Responsibilities: Develop and optimize database solutions ensuring integrity, security, and performance. Design and maintain database schema, tables, indexes, views, and stored procedures. Implement data models and governance standards aligned with business requirements. Conduct performance tuning and troubleshooting to enhance efficiency. Manage backups, recovery, and disaster recovery strategies. Collaborate with architects, analysts, and development teams for seamless integration. Provide technical support and mentorship to junior developers. Skills & Qualifications: 5+ years of experience in database development with expertise in PL/SQL and SQL. Strong grasp of database architecture, normalization, and design patterns. Hands-on experience with database security, performance tuning, and version control. Familiarity with cloud-based solutions, automation, and DevOps practices. Additional experience with MySQL, PostgreSQL, or NoSQL databases is a plus. Oracle Certified Professional (OCP) certification preferred. Strong problem-solving, attention to detail, and communication skills. Note: Shift timings align with UK working hours. This role is based in Ahmedabad, but candidates from other cities or states are encouraged to apply, as remote or hybrid working options are available. MSBC Group has been a trusted technology partner for over 20 years, delivering the latest systems and software solutions for financial services, manufacturing, logistics, construction, and startup ecosystems. Our expertise includes Accessible AI, Custom Software Solutions, Staff Augmentation, Managed Services, and Business Process Outsourcing. We are at the forefront of developing advanced AI-enabled services and supporting transformative projects, such as state-of-the-art trading platforms, seamless application migrations, and integrating real-time data analytics. With offices in London, California, and Ahmedabad, and operating in every time-zone, MSBC Group is your AI and automation partner.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France