Jobs
Interviews

1190 Normalization Jobs - Page 11

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Data Modeller – GCP & Cloud Databases Location: Chennai (Work From Office) Experience Required: 6 to 9 Years Role Overview We are looking for a hands-on Data Modeller with strong expertise in cloud-based databases, data architecture, and modeling for OLTP and OLAP systems. You will work closely with engineering and analytics teams to design and optimize conceptual, logical, and physical data models, supporting both operational systems and near real-time reporting pipelines. Key Responsibilities Design conceptual, logical, and physical data models for OLTP and OLAP systems Develop and refine models that support performance-optimized cloud data pipelines Collaborate with data engineers to implement models in BigQuery, CloudSQL, and AlloyDB Design schemas and apply indexing, partitioning, and data sharding strategies Translate business requirements into scalable data architecture and schemas Optimize for near real-time ingestion, transformation, and query performance Use tools such as DBSchema or similar for collaborative modeling and documentation Create and maintain metadata and documentation around models Must-Have Skills Hands-on experience with GCP databases: BigQuery, CloudSQL, AlloyDB Strong understanding of OLTP vs OLAP systems and respective design principles Experience in database performance tuning: indexing, sharding, and partitioning Skilled in modeling tools such as DBSchema, ERWin, or similar Understanding of variables that impact performance in real-time/near real-time systems Proficient in SQL, schema definition, and normalization/denormalization techniques Preferred Skills Functional knowledge of the Mutual Fund or BFSI domain Experience integrating with cloud-native ETL and data orchestration pipelines Familiarity with schema version control and CI/CD in a data context Soft Skills Strong analytical and communication skills Detail-oriented and documentation-focused Ability to collaborate across engineering, product, and analytics teams Why Join Work on enterprise-scale cloud data architectures Drive performance-first data modeling for advanced analytics Collaborate with high-performing cloud-native data teams Skills: olap,normalization,indexing,gcp databases,sharding,olap systems,modeling,schema definition,sql,data,oltp systems,alloydb,erwin,modeling tools,bigquery,database performance tuning,databases,partitioning,denormalization,dbschema,cloudsql

Posted 2 weeks ago

Apply

0.0 - 3.0 years

0 Lacs

Delhi, Delhi

Remote

Job Information Date Opened 07/15/2025 Job Type Full time Industry Government & Public Sector Work Experience 3 years + Salary ₹12L - ₹15L per annum City Delhi State/Province Delhi Country India Zip/Postal Code 110019 Job Description Position Name : Senior Backend Developer Location: New Delhi Who we are We at CivicDataLab, work with the goal to use data, tech, design and social science to strengthen the course of civic-engagements in India. We work to harness the potential of the open-source movement to enable citizens to engage better with public reforms. Our work is centered around building data strategy, data platforms and data science applications to push data-driven decision-making at scale. Moreover, we work closely with governments, non-profits, think tanks, media houses, academia and more to build overall data and tech capacity. What We are looking for A Senior Backend Developer - Consultant with a minimum experience of three years to help support our various interventions through data platforms and tools, focusing from development to infrastructure management. This position requires a developer who can help us scope, build and scale our backend stack that powers our data platforms. Need basis travel to other states within India will be required based on project commitments About the role We need a team player who can coordinate in person and virtually with internal and external stakeholders with diverse backgrounds to refine their requirements into user stories and pull them into the current project roadmap and deliver on them. Key Responsibilities; Design, build, and maintain scalable backend systems that process large-scale data from diverse sectors and geographies. Develop and enhance monitoring, evaluation, and observability of backend infrastructure. Work with large-scale data inflows and outflows, ensuring efficient data handling and performance optimization. Develop and maintain data pipelines capable of processing both big and small datasets, with programmatic scheduling and monitoring. Design and implement scalable APIs that serve key sectors and integrate seamlessly with various Open Source Solutions. Build and manage open-source projects, contributing to the wider tech ecosystem. Ensure security, privacy, and best practices in distributed data systems. Maintain infrastructure scalability through efficient architecture, orchestration, and automation. Requirements Skill sets & Requirements; Minimum 3+ years of experience with backend web frameworks and RESTful service development. Our primary stack is Python (Django/Flask/FastAPI), but experience in Golang, Ruby, or JavaScript is acceptable if you're willing to work with Python. Deep understanding of databases (both relational and NoSQL), including best practices for indexing, querying, normalization, caching, and performance optimization. Solid understanding of Git workflows, CI/CD pipelines, and modern DevOps practices. We use GitHub for project management, so familiarity with its workflow is a plus. Experience with scalable infrastructure, including microservices, distributed systems, Infrastructure as Code (IaC), load balancing, and cloud-based deployment. Experience with cloud services e.g AWS, GCP, Azure etc. Strong communication skills, with the ability to translate complex technical requirements into actionable development plans. You should be comfortable keeping stakeholders informed and making data-driven decisions. Good to have Prior experience working on Open Source projects. Prior experience in working with Data/Tech communities. Collaborating with government or research-based organizations on past projects. Prior experience of working remotely. Familiar with Docker and Kubernetes ecosystems. Basic knowledge of queuing mechanisms with Redis/RabbitMQ/Celery A good sense of humor. How we work CivicDataLab is based out of Delhi and has the presence of project teams located in Assam and Himachal Pradesh. We follow a hybrid model where our bandhus work out of the office for a minimum of 12 days per month (i.e.) 3 days a week. We use open-source tools and agile methodologies in organising our work. Benefits Perks of Working with Us Wellness Allowance At CivicDataLab, we always emphasise the wellness of our bandhus. This includes any Expenditure done for the purpose of Wellness Setup, except any financial instrument, any expense that can be claimed as a deductible expense under Income Tax rules, any goods and services that attract a combined tax, cess or duty of more than 28%. If you're interested in taking classes that enhance your overall physical or mental well-being, you have an INR 60,000 annual stipend to do so. For some people, that might mean a monthly massage. Some take photography lessons or learn a musical instrument or buy a gym membership. It's up to you; the point is to learn something that you feel enriches you as a person. Professional growth and development Allowance At CivicDataLab, we encourage everyone to take up things that help one grow professionally, and you get an annual kitty of INR 60,000 to do so. This includes attending or speaking at conferences and workshops, taking courses, acquiring hardware or software licenses or even joining summer schools. We feel that learning a skill should never be a hurdle to solve important problems for the community. Cost to Organisation(CTO) range 12-15 LPA (incl Perks) Please note: This figure includes both fixed remuneration, perks and Incidental components such as salary, statutory benefits, professional development , wellness, travel infrastructure, and other operational support costs incurred by the organisation. Our Commitment to Diversity We are committed to inclusive hiring and strongly encourage applicants from diverse and underrepresented gender and caste identities and/or sociocultural backgrounds to apply for this role. Our organisational policies are gender-neutral, including POSH policy and leave policy. We provide 6 months of paid time off as parental leave for the primary caregiver and 6 weeks of paid time off for the secondary caregiver, including adoption. Our Hiring Process The entire hiring process averages between 3-4 weeks and consists of the following simple steps: You can submit your application with your detailed portfolio through our website career page Link to page is here https://jobs.civicdatalab.in/jobs/Careers If you are shortlisted, We will have an Introductory discussion to know better and check your fitment and interest Based on how our discussion goes, we’ll give you a take-home assignment, and you will have a week to follow up with your submission. Assignment Discussion: Here, you’ll share your screen and collaborate with our team to further enhance your assignment outcomes. The discussion typically runs for 60 to 90 mins. Based on the inputs from the team, we will get to a decision in 1–2 days. If all goes well, we’ll have a final ‘Culture Discussion’ round, and you get to meet the rest of the team Note: We appreciate your interest in joining CivicDataLab. Applications for this position will be reviewed on a rolling basis. Therefore, we strongly encourage you to apply at the earliest opportunity.Due to the volume of applications we receive, only shortlisted candidates will be contacted for the next stage of the selection process. If you are shortlisted, you can typically expect to hear from us within 5 to 7 working days from the date of your application.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana

On-site

Location Gurugram, Haryana, India Category Corporate Job Id GGN00002121 Tech Ops / Maintenance - Management & Administrative Job Type Full-Time Posted Date 07/15/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description At United, we have some of the best aircraft in the world. Our Technical Operations team is full of aircraft maintenance technicians, engineers, planners, ground equipment and facilities professionals, and supply chain teams that help make sure they’re well taken care of and ready to get our customers to their desired destinations. If you’re ready to work on our planes, join our Tech Ops experts and help keep our fleet in tip-top shape. Job overview and responsibilities Technical Operations includes the maintenance, and overhaul of our aircraft. This includes aircraft maintenance technicians, engineers, planners, ground equipment, facilities teams, supply chain teams and more. The Technical Operations Reliability team ensures United operates safely and dependably by analyzing aircraft defects and operational disruptions. The team monitors trends to notify maintenance and engineering teams of emerging issues and probable corrective actions. As a member of the Reliability Engineering team, the Senior Reliability Engineer will serve as the subject matter expert for their assigned fleet and reports directly to the Manager of Reliability. The role is to provide accurate, high-quality insights and trend analysis, helping various divisions make informed, data-driven decisions. In this role, the Senior Reliability Engineer will identify root causes of significant aircraft issues through detailed reporting and analysis. This position balances both strategic and tactical responsibilities, from long-term fleet initiatives to the day-to-day identification of recurring system failures. The candidate must possess an analytical and engineering mindset with proven ability to drive business results through collaboration with cross-divisional organizations. Responsibilities include, but are not limited to: Conduct daily, weekly and monthly surveillance of mechanical reliability performance to identify fleet/system trends and emerging reliability issues Support the identification and analysis of fleet/aircraft system trends, performing data drilldowns, and leveraging engineering expertise to highlight top drivers and emerging issues for Fleet Managers and Engineers. Assist in executing ongoing reliability and fleet management initiatives while addressing ad-hoc requests as needed. Recommend reliability and safety improvements through detailed analysis and insights. Contribute to the management of fleet reliability and facilitate cross-collaboration with various teams on technical topics. Communicate complex technical data to a wide variety of key stakeholders in a clear and actionable manner. Extract actionable insights from complex data to guide decision-making and drive improved reliability outcomes. This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor’s degree in engineering, or a related STEM disciplines. At least 5 years of experience in aircraft reliability, performance analysis, or a related technical field within the aviation maintenance industry. Extensive experience in analytical roles, with a strong focus on delivering high-quality, accurate analysis and actionable insights. In-depth knowledge of aircraft systems and fleet health/reliability programs. Exceptional attention to detail and accuracy in all aspects of analysis and reporting. Proficiency in Microsoft Office tools, proficient in Microsoft Excel, with the ability to manipulate and analyze complex, high-volume data. Strong ability to conduct drill-down analysis to identify operational root causes and deliver insights that drive decision-making. Strong interpersonal skills, with the ability to collaborate effectively across teams and communicate with senior leadership. Effective communication skills, with the ability to clearly present complex data to a variety of stakeholders. Ability to lead and mentor junior engineers and analysts in best practices for data analysis and reliability engineering. Must be legally authorized to work in India for any employer without sponsorship. Must be fluent in English (written and spoken). Reliable, punctual attendance is an essential function of the position. What will help you propel from the pack (Preferred Qualifications): Master's degree in aeronautical / mechanical engineering and/ or MBA. Experience working with large datasets, with the ability to perform data cleansing, normalization, and advanced analytics. Experience with data analysis software and programming languages (e.g., Python, R, SQL). Experience in Palantir Foundry, including Contour analysis and dashboarding FAA A&P License/ DGCA issued equivalent.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We are building a machine learning-based anomaly detection system using structured and sequential sensor data. The goal is to identify unusual patterns or faults through data modeling and visualization. This internship offers a real-world opportunity to work on machine learning pipelines and understand both supervised and unsupervised approaches to anomaly detection. This phase focuses on offline/static modeling using historical sensor data in tabular and time-series formats. 🎯 Internship Objective Analyze sensor datasets representing various operational scenarios Apply and evaluate supervised classification models Transition into unsupervised anomaly detection approaches Visualize insights and document findings for technical and non-technical audiences 📘 Key Responsibilities Perform data preprocessing: cleaning, encoding, normalization, and feature engineering Train And Evaluate Classification Models Using Artificial Neural Networks (ANN) Long Short-Term Memory (LSTM) models for sequence-based classification Explore And Implement Unsupervised Anomaly Detection Techniques Isolation Forest One-Class SVM Z-score or IQR-based statistical methods Analyze And Visualize Model Outputs Using Confusion matrices Anomaly heatmaps Time-series plots Optional: Build a lightweight dashboard (e.g., Streamlit) to present findings About Company: TechnoExcel is the leading training and consulting company in Hyderabad offering data analytics solutions.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Vijayawada, Andhra Pradesh, India

On-site

About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Delhi, India

Remote

Position Name : Senior Backend Developer Location: New Delhi Who we are We at CivicDataLab, work with the goal to use data, tech, design and social science to strengthen the course of civic-engagements in India. We work to harness the potential of the open-source movement to enable citizens to engage better with public reforms. Our work is centered around building data strategy, data platforms and data science applications to push data-driven decision-making at scale. Moreover, we work closely with governments, non-profits, think tanks, media houses, academia and more to build overall data and tech capacity. What We are looking for A Senior Backend Developer - Consultant with a minimum experience of three years to help support our various interventions through data platforms and tools, focusing from development to infrastructure management. This position requires a developer who can help us scope, build and scale our backend stack that powers our data platforms. Need basis travel to other states within India will be required based on project commitments About the role We need a team player who can coordinate in person and virtually with internal and external stakeholders with diverse backgrounds to refine their requirements into user stories and pull them into the current project roadmap and deliver on them. Key Responsibilities; Design, build, and maintain scalable backend systems that process large-scale data from diverse sectors and geographies. Develop and enhance monitoring, evaluation, and observability of backend infrastructure. Work with large-scale data inflows and outflows, ensuring efficient data handling and performance optimization. Develop and maintain data pipelines capable of processing both big and small datasets, with programmatic scheduling and monitoring. Design and implement scalable APIs that serve key sectors and integrate seamlessly with various Open Source Solutions. Build and manage open-source projects, contributing to the wider tech ecosystem. Ensure security, privacy, and best practices in distributed data systems. Maintain infrastructure scalability through efficient architecture, orchestration, and automation. Requirements Skill sets & Requirements; Minimum 3+ years of experience with backend web frameworks and RESTful service development. Our primary stack is Python (Django/Flask/FastAPI), but experience in Golang, Ruby, or JavaScript is acceptable if you're willing to work with Python. Deep understanding of databases (both relational and NoSQL), including best practices for indexing, querying, normalization, caching, and performance optimization. Solid understanding of Git workflows, CI/CD pipelines, and modern DevOps practices. We use GitHub for project management, so familiarity with its workflow is a plus. Experience with scalable infrastructure, including microservices, distributed systems, Infrastructure as Code (IaC), load balancing, and cloud-based deployment. Experience with cloud services e.g AWS, GCP, Azure etc. Strong communication skills, with the ability to translate complex technical requirements into actionable development plans. You should be comfortable keeping stakeholders informed and making data-driven decisions. Good to have Prior experience working on Open Source projects. Prior experience in working with Data/Tech communities. Collaborating with government or research-based organizations on past projects. Prior experience of working remotely. Familiar with Docker and Kubernetes ecosystems. Basic knowledge of queuing mechanisms with Redis/RabbitMQ/Celery A good sense of humor. How we work CivicDataLab is based out of Delhi and has the presence of project teams located in Assam and Himachal Pradesh. We follow a hybrid model where our bandhus work out of the office for a minimum of 12 days per month (i.e.) 3 days a week. We use open-source tools and agile methodologies in organising our work. Benefits Perks of Working with Us Wellness Allowance At CivicDataLab, we always emphasise the wellness of our bandhus. This includes any Expenditure done for the purpose of Wellness Setup, except any financial instrument, any expense that can be claimed as a deductible expense under Income Tax rules, any goods and services that attract a combined tax, cess or duty of more than 28%. If you're interested in taking classes that enhance your overall physical or mental well-being, you have an INR 60,000 annual stipend to do so. For some people, that might mean a monthly massage. Some take photography lessons or learn a musical instrument or buy a gym membership. It's up to you; the point is to learn something that you feel enriches you as a person. Professional growth and development Allowance At CivicDataLab, we encourage everyone to take up things that help one grow professionally, and you get an annual kitty of INR 60,000 to do so. This includes attending or speaking at conferences and workshops, taking courses, acquiring hardware or software licenses or even joining summer schools. We feel that learning a skill should never be a hurdle to solve important problems for the community. Cost to Organisation(CTO) range 12-15 LPA (incl Perks) Please note: This figure includes both fixed remuneration, perks and Incidental components such as salary, statutory benefits, professional development , wellness, travel infrastructure, and other operational support costs incurred by the organisation. Our Commitment to Diversity We are committed to inclusive hiring and strongly encourage applicants from diverse and underrepresented gender and caste identities and/or sociocultural backgrounds to apply for this role. Our organisational policies are gender-neutral, including POSH policy and leave policy. We provide 6 months of paid time off as parental leave for the primary caregiver and 6 weeks of paid time off for the secondary caregiver, including adoption. Our Hiring Process The entire hiring process averages between 3-4 weeks and consists of the following simple steps: You can submit your application with your detailed portfolio through our website career page Link to page is here https://jobs.civicdatalab.in/jobs/Careers If you are shortlisted, We will have an Introductory discussion to know better and check your fitment and interest Based on how our discussion goes, we’ll give you a take-home assignment, and you will have a week to follow up with your submission. Assignment Discussion: Here, you’ll share your screen and collaborate with our team to further enhance your assignment outcomes. The discussion typically runs for 60 to 90 mins. Based on the inputs from the team, we will get to a decision in 1–2 days. If all goes well, we’ll have a final ‘Culture Discussion’ round, and you get to meet the rest of the team Note: We appreciate your interest in joining CivicDataLab. Applications for this position will be reviewed on a rolling basis. Therefore, we strongly encourage you to apply at the earliest opportunity.Due to the volume of applications we receive, only shortlisted candidates will be contacted for the next stage of the selection process. If you are shortlisted, you can typically expect to hear from us within 5 to 7 working days from the date of your application.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : ALIP Product Configuration Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: Summary: As a Product Configurator, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve working with ALIP Product Configuration, Product Development Management, and ALIP Development to deliver impactful data-driven solutions. Roles & Responsibilities: - Requirement Analysis, Design, build, and configure applications to meet business process and application requirements using ALIP Product Configuration. - Collaborate with cross-functional teams to develop and deploy ALIP Development solutions. - Manage product development using Product Development Management methodologies. - Ensure the quality and integrity of the application by conducting detailed analysis and testing. Professional & Technical Skills: - Must To Have Skills: Life Insurance or Annuity background. ALIP Product Configuration, Product Development Management, ALIP Development. - Good To Have Skills: Experience with Java, SQL, and Agile methodologies, JIRA, RTM, LOMA Certification. - Strong understanding of software engineering principles and best practices. - Experience with software development life cycle (SDLC) processes. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Product Configuration in any insurance policy administration system. - The ideal candidate will possess a strong educational background in software engineering, computer science, or a related field, along with a proven track record of delivering impactful data-driven solutions. - This position is based at our Bengaluru/ Mumbai/ flex office., 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

6 - 16 Lacs

Bengaluru

Remote

Hiring for USA based big MNC, Looking for a detail-oriented and experienced SQL Developer to join our team. The ideal candidate will be responsible for developing, maintaining, and optimizing SQL databases and writing complex queries to ensure data accessibility and integrity. You will work closely with data analysts, software engineers, and business teams to support various data-driven projects. Design, create, and maintain scalable databases Write complex SQL queries, stored procedures, triggers, functions, and views Optimize existing queries for performance and maintainability Perform data extraction, transformation, and loading (ETL) Monitor database performance, implement changes, and apply new patches and versions when required Ensure database security, integrity, stability, and system availability Work with application developers to integrate database logic with applications Troubleshoot and resolve data issues in a timely manner Generate reports and data visualizations for stakeholders as needed Strong proficiency in SQL and experience with relational database systems such as MySQL, SQL Server, PostgreSQL, or Oracle Experience in writing and debugging stored procedures, functions, and complex queries Understanding of data warehousing concepts and ETL processes Familiarity with database design, normalization, and indexing Knowledge of performance tuning and query optimization Experience with reporting tools such as SSRS, Power BI, or Tableau is a plus Good understanding of data governance, security, and compliance Excellent analytical and problem-solving skills Strong communication and teamwork abilities

Posted 2 weeks ago

Apply

3.0 years

7 - 9 Lacs

Mohali

On-site

Responsibilities : · Develop and deploy machine learning models to optimize HVAC setpoints, energy consumption, and operational performance. · Design algorithms for predictive maintenance, fault detection, and dynamic thresholding specific to chillers, pumps, cooling towers, and air handling units. · Work with domain experts and controls engineers to integrate data-driven solutions into existing BMS, SCADA, or edge computing platforms. · Analyze historical and real-time sensor data (temperature, pressure, flow, energy) to identify patterns, anomalies, and optimization opportunities. · Build scalable pipelines for data ingestion, cleaning, normalization, and feature engineering using time-series data from HVAC systems. · Conduct what-if analyses and energy simulations to validate model outputs and savings estimates. · Create visualizations, dashboards, and reports that clearly communicate insights and recommendations to technical and non-technical stakeholders. · Collaborate with software engineers to productize algorithms within cloud or on-prem solutions. Requirements : · Bachelor’s or Master’s degree in Data Science, Computer Science, Mechanical Engineering, Energy Systems, or related field. · 3+ years experience applying data science techniques for industrial systems, preferably with HVAC or energy optimization projects. · Strong skills in Python / R / SQL, with libraries such as pandas, scikit-learn, TensorFlow/PyTorch, XGBoost, Neural Network and RL model development. · Proven experience with time-series analysis, forecasting, and anomaly detection. · Knowledge of HVAC equipment, control strategies, and energy efficiency principles. · Hands-on experience with BMS/SCADA/OPC/Modbus/OPC UA data integration. · Familiarity with cloud platforms (AWS, GCP, Azure) and/or edge computing frameworks. · Excellent problem-solving skills, with ability to translate operational challenges into data science solutions. Job Type: Full-time Pay: ₹700,000.00 - ₹900,000.00 per year Schedule: Day shift Application Question(s): What is your expected CTC and notice period? Experience: Total: 3 years (Required) Python: 2 years (Required) SQL: 1 year (Required) Work Location: In person

Posted 2 weeks ago

Apply

4.0 years

4 - 4 Lacs

Chennai

On-site

As a Full Stack Developer (PHP/Python) , you will craft scalable applications, optimize database structures, and seamlessly integrate APIs while adhering to agile methodologies. Your work will directly impact our clients' digital interventions, fueling departmental success and organizational growth. Collaborating with cross-functional teams and mentoring junior developers, you contribute to a quality-focused, continuous learning culture that aligns with industry trends and best practices in the full stack development community. Technical Responsibilities: Develop high-quality full stack applications using PHP (Laravel/Symfony) and Python (Django/Flask), ensuring they are scalable, maintainable, and responsive. Design and optimize database structures in MySQL, with a focus on normalization, query optimization, and data integrity management. Implement RESTful APIs and SOAP services for seamless integration with third-party applications or external systems, using industry best practices. Collaborate effectively with cross-functional teams to ensure the delivery of high-quality code that meets or exceeds industry standards. This includes writing clear, concise documentation and participating in regular code reviews. Qualifications: Bachelor’s degree in Computer Science, Engineering, or related field. 4+ years of experience in full-stack development with strong PHP and/or Python expertise. Strong understanding of software development lifecycle and best practices. Demonstrated experience delivering production-grade applications across domains. Job Type: Full-time Experience: Python: 4 years (Required) PHP: 4 years (Required) Java: 4 years (Required) Work Location: In person

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Minimum criteria to match the requirement Detailed JD Experience Range:6 years to 14 years Implemented 5+ successful K2 projects as developer and lead roles. • Experience in analysis, design and development of web/windows based applications and n-tier applications at both client and server environments. • Proficient in Relational Database Management Systems (RDBMS). • Expertise in Transact-SQL (DDL, DML) and in Design and Normalization of the database tables. Including data migration experience. • Experience in implementing business logic using Triggers, Indexes, Views and Stored procedures. • Experienced in complete software development life cycle using Agile framework • Capable to delve into the new leading Technologies. • Preferably knowledgeable in logistics, sales and authorization workflows Required Tools Skills • BPM Tools: K2 Blackpearl and K2 Five. K2 cloud would be an advantage . • RDBMS: MS SQL Server 2008 R2 and 2012, 2016 • Internet Technologies: Javascript, ASP.net, MVC, MVC Web APIs, XML and HTML. • Cloud Technologies: Azure, Worker roles, Web roles, Web App, Azure VM, Azure SQL Server DB.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About Aezion Aezion is one of the premiere custom software providers in United States. and we live by the adage that our word is our bond. Our Promise is to get it right or make it right. We accomplish this by investing the effort to exceed client expectations from start to finish - architecting, designing, developing, hosting, deploying, maintaining, and supporting our clients throughout the project lifecycle. We believe that work is ministry – an expression of our values. Our goal is to honor our commitments to clients and the life energies of Aezion employees through results that transform clients into lifelong partners. Position Description Job Title: AI/ML Architct Job Summary: We are looking for an innovative AI/ML Engineer to join our team and work on cutting-edge machine learning and artificial intelligence projects. The ideal candidate will have experience in building, deploying, and optimizing AI/ML models, along with a strong foundation in data science, programming, and algorithms. You will help drive the development of intelligent systems that leverage machine learning to solve real-world problems and improve business outcomes. Key Responsibilities: Data Preparation and Analysis: Ability to understand large datasets, preprocess them, and extract features Data Preprocessing Techniques: Knowledge of normalization, feature encoding, and handling missing values Data Cleaning: Identifying and rectifying errors, outliers, and missing values within datasets Design, develop, and implement machine learning & Deep Learning (FNN, CNN, RNN) models, with a focus on LLMs, generative AI, and fraud detection systems. Deploy and maintain ML models in AWS or any other cloud environments. Optimize model performance and scalability. Collaborate with cross-functional teams to integrate AI solutions into existing applications. Develop and maintain APIs (RESTful) for AI model integration. Implement MLOps best practices to streamline the ML lifecycle. Stay up-to-date with the latest advancements in AI/ML and incorporate new techniques into our workflow. Develop and implement fraud detection models to identify and prevent fraudulent activities. Evaluate model performance using appropriate metrics and techniques, ensuring high accuracy and reliability. Experience with Machine Learning Libraries and Frameworks: Familiarity with tools like TensorFlow, PyTorch, and scikit-learn, Keras GenAI Developer Job Description We are seeking a talented Generative AI Developer to join our innovative team. The ideal candidate will have a strong background in AI and machine learning, with a focus on generative models and large language models (LLMs). You will work closely with cross-functional teams to conceptualize, design, test, and deploy AI projects that drive innovation and provide value in the rapidly evolving field of artificial intelligence. Join us and be part of a dynamic team that is shaping the future of AI. Advanced Programming Knowledge: Mastery in programming languages like Python and expertise in AI-specific libraries such as TensorFlow, PyTorch, and Keras. Proficiency in implementing and manipulating complex algorithms essential for generative AI development. Generative Models Expertise: In-depth experience with Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs). Ability to design, train, and optimize these models for generating high-quality and creative content. Natural Language Processing (NLP): Strong background in text generation techniques, including text parsing, sentiment analysis, and the use of transformers like GPT models for advanced text-based applications. Vector Databases: Hands-on experience with vector databases such as Pinecone, PgVector, and Qdrant for efficient retrieval and similarity search. Embedding, Retrieval-Augmented Generation (RAG), and Indexing: Expertise in creating embeddings, implementing RAG workflows, and indexing vector databases to improve search, retrieval, and contextual generation. LangChain and LangGraph: Expertise in building advanced workflows and applications using LangChain for LLM-based solutions and LangGraph for structured workflows with conditional logic. FastAPI and Microservices: Proficiency in building scalable, RESTful FastAPI applications and microservices architectures for deploying AI solutions. LangSmith and LangFuse: Experience using LangSmith for debugging and evaluation of LLM chains, and LangFuse for logging and analytics in LLM applications. Cloud Computing and Deployment: Expertise in deploying and managing AI applications on cloud platforms such as AWS, Google Cloud, and Microsoft Azure. Familiarity with Docker for containerization and Kubernetes for scaling and orchestration.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP ABAP Development for HANA Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the effort to design, build, and configure applications - Act as the primary point of contact - Manage the team and ensure successful project delivery Professional & Technical Skills: - Must Have Skills: Expertise in SAP ABAP Development for HANA. S/4 HANA implementation work experience. CDS & RAP& AMDP experience Experience in Odata services Life sciences experience - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have a minimum of 5 years of experience in SAP ABAP Development for HANA - This position is based at our Hyderabad office - A 15 years full-time education is required

Posted 2 weeks ago

Apply

0.0 - 10.0 years

0 Lacs

Navi Mumbai, Maharashtra

On-site

Job Title: Senior Oracle Database Developer Experience: 10+ Years Location: Navi Mumbai Job Type: Full-Time Job Summary: We are looking for a highly experienced Senior Oracle Database Developer with over 10 years of proven expertise in Oracle database architecture, development, and performance optimization. The ideal candidate should possess strong PL/SQL programming skills, data modeling proficiency, and deep understanding of Oracle features, tools, and best practices for enterprise-grade systems. Key Responsibilities: Design, develop, and maintain complex Oracle PL/SQL packages, procedures, functions, and triggers. Analyze business requirements and translate them into database solutions and enhancements. Optimize and tune SQL queries and database performance. Develop and maintain data models, ER diagrams, and schemas. Work closely with application developers to design efficient database structures. Ensure data integrity, security, and high availability of Oracle environments. Manage database deployments, migrations, and release cycles. Implement backup strategies, recovery processes, and disaster recovery plans. Monitor database health, logs, and system metrics regularly. Mentor junior developers and enforce best practices and coding standards. Required Skills & Qualifications: 10+ years of hands-on experience with Oracle Database (11g/12c/19c). Expert-level PL/SQL development skills. Strong background in data modeling, normalization, and schema design. Experience in performance tuning and query optimization. Knowledge of Oracle tools: SQL*Plus, SQL Developer, AWR, Statspack, TKPROF, etc. Proficient in writing complex stored procedures and scripts for ETL, data processing, and automation. Exposure to Oracle Apex, Forms/Reports (optional but preferred). Working knowledge of UNIX/Linux shell scripting. Strong problem-solving skills and analytical thinking. Bachelor's/Master's degree in Computer Science, Information Technology, or related field. Preferred: Experience with cloud-based Oracle databases (OCI, AWS RDS for Oracle, Azure). Knowledge of Agile methodologies and CI/CD tools. Oracle certifications (OCP/OCA) are a plus. Job Types: Full-time, Permanent Benefits: Internet reimbursement Paid sick time Paid time off Schedule: Monday to Friday Ability to commute/relocate: Navi Mumbai, Maharashtra: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): Current CTC Expected CTC Notice Period Education: Bachelor's (Preferred) Experience: Oracle: 10 years (Required) Location: Navi Mumbai, Maharashtra (Preferred) Work Location: In person

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Exp level:- 6-10yrs Skills Required:- Implemented 5+ successful K2 projects as developer and lead roles. • Experience in analysis, design and development of web/windows based applications and n-tier applications at both client and server environments. • Proficient in Relational Database Management Systems (RDBMS). • Expertise in Transact-SQL (DDL, DML) and in Design and Normalization of the database tables. Including data migration experience. • Experience in implementing business logic using Triggers, Indexes, Views and Stored procedures. • Experienced in complete software development life cycle using Agile framework • Capable to delve into the new leading Technologies. • Preferably knowledgeable in logistics, sales and authorization workflows Required Tools Skills • BPM Tools: K2 Blackpearl and K2 Five. K2 cloud would be an advantage . • RDBMS: MS SQL Server 2008 R2 and 2012, 2016 • Internet Technologies: Javascript, ASP.net, MVC, MVC Web APIs, XML and HTML. • Cloud Technologies: Azure, Worker roles, Web roles, Web App, Azure VM, Azure SQL Server DB. Note:- NP 20-30days

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Exp level:-6-10yrs Skills Required:- Implemented 5+ successful K2 projects as developer and lead roles. • Experience in analysis, design and development of web/windows based applications and n-tier applications at both client and server environments. • Proficient in Relational Database Management Systems (RDBMS). • Expertise in Transact-SQL (DDL, DML) and in Design and Normalization of the database tables. Including data migration experience. • Experience in implementing business logic using Triggers, Indexes, Views and Stored procedures. • Experienced in complete software development life cycle using Agile framework • Capable to delve into the new leading Technologies. • Preferably knowledgeable in logistics, sales and authorization workflows Required Tools Skills • BPM Tools: K2 Blackpearl and K2 Five. K2 cloud would be an advantage . • RDBMS: MS SQL Server 2008 R2 and 2012, 2016 • Internet Technologies: Javascript, ASP.net, MVC, MVC Web APIs, XML and HTML. • Cloud Technologies: Azure, Worker roles, Web roles, Web App, Azure VM, Azure SQL Server DB. Note:- NP 20-30days

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

India

Remote

Job Title: Lead Data Engineer Experience: 8–10 Years Location: Remote Job Type: Full-Time Mandatory: Prior hands-on experience with Fivetran integrations About the Role: We are seeking a highly skilled Lead Data Engineer with 8–10 years of deep expertise in cloud-native data platforms, including Snowflake, Azure, DBT , and Fivetran . This role will drive the design, development, and optimization of scalable data pipelines, leading a cross-functional team and ensuring data engineering best practices are implemented and maintained. Key Responsibilities: Lead the design and development of data pipelines (batch and real-time) using Azure, Snowflake, DBT, Python , and Fivetran . Translate complex business and data requirements into scalable, efficient data engineering solutions. Architect multi-cluster Snowflake setups with an eye on performance and cost. Design and implement robust CI/CD pipelines for data workflows (Git-based). Collaborate closely with analysts, architects, and business teams to ensure data architecture aligns with organizational goals. Mentor and review work of onshore/offshore data engineers. Define and enforce coding standards, testing frameworks, monitoring strategies , and data quality best practices. Handle real-time data processing scenarios where applicable. Own end-to-end delivery and documentation for data engineering projects. Must-Have Skills: Fivetran : Proven experience integrating and managing Fivetran connectors and sync strategies. Snowflake Expertise : Warehouse management, cost optimization, query tuning Internal vs. external stages, loading/unloading strategies Schema design, security model, and user access Python (advanced): Modular, production-ready code for ETL/ELT, APIs, and orchestration DBT : Strong command of DBT for transformation workflows and modular pipelines Azure : Azure Data Factory (ADF), Databricks Integration with Snowflake and other services SQL : Expert-level SQL for transformations, validations, and optimizations Version Control : Git, branching, pull requests, and peer code reviews CI/CD : DevOps/DataOps workflows for data pipelines Data Modeling : Star schema, Data Vault, normalization/denormalization techniques Strong documentation using Confluence, Word, Excel, etc. Excellent communication skills – verbal and written Good to Have: Experience with real-time data streaming tools (Event Hub, Kafka) Exposure to monitoring/data observability tools Experience with cost management strategies for cloud data platforms Exposure to Agile/Scrum-based environments

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionises customer engagement by transforming contact centres into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organisations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry. Position Overview: We seek an experienced Staff Software Engineer to lead the design and development of our data warehouse and analytics platform in addition to helping raise the engineering bar for the entire technology stack at Level AI, including applications, platform, and infrastructure. They will actively collaborate with team members and the wider Level AI engineering community to develop highly scalable and performant systems. They will be a technical thought leader who will help drive solving complex problems of today and the future by designing and building simple and elegant technical solutions. They will coach and mentor junior engineers and drive engineering best practices. They will actively collaborate with product managers and other stakeholders both inside and outside the team. What you’ll get to do at Level AI (and more as we grow together): Design, develop, and evolve data pipelines that ingest and process high-volume data from multiple external and internal sources. Build scalable, fault-tolerant architectures for both batch and real-time data workflows using tools like GCP Pub/Sub, Kafka and Celery. Define and maintain robust data models with a focus on domain-oriented design , supporting both operational and analytical workloads. Architect and implement data lake/warehouse solutions using Postgres and Snowflake . Lead the design and deployment of workflow orchestration using Apache Airflow for end-to-end pipeline automation. Ensure platform reliability with strong monitoring, alerting, and observability for all data services and pipelines. Collaborate closely with Other internal product & engineering teams to align data platform capabilities with product and business needs. Own and enforce data quality, schema evolution, data contract practices, and governance standards. Provide technical leadership, mentor junior engineers , and contribute to cross-functional architectural decisions. We'd love to explore more about you if you have 8+ years of experience building large-scale data systems ; preferably in high-ingestion, multi-source environments. Strong system design, debugging, and performance tuning skills . Strong programming skills in Python and Java . Deep understanding of SQL (Postgres, MySQL) and data modeling (star/snowflake schema, normalization/denormalization). Hands-on experience with streaming platforms like Kafka and GCP Pub/Sub . Expertise with Airflow or similar orchestration frameworks. Solid experience with Snowflake , Postgres , and distributed storage design. Familiarity with Celery for asynchronous task processing. Comfortable working with ElasticSearch for data indexing and querying. Exposure to Redash , Metabase , or similar BI/analytics tools. Proven experience deploying solutions on cloud platforms like GCP or AWS . Compensation: We offer market-leading compensation, based on the skills and aptitude of the candidate. Preferred Attributes- Experience with data governance and lineage tools. Demonstrated ability to handle scale, reliability, and incident response in data systems. Excellent communication and stakeholder management skills. Passion for mentoring and growing engineering talent. To learn more visit : https://thelevel.ai/ Funding : https://www.crunchbase.com/organization/level-ai LinkedIn : https://www.linkedin.com/company/level-ai/ Our AI platform : https://www.youtube.com/watch?v=g06q2V_kb-s

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

Remote

Job Title: SQL Developer Intern Company: Enerzcloud Solutions Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month Department: Data Engineering / Development About the Company: Enerzcloud Solutions is a forward-thinking technology company focused on delivering smart, scalable software and data solutions. We help businesses make better decisions through automation, data analysis, and cutting-edge development practices. Job Summary: We are seeking a dedicated and detail-oriented SQL Developer Intern to join our remote development team. This internship offers real-world exposure to writing SQL queries, managing databases, and supporting business intelligence and analytics processes. Key Responsibilities: Write and optimize SQL queries for data extraction and reporting Assist in designing, creating, and maintaining relational databases Perform data validation, transformation, and troubleshooting tasks Work on ETL processes and support data pipeline development Collaborate with data analysts and developers to fulfill data needs Document queries, schemas, and workflow processes Requirements: Pursuing or recently completed a degree in Computer Science, IT, or related field Strong foundational knowledge of SQL and relational databases Familiarity with MySQL, PostgreSQL, SQL Server, or similar platforms Understanding of normalization, joins, indexing, and query optimization Basic knowledge of Excel or BI tools is a plus Eager to learn and adapt in a remote work environment Perks & Benefits: ₹25,000/month stipend Real-world data and development project exposure Internship certificate upon successful completion Mentorship and learning support Flexible remote working

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

Remote

Job Title: SQL Developer Intern Company: Enerzcloud Solutions Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month Department: Data Engineering / Development About the Company: Enerzcloud Solutions is a leading technology company providing smart, scalable digital solutions to clients across various sectors. We specialize in web and software development, automation, and data solutions to help businesses grow efficiently. Job Summary: We are seeking a detail-oriented and motivated SQL Developer Intern to join our data and development team. This role offers hands-on experience in writing SQL queries, managing databases, and supporting data-related projects in a real-time environment. Key Responsibilities: Write efficient and optimized SQL queries for data extraction and reporting Assist in designing, developing, and maintaining relational databases Support data validation, cleansing, and transformation activities Collaborate with analysts and developers to meet data requirements Document database changes, processes, and workflows Troubleshoot and resolve database-related issues Requirements: Pursuing or recently completed a degree in Computer Science, Information Technology, or related field Basic understanding of SQL and relational database concepts Familiarity with databases like MySQL, PostgreSQL, or SQL Server Knowledge of data normalization and query optimization Strong problem-solving and logical thinking skills Eagerness to learn and adapt in a remote work environment Perks & Benefits: ₹25,000/month stipend Real-world project experience Mentorship from experienced SQL developers Internship completion certificate Flexible remote work setup

Posted 2 weeks ago

Apply

0 years

0 Lacs

Secunderābād, Telangana, India

On-site

About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI

Posted 2 weeks ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, colour, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes. Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Position: Software Engineer / Sr. Software Engineer Education Qualification: Any Graduate Minimum Years of Experience: 5 Years Key Skills: MS SQL Server Type of Employment: Permanent Requirement: Immediate or Max 15 days Location: Mumbai Responsibilities Responsible to work with development team to develop, implement, and manage database models for core product development. Responsible to write SQL database views, tables, and stored procedures to support engineering product development. Responsible for designing and maintaining SSIS, T-SQL, and SQL jobs. Responsible for developing and maintaining complex stored procedures for loading data into staging tables from OLTP, and other intermediary systems. Responsible for analysis, design specifications, development, implementation, and maintenance of DB. Responsible for designing partitioning of DB for Archive data. Responsible to ensure that the best practices and standards established for the use of tools like SQL Server, SSIS, SSRS, Excel Power Pivot/View/Map are incorporated in Data Analytics solutions design. Responsible for documenting complex processes, business requirements, and specifications. Requirements Technical Skills: Experience in database design, normalization, query design, performance tuning. Proficient in writing complex Transact SQL code. Proficient in MS SQL Server query tuning. Experience in writing stored procedures, functions, views, and triggers. Experience in Indexes, column store index, SQL server column storage, Query execution plan. Provide authentication and authorizations for Database. Develop best practices for database design and development activities. Experience in database migration activities. Strong analytical, multi-tasking, and problem-solving skills.

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Job Title: Talent Management Specialist (2–6 Years Experience) – Immediate Joiner 📍 Location: Kolkata 🕒 Experience: 2–6 Years 📅 Join: Immediate About the Role: We are looking for a passionate and detail-oriented Talent Management Specialist to join our HR team in Kolkata. The ideal candidate will have hands-on experience in driving performance management systems (PMS), succession planning, talent reviews, competency mapping, and engagement initiatives . This is a strategic and execution-focused role that contributes to building a high-performance culture across the organization. Key Responsibilities: 🌟 Performance Management System (PMS): Drive the end-to-end PMS cycle including goal setting, mid-year and annual reviews, normalization, and performance calibration sessions. Partner with line managers and department heads to define KRAs/KPIs aligned with organizational objectives. Ensure timely completion of appraisal cycles, and provide support and training to employees and managers on the PMS process. Prepare dashboards, analysis, and reports on performance trends, ratings distribution, and talent movements. 🧩 Talent Review & Succession Planning: Identify critical roles and build succession pipelines for key positions. Coordinate and support annual Talent Review processes in collaboration with business leaders and HRBPs. Maintain and update talent matrices and 9-box grids for leadership planning. 🧠 Competency Mapping & Career Pathing: Contribute to building and enhancing competency frameworks across levels and functions. Work with department heads to define and document career paths and progression plans. Conduct competency-based assessments and gap analysis. 🏅 High-Potential (HiPo) & Leadership Development: Identify and implement frameworks for HiPo identification and development. Plan and execute learning and development interventions for emerging leaders. Collaborate with L&D team to design Individual Development Plans (IDPs). 🤝 Employee Engagement & Retention: Partner with HR and business to track engagement levels , analyze attrition trends, and recommend retention strategies. Run pulse surveys , stay interviews, and exit interview analytics. Design employee recognition frameworks and cultural reinforcement initiatives. 📊 HR Analytics & Reporting: Create dashboards and reports related to performance, talent movement, and retention. Present insights and recommendations to senior management. Desired Candidate Profile: Experience: 2–6 years in Talent Management, PMS, or Organizational Development roles. Skills: Strong understanding of PMS tools and frameworks. Good facilitation and stakeholder management skills. Analytical mindset with data interpretation and reporting ability. Exposure to engagement, career development, and succession planning initiatives. Familiarity with HRMS, PMS tools (SAP SuccessFactors, Darwinbox, Zoho People, etc.) is a plus. Education: Graduate/Postgraduate in HR, Business Administration, Psychology, or related fields. Location: Kolkata (mandatory) Joining: Immediate or within 15 days preferred

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Summary: We are seeking a highly skilled and motivated SQL Developer with strong expertise in either PostgreSQL or Oracle databases, coupled with proficiency in Java and/or Python. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable database solutions, as well as integrating these solutions with applications built using Java and/or Python. You will play a crucial role in optimizing database performance, ensuring data integrity, and collaborating with cross-functional teams to deliver high-quality software. Responsibilities: Design, develop, and implement complex SQL queries, stored procedures, functions, and triggers for PostgreSQL or Oracle databases. Optimize database performance through indexing, query tuning, and database schema improvements. Work closely with application developers to integrate database solutions with Java and/or Python applications. Develop and maintain ETL processes (Extract, Transform, Load) for data migration and integration. Collaborate with business analysts to understand data requirements and translate them into technical specifications. Ensure data integrity, security, and availability. Perform database performance monitoring, troubleshooting, and tuning. Participate in database design reviews and provide recommendations for best practices. Develop and maintain documentation for database designs, processes, and procedures. Support existing database systems and applications, including on-call rotation as needed. Stay up-to-date with the latest database technologies and best practices in PostgreSQL, Oracle, Java, and Python. Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent practical experience). 3+ years of experience as an SQL Developer. Strong proficiency in either PostgreSQL or Oracle databases, including: Expertise in writing complex SQL queries, stored procedures, functions, and triggers. Experience with database design, normalization, and optimization techniques. Understanding of database architecture and administration concepts. Proficiency in at least one of the following programming languages: Java: Experience with JDBC, ORM frameworks (e.g., Hibernate, JPA), and Spring Boot is a plus. Python: Experience with database connectors (e.g., psycopg2, cx_Oracle), ORM frameworks (e.g., SQLAlchemy), and data manipulation libraries (e.g., Pandas) is a plus. Experience with version control systems (e.g., Git). Strong analytical and problem-solving skills. Excellent communication and interpersonal skills, with the ability to collaborate effectively with technical and non-technical stakeholders. Ability to work independently and as part of a team in a fast-paced environment. Preferred Skills (Bonus Points): Experience with both PostgreSQL and Oracle databases. Familiarity with cloud platforms (AWS, Azure, GCP) and their database services. Experience with CI/CD pipelines and DevOps practices. Knowledge of data warehousing concepts and tools. Experience with data visualization tools (e.g., Tableau, Power BI). Familiarity with NoSQL databases (e.g., MongoDB, Cassandra). About Finacle Finacle is an industry leader in digital banking solutions. We partner with emerging and established financial institutions to inspire better banking. Our cloud-native solution suite and SaaS services help banks to engage, innovate, operate, and transform better. We are a business unit of EdgeVerve Systems, a wholly-owned product subsidiary of Infosys – a global technology leader with over USD 15 billion in annual revenues. We are differentiated by our functionally-rich solution suite, composable architecture, culture, and entrepreneurial spirit of a start-up. We are also known for an impeccable track record of helping financial institutions of all sizes drive digital transformation at speed and scale. Today, financial institutions in more than 100 countries rely on Finacle to help more than a billion people and millions of businesses to save, pay, borrow, and invest better. Finacle website (https://www.edgeverve.com/finacle/solutions/) Disclaimer :- Edgeverve Systems does not engage with external manpower agencies or charge any fees from candidates for recruitment. If you encounter such scams, please report them immediately.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies