Jobs
Interviews

1190 Normalization Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Telangana

On-site

Design, develop, and maintain SQL databases and database objects such as tables, views, indexes, stored procedures, and functions. Write complex SQL queries to extract, manipulate, and analyze data. Optimize database performance by analyzing query execution plans and making necessary adjustments. Ensure data integrity and security by implementing appropriate measures and protocols. Collaborate with software developers, analysts, and other stakeholders to understand data requirements and provide solutions. Perform data migrations and transformations as needed. Monitor database performance and troubleshoot issues as they arise. Create and maintain documentation related to database design, configuration, and processes. Participate in code reviews and provide feedback to team members. Stay updated with the latest developments in SQL and database technologies. Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience as a SQL Developer, Database Administrator, or similar role. Strong proficiency in SQL and experience with database management systems such as MySQL, SQL Server, Oracle, or PostgreSQL. Familiarity with data warehousing concepts and tools. Experience with ETL (Extract, Transform, Load) processes and tools. Knowledge of programming languages such as Python, Java, or C# is a plus. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Skills: Advanced SQL querying and database design. Performance tuning and optimization. Data modeling and normalization. Understanding of database security and backup/recovery processes. Ability to work independently and as part of a team. Analytical mindset with the ability to interpret complex data sets.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Gurgaon

On-site

AHEAD builds platforms for digital business. By weaving together advances in cloud infrastructure, automation and analytics, and software delivery, we help enterprises deliver on the promise of digital transformation. At AHEAD, we prioritize creating a culture of belonging, where all perspectives and voices are represented, valued, respected, and heard. We create spaces to empower everyone to speak up, make change, and drive the culture at AHEAD. We are an equal opportunity employer, and do not discriminate based on an individual's race, national origin, color, gender, gender identity, gender expression, sexual orientation, religion, age, disability, marital status, or any other protected characteristic under applicable law, whether actual or perceived. We embrace all candidates that will contribute to the diversification and enrichment of ideas and perspectives at AHEAD. The Senior Technical Consultant is a skilled cybersecurity professional with strong expertise in at least one core XSIAM technology domain and a good working knowledge of others. They will be responsible for the technical execution of XSIAM deployments, handling complex configurations, and mentoring junior team members. Key Responsibilities Take a hands-on role in the end-to-end delivery of Palo Alto Networks XSIAM solutions, including deployment, configuration, and customization to meet specific client requirements. Develop and implement custom XSIAM content, such as tailored correlation rules, data models for unique log sources, and automation playbooks that streamline client SOC workflows. Integrate a variety of data sources into XSIAM, ensuring comprehensive visibility across endpoint, network, cloud, and identity layers. Configure and fine-tune XSIAM functionalities, including TIM for threat intelligence enrichment and ASM for external visibility. Collaborate with clients to optimize their XSIAM deployment, provide guidance on alert tuning, and assist in operationalizing the platform. Act as a technical resource for troubleshooting and resolving complex XSIAM-related issues during and post-implementation. Contribute to project documentation, ensuring clarity and completeness of Solution Designs and As-Built configurations. Skills Required 8 years of dedicated experience in cybersecurity, with a strong practical background in SIEM, SOAR, EDR/XDR, or SOC operations. 5 years of demonstrated threat intelligence and Incident response experience A minimum of 2 years of direct experience implementing and configuring Palo Alto Networks XSIAM or similar advanced SecOps platforms. Demonstrated expertise in at least one of the following: SIEM administration, including log collection, parsing, and normalization (XDM). SOAR development, including creating playbooks and leveraging scripting (Python preferred). EDR/XDR deployment and management, particularly with Cortex and Crowdstrike Proficiency with XQL for data analysis and rule creation. Solid understanding of network security concepts, cloud environments (AWS, Azure, GCP), and identity management. Strong analytical and troubleshooting capabilities. Effective communication skills, with the ability to engage with clients and team members. Palo Alto Networks certifications (e.g., PCNSE) or other relevant industry certifications are a plus. Why AHEAD: Through our daily work and internal groups like Moving Women AHEAD and RISE AHEAD, we value and benefit from diversity of people, ideas, experience, and everything in between. We fuel growth by stacking our office with top-notch technologies in a multi-million-dollar lab, by encouraging cross department training and development, sponsoring certifications and credentials for continued learning. USA Employment Benefits include: Medical, Dental, and Vision Insurance 401(k) Paid company holidays Paid time off Paid parental and caregiver leave Plus more! See benefits https://www.aheadbenefits.com/ for additional details. The compensation range indicated in this posting reflects the On-Target Earnings (“OTE”) for this role, which includes a base salary and any applicable target bonus amount. This OTE range may vary based on the candidate’s relevant experience, qualifications, and geographic location.

Posted 1 week ago

Apply

0 years

1 - 6 Lacs

Noida

On-site

As Python Developer, you'll contribute to the core analytics engine that powers portfolio scoring and optimization. If you're strong in Python and love working with numbers, dataframes, and algorithms—this is the role for you. Key Responsibilities - Build and maintain internal Python libraries for scoring, data ingestion, normalization, and transformation. - Collaborate on the core computation engine, dealing with Return, Safety, Income scoring algorithms. - Process broker portfolio data and validate datasets using clean, modular Python code. - Contribute to writing tests, CI scripts, and engine documentation. - Participate in internal design discussions and implement enhancements to existing libraries and engines. - Financial data set like factser and bloomberg to build quant model for USA market Must-Have Skills - Proficiency in Python 3.x, with attention to clean, testable code. - Hands-on experience with NumPy and pandas for numerical/data processing. - Understanding of basic ETL concepts and working with structured datasets. - Familiarity with Git, unit testing, logging, and code organization principles. - Strong problem-solving ability and willingness to learn new technologies fast. Good to Have (Bonus Points) - Experience with SQL and writing efficient queries for analytics use-cases. - Exposure to ClickHouse or other columnar databases optimized for fast OLAP workloads. - Familiarity with data validation tools like pydantic or type systems like mypy. - Knowledge of Python packaging tools (setuptools, pyproject.toml). - Experience with Apache Arrow, Polars, FastAPI, or SQLAlchemy. - Exposure to async processing (Celery, asyncio), Docker, or Kubernetes. What You’ll Gain - Work on the core decision-making engine that directly impacts investor outcomes. - Be part of a small, high-quality engineering team focused on clean, impactful systems. - Mentorship and learning in data engineering, financial analytics, and production-quality backend systems. - Growth path into specialized tracks: backend, data engineering, or system architecture. About the Stack While your main focus will be Python, you'll also interact with services that consume or publish to APIs, data stores like ClickHouse and PostgreSQL, and real-time processing queues. Job Type: Full-time Pay: ₹10,000.00 - ₹50,000.00 per month Location Type: In-person Work Location: In person

Posted 1 week ago

Apply

5.0 - 6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

ETL Developer Overall Exp – 5 to 6 years Relevant Yrs of Exp – 2 to 3 yrs WFO - Mandatory - Monday to Friday Migration project Oracle Cloud to Database MySQL (on-premises), No AWS or Azure is required SQL script and Python migration On-premises (OS: Linux) solutions - Standalone data base, connect with multiple data source Location- Sector, 98, Noida • Strong knowledge of Python 3.x • Experience with ETL libraries: pandas, SQL alchemy, cx_ Oracle, pymysql, pyodbc, or mysql-connector-python • Exception handling, logging frameworks, scheduling via cron, Airflow, or custom scripts Databases • Strong SQL skills in Oracle (PL/SQL) and MySQL • Understanding of data types, normalization/denormalization, indexing, and relational integrity • Comfortable reading and analysing stored procedures, triggers, and constraints Data Transformation • Experience with ID mapping, data cleansing, type casting, lookup table joins • Comfortable with large data files, incremental updates, and historical data loads Documentation • Maintain clear documentation of ETL logic, transformation rules, and exception cases

Posted 1 week ago

Apply

1.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Summary: We are seeking a proactive and detail-oriented Data Scientist to join our team and contribute to the development of intelligent AI-driven production scheduling solutions. This role is ideal for candidates passionate about applying machine learning, optimization techniques, and operational data analysis to enhance decision-making and drive efficiency in manufacturing or process industries. You will play a key role in designing, developing, and deploying smart scheduling algorithms integrated with real-world constraints like machine availability, workforce planning, shift cycles, material flow, and due dates. Experience: 1 Year Responsibilities: 1. AI-Based Scheduling Algorithm Development Develop and refine scheduling models using: Constraint Programming Mixed Integer Programming (MIP) Metaheuristic Algorithms (e.g., Genetic Algorithm, Ant Colony, Simulated Annealing) Reinforcement Learning or Deep Q-Learning Translate shop floor constraints (machines, manpower, sequence dependencies, changeovers) into mathematical models. Create simulation environments to test scheduling models under different scenarios. 2. Data Exploration & Feature Engineering Analyze structured and semi-structured production data from MES, SCADA, ERP, and other sources. Build pipelines for data preprocessing, normalization, and handling missing values. Perform feature engineering to capture important relationships like setup times, cycle duration, and bottlenecks. 3. Model Validation & Deployment Use statistical metrics and domain KPIs (e.g., throughput, utilization, makespan, WIP) to validate scheduling outcomes. Deploy solutions using APIs, dashboards (Streamlit, Dash), or via integration with existing production systems. Support ongoing maintenance, updates, and performance tuning of deployed models. 4. Collaboration & Stakeholder Engagement Work closely with production managers, planners, and domain experts to understand real-world constraints and validate model results. Document solution approaches, model assumptions, and provide technical training to stakeholders. Qualifications: Bachelor’s or Master’s degree in: Data Science, Computer Science, Industrial Engineering, Operations Research, Applied Mathematics, or equivalent. Minimum 1 year of experience in data science roles with exposure to: AI/ML pipelines, predictive modelling, Optimization techniques or industrial scheduling Proficiency in Python, especially with: pandas, numpy, scikit-learn ortools, pulp, cvxpy or other optimization libraries, matplotlib, plotly for visualization Solid understanding of: Production planning & control processes (dispatching rules, job-shop scheduling, etc.), Machine Learning fundamentals (regression, classification, clustering) Familiarity with version control (Git), Jupyter/VSCode environments, and CI/CD principles Preferred (Nice-to-Have) Skills: Experience with: Time-series analysis, sensor data, or anomaly detection, Manufacturing execution systems (MES), SCADA, PLC logs, or OPC UA data, Simulation tools (SimPy, Arena, FlexSim) or digital twin technologies Exposure to containerization (Docker) and model deployment (FastAPI, Flask) Understanding of lean manufacturing principles, Theory of Constraints, or Six Sigma Soft Skills: Strong problem-solving mindset with ability to balance technical depth and business context. Excellent communication and storytelling skills to convey insights to both technical and non-technical stakeholders. Eagerness to learn new tools, technologies, and domain knowledge.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Title: Data Analyst Purpose of the Position: This position involves performing feasibility and impact assessments, reviewing documentation to ensure conformity to methods, designs, and standards, and achieving economies of scale during the support phase. As a senior Business Analyst, you will also be responsible for stakeholder communication, conducting primary and secondary research based on solution and project needs, supporting new solution design and other organizational initiatives, and collaborating with all stakeholders in a multi-disciplinary team environment to build consensus on various data and analytics projects. Work Location : Pune/ Nagpur/ Chennai/ Bangalore Type of Employment: Full time Key Result Areas and Activities: Data Analysis and Orchestration: Analyze data from various sources, create logical mappings using standard data dictionaries/business logic, and build SQL scripts to orchestrate data from source (Redshift) to target (Treasure Data - CDP). Stakeholder Engagement and Requirement Gathering: Engage with stakeholders to gather business requirements, particularly for data integration projects, ensuring clear communication and understanding. Communication and Team Collaboration: Demonstrate excellent communication skills and strong teamwork, contributing effectively to the team’s success. Stakeholder Management: Manage relationships with stakeholders across different levels, departments, and geographical locations, ensuring effective liaison and coordination. Data Modelling and ETL Processes: Utilize excellent data modelling skills (including RDBMS concepts, normalization, dimensional modelling, star/snowflake schema) and possess sound knowledge of ETL/data warehousing processes and data orchestration. Must Have: Well versed/expert with Data Analysis using SQL. Experienced in building data orchestration SQL queries. Experience in working with business process & data engineering teams to understand & build Business logic, Data mapping as per the business logic, Building SQL orchestration scripts on top of logical data mapping. Ability to review systems and map business processes. Process and Data modelling. Good To Have: Experience with Business and data analysis in Pharmaceutical/Biotech Industry. Excellent Data Modelling skills (RDBMS concepts, Normalisation, dimensional modelling, star/snowflake schema etc). Hands on knowledge with data querying, data analysis, data mining, reporting and analytics will be a plus. Qualifications: 3+ years of experience as core Data Analyst or Business Analyst focused on data integration/orchestration. Bachelor’s degree in computer science, engineering, or related field (Master’s degree is a plus). Demonstrated continued learning through one or more technical certifications or related methods. Qualities: Self-motivated and focused on delivering outcomes for a fast-growing team and firm. Able to communicate persuasively through speaking, writing, and client presentations. Able to work with teams and clients in different time zones. Research focused mindset.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Large Language Models Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an AI/ML Engineer, you will develop applications and systems utilizing AI tools, Cloud AI services, and GenAI models. Your role involves implementing deep learning, neural networks, chatbots, and image processing in production-ready quality solutions. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Develop applications and systems using AI tools and Cloud AI services. - Implement deep learning and neural networks in solutions. - Create chatbots and work on image processing tasks. - Collaborate with team members to provide innovative solutions. - Stay updated with the latest AI/ML trends and technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Large Language Models. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms like linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques including data cleaning, transformation, and normalization. Additional Information: - The candidate should have a minimum of 3 years of experience in Large Language Models. - This position is based at our Bengaluru office. - A 15 years full-time education is required.

Posted 1 week ago

Apply

4.0 - 8.0 years

6 Lacs

Panchkula

On-site

Job Summary: We are seeking a highly skilled MS SQL Database Expert to join our team. The ideal candidate will have hands-on experience with T-SQL programming, performance tuning, and advanced database management on Microsoft SQL Server. You will play a key role in supporting development teams, optimizing database performance, and ensuring data security and integrity. Key Responsibilities: Support software developers by optimizing complex SQL queries, tuning DML operations, and developing stored procedures. Design and develop high-quality database solutions using T-SQL. Create and implement robust procedures and functions to support business logic and application needs. Analyze, troubleshoot, and improve existing SQL queries and database structures for optimal performance. Collaborate closely with developers, architects, and stakeholders to align database development with project goals. Monitor and maintain database systems, resolving issues as they arise. Ensure data integrity and security by managing database roles, permissions, and access control. Develop and implement strategies for database backups, restorations, migrations, and replications. Manage database constraints, indexes, and normalization for performance and scalability. Required Skills & Qualifications: 4–8 years of experience as an SQL Developer or in a similar database-focused role. Strong proficiency in Microsoft SQL Server and T-SQL programming. Proven experience in writing optimized queries, stored procedures, and functions. Experience with performance tuning and query optimization techniques. Solid understanding of database security, roles, and permission management. Experience with backup/recovery strategies and replication setups. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Nice to Have: Experience working with large datasets and high-availability environments. Familiarity with Azure SQL or cloud-based database systems. Exposure to Agile development methodologies. Job Type: Full-time Pay: From ₹600,000.00 per year Schedule: Day shift Fixed shift Monday to Friday Ability to commute/relocate: Panchkula, Haryana: Reliably commute or planning to relocate before starting work (Required) Experience: MS SQL: 3 years (Required) Work Location: In person

Posted 1 week ago

Apply

5.0 years

6 - 8 Lacs

Bengaluru

On-site

India - Bangalore Professional Services/Full time/Hybrid We are seeking a skilled Data Analytics and Reporting Specialist to join our global team. In this role, you will be responsible for creating and maintaining impactful reports and dashboards for our leadership and finance teams, and in support of greater data integrity efforts across the company. This is an exciting opportunity for a proactive and analytical professional who is passionate about transforming data into compelling visualizations that drive action. The ideal candidate will be comfortable working in a dynamic, data- driven environment and possess a natural curiosity for “digging” into the data. You will partner closely with business stakeholders to analyze current challenges and use cases, with the goal of enhancing the analytic experience. This position is based in our Guidewire Bangalore office, and reports to the PS Manager, Salesforce PSA & Analytics. Job Description Responsibilities: Serve as the go-to expert for the PS organization on Salesforce Reports and Dashboards (including standard & CRMA) and Thoughtspot for enterprise reporting needs. Continuously hone expertise on the PSA data model within Salesforce and underlying system processes. Ensure solutions are scalable, repeatable, effective, and meet the expectations of various stakeholders. Support top-tier executive analytic reporting, setting a high standard for best practices. Partner with IS Technical Teams and Business Partners to establish enterprise oversight of critical team- related data. Manage end-to-end evaluations and solutioning, including design, development, testing, deployment, an adoption. Map, clean, and organize data, ensuring accurate alignment of fields and values between CRM systems. Analyze data to identify gaps or inconsistencies and develop reports and dashboards to support management's information needs. Present findings, recommendations, and updates on key metrics to management and stakeholders through clear, impactful presentations. ### Essential Skills and Experience: BA/BS degree or equivalent experience and minimum of 5+ years proven related experience in Analytics, Finance, Business Operation or Management Consulting. Experience with Salesforce CRM Analytics (Salesforce Admin certification is a plus). Intermediate to advanced experience with SQL scripting and data visualization tools (e.g. Tableau, PowerBI, ThoughtSpot) to turn data into insights. Ability to independently synthesize sophisticated data into simple consumable deliverables with clear takeaways. Outstanding communications skills, both written and verbal, with both technical and non-technical colleagues and strong stakeholder management ability are essential skills. Detail-oriented with a desire to quickly learn new concepts, business models, and technologies. Ability to adapt to new demands and execute with urgency in a dynamic environment. Strong operational skills, including problem-solving, process analysis, and execution. Ability to design and interpret key performance metrics and deliver insights. Capability to automate processes and drive improvements. Troubleshoot operational issues and propose system/process changes to fix root causes. Preferred Skills and Experience: Core foundational FP&A experience: accounting, planning, forecasting, variance analysis. Experience in Professional Services Consulting business of a SaaS vertical company is a plus. Knowledge or background with Certinia PSA (Professional Services Automation) business model preferred. Experience with Google Apps Script, ETL, Data Normalization, Sheets advanced formulas (Let, Lambda, VStack/HStack, Query, Importrange, Filter, Tables, Named Ranges and Functions...) Interested in this position? About Guidewire Guidewire is the platform P&C insurers trust to engage, innovate, and grow efficiently. We combine digital, core, analytics, and AI to deliver our platform as a cloud service. More than 540+ insurers in 40 countries, from new ventures to the largest and most complex in the world, run on Guidewire. As a partner to our customers, we continually evolve to enable their success. We are proud of our unparalleled implementation track record with 1600+ successful projects, supported by the largest R&D team and partner ecosystem in the industry. Our Marketplace provides hundreds of applications that accelerate integration, localization, and innovation. For more information, please visit www.guidewire.com and follow us on Twitter: @Guidewire_PandC. Guidewire Software, Inc. is proud to be an equal opportunity and affirmative action employer. We are committed to an inclusive workplace, and believe that a diversity of perspectives, abilities, and cultures is a key to our success. Qualified applicants will receive consideration without regard to race, color, ancestry, religion, sex, national origin, citizenship, marital status, age, sexual orientation, gender identity, gender expression, veteran status, or disability. All offers are contingent upon passing a criminal history and other background checks where it's applicable to the position.

Posted 1 week ago

Apply

5.0 - 8.0 years

2 - 8 Lacs

Noida

On-site

Posted On: 21 Jul 2025 Location: Noida, UP, India Company: Iris Software Why Join Us? Are you inspired to grow your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to do the best work of your life at one of the fastest growing IT services companies ? Do you aspire to thrive in an award-winning work culture that values your talent and career aspirations ? It’s happening right here at Iris Software. About Iris Software At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential. With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services. Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation. Working at Iris Be valued, be inspired, be your best. At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow. Our employee value proposition (EVP) is about “Being Your Best” – as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. We’re a place where everyone can discover and be their best version. Job Description Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. ________________________________________ Required Skills & Experience: Experience: 5-8 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise: o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis. o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery). o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise: o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting). o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Mandatory Competencies ETL - ETL - Tester Beh - Communication and collaboration Database - SQL QA/QE - QA Automation - ETL Testing Database - PostgreSQL - PostgreSQL Perks and Benefits for Irisians At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, we're committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees' success and happiness.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

We are seeking a technically skilled and analytically driven Solar Performance & Meteorological Data Specialist to support the development and optimization of our weather monitoring systems for solar power plants. This role combines in-depth solar domain expertise with applied meteorological knowledge and data analysis to enhance the performance monitoring of solar energy assets. As part of a cross-functional product and engineering team, the candidate will play a pivotal role in linking environmental measurements with solar plant performance analytics, enabling actionable insights that improve energy yield and asset efficiency. Key Responsibilities: Serve as the domain expert for the design, deployment, and optimization of weather monitoring systems used in solar applications. Work closely with product managers and hardware engineers to define key meteorological parameters (e.g., irradiance, temperature, wind, humidity) critical for performance benchmarking. Analyze and interpret solar performance metrics (e.g., PR, CUF, energy yield) in conjunction with weather data to assess plant health and efficiency. Assist in the development of algorithms for performance normalization, sensor calibration, and fault detection using environmental data. Support internal teams and clients in understanding weather-influenced performance trends through detailed reports and dashboards. Contribute to data quality protocols and sensor accuracy standards as per industry guidelines (e.g., IEC 61724). Interact with solar developers, EPCs, and O&M teams to incorporate field insights into system design and analytics improvements. Required Qualifications: Bachelor’s or Master’s degree in Renewable Energy, Electrical/Mechanical Engineering, Atmospheric Science, or a related field. 3–7 years of experience in the solar energy sector , with specific exposure to performance analysis and weather-related data interpretation. Hands-on experience with solar SCADA systems, weather stations, and performance monitoring platforms. Proficiency in tools for data analysis (e.g., Excel, Python, MATLAB) and visualization (e.g., Power BI, Tableau). Strong understanding of solar PV performance metrics and how environmental variables affect generation. Excellent problem-solving skills and the ability to translate complex data into actionable insights. Strong written and verbal communication skills for technical reporting and cross-functional collaboration. Preferred Skills: Familiarity with industry standards like IEC 61724 for solar monitoring systems. Experience with IoT platforms , data acquisition systems, or real-time monitoring tools. Understanding of sensor calibration, instrumentation error analysis, or meteorological modeling . Exposure to machine learning or advanced statistical techniques for performance prediction (a plus). Why Join Us? Work at the intersection of renewable energy and data-driven innovation. Contribute to building cutting-edge weather sensing systems that power the solar industry. Collaborate with a dynamic team of engineers, scientists, and energy professionals. Play a key role in improving energy efficiency and sustainability in solar plant operations.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Summary We are seeking a skilled Data Analytics and Reporting Specialist to join our global team. In this role, you will be responsible for creating and maintaining impactful reports and dashboards for our leadership and finance teams, and in support of greater data integrity efforts across the company. This is an exciting opportunity for a proactive and analytical professional who is passionate about transforming data into compelling visualizations that drive action. The ideal candidate will be comfortable working in a dynamic, data- driven environment and possess a natural curiosity for “digging” into the data. You will partner closely with business stakeholders to analyze current challenges and use cases, with the goal of enhancing the analytic experience. This position is based in our Guidewire Bangalore office, and reports to the PS Manager, Salesforce PSA & Analytics. Responsibilities Job Description Serve as the go-to expert for the PS organization on Salesforce Reports and Dashboards (including standard & CRMA) and Thoughtspot for enterprise reporting needs. Continuously hone expertise on the PSA data model within Salesforce and underlying system processes. Ensure solutions are scalable, repeatable, effective, and meet the expectations of various stakeholders. Support top-tier executive analytic reporting, setting a high standard for best practices. Partner with IS Technical Teams and Business Partners to establish enterprise oversight of critical team- related data. Manage end-to-end evaluations and solutioning, including design, development, testing, deployment, an adoption. Map, clean, and organize data, ensuring accurate alignment of fields and values between CRM systems. Analyze data to identify gaps or inconsistencies and develop reports and dashboards to support management's information needs. Present findings, recommendations, and updates on key metrics to management and stakeholders through clear, impactful presentations. ### Essential Skills and Experience: BA/BS degree or equivalent experience and minimum of 5+ years proven related experience in Analytics, Finance, Business Operation or Management Consulting. Experience with Salesforce CRM Analytics (Salesforce Admin certification is a plus). Intermediate to advanced experience with SQL scripting and data visualization tools (e.g. Tableau, PowerBI, ThoughtSpot) to turn data into insights. Ability to independently synthesize sophisticated data into simple consumable deliverables with clear takeaways. Outstanding communications skills, both written and verbal, with both technical and non-technical colleagues and strong stakeholder management ability are essential skills. Detail-oriented with a desire to quickly learn new concepts, business models, and technologies. Ability to adapt to new demands and execute with urgency in a dynamic environment. Strong operational skills, including problem-solving, process analysis, and execution. Ability to design and interpret key performance metrics and deliver insights. Capability to automate processes and drive improvements. Troubleshoot operational issues and propose system/process changes to fix root causes. Preferred Skills And Experience Core foundational FP&A experience: accounting, planning, forecasting, variance analysis. Experience in Professional Services Consulting business of a SaaS vertical company is a plus. Knowledge or background with Certinia PSA (Professional Services Automation) business model preferred. Experience with Google Apps Script, ETL, Data Normalization, Sheets advanced formulas (Let, Lambda, VStack/HStack, Query, Importrange, Filter, Tables, Named Ranges and Functions...) About Guidewire Guidewire is the platform P&C insurers trust to engage, innovate, and grow efficiently. We combine digital, core, analytics, and AI to deliver our platform as a cloud service. More than 540+ insurers in 40 countries, from new ventures to the largest and most complex in the world, run on Guidewire. As a partner to our customers, we continually evolve to enable their success. We are proud of our unparalleled implementation track record with 1600+ successful projects, supported by the largest R&D team and partner ecosystem in the industry. Our Marketplace provides hundreds of applications that accelerate integration, localization, and innovation. For more information, please visit www.guidewire.com and follow us on Twitter: @Guidewire_PandC. Guidewire Software, Inc. is proud to be an equal opportunity and affirmative action employer. We are committed to an inclusive workplace, and believe that a diversity of perspectives, abilities, and cultures is a key to our success. Qualified applicants will receive consideration without regard to race, color, ancestry, religion, sex, national origin, citizenship, marital status, age, sexual orientation, gender identity, gender expression, veteran status, or disability. All offers are contingent upon passing a criminal history and other background checks where it's applicable to the position.

Posted 1 week ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

- 1+ years of data engineering experience - Experience with SQL - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) Amazon IN Platform Development team is looking to hire a rock star Data/BI Engineer to build for pan Amazon India businesses. Amazon India is at the core of hustle @ Amazon WW today and the team is charted with democratizing data access for the entire marketplace & add productivity. That translates to owning the processing of every Amazon India transaction, for which the team is organized to have dedicated business owners & processes for each focus area. The BI Engineer will play a key role in contributing to the success of each focus area, by partnering with respective business owners and leveraging data to identify areas of improvement & optimization. He / She will build deliverables like business process automation, payment behavior analysis, campaign analysis, fingertip metrics, failure prediction etc. that provide edge to business decision making AND can scale with growth. The role sits in the sweet spot between technology and business worlds AND provides opportunity for growth, high business impact and working with seasoned business leaders. An ideal candidate will be someone with sound technical background in data domain – storage / processing / analytics, has solid business acumen and a strong automation / solution oriented thought process. Will be a self-starter who can start with a business problem and work backwards to conceive & devise best possible solution. Is a great communicator and at ease on partnering with business owners and other internal / external teams. Can explore newer technology options, if need be, and has a high sense of ownership over every deliverable by the team. Is constantly obsessed with customer delight & business impact / end result and ‘gets it done’ in business time. Key job responsibilities - Design, implement and support an data infrastructure for analytics needs of large organization - Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL and AWS big data technologies - Be enthusiastic about building deep domain knowledge about Amazon’s business. - Must possess strong verbal and written communication skills, be self-driven, and deliver high quality results in a fast-paced environment. - Enjoy working closely with your peers in a group of very smart and talented engineers. - Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers - Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency About the team India Data Engineering and Analytics (IDEA) team is central data engineering team for Amazon India. Our vision is to simplify and accelerate data driven decision making for Amazon India by providing cost effective, easy & timely access to high quality data. We achieve this by building UDAI (Unified Data & Analytics Infrastructure for Amazon India) which serves as a central data platform and provides data engineering infrastructure, ready to use datasets and self-service reporting capabilities. Our core responsibilities towards India marketplace include a) providing systems(infrastructure) & workflows that allow ingestion, storage, processing and querying of data b) building ready-to-use datasets for easy and faster access to the data c) automating standard business analysis / reporting/ dashboarding d) empowering business with self-service tools for deep dives & insights seeking. Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Knowledge of AWS Infrastructure Knowledge of basics of designing and implementing a data schema like normalization, relational model vs dimensional model Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

noida, uttar pradesh

On-site

Join Barclays as an Analyst in the Cost Utility role, where you will be responsible for supporting the execution of end-to-end monthly financial close processes. This includes performing aged accrual analysis, vendor cost analysis, production of financial reports, flash reports, providing support in commentaries, executing APE amendments, normalization at AE levels, and supporting the Financial Controller and Financial Business Partner in addressing queries from auditors. At Barclays, we are not only anticipating the future but also actively creating it. To excel in this role, you should possess the following skills: - Qualified CA / CMA / CPA / ACCA / CFA / MBA Finance from a premier institute with a minimum of one year of relevant experience. - CA Inter / Commerce Graduate with a few years of relevant experience. - Take ownership of embedding new policies and procedures implemented for risk mitigation. - Provide advice and influence decision-making within your area of expertise. Some additional valued skills may include: - Proficiency in SAP and understanding of Ledger hierarchy. - Comprehensive understanding of Finance Business Partnering. - Intermediate to Advanced proficiency in Excel and PowerPoint. - Familiarity with automation tools like Alteryx. You will be evaluated based on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, digital and technology, as well as job-specific technical skills. This position is based in our Noida office. Purpose of the role: The purpose of this role is to provide financial expertise and support to specific business units or departments within the organization. You will act as a liaison between the finance function and various business units, helping bridge the gap between financial data and business decisions. Accountabilities: - Develop and implement business unit financial strategies, plans, and budgets, using insights to evaluate the financial implications of strategic initiatives and recommend appropriate actions. - Create financial models to forecast future performance, assess investment opportunities, evaluate financial risks, and provide recommendations. - Collaborate cross-functionally to provide financial insights and guidance to business unit stakeholders. - Identify opportunities and implement financial process improvements to streamline financial operations. - Support business units in identifying, assessing, and mitigating financial risks, including providing training and guidance on financial risk management and compliance practices. - Analyze and present financial data to provide insights into business performance, identify trends, and support decision-making. Analyst Expectations: - Perform prescribed activities in a timely manner and to a high standard, consistently driving continuous improvement. - Demonstrate in-depth technical knowledge and experience in your assigned area of expertise. - Lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. - Take responsibility for embedding new policies and procedures adopted for risk mitigation. - Advise and influence decision-making within your area of expertise. - Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. - Maintain an understanding of how your sub-function integrates with the function and the organization's products, services, and processes. - Resolve problems by applying acquired technical experience and precedents. - Act as a contact point for stakeholders outside of the immediate function and build a network of contacts external to the organization. All colleagues are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset to Empower, Challenge, and Drive.,

Posted 1 week ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. The Opportunity When you join PwC Acceleration Centers (ACs), you step into a pivotal role focused on actively supporting various Acceleration Center services, from Advisory to Assurance, Tax and Business Services. In our innovative hubs, you’ll engage in challenging projects and provide distinctive services to support client engagements through enhanced quality and innovation. You’ll also participate in dynamic and digitally enabled training that is designed to grow your technical and professional skills. As part of the Data and Analytics Engineering team you design, develop, and maintain responsive, scalable web applications. As a Senior Associate you analyze complex problems, mentor junior developers, and maintain professional standards while building meaningful client connections. This role offers the chance to collaborate with cross-functional teams, enhance your technical skills, and contribute to innovative solutions that drive success. Responsibilities Design and implement end-to-end solutions for various applications Collaborate with teams to secure project alignment Mentor junior engineers to enhance their skills Analyze user requirements to inform development processes Maintain exceptional standards of code quality and performance Navigate the complexities of software architecture Contribute to innovative solutions that drive business impact Uphold professional standards in every development activity What You Must Have Any Bachelor's Degree 4-9 years of experience Oral and written proficiency in English required What Sets You Apart 3-9 years of experience in web application development Familiarity with CI/CD pipelines and basic DevOps practices Working knowledge of Microsoft Azure is a plus Experience with charting libraries such as Google React Charts Exposure to Generative AI APIs or AI-integrated UI features Proven problem-solving ability and technical ownership Ability to mentor team members and provide technical leadership Strength in communication and collaboration skills Please reference skill categories for Senior Developer Key Responsibilities Design, develop, and maintain responsive, scalable web applications. Build intuitive and dynamic frontend components using ReactJS, JavaScript, HTML5, and CSS3. Develop and maintain RESTful APIs using .NET Core. Write optimized SQL queries, stored procedures, and perform backend data operations using SQL Server. Collaborate with cross-functional teams including product owners, designers, and QA engineers to deliver high-quality solutions. Ensure solutions are architecturally sound and aligned with best practices in performance, security, and maintainability. Lead and mentor junior developers and contribute to code reviews and technical discussions. Participate in Agile ceremonies and contribute to sprint planning and estimation. Total experience needed - 3-9 years Required Skills Frontend Development: Strong expertise in ReactJS, JavaScript (ES6+), HTML5, CSS3, and modern UI/UX principles. Familiarity with component libraries and frontend tooling (Webpack, Babel, etc.). Experience with State Management (Redux, Context API). Backend Development: Proficient in developing RESTful services and business logic using .NET Core (C#). Strong understanding of API design, dependency injection, and middleware pipelines. Database: Solid experience with SQL Server, including complex queries, indexing, and performance optimization. Knowledge of relational database design and normalization principles. DevOps & Tools: Familiar with Git for version control and branching strategies. Experience with CI/CD pipelines and basic DevOps practices (Azure DevOps preferred). Cloud & Deployment: Working knowledge of Microsoft Azure (App Services, Azure SQL, Key Vault, etc.) is a plus. Soft Skills: Strong communication and collaboration skills. Proven problem-solving ability and technical ownership. Ability to mentor team members and provide technical leadership. Preferred: Experience with charting libraries such as Google React Charts, Chart.js, or D3.js. Familiarity with metadata-driven UI components or low-code frameworks. Exposure to Generative AI APIs or AI-integrated UI features (a plus but not required).

Posted 1 week ago

Apply

4.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Job Summary We are looking for a skilled Database Engineer to design, build, and maintain reliable database systems that support our applications and data infrastructure. The ideal candidate will have strong technical expertise in database architecture, data modeling, and performance tuning, along with hands-on experience in both SQL and NoSQL systems. Location : Indore Job Type : Full-Time Experience : 4+ Years Key Responsibilities Design and implement scalable and high-performing database architectures Build and optimize complex queries, stored procedures, and indexing strategies Collaborate with backend engineers and data teams to model and structure databases that meet application requirements Perform data migrations, transformations, and integrations across environments Ensure data consistency, integrity, and availability across distributed systems Develop and maintain ETL pipelines and real-time data flows Monitor database performance and implement tuning improvements Automate repetitive database tasks and deploy schema changes Assist with database security practices and access control policies Support production databases and troubleshoot incidents or outages Required Skills And Qualifications Strong experience in relational databases like PostgreSQL, MySQL, MS SQL Server, or Oracle Proficiency in writing optimized SQL queries and performance tuning Experience with NoSQL databases like MongoDB, Cassandra, DynamoDB, or Redis Solid understanding of database design principles, normalization, and data warehousing Strong expertise in Oracle and GoldenGate. Experience with database platforms such as Vertica, Couchbase Capella, or CockroachDB. Hands-on experience with ETL pipelines, data transformation, and scripting (e.g., Python, Bash) Familiarity with version control systems (e.g., Git) and DevOps tools (e.g., Docker, Kubernetes, Jenkins) Knowledge of cloud database services (e.g., AWS RDS, Google Cloud SQL, Azure SQL Database) Experience with data backup, disaster recovery, and high availability setups (ref:hirist.tech)

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Role : Senior MS SQL Data Engineer Location : Remote Duration : Long Term Experience : 5+ yrs Interview Process : Test,15 min HR screening, followed by a technical session with 1 or 2 clients Time Zone : 10 AM 7 : 00 PM IST Dual employment = Client does not allow dual employment and such must be terminated as soon as possible if applies. Note : - There will be a BGV process for this requirement including : Employment Check PCC Police Clearance Certificate (Criminal Record Check) English fluency (all teams work internationally, and English is the standard language) Candidate should be your Inhouse Bench resource Key Responsibilities Expert-level proficiency in MS SQL Server (2016 and above) Strong command over T-SQL, indexing, query optimization, and execution plans In-depth understanding of MS SQL Server architecture and internals Real-world experience with performance tuning and resolving complex database issues Experience in database design, normalization, and optimization Hands-on experience with monitoring and troubleshooting tools Experience in ETL development or integration with SSIS/SSRS is a plus Familiarity with cloud-based SQL services (Azure SQL, Amazon RDS ) is an advantage (ref:hirist.tech)

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Tech Lead, Software Development Engineering What does a successful database developer - Professional do at Fiserv? A successful database developer -Professional at Fiserv do database design, optimizes database performance, and mentors junior developers, demonstrating deep expertise in SQL, PL/SQL, and database design principles. What will you do: Develop, maintain, and optimize PL/SQL code for efficient database operations. Design and implement database solutions to support business requirements. Troubleshoot and debug complex database issues. Collaborate with cross-functional teams to gather requirements and provide technical guidance. Perform code reviews and ensure adherence to coding standards. Optimize database performance through query optimization and indexing strategies. Create and maintain technical documentation for developed solutions. What Would Be Great To Have Powerbuilder Knowledge Oracle certification (e.g., Oracle PL/SQL Developer Certified Associate). Experience with performance tuning and optimization techniques. Knowledge of other programming languages such as Shell Scripting, Java, Python/Perl, or C/C++. Familiarity with Agile development methodologies. Expertise in version control systems (e.g., Git) and CI/CD (Github actions, Harness etc) What Will You Need To Have Bachelor’s degree in computer science, Information Technology, or related field. 5 years of hands-on experience in SQL and PL/SQL development. Proficiency in Oracle database technologies. Strong understanding of database design principles and normalization. Work on security scans and pipelines for code deployment. Excellent problem-solving and analytical skills. Ability to work independently and collaboratively in a team environment. Effective communication skills to interact with stakeholders at all levels. Thank You For Considering Employment With Fiserv. Please Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our Commitment To Diversity And Inclusion Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note To Agencies Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning About Fake Job Posts Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 1 week ago

Apply

5.0 - 7.0 years

0 Lacs

Thiruvananthapuram

On-site

5 - 7 Years 2 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation: Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration: Define and govern the configuration management plan. Ensure compliance within the team. Testing: Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance: Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management: Manage the delivery of modules effectively. Defect Management: Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation: Create and provide input for effort and size estimation for projects. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management: Execute and monitor the release process to ensure smooth transitions. Design Contribution: Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface: Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management: Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications: Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples: Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments: Data Engineering Role Summary: Skilled Data Engineer with strong Python programming skills and experience in building scalable data pipelines across cloud environments. The candidate should have a good understanding of ML pipelines and basic exposure to GenAI solutioning. This role will support large-scale AI/ML and GenAI initiatives by ensuring high-quality, contextual, and real-time data availability. ________________________________________ Key Responsibilities: • Design, build, and maintain robust, scalable ETL/ELT data pipelines in AWS/Azure environments. • Develop and optimize data workflows using PySpark, SQL, and Airflow. • Work closely with AI/ML teams to support training pipelines and GenAI solution deployments. • Integrate data with vector databases like ChromaDB or Pinecone for RAG-based pipelines. • Collaborate with solution architects and GenAI leads to ensure reliable, real-time data availability for agentic AI and automation solutions. • Support data quality, validation, and profiling processes. ________________________________________ Key Skills & Technology Areas: • Programming & Data Processing: Python (4–6 years), PySpark, Pandas, NumPy • Data Engineering & Pipelines: Apache Airflow, AWS Glue, Azure Data Factory, Databricks • Cloud Platforms: AWS (S3, Lambda, Glue), Azure (ADF, Synapse), GCP (optional) • Databases: SQL/NoSQL, Postgres, DynamoDB, Vector databases (ChromaDB, Pinecone) – preferred • ML/GenAI Exposure (basic): Hands-on with Pandas, scikit-learn, knowledge of RAG pipelines and GenAI concepts • Data Modeling: Star/Snowflake schema, data normalization, dimensional modeling • Version Control & CI/CD: Git, Jenkins, or similar tools for pipeline deployment ________________________________________ Other Requirements: • Strong problem-solving and analytical skills • Flexible to work on fast-paced and cross-functional priorities • Experience collaborating with AI/ML or GenAI teams is a plus • Good communication and a collaborative, team-first mindset • Experience in Telecom, E- Commerce, or Enterprise IT Operations is a plus. Skills ETL,BIGDATA,PYSPARK,SQL About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

India

On-site

Job Information Date Opened 07/18/2025 Job Type Full time City Saidapet State/Province Tamil Nadu Country India Zip/Postal Code 600096 Industry Technology Job Description Job Title: Database Consultant Job Summary: The Database Consultant is responsible for evaluating, optimizing, and securing the organization’s database systems to ensure high performance, data integrity, and regulatory compliance. This role supports the classification, integration, and lifecycle management of data assets in alignment with national standards and organizational policies. The consultant plays a key role in enabling data-driven decision-making and maintaining robust data infrastructure. Key Responsibilities: Database Assessment & Optimization: Analyze existing database systems for performance, scalability, and reliability. Recommend and implement tuning strategies to improve query efficiency and resource utilization. Support database upgrades, migrations, and reconfigurations. Security & Compliance: Ensure databases comply with national cybersecurity and data protection regulations. Implement access controls, encryption, and backup strategies to safeguard data. Conduct regular audits and vulnerability assessments. Data Classification & Integration: Support the classification of data assets based on sensitivity, usage, and ownership. Facilitate integration of data across platforms and applications to ensure consistency and accessibility. Collaborate with data governance teams to maintain metadata and lineage documentation. Lifecycle Management: Develop and enforce policies for data retention, archival, and disposal. Monitor data growth and storage utilization to support capacity planning. Ensure databases are aligned with business continuity and disaster recovery plans. Collaboration & Advisory: Work closely with application developers, data analysts, and IT teams to understand data requirements. Provide expert guidance on database design, normalization, and indexing strategies. Assist in selecting and implementing database technologies that align with business goals. Innovation & Best Practices: Stay current with emerging database technologies and trends (e.g., cloud databases, NoSQL, data mesh). Promote best practices in database management and data governance. Contribute to the development of enterprise data strategies. Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. Proven experience in database administration, consulting, or data architecture. Proficiency in SQL and familiarity with major database platforms (e.g., Oracle, SQL Server, PostgreSQL, MySQL, MongoDB). Knowledge of data governance frameworks and compliance standards (e.g., GDPR, HIPAA, ISO 27001). Strong analytical, problem-solving, and communication skills. 6-10 years of relevant Experience in IT

Posted 1 week ago

Apply

5.0 - 8.0 years

2 - 8 Lacs

Noida

On-site

Posted On: 18 Jul 2025 Location: Noida, UP, India Company: Iris Software Why Join Us? Are you inspired to grow your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to do the best work of your life at one of the fastest growing IT services companies ? Do you aspire to thrive in an award-winning work culture that values your talent and career aspirations ? It’s happening right here at Iris Software. About Iris Software At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential. With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services. Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation. Working at Iris Be valued, be inspired, be your best. At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow. Our employee value proposition (EVP) is about “Being Your Best” – as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. We’re a place where everyone can discover and be their best version. Job Description Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. ________________________________________ Required Skills & Experience: Experience: 5-8 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise: o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis. o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery). o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise: o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting). o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Mandatory Competencies QA/QE - QA Automation - ETL Testing ETL - ETL - Tester Beh - Communication and collaboration Database - Sql Server - SQL Packages Perks and Benefits for Irisians At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, we're committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees' success and happiness.

Posted 1 week ago

Apply

5.0 - 8.0 years

4 - 9 Lacs

Noida

On-site

Posted On: 18 Jul 2025 Location: Noida, UP, India Company: Iris Software Why Join Us? Are you inspired to grow your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to do the best work of your life at one of the fastest growing IT services companies ? Do you aspire to thrive in an award-winning work culture that values your talent and career aspirations ? It’s happening right here at Iris Software. About Iris Software At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential. With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services. Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation. Working at Iris Be valued, be inspired, be your best. At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow. Our employee value proposition (EVP) is about “Being Your Best” – as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. We’re a place where everyone can discover and be their best version. Job Description Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. ________________________________________ Required Skills & Experience: Experience: 5-8 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise: o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis. o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery). o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise: o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting). o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Mandatory Competencies Perks and Benefits for Irisians At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, we're committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees' success and happiness.

Posted 1 week ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Full-time | Entry-Level | Freshers Welcome (B.Tech Required) Location: Ahmedabad, Gujarat, India ⸻ About the Role We are seeking a detail-oriented and passionate Junior Database Engineer to join our growing infrastructure team at our Hyderabad office. This is an excellent opportunity for fresh graduates who are eager to dive deep into relational database systems, query optimization, and data infrastructure engineering. You will be responsible for maintaining, optimizing, and scaling MySQL-based database systems that power our marketplace platform—supporting real-time, high-availability operations across global trade networks. ⸻ Core Responsibilities • Support the administration and performance tuning of MySQL databases in production and development environments. • Implement database design best practices including normalization, indexing strategies, and query optimization. • Assist with managing master-slave replication, backup & recovery processes, and disaster recovery planning. • Learn and support sharding strategies, data partitioning, and horizontal scaling for large datasets. • Write and optimize complex SQL queries, stored procedures, and triggers. • Monitor database health using monitoring tools and address bottlenecks, slow queries, or deadlocks. • Collaborate with backend engineers and DevOps to ensure database reliability, scalability, and high availability. ⸻ Technical Skills & Requirements • Fresh graduates (B.Tech in Computer Science, IT, or related fields) with academic or project experience in SQL and RDBMS. • Strong understanding of relational database design, ACID principles, and transaction management. • Hands-on experience with MySQL or compatible systems (MariaDB, Percona). • Familiarity with ER modeling, data migration, and schema versioning. • Exposure to concepts like: • Replication (master-slave/master-master) • Sharding & partitioning • Write/read splitting • Backup strategies (mysqldump, Percona XtraBackup) • Connection pooling and resource utilization • Comfortable working in Linux environments and using CLI tools. • Strong analytical skills and a curiosity to explore and solve data-layer challenges. Interview Process 1. Shortlisting – Based on resume and relevant experience 2. Technical Assessment – Practical web development test 3. Final Interview – With the client’s hiring team ⸻ Why Join Us? • Be part of a cutting-edge AI project with global exposure • Work in a professional environment with real growth opportunities • Gain valuable experience in client-facing, production-level development • Strong potential for contract extension or full-time conversion ⸻ Interested in working on impactful web products for the future of AI?

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies