1-3 Years Gurgaon Full-Time Job Description | SDET Who are we? Falcon a Series-A funded cloud-native, AI-first banking technology & processing platform that helps banks, NBFCs, and PPIs quickly and affordably launch next-gen financial products, such as credit card, credit line on UPI, prepaid card, fixed deposits, and loans. Since our 2022 launch, we’ve processed USD 1 Bn+ in transactions, signed on 12 of India's top financial institutions, & clocked USD 15 Mn+ in revenue. Our company is backed by marquee investors from around the world, including heavyweight investors from Japan, USA, as well as leading Indian ventures and banks. For more details, please visit https://falconfs.com/ Job Summary We’re looking for a passionate and detail-oriented SDET (Software Development Engineer in Test) with a strong understanding of penetration testing and familiarity with automation frameworks . In this hybrid role, you’ll bridge the gap between development, testing, and security—ensuring our applications are not just functional, but secure and resilient. Key Responsibilities Conduct manual and automated penetration testing across web and mobile applications, APIs, and infrastructure Develop, maintain, and enhance automated test scripts within CI/CD pipelines Identify vulnerabilities using tools (e.g., Burp Suite, OWASP ZAP, Metasploit) and validate fixes through regression and retesting Collaborate with DevOps, Development, and Product teams to build security-first testing frameworks Assist in creating secure coding guidelines and performing code reviews with a security lens Contribute to the development of test strategies, test plans, and test cases Stay updated with the latest security vulnerabilities, attack vectors, and threat landscapes Required Qualifications 1–3 years of experience in software testing , including security and functional test automation Strong understanding of OWASP Top 10 , threat modelling , and security best practices Experience with automated testing tools such as Selenium, TestNG, JUnit, or Cypress Hands-on with SAST/DAST tools , vulnerability scanners, and scripting languages (Python, Bash, JavaScript) Familiarity with CI/CD tools like Jenkins, GitLab, or Circle CI Understanding of RESTful APIs and experience in API testing (Postman, Rest Assured, etc.) Certifications like OSCP, CEH, or GWAPT are a plus Other Specifics Location: Gurgaon(Hybrid mode) Job Type: Full Time Share with someone awesome View all job openings
1-3 Years Gurgaon Full-Time Job Description | SDET Who are we Falcon a Series-A funded cloud-native, AI-first banking technology & processing platform that helps banks, NBFCs, and PPIs quickly and affordably launch next-gen financial products, such as credit card, credit line on UPI, prepaid card, fixed deposits, and loans. Since our 2022 launch, weve processed USD 1 Bn+ in transactions, signed on 12 of India&aposs top financial institutions, & clocked USD 15 Mn+ in revenue. Our company is backed by marquee investors from around the world, including heavyweight investors from Japan, USA, as well as leading Indian ventures and banks. For more details, please visit https://falconfs.com/ Job Summary Were looking for a passionate and detail-oriented SDET (Software Development Engineer in Test) with a strong understanding of penetration testing and familiarity with automation frameworks . In this hybrid role, youll bridge the gap between development, testing, and securityensuring our applications are not just functional, but secure and resilient. Key Responsibilities Conduct manual and automated penetration testing across web and mobile applications, APIs, and infrastructure Develop, maintain, and enhance automated test scripts within CI/CD pipelines Identify vulnerabilities using tools (e.g., Burp Suite, OWASP ZAP, Metasploit) and validate fixes through regression and retesting Collaborate with DevOps, Development, and Product teams to build security-first testing frameworks Assist in creating secure coding guidelines and performing code reviews with a security lens Contribute to the development of test strategies, test plans, and test cases Stay updated with the latest security vulnerabilities, attack vectors, and threat landscapes Required Qualifications 13 years of experience in software testing , including security and functional test automation Strong understanding of OWASP Top 10 , threat modelling , and security best practices Experience with automated testing tools such as Selenium, TestNG, JUnit, or Cypress Hands-on with SAST/DAST tools , vulnerability scanners, and scripting languages (Python, Bash, JavaScript) Familiarity with CI/CD tools like Jenkins, GitLab, or Circle CI Understanding of RESTful APIs and experience in API testing (Postman, Rest Assured, etc.) Certifications like OSCP, CEH, or GWAPT are a plus Other Specifics Location: Gurgaon(Hybrid mode) Job Type: Full Time Share with someone awesome View all job openings Show more Show less
You will be working at Paras Twin Tower, Gurgaon as a full-time employee for Falcon, a Series-A funded cloud-native, AI-first banking technology & processing platform. Falcon specializes in assisting banks, NBFCs, and PPIs to efficiently launch cutting-edge financial products like credit cards, credit lines on UPI, prepaid cards, fixed deposits, and loans. Since its inception in 2022, Falcon has processed over USD 1 billion in transactions, collaborated with 12 of India's top financial institutions, and generated revenue exceeding USD 15 million. The company is supported by prominent investors from Japan, the USA, and leading Indian ventures and banks. To gain more insights about Falcon, visit https://falconfs.com/. As an Intermediate Data Engineer with 5-7 years of experience, your responsibilities will include designing, developing, and maintaining scalable ETL processes using open source tools and data frameworks such as AWS Glue, AWS Athena, Redshift, Apache Kafka, Apache Spark, Apache Airflow, and Pentaho Data Integration (PDI). You will also be accountable for designing, creating, and managing data lakes and data warehouses on the AWS cloud, optimizing data pipeline architecture, formulating complex SQL queries for big data processing, collaborating with product and engineering teams to develop a platform for data modeling and machine learning operations, implementing data structures and algorithms to meet functional and non-functional requirements, ensuring data privacy and compliance, developing processes for monitoring and alerting on data quality issues, and staying updated with the latest data engineering trends by evaluating new open source technologies. To qualify for this role, you must have a Bachelor's or Master's degree in Computer Science or MCA from a reputable institute, at least 4 years of experience in a data engineering role, proficiency in Python, Java, or Scala for data processing (Python preferred), a deep understanding of SQL and analytical data warehouses, experience with database frameworks like PostgreSQL, MySQL, and MongoDB, knowledge of AWS technologies such as Lambda, Athena, Glue, and Redshift, experience implementing ETL or ELT best practices at scale, familiarity with data pipeline tools like Airflow, Luigi, Azkaban, dbt, proficiency with Git for version control, familiarity with Linux-based systems, cloud services (preferably AWS), strong analytical skills, and the ability to work in an agile and collaborative team environment. Preferred skills for this role include certification in any open source big data technologies, expertise in Apache Hadoop, Apache Hive, and other open source big data technologies, familiarity with data visualization tools like Apache Superset, Grafana, Tableau, experience in CI/CD processes, and knowledge of containerization technologies like Docker or Kubernetes. If you are someone with these skills and experience, we encourage you to explore this opportunity further. Please note that this job description is for an Intermediate Data Engineer role with key responsibilities and qualifications outlined.,
As a Reconciliation Specialist for Falcon, a Series-A funded cloud-native, AI-first banking technology & processing platform, your primary responsibility will be to manage daily operations, reconciliations, reporting, and dispute handling for Credit Card and Prepaid (PPI) programs. You will play a crucial role in coordinating with internal teams, banking partners, processors, and external vendors to ensure seamless operations, compliance, and accurate reporting. Your key responsibilities will include sharing daily reports with banks, FIs SFTP & partner portals, reconciling all CC, Prepaid, and other program reports, investigating and correcting discrepancies, managing customer disputes, chargeback processing via networks like Visa VROL portal, tracking closure, posting chargeback credits and fees as per approvals, reconciling card issuance and dispatch, handling Jira tickets, managing escalations for high-priority operational issues, downloading and sharing daily switch files, managing network reports, reconciling with system and settlement reports, tallying card loads, authorizations, refunds, reversals, rectifying reconciliation issues, preparing and sharing regulatory reports, generating adhoc reports, setting up operational guidelines for new client programs, reviewing and signing off reports for compliance, monitoring operational alerts, and ensuring data backup and archival. To excel in this role, you should possess a strong understanding of reconciliation processes, settlement cycles, chargeback handling, and dispute management. You should have working knowledge of SFTP, file management, automated alerts, and be able to coordinate effectively across technical teams, partners, and banking stakeholders. Excellent communication and documentation skills, proficiency in MS Excel for report analysis, familiarity with Jira or similar ticketing tools, and a detail-oriented, process-driven approach are essential. The preferred qualifications for this position include a B.Tech. or Bachelor's degree in Commerce, Finance, or related fields. This role based in Gurgaon is critical for maintaining the operational stability of live Credit Card and Prepaid programs. As a successful candidate, you are expected to be proactive in identifying and resolving operational issues. If you are ready to take on this challenging role and contribute to the success of Falcon's banking technology platform, we encourage you to apply and join our dynamic team.,
As a Platform Product Manager at Falcon, a Series-A funded cloud-native, AI-first banking technology and processing platform, your primary responsibility will be to oversee the end-to-end product lifecycle of our fintech platform. In this role, you will collaborate closely with cross-functional teams to conceptualize, develop, and deliver innovative solutions that cater to the requirements of both businesses and consumers. Your pivotal role will involve ensuring that our platform maintains its robustness, scalability, and user-friendliness. Your key responsibilities will include identifying opportunities for product innovation and differentiation through comprehensive market analysis, monitoring competitor products to evaluate strengths, weaknesses, and areas for differentiation, and staying informed about industry regulations to ensure product features meet compliance standards. Additionally, you will work with various teams to gather insights into customer needs, market trends, and the competitive landscape, synthesize user feedback from multiple sources, and translate these insights into actionable recommendations for refining and expanding product features. You will also collaborate with stakeholders to document requirements for new features or enhancements, define and prioritize features for the product roadmap, and bridge the communication divide between technical and non-technical teams to ensure clarity in product development. Further, you will iterate on products based on user feedback, data analysis, and evolving market trends to enhance the product's value proposition, contribute to generating reports on key product metrics and performance, and participate in presenting findings and recommendations to foster collaboration across teams. Furthermore, you will play a crucial role in shaping the long-term product strategy by identifying growth avenues and fostering innovation. To excel in this role, you must hold a bachelor's degree in a relevant field such as business, computer science, or engineering and possess 5-8 years of overall experience, with a minimum of 3 years of experience in building platform products in a technology company. Proficiency in writing API specifications, excellent written and verbal communication skills, strong analytical and problem-solving abilities, empathy towards users, and the ability to translate feedback into actionable product enhancements are essential qualities. Moreover, you should demonstrate a deep understanding of Falcon's product offerings and the value they deliver to customers, meticulous attention to detail for data analysis, and defining precise product requirements, as well as familiarity with technical concepts and effective collaboration with engineers. Curiosity to explore user behavior, knowledge of industry trends, exceptional task management, and prioritization skills are essential in this fast-paced environment. A master's degree in business or engineering, experience with customer journey mapping and user experience design, knowledge of regulatory requirements and compliance in the financial services industry, and experience in creating multi-country platform products are considered advantageous. In return, Falcon offers ample opportunities for professional growth, collaboration with top talents in the product and engineering domains, an annual learning and development budget, comprehensive medical insurance coverage for you and your family, and a hybrid work policy providing flexibility to work from the office and remotely. This full-time position is based in Gurgaon (Delhi NCR).,
2-4 Chennai Full-Time Job Description | Manager/Senior Manager - SME - CC Who are we? Falcon a Series-A funded cloud-native, AI-first banking technology & processing platform that helps banks, NBFCs, and PPIs quickly and affordably launch next-gen financial products, such as credit card, credit line on UPI, prepaid card, fixed deposits, and loans. Since our 2022 launch, we’ve processed USD 1 .5Bn+ in transactions, signed on 12 of India's top financial institutions, & clocked USD 15 Mn+ in revenue. Our company is backed by marquee investors from around the world, including heavyweight investors from Japan, USA, as well as leading Indian ventures and banks. For more details, please visit https://falconfs.com/ Job Summary We are looking for a dynamic and experienced Delivery & Implementation Manager with subject matter expertise in Credit Cards to support the successful implementation of credit card products and services for our banking partner. This role is based on-site at the Karur Vysya Bank (KVB) office , and the ideal candidate will have hands-on experience in working with banks, managing delivery pressure, and leading end-to-end implementation cycles. Key Responsibilities Act as the subject matter expert (SME) for credit card implementation, servicing and lifecycle management Lead end-to-end delivery of credit card projects, including coordination with cross-functional teams – tech, product, operations, and compliance Serve as the single point of contact on-site at KVB for all delivery and implementation activities Drive UAT, deployment, go-live coordination, and post-go-live support Collaborate with bank stakeholders to gather and finalize business requirements Manage stakeholder expectations under high-pressure banking environments Identify potential risks and proactively implement mitigation strategies Track project timelines, ensure adherence to SLAs, and escalate blockers appropriately Regularly report project status, milestones, and issues to internal and client leadership Required Skills & Experience 2–4 years of experience in the banking industry, with a focus on credit card systems / implementation Understanding of credit card lifecycle, including onboarding, transactions, disputes, billing, and collections Experience in project coordination, delivery planning, and issue resolution Client-facing communication and stakeholder management skills Ability to work under tight deadlines and manage multiple priorities in a banking setup Knowledge of card management systems, integrations with LOS/LMS/CBS is a plus Preferred Qualifications Bachelor’s degree in Engineering, Finance, or related field; MBA preferred Certifications in project management (PMP, Prince2) or Agile methodologies are a plus Work Environment Full-time, on-site at KVB's offices in Chennai Fast-paced, collaborative, and structured working environment Candidates must be willing to work closely with banking leadership and operations teams on-site Location: Chennai (Primarily on-site at KVB Office; may extend to other partner banks as required) Share with someone awesome View all job openings
As a Security Lead at Kite, you will play a crucial role in enhancing the security systems and policies at Kite to ensure the confidentiality and security of our users" information. Your responsibilities will include: - Enhancing security team accomplishments through planning and mentorship of team members - Determining security requirements through business strategy research, evaluation, vulnerability assessments, platform studies, and cost estimates - Implementing, maintaining, and enhancing security systems through the specification of intrusion detection methodologies, installation and calibration of equipment and software, provision of technical support, and preparation of security reports and other documentation - Driving end-to-end PCI and other required certifications It is essential that you: - Have at least 6 years of hands-on leadership experience in developing solutions with a focus on performance, scalability, and reliability, and in performing network and application security penetration testing and/or threat assessments - Possess at least 6 years of experience with commercial and open-source security technologies and protocols, such as malware prevention, DLP, IDS/IDP, cryptography, vulnerability scanning, penetration testing, SSH, SSL/TLS, snort, port scanners, etc. - Hold an educational background in Computer Science, Information Security, or a related Engineering discipline - Stay updated with emerging security standards and platforms by maintaining networks in security communities and participating in education opportunities Experience in the following areas would be a plus: - Identity management solutions like SailPoint, Centrify, CyberArk, Radiant Logic - Application and infrastructure hardening techniques - IT networking and heterogeneous computing environments (network routing/switching, UNIX, Windows, virtualized infrastructures) - Cloud computing, specifically AWS and Microsoft Azure environments - IT security-related audits such as NIST 800-53, DoD 8500.2, or other relevant frameworks Skills required: - Malware prevention - Vulnerability scanning - Penetration testing - Computer Security Join Kite to be a part of building a new-age financial system that prioritizes security and full financial inclusion in India.,
2-4Years Paras Twin Tower, Gurgaon Full-Time Company Introduction Who are we? Falcon a Series-A funded cloud-native, AI-first banking technology & processing platform that helps banks, NBFCs, and PPIs quickly and affordably launch next-gen financial products, such as credit card, credit line on UPI, prepaid card, fixed deposits, and loans. Since our 2022 launch, we’ve processed USD 1 Bn+ in transactions, signed on 12 of India's top financial institutions, & clocked USD 15 Mn+ in revenue. Our company is backed by marquee investors from around the world, including heavyweight investors from Japan, USA, as well as leading Indian ventures and banks. For more details, please visit https://falconfs.com/ Experience level : Intermediate (5 - 7 years) Key Responsibilities Design, develop, and support scalable ETL processes using open source tools and data frameworks like AWS Glue, AWS Athena, redshift, Apache Kafka, Apache Spark, Apache Airflow and Pentaho Data Integration PDI Design, creation and maintenance of data lakes and data warehouse on AWS cloud. Maintain and optimise our data pipeline architecture, and formulate complex SQL queries for big data processing. Collaborate with product and engineering teams to design and develop a platform for data modelling and machine learning operations. Implement various data structures and algorithms to ensure we meet both functional and non-functional requirements. Maintain data privacy and compliance according to industry standards. Develop processes for monitoring and alerting on data quality issues. Continually evaluate new open source technologies and stay updated with the latest data engineering trends. Key Qualifications Bachelor’s or Master’s degree in Computer Science, MCA from a reputed institute Minimum of 4 years experience in a data engineering role. Experience using Python, Java, or Scala for data processing (Python preferred) Demonstrably deep understanding of SQL and analytical data warehouses. Solid experience with popular database frameworks such as PostgreSQL, MySQL, and MongoDB Knowledge of AWS technologies like , lambda, Athena, glue and redshift Hands-on experience implementing ETL (or ELT best practices at scale. Hands-on experience with data pipeline tools (Airflow, Luigi, Azkaban, dbt) Experience with version control tools like Git. Familiarity with Linux-based systems and cloud services, preferably in environments like AWS. Strong analytical skills and ability to work in an agile and collaborative team environment. Preferred Skills Certification in any open source big data technologies. Expertise in open source big data technologies like Apache Hadoop, Apache Hive, and others. Familiarity with data visualisation tools like Apache Superset, Grafana, tableau etc. Experience in CI/CD processes and containerization technologies like Docker or Kubernetes. Share with someone awesome View all job openings
4-7 Gurgaon Full-Time Company Introduction Who are we Falcon a Series-A funded cloud-native, AI-first banking technology & processing platform that helps banks, NBFCs, and PPIs quickly and affordably launch next-gen financial products, such as credit card, credit line on UPI, prepaid card, fixed deposits, and loans. Since our 2022 launch, we've processed USD 1 Bn+ in transactions, signed on 12 of India's top financial institutions, & clocked USD 15 Mn+ in revenue. Our company is backed by marquee investors from around the world, including heavyweight investors from Japan, USA, as well as leading Indian ventures and banks. For more details, please visit https://falconfs.com/ Experience level : Intermediate (6-10 years) Key Responsibilities Design, develop, and support scalable ETL processes using open source tools and data frameworks like AWS Glue, AWS Athena, redshift, Apache Kafka, Apache Spark, Apache Airflow and Pentaho Data Integration PDI Design, creation and maintenance of data lakes and data warehouse on AWS cloud. Maintain and optimise our data pipeline architecture, and formulate complex SQL queries for big data processing. Collaborate with product and engineering teams to design and develop a platform for data modelling and machine learning operations. Implement various data structures and algorithms to ensure we meet both functional and non-functional requirements. Maintain data privacy and compliance according to industry standards. Develop processes for monitoring and alerting on data quality issues. Continually evaluate new open source technologies and stay updated with the latest data engineering trends. Key Qualifications Bachelor's or Master's degree in Computer Science, MCA from a reputed institute Minimum of 3+ years experience in a data engineering role. Experience using Python, Java, or Scala for data processing (Python preferred) Demonstrably deep understanding of SQL and analytical data warehouses. Solid experience with popular database frameworks such as PostgreSQL, MySQL, and MongoDB Knowledge of AWS technologies like , lambda, Athena, glue and redshift Hands-on experience implementing ETL (or ELT best practices at scale. Hands-on experience with data pipeline tools (Airflow, Luigi, Azkaban, dbt) Experience with version control tools like Git. Familiarity with Linux-based systems and cloud services, preferably in environments like AWS. Strong analytical skills and ability to work in an agile and collaborative team environment. Preferred Skills Certification in any open source big data technologies. Expertise in open source big data technologies like Apache Hadoop, Apache Hive, and others. Familiarity with data visualisation tools like Apache Superset, Grafana, tableau etc. Experience in CI/CD processes and containerization technologies like Docker or Kubernetes. Share with someone awesome View all job openings