Home
Jobs

8977 Relational Jobs - Page 50

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

10 - 24 Lacs

India

Remote

Linkedin logo

Job Title : Business Analyst - Capital Markets Location : Remote Years Of Experience : 6+ Years Mandatory Qualifications: SAFe/Agile Certification Job Description We are looking for a highly driven Business Analyst with strong Capital Markets expertise to join our growing team. In this role, you’ll play a key part in analyzing, documenting, and validating business and functional requirements , while supporting system integration and user acceptance testing. The ideal candidate has a solid background in financial services or capital markets , a sharp analytical mindset, and the ability to collaborate with stakeholders across business and technology functions. Responsibilities Engage with business stakeholders to understand, document, and validate business needs. Build stakeholder alignment and visualize proposed solutions through wireframes, workflows, or prototypes. Develop detailed business cases, requirements, user stories, test plans, and operational processes. Create and maintain process flows, workflows, and use case documentation using tools like Visio. Lead or contribute to the design and execution of test plans and test cases during SIT and UAT phases. Identify whether business needs can be met with existing solutions or require new design/technology. Collaborate closely with development and QA teams to ensure requirements are correctly implemented. Support training material preparation and conduct stakeholder training where needed. Maintain clear traceability between business requirements and delivered functionality. Actively support project change management, issue tracking, and continuous improvement activities. Prepare professional documentation and presentations for stakeholders and leadership. Requirements Bachelor’s degree in Computer Science, Information Systems, Finance, or a related field. 6+ years of experience as a Business Analyst, ideally within a Capital Markets or large financial institution. Hands-on experience with: Capital markets products, trade life cycle, and risk management concepts. Developing functional specifications, test plans, and business cases. SDLC methodologies including Agile, Waterfall, or hybrid models. SQL (basic proficiency in SELECT queries for data analysis). MS Excel, PowerPoint, and Visio for documentation and analysis. Familiarity with relational database concepts and data modeling. Strong communication skills and ability to translate complex business processes into technical requirements. Ability to work independently and manage priorities in a dynamic, fast-paced environment. Preference Experience in cloud transformation or large-scale technology modernization projects. Familiarity with tools such as JIRA, Confluence, or Azure DevOps. Exposure to data governance, data lineage, or ETL/data warehouse projects in a capital markets context. Skills: ms excel,sql,tableau,capital markets expertise,communication skills,stakeholder engagement,microsoft power bi,capital markets,business analysis,safe certification,azure devops,business intelligence,ms powerpoint,safe/agile certification,functional requirements documentation,process flow creation,test plan development,visio

Posted 6 days ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

We seek a highly experienced Frontend Developer with 3-4 years of expertise in React.js, and proficiency in frontend-related technologies, including database integration with PostgreSQL. As a Frontend Developer, you will be responsible for creating engaging and responsive user interfaces while collaborating closely with backend developers and database engineers to deliver seamless web applications. Responsibilities: Design and develop user-friendly, responsive, high-performance web applications using React.js. Collaborate with backend and database teams to integrate frontend components seamlessly with PostgreSQL or other databases. Work closely with UX/UI designers to translate design mockups and wireframes into pixel-perfect user interfaces. Implement state management solutions (e.g., Redux, Mobx) to ensure efficient data flow and UI updates. Identify and address frontend performance bottlenecks, optimizing for speed and responsiveness. Communicate with backend APIs to fetch and display data, ensuring smooth user experiences. Maintain high code quality standards, follow best practices, and participate in code reviews. Ensure compatibility and consistent user experiences across various web browsers. Implement Web Accessibility Guidelines (WCAG) to create inclusive web applications. Create and maintain technical documentation related to front-end components and integrations. Qualifications: 3-4 years of professional experience as a Frontend Developer. Proficiency in React.js for building modern web applications. Strong knowledge of frontend technologies such as HTML5, CSS3, JavaScript (ES6+), and responsive design. Experience with database integration, especially with PostgreSQL or other relational databases. Familiarity with state management libraries like Redux or Mobx. Proficiency in architecting solutions on low-code/no-code platforms, such as the ServiceNow platform." Understanding of API integration and asynchronous programming. Attention to detail and a passion for creating visually appealing and user-friendly interfaces. Problem-solving skills and the ability to work independently or in a team. Excellent communication and collaboration skills.

Posted 6 days ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Specialist, Technical Product - HxGN EAM Are you passionate about leveraging technology to drive innovation and efficiency? Join our company as a System Administrator for the HxGN Enterprise Asset Management Solution and be at the forefront of implementing cutting-edge solutions that enhance our operational capabilities. We are looking for a dedicated professional who thrives in a collaborative environment and is eager to contribute to our mission of advancing health solutions. Responsibilities Collaborate with the Product Manager, Technical Product Manager, and stakeholders to implement new projects, ensuring adherence to project objectives, timelines, and quality standards. Develop and execute comprehensive testing strategies, including functional, integration, and regression testing, to ensure quality and reliability. Provide ongoing product line support, maintenance, and troubleshooting to address issues and implement enhancements. Effectively communicate with business partners, vendors, and internal ServiceNow teams. Participate in the design and architecture of the systems, ensuring scalability, performance, and security. Create and maintain technical documentation, including system designs, requirements specifications, test plans, and user guides. Actively participate in team meetings, contribute to knowledge review, and share knowledge and best practices with other team members. Work with end-users to collect the right data, at the right time, with sufficient context that facilitates analytics, diagnostics, and insights. Create and support sandbox environments for proof-of-concepts and advanced data analytics. Act as Subject Matter Expert (SME) for data exchange APIs between maintenance systems. Assist with Digital System Development Life-Cycle (SDLC), Author, Execution, Reviewer/Approver, and Change Management. Assist with product releases by coordinating with AMS, vendors, and Asset Reliability Management (ARM) Center of Excellence (CoE). Work with business and product owners for SDLC, procedures, and Work Instruction Authoring, Reviewing/Approving. Qualifications Required Bachelor's degree in computer science, Information Systems, or a related field. 4 years' experience in computer science and/or information systems. Experience with Hexagon, Maximo, SAP or similar industry applications Data Migration Experience preferred - Extract | transform | Load (ETL). Deep understanding of SDLC methodologies and project management frameworks (e.g., Scrum, MS Project, JIRA). Familiarity with database management systems (e.g., SQL) and web technologies (e.g., HTML, CSS, Python, JavaScript). SQL and/or RDBMS database experience required (Oracle, SQL Server, PostGre SQL, mySQL, etc.). Experience with OSIsoft / Aveva - PI, PI AF. Proficiency in SDLC including Requirements, Design Specs, Unit Testing. Exceptional experience with Windows Server required. Preferred GMP Experience a definite plus. Certification in project management (e.g., PMP, Agile Certified Practitioner). Knowledge of cloud platforms (e.g., AWS, Azure) and DevOps practices. Understanding of data analytics and visualization tools. Programming Experience C#, Python, VB.Net, HTML5, Javascript - strongly preferred. Advanced degree in Computer Science, Information Systems, or a related field. Knowledge of machine learning, artificial intelligence, or IoT technologies. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Applied Engineering, Asset Management, Benefits Management, Cloud DevOps, Computer Science, Database Management Systems (DBMS), DevOps, DevOps Architecture, Digital Development, Digital Project Management, Management Process, Management System Development, Oracle Database, Product Management, Quality Standards, Relational Database Management System (RDBMS), Requirements Management, Software Product Management, Stakeholder Relationship Management, Strategic Planning, System Designs, Systems Development Lifecycle (SDLC), Technical Writing Documentation Preferred Skills Job Posting End Date 08/31/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R353056

Posted 6 days ago

Apply

7.0 years

0 Lacs

India

On-site

Linkedin logo

Job Description We are looking for a highly motivated Azure Full Stack Developer to design, develop, and deploy scalable web applications on Microsoft Azure. This role spans front-end, back-end, cloud, and database development, offering the opportunity to work with cutting-edge technologies in a fast-paced environment. Key Responsibilities Develop full-stack applications using React, Angular, Vue.js, .NET Core, Node.js, or Python. Design responsive UIs and build/consume RESTful APIs and microservices. Deploy cloud-native apps using Azure services (App Service, Functions, SQL, Cosmos DB, AKS). Use ARM/Terraform for Infrastructure as Code (IaC); manage CI/CD pipelines with Azure DevOps. Build serverless solutions and manage Azure Active Directory (AD) for authentication. Design and optimize relational and NoSQL databases. Follow clean coding practices, write unit/integration tests, and work in Agile sprints. Collaborate with cross-functional teams and contribute to architectural decisions. Stay current with Azure and full-stack technology trends. Required Skills & Experience Bachelors in Computer Science or related field. 7+ years of hands-on full-stack development experience. Proficiency in JavaScript, React/Angular/Vue, .NET Core/Node.js/Python. In-depth knowledge of Microsoft Azure cloud ecosystem. Experience in REST APIs, microservices, and CI/CD (Azure DevOps). Proficient in Azure SQL, Cosmos DB, and IaC tools (ARM/Terraform). Familiar with Azure AD, serverless computing, and DevOps practices. Strong communication, analytical thinking, and team collaboration skills. Preferred Qualifications Microsoft Azure Certifications (Developer Associate, Solutions Architect, etc.). Experience with Docker and Kubernetes. Familiarity with Azure Monitor and cloud security best practices. Exposure to serverless patterns and enterprise app modernization. Skills: app service,javascript,aks,python,azure services,azure devops,kubernetes,functions,.net core,sql,.net,arm,restful apis,vue.js,azure active directory,azure,azure monitor,reactjs,nosql databases,react,angular,docker,cosmos db,ci/cd,terraform,microservices,relational databases,node.js

Posted 6 days ago

Apply

5.0 - 8.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Linkedin logo

Job Description Skill set with Hands on Coding : Proficiency in Python/Django, Django Rest Framework Proficiency in relational databases - any RDBMS - (Oracle, HANA, MySQL, PostgreSQL ) Proficiency in UI technologies - React, Backbone, Angular, HTML, CSS. Proficiency in GIT, Jenkins, CI/CD pipelines, Docker, Kibana Proficiency with Kubernetes Proficiency in test automation Proficiency in data engineering (SQL, NoSQL, Big Data, Kafka, Redis), data governance, data privacy and security. Experience building AI applications, preferably Generative AI and LLM based apps. Broad understanding of AI agents, Agentic Orchestration, Multi-Agent Workflow Automation, along with hands-on experience in Agent Builder frameworks such Lang Chain and Lang Graph. ·Experience working with Generative AI development, embeddings, fine tuning of Generative AI models. Role: Should independently deliver scoped features and be responsible for top-notch end-to-end quality of deliverables through unit tests/automated tests. Closely collaborate with product managers, leads, and architects for the definition of new features and review of code and deliverables. Participate in and perform code reviews and regular agile ceremonies like Scrum. Exposure to the Agile process, Scrum, TDD/BDD desired Must also be able to contribute to adherence to Non-functional requirements of features (Performance/Security ) and write automated tests. Should be a quick learner and keen on learning new technologies. Should be a fantastic team player and collaborator. Good communication skills Role Requirements B.E./B.Tech / M.Tech / MCA without backlogs.. 5-8 years of relevant hands-on experience in Python development Exposure to multi-tenant SaaS web applications. Attention to details · Exceptional verbal and written communication skills. Exposure to the Agile process, Scrum desired. Should be a quick learner. Should be a good team player and collaborator.

Posted 6 days ago

Apply

5.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Senior Power BI Developer Location: US, remote from India Department: IT / Data & Analytics Reports To: Director of Data & Analytics Employment Type: 12 months contract Pay: $35-45 per hour payrolled or self-employed About the Role We are seeking an experienced Senior Power BI Developer to lead the design and development of interactive dashboards, semantic data models, and self-service analytics tools as part of a new internal reporting platform. This role will work closely with stakeholders from production, quality, finance, and supply chain to turn raw data into strategic insights. You will be instrumental in shaping the data visualization layer of our platform, applying best practices in BI development, data storytelling, and performance optimization. Key Responsibilities Power BI Development & Reporting Design, develop, and deploy Power BI dashboards , reports , and paginated reports tailored to manufacturing KPIs and operational needs. Build and maintain Power BI datasets , dataflows , and semantic models that support reuse and scalability. Create intuitive and visually compelling reports using effective data storytelling principles. Data Modeling & Optimization Collaborate with data engineers and DBAs to define and refine star/snowflake schemas for reporting. Develop optimized DAX measures and calculations for business logic implementation. Tune report and dataset performance for fast rendering and user experience. Stakeholder Collaboration Gather requirements from stakeholders across departments (operations, quality, finance, supply chain) and translate them into reporting solutions. Lead discovery sessions and provide data visualization recommendations based on business goals. Platform Enablement & Governance Implement and enforce Power BI development standards , naming conventions, and workspace organization. Contribute to Power BI governance , including dataset certification, version control, and access/security practices. Provide training, documentation, and support to business users for self-service reporting. Required Qualifications Bachelor’s degree in Computer Science, Information Systems, Data Analytics, or a related field. 5+ years of experience in Power BI development , with a strong portfolio of reports/dashboards. Expertise in DAX , Power Query (M) , and data modeling best practices. Experience working with SQL Server , Azure SQL , or other relational databases. Strong understanding of ETL processes , data warehousing , and manufacturing data systems (ERP, MES, SCADA, etc.). Demonstrated ability to gather requirements and deliver user-focused reporting solutions. Preferred Skills Experience working in or supporting manufacturing or industrial environments . Familiarity with Power BI Service administration , deployment pipelines, and Power Platform tools (Power Automate, Power Apps). Understanding of data governance , data security , and row-level security (RLS) in Power BI. Exposure to Azure Data Factory , Azure Synapse , or Databricks is a plus. Soft Skills Strong communication and collaboration skills—able to engage both technical and non-technical stakeholders. Analytical thinking and problem-solving mindset with attention to detail. Comfortable leading projects and mentoring junior developers or analysts. Passion for continuous improvement and modern BI technologies.

Posted 6 days ago

Apply

0.0 - 2.0 years

0 - 0 Lacs

Calicut, Kerala

On-site

Indeed logo

Job Title: PHP Full-Stack Developer We are seeking a talented PHP Full-Stack Developer to design, build, and maintain high-performance web applications. You’ll work with modern PHP frameworks such as Yii2, Laravel, or CodeIgniter , and apply object-oriented programming (OOP) best practices to write clean, reusable code. As part of our dynamic team, you’ll collaborate with cross-functional teams to ensure our backend systems are secure, responsive, and maintainable. Key Responsibilities: 1. Develop & Maintain Build, test, and maintain scalable PHP applications using modern frameworks (Yii2, Laravel, Symfony, CodeIgniter, etc.). Write clean, secure, and maintainable code following industry best practices. 2. Cross-Functional Collaboration Collaborate with designers, product managers, and QA teams to define, design, and deploy new features. Ensure that new features are responsive, efficient, and well-documented. 3. Production Support Troubleshoot, debug, and resolve production issues promptly. Implement performance enhancements and contribute to the overall reliability of the system. 4. Quality Code Participate in code reviews and design discussions. Follow coding standards to ensure the code is well-organized, test-driven, and maintainable. 5. Documentation Create and maintain clear software documentation for applications and systems. 6. Database Design Design and implement database schemas, migrations, and optimize SQL queries for performance. 7. API & Integrations Integrate RESTful APIs and third-party services (e.g., payment gateways, external data sources). Required Experience, Skills & Qualifications:1. Core PHP & OOP Proficient in PHP, object-oriented programming, and Composer dependency management. 2. ORM & Databases Hands-on experience with ORM libraries (Eloquent for Laravel, Active Record for Yii2) and relational databases (MySQL, PostgreSQL). 3. Front-End Fundamentals Solid knowledge of HTML5, CSS3, and JavaScript. Familiarity with responsive design and ensuring cross-browser compatibility. 4. Architecture & Templating Deep understanding of the MVC architecture and templating engines (Blade, Twig, or Yii2's native views). 5. Version Control Experience using Git workflows, including branching, pull requests, and merges. 6. API Design Skilled in RESTful API design, consumption, and third-party integrations. 7. Testing & TDD Experience with unit testing frameworks (PHPUnit, PestPHP, Codeception) and test-driven development (TDD). 8. Security Best Practices Strong knowledge of web security practices (e.g., SQL injection, XSS, CSRF prevention). Nice to Have: CI/CD pipelines Docker/containerization Basic server administration knowledge Experience with cloud platforms Soft Skills: Excellent problem-solving, communication, and teamwork abilities. Comfortable working in agile environments with fast-paced, iterative development. Education: Bachelor’s degree in Computer Science, Software Engineering, or a related field (or equivalent practical experience). How to Apply: Please submit your resume , a brief cover letter highlighting relevant projects (especially those using Yii2 or Laravel), and links to any code samples or GitHub repositories . We look forward to building great software with you! Job Type: Full-time Pay: ₹40,000.00 - ₹60,000.00 per month Ability to commute/relocate: Kozhikode, Kerala: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Are you able to independently create and manage PHP project from start to finish ? Education: Bachelor's (Required) Experience: PHP: 3 years (Required) Laravel / Yii2: 2 years (Required) SQL / MySQL / PostgreSQL: 2 years (Required) Language: English & Malayalam (Required) Location: Kozhikode, Kerala (Required)

Posted 6 days ago

Apply

5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

WE ARE HIRING FOR THE ROLE OF Business Analyst with Capital Markets for Gurgoan Location. Skill: Business Analyst with Capital Markets Risk Domain (Financial Products Bonds / Equity / Fixed Income / Derivatives) SQL. Exp: 5 years Location: Gurgaon Immediate joiners Budget: 25 LPA Experience Required Experience in Capital Markets, knowledge about - financial products, Bonds / Equity / Fixed income / Derivatives, features of various asset classes, risk sensitivity and Greeks, risk domain. Familiarity with relational database and hands-on experience with SQL. Experience as a Business analyst must have worked on specifications/user stories. Good documentation skills, gap analysis skills, knowledge of agile framework. Solid experience in FRD and BRD Global certification like CFA, FRM, CQF will be a plus. BA certifications are desirable but not mandatory. INTERVIEW PROCEDURE: VIRTUAL INTERVIEW. Refer your Friends - Help Us Catch A Rising Star!. To know more about job openings please visit Converse Job Portal All the Best, Converse Hiring Team. KeySkills: Business Analysis, Finance & Accounts Functional Area: Finance / Accounts / Tax Candidate Profile Detail

Posted 6 days ago

Apply

3.0 years

12 - 20 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Industry: Enterprise software quality assurance and digital product development. We empower global clients in banking, fintech, and e-commerce to ship robust, secure, and high-performance APIs through rigorous test automation and continuous validation. Role & Responsibilities Design and execute automated functional, contract, and regression tests for REST and SOAP APIs. Develop test suites in Postman/Newman and REST Assured, integrating them with Jenkins pipelines. Create data-driven test cases using JSON/XML payloads and validate responses against Swagger/OpenAPI specs. Monitor API performance, capture logs, and raise detailed defects with reproducible steps and traces. Collaborate with developers and DevOps to triage issues, advise on root-cause, and harden CI/CD gates. Document test strategy, coverage metrics, and best practices; mentor junior testers on automation standards. Skills & Qualifications Must-Have 3+ years focused on API testing in Agile/Scrum teams. Hands-on with Postman/Newman, REST Assured or similar frameworks. Strong knowledge of HTTP, JSON, XML, OAuth, and microservices architecture. Proficiency in Java or JavaScript for scripting automated tests. Experience querying relational databases (SQL) for test data and verification. Working understanding of Jenkins/Git and defect-tracking tools like JIRA. Preferred Exposure to performance testing using JMeter or Gatling. Knowledge of contract testing tools such as Pact. Experience testing GraphQL or gRPC services. Benefits & Culture Highlights Engineer-driven culture valuing automation, quality, and continuous learning. Access to paid certifications and dedicated upskilling budget. Collaborative on-site environment with modern tooling and zero-red-tape decision making. Location: On-site role based in India. Skills: http,test automation,gatling,jenkins,contract testing,restassured,swagger,jira,pact,rest assured,newman,javascript,soapui,jmeter,grpc,java,xml,oauth,selenium,git,postman,api testing,json,performance testing,sql,graphql,microservices architecture,bug tracking

Posted 6 days ago

Apply

3.0 years

12 - 20 Lacs

Bhubaneswar, Odisha, India

On-site

Linkedin logo

Industry: Enterprise software quality assurance and digital product development. We empower global clients in banking, fintech, and e-commerce to ship robust, secure, and high-performance APIs through rigorous test automation and continuous validation. Role & Responsibilities Design and execute automated functional, contract, and regression tests for REST and SOAP APIs. Develop test suites in Postman/Newman and REST Assured, integrating them with Jenkins pipelines. Create data-driven test cases using JSON/XML payloads and validate responses against Swagger/OpenAPI specs. Monitor API performance, capture logs, and raise detailed defects with reproducible steps and traces. Collaborate with developers and DevOps to triage issues, advise on root-cause, and harden CI/CD gates. Document test strategy, coverage metrics, and best practices; mentor junior testers on automation standards. Skills & Qualifications Must-Have 3+ years focused on API testing in Agile/Scrum teams. Hands-on with Postman/Newman, REST Assured or similar frameworks. Strong knowledge of HTTP, JSON, XML, OAuth, and microservices architecture. Proficiency in Java or JavaScript for scripting automated tests. Experience querying relational databases (SQL) for test data and verification. Working understanding of Jenkins/Git and defect-tracking tools like JIRA. Preferred Exposure to performance testing using JMeter or Gatling. Knowledge of contract testing tools such as Pact. Experience testing GraphQL or gRPC services. Benefits & Culture Highlights Engineer-driven culture valuing automation, quality, and continuous learning. Access to paid certifications and dedicated upskilling budget. Collaborative on-site environment with modern tooling and zero-red-tape decision making. Location: On-site role based in India. Skills: http,test automation,gatling,jenkins,contract testing,restassured,swagger,jira,pact,rest assured,newman,javascript,soapui,jmeter,grpc,java,xml,oauth,selenium,git,postman,api testing,json,performance testing,sql,graphql,microservices architecture,bug tracking

Posted 6 days ago

Apply

6.0 years

10 - 20 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Senior Data Engineer (On-Site, India) About The Opportunity A high-growth innovator in the Analytics & Enterprise Data Management sector, we architect and deliver cloud-native data platforms that power real-time reporting, AI/ML workloads, and intelligent decisioning for global retail, fintech, and manufacturing leaders. Our engineering teams transform raw, high-velocity data into trusted, analytics-ready assets that drive revenue acceleration and operational excellence. Role & Responsibilities Design, build, and optimise batch and streaming ETL/ELT pipelines on Apache Spark and Kafka, ensuring sub-minute latency and 99.9% uptime. Develop modular, test-driven Python code to ingest, cleanse, and enrich terabyte-scale datasets from relational, NoSQL, and API sources. Model data for analytics and AI, implementing star/snowflake schemas, partitioning, and clustering in BigQuery, Redshift, or Snowflake. Automate workflow orchestration with Apache Airflow, defining DAGs, dependency management, and robust alerting for SLA adherence. Collaborate with Data Scientists and BI teams to expose feature stores, curated marts, and self-service semantic layers. Enforce data-governance best practices—lineage, cataloguing, RBAC, and encryption—in compliance with GDPR and SOC 2 standards. Skills & Qualifications Must-Have 3–6 years hands-on engineering large-scale data pipelines in production. Expertise in Python and advanced SQL for ETL, optimisation, and performance tuning. Proven experience with Spark (PySpark or Scala) and streaming technologies such as Kafka or Kinesis. Deep knowledge of relational modelling, data-warehousing concepts, and at least one cloud DWH (BigQuery, Redshift, or Snowflake). Solid command of CI/CD, Git workflows, and containerisation (Docker). Preferred Exposure to infrastructure-as-code (Terraform, CloudFormation) and Kubernetes. Experience integrating ML feature stores and monitoring data quality with Great Expectations or similar tools. Certification on AWS, GCP, or Azure data services. Benefits & Culture Highlights On-site, engineer-first environment with dedicated lab space and latest Mac/Linux gear. Rapid career progression through technical mentorship, sponsored certifications, and conference budgets. Inclusive, innovation-driven culture that rewards outcome ownership and creative problem-solving. Ready to architect next-gen data pipelines that power AI at scale? Apply now and join a mission-focused team turning data into competitive advantage. Skills: airflow,docker,snowflake,apache spark,data engineering,redshift,sql,pyspark,python,data modeling,bigquery,ci/cd,apache airflow,data warehousing,spark,etl,kafka,git

Posted 6 days ago

Apply

6.0 years

10 - 20 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Senior Data Engineer (On-Site, India) About The Opportunity A high-growth innovator in the Analytics & Enterprise Data Management sector, we architect and deliver cloud-native data platforms that power real-time reporting, AI/ML workloads, and intelligent decisioning for global retail, fintech, and manufacturing leaders. Our engineering teams transform raw, high-velocity data into trusted, analytics-ready assets that drive revenue acceleration and operational excellence. Role & Responsibilities Design, build, and optimise batch and streaming ETL/ELT pipelines on Apache Spark and Kafka, ensuring sub-minute latency and 99.9% uptime. Develop modular, test-driven Python code to ingest, cleanse, and enrich terabyte-scale datasets from relational, NoSQL, and API sources. Model data for analytics and AI, implementing star/snowflake schemas, partitioning, and clustering in BigQuery, Redshift, or Snowflake. Automate workflow orchestration with Apache Airflow, defining DAGs, dependency management, and robust alerting for SLA adherence. Collaborate with Data Scientists and BI teams to expose feature stores, curated marts, and self-service semantic layers. Enforce data-governance best practices—lineage, cataloguing, RBAC, and encryption—in compliance with GDPR and SOC 2 standards. Skills & Qualifications Must-Have 3–6 years hands-on engineering large-scale data pipelines in production. Expertise in Python and advanced SQL for ETL, optimisation, and performance tuning. Proven experience with Spark (PySpark or Scala) and streaming technologies such as Kafka or Kinesis. Deep knowledge of relational modelling, data-warehousing concepts, and at least one cloud DWH (BigQuery, Redshift, or Snowflake). Solid command of CI/CD, Git workflows, and containerisation (Docker). Preferred Exposure to infrastructure-as-code (Terraform, CloudFormation) and Kubernetes. Experience integrating ML feature stores and monitoring data quality with Great Expectations or similar tools. Certification on AWS, GCP, or Azure data services. Benefits & Culture Highlights On-site, engineer-first environment with dedicated lab space and latest Mac/Linux gear. Rapid career progression through technical mentorship, sponsored certifications, and conference budgets. Inclusive, innovation-driven culture that rewards outcome ownership and creative problem-solving. Ready to architect next-gen data pipelines that power AI at scale? Apply now and join a mission-focused team turning data into competitive advantage. Skills: airflow,docker,snowflake,apache spark,data engineering,redshift,sql,pyspark,python,data modeling,bigquery,ci/cd,apache airflow,data warehousing,spark,etl,kafka,git

Posted 6 days ago

Apply

6.0 years

10 - 20 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Senior Data Engineer (On-Site, India) About The Opportunity A high-growth innovator in the Analytics & Enterprise Data Management sector, we architect and deliver cloud-native data platforms that power real-time reporting, AI/ML workloads, and intelligent decisioning for global retail, fintech, and manufacturing leaders. Our engineering teams transform raw, high-velocity data into trusted, analytics-ready assets that drive revenue acceleration and operational excellence. Role & Responsibilities Design, build, and optimise batch and streaming ETL/ELT pipelines on Apache Spark and Kafka, ensuring sub-minute latency and 99.9% uptime. Develop modular, test-driven Python code to ingest, cleanse, and enrich terabyte-scale datasets from relational, NoSQL, and API sources. Model data for analytics and AI, implementing star/snowflake schemas, partitioning, and clustering in BigQuery, Redshift, or Snowflake. Automate workflow orchestration with Apache Airflow, defining DAGs, dependency management, and robust alerting for SLA adherence. Collaborate with Data Scientists and BI teams to expose feature stores, curated marts, and self-service semantic layers. Enforce data-governance best practices—lineage, cataloguing, RBAC, and encryption—in compliance with GDPR and SOC 2 standards. Skills & Qualifications Must-Have 3–6 years hands-on engineering large-scale data pipelines in production. Expertise in Python and advanced SQL for ETL, optimisation, and performance tuning. Proven experience with Spark (PySpark or Scala) and streaming technologies such as Kafka or Kinesis. Deep knowledge of relational modelling, data-warehousing concepts, and at least one cloud DWH (BigQuery, Redshift, or Snowflake). Solid command of CI/CD, Git workflows, and containerisation (Docker). Preferred Exposure to infrastructure-as-code (Terraform, CloudFormation) and Kubernetes. Experience integrating ML feature stores and monitoring data quality with Great Expectations or similar tools. Certification on AWS, GCP, or Azure data services. Benefits & Culture Highlights On-site, engineer-first environment with dedicated lab space and latest Mac/Linux gear. Rapid career progression through technical mentorship, sponsored certifications, and conference budgets. Inclusive, innovation-driven culture that rewards outcome ownership and creative problem-solving. Ready to architect next-gen data pipelines that power AI at scale? Apply now and join a mission-focused team turning data into competitive advantage. Skills: airflow,docker,snowflake,apache spark,data engineering,redshift,sql,pyspark,python,data modeling,bigquery,ci/cd,apache airflow,data warehousing,spark,etl,kafka,git

Posted 6 days ago

Apply

3.0 years

12 - 20 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Industry: Enterprise software quality assurance and digital product development. We empower global clients in banking, fintech, and e-commerce to ship robust, secure, and high-performance APIs through rigorous test automation and continuous validation. Role & Responsibilities Design and execute automated functional, contract, and regression tests for REST and SOAP APIs. Develop test suites in Postman/Newman and REST Assured, integrating them with Jenkins pipelines. Create data-driven test cases using JSON/XML payloads and validate responses against Swagger/OpenAPI specs. Monitor API performance, capture logs, and raise detailed defects with reproducible steps and traces. Collaborate with developers and DevOps to triage issues, advise on root-cause, and harden CI/CD gates. Document test strategy, coverage metrics, and best practices; mentor junior testers on automation standards. Skills & Qualifications Must-Have 3+ years focused on API testing in Agile/Scrum teams. Hands-on with Postman/Newman, REST Assured or similar frameworks. Strong knowledge of HTTP, JSON, XML, OAuth, and microservices architecture. Proficiency in Java or JavaScript for scripting automated tests. Experience querying relational databases (SQL) for test data and verification. Working understanding of Jenkins/Git and defect-tracking tools like JIRA. Preferred Exposure to performance testing using JMeter or Gatling. Knowledge of contract testing tools such as Pact. Experience testing GraphQL or gRPC services. Benefits & Culture Highlights Engineer-driven culture valuing automation, quality, and continuous learning. Access to paid certifications and dedicated upskilling budget. Collaborative on-site environment with modern tooling and zero-red-tape decision making. Location: On-site role based in India. Skills: http,test automation,gatling,jenkins,contract testing,restassured,swagger,jira,pact,rest assured,newman,javascript,soapui,jmeter,grpc,java,xml,oauth,selenium,git,postman,api testing,json,performance testing,sql,graphql,microservices architecture,bug tracking

Posted 6 days ago

Apply

6.0 years

10 - 20 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Senior Data Engineer (On-Site, India) About The Opportunity A high-growth innovator in the Analytics & Enterprise Data Management sector, we architect and deliver cloud-native data platforms that power real-time reporting, AI/ML workloads, and intelligent decisioning for global retail, fintech, and manufacturing leaders. Our engineering teams transform raw, high-velocity data into trusted, analytics-ready assets that drive revenue acceleration and operational excellence. Role & Responsibilities Design, build, and optimise batch and streaming ETL/ELT pipelines on Apache Spark and Kafka, ensuring sub-minute latency and 99.9% uptime. Develop modular, test-driven Python code to ingest, cleanse, and enrich terabyte-scale datasets from relational, NoSQL, and API sources. Model data for analytics and AI, implementing star/snowflake schemas, partitioning, and clustering in BigQuery, Redshift, or Snowflake. Automate workflow orchestration with Apache Airflow, defining DAGs, dependency management, and robust alerting for SLA adherence. Collaborate with Data Scientists and BI teams to expose feature stores, curated marts, and self-service semantic layers. Enforce data-governance best practices—lineage, cataloguing, RBAC, and encryption—in compliance with GDPR and SOC 2 standards. Skills & Qualifications Must-Have 3–6 years hands-on engineering large-scale data pipelines in production. Expertise in Python and advanced SQL for ETL, optimisation, and performance tuning. Proven experience with Spark (PySpark or Scala) and streaming technologies such as Kafka or Kinesis. Deep knowledge of relational modelling, data-warehousing concepts, and at least one cloud DWH (BigQuery, Redshift, or Snowflake). Solid command of CI/CD, Git workflows, and containerisation (Docker). Preferred Exposure to infrastructure-as-code (Terraform, CloudFormation) and Kubernetes. Experience integrating ML feature stores and monitoring data quality with Great Expectations or similar tools. Certification on AWS, GCP, or Azure data services. Benefits & Culture Highlights On-site, engineer-first environment with dedicated lab space and latest Mac/Linux gear. Rapid career progression through technical mentorship, sponsored certifications, and conference budgets. Inclusive, innovation-driven culture that rewards outcome ownership and creative problem-solving. Ready to architect next-gen data pipelines that power AI at scale? Apply now and join a mission-focused team turning data into competitive advantage. Skills: airflow,docker,snowflake,apache spark,data engineering,redshift,sql,pyspark,python,data modeling,bigquery,ci/cd,apache airflow,data warehousing,spark,etl,kafka,git

Posted 6 days ago

Apply

3.0 years

12 - 20 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Industry: Enterprise software quality assurance and digital product development. We empower global clients in banking, fintech, and e-commerce to ship robust, secure, and high-performance APIs through rigorous test automation and continuous validation. Role & Responsibilities Design and execute automated functional, contract, and regression tests for REST and SOAP APIs. Develop test suites in Postman/Newman and REST Assured, integrating them with Jenkins pipelines. Create data-driven test cases using JSON/XML payloads and validate responses against Swagger/OpenAPI specs. Monitor API performance, capture logs, and raise detailed defects with reproducible steps and traces. Collaborate with developers and DevOps to triage issues, advise on root-cause, and harden CI/CD gates. Document test strategy, coverage metrics, and best practices; mentor junior testers on automation standards. Skills & Qualifications Must-Have 3+ years focused on API testing in Agile/Scrum teams. Hands-on with Postman/Newman, REST Assured or similar frameworks. Strong knowledge of HTTP, JSON, XML, OAuth, and microservices architecture. Proficiency in Java or JavaScript for scripting automated tests. Experience querying relational databases (SQL) for test data and verification. Working understanding of Jenkins/Git and defect-tracking tools like JIRA. Preferred Exposure to performance testing using JMeter or Gatling. Knowledge of contract testing tools such as Pact. Experience testing GraphQL or gRPC services. Benefits & Culture Highlights Engineer-driven culture valuing automation, quality, and continuous learning. Access to paid certifications and dedicated upskilling budget. Collaborative on-site environment with modern tooling and zero-red-tape decision making. Location: On-site role based in India. Skills: http,test automation,gatling,jenkins,contract testing,restassured,swagger,jira,pact,rest assured,newman,javascript,soapui,jmeter,grpc,java,xml,oauth,selenium,git,postman,api testing,json,performance testing,sql,graphql,microservices architecture,bug tracking

Posted 6 days ago

Apply

6.0 years

10 - 20 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Senior Data Engineer (On-Site, India) About The Opportunity A high-growth innovator in the Analytics & Enterprise Data Management sector, we architect and deliver cloud-native data platforms that power real-time reporting, AI/ML workloads, and intelligent decisioning for global retail, fintech, and manufacturing leaders. Our engineering teams transform raw, high-velocity data into trusted, analytics-ready assets that drive revenue acceleration and operational excellence. Role & Responsibilities Design, build, and optimise batch and streaming ETL/ELT pipelines on Apache Spark and Kafka, ensuring sub-minute latency and 99.9% uptime. Develop modular, test-driven Python code to ingest, cleanse, and enrich terabyte-scale datasets from relational, NoSQL, and API sources. Model data for analytics and AI, implementing star/snowflake schemas, partitioning, and clustering in BigQuery, Redshift, or Snowflake. Automate workflow orchestration with Apache Airflow, defining DAGs, dependency management, and robust alerting for SLA adherence. Collaborate with Data Scientists and BI teams to expose feature stores, curated marts, and self-service semantic layers. Enforce data-governance best practices—lineage, cataloguing, RBAC, and encryption—in compliance with GDPR and SOC 2 standards. Skills & Qualifications Must-Have 3–6 years hands-on engineering large-scale data pipelines in production. Expertise in Python and advanced SQL for ETL, optimisation, and performance tuning. Proven experience with Spark (PySpark or Scala) and streaming technologies such as Kafka or Kinesis. Deep knowledge of relational modelling, data-warehousing concepts, and at least one cloud DWH (BigQuery, Redshift, or Snowflake). Solid command of CI/CD, Git workflows, and containerisation (Docker). Preferred Exposure to infrastructure-as-code (Terraform, CloudFormation) and Kubernetes. Experience integrating ML feature stores and monitoring data quality with Great Expectations or similar tools. Certification on AWS, GCP, or Azure data services. Benefits & Culture Highlights On-site, engineer-first environment with dedicated lab space and latest Mac/Linux gear. Rapid career progression through technical mentorship, sponsored certifications, and conference budgets. Inclusive, innovation-driven culture that rewards outcome ownership and creative problem-solving. Ready to architect next-gen data pipelines that power AI at scale? Apply now and join a mission-focused team turning data into competitive advantage. Skills: airflow,docker,snowflake,apache spark,data engineering,redshift,sql,pyspark,python,data modeling,bigquery,ci/cd,apache airflow,data warehousing,spark,etl,kafka,git

Posted 6 days ago

Apply

6.0 years

10 - 20 Lacs

Bhubaneswar, Odisha, India

On-site

Linkedin logo

Senior Data Engineer (On-Site, India) About The Opportunity A high-growth innovator in the Analytics & Enterprise Data Management sector, we architect and deliver cloud-native data platforms that power real-time reporting, AI/ML workloads, and intelligent decisioning for global retail, fintech, and manufacturing leaders. Our engineering teams transform raw, high-velocity data into trusted, analytics-ready assets that drive revenue acceleration and operational excellence. Role & Responsibilities Design, build, and optimise batch and streaming ETL/ELT pipelines on Apache Spark and Kafka, ensuring sub-minute latency and 99.9% uptime. Develop modular, test-driven Python code to ingest, cleanse, and enrich terabyte-scale datasets from relational, NoSQL, and API sources. Model data for analytics and AI, implementing star/snowflake schemas, partitioning, and clustering in BigQuery, Redshift, or Snowflake. Automate workflow orchestration with Apache Airflow, defining DAGs, dependency management, and robust alerting for SLA adherence. Collaborate with Data Scientists and BI teams to expose feature stores, curated marts, and self-service semantic layers. Enforce data-governance best practices—lineage, cataloguing, RBAC, and encryption—in compliance with GDPR and SOC 2 standards. Skills & Qualifications Must-Have 3–6 years hands-on engineering large-scale data pipelines in production. Expertise in Python and advanced SQL for ETL, optimisation, and performance tuning. Proven experience with Spark (PySpark or Scala) and streaming technologies such as Kafka or Kinesis. Deep knowledge of relational modelling, data-warehousing concepts, and at least one cloud DWH (BigQuery, Redshift, or Snowflake). Solid command of CI/CD, Git workflows, and containerisation (Docker). Preferred Exposure to infrastructure-as-code (Terraform, CloudFormation) and Kubernetes. Experience integrating ML feature stores and monitoring data quality with Great Expectations or similar tools. Certification on AWS, GCP, or Azure data services. Benefits & Culture Highlights On-site, engineer-first environment with dedicated lab space and latest Mac/Linux gear. Rapid career progression through technical mentorship, sponsored certifications, and conference budgets. Inclusive, innovation-driven culture that rewards outcome ownership and creative problem-solving. Ready to architect next-gen data pipelines that power AI at scale? Apply now and join a mission-focused team turning data into competitive advantage. Skills: airflow,docker,snowflake,apache spark,data engineering,redshift,sql,pyspark,python,data modeling,bigquery,ci/cd,apache airflow,data warehousing,spark,etl,kafka,git

Posted 6 days ago

Apply

6.0 years

10 - 20 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Senior Data Engineer (On-Site, India) About The Opportunity A high-growth innovator in the Analytics & Enterprise Data Management sector, we architect and deliver cloud-native data platforms that power real-time reporting, AI/ML workloads, and intelligent decisioning for global retail, fintech, and manufacturing leaders. Our engineering teams transform raw, high-velocity data into trusted, analytics-ready assets that drive revenue acceleration and operational excellence. Role & Responsibilities Design, build, and optimise batch and streaming ETL/ELT pipelines on Apache Spark and Kafka, ensuring sub-minute latency and 99.9% uptime. Develop modular, test-driven Python code to ingest, cleanse, and enrich terabyte-scale datasets from relational, NoSQL, and API sources. Model data for analytics and AI, implementing star/snowflake schemas, partitioning, and clustering in BigQuery, Redshift, or Snowflake. Automate workflow orchestration with Apache Airflow, defining DAGs, dependency management, and robust alerting for SLA adherence. Collaborate with Data Scientists and BI teams to expose feature stores, curated marts, and self-service semantic layers. Enforce data-governance best practices—lineage, cataloguing, RBAC, and encryption—in compliance with GDPR and SOC 2 standards. Skills & Qualifications Must-Have 3–6 years hands-on engineering large-scale data pipelines in production. Expertise in Python and advanced SQL for ETL, optimisation, and performance tuning. Proven experience with Spark (PySpark or Scala) and streaming technologies such as Kafka or Kinesis. Deep knowledge of relational modelling, data-warehousing concepts, and at least one cloud DWH (BigQuery, Redshift, or Snowflake). Solid command of CI/CD, Git workflows, and containerisation (Docker). Preferred Exposure to infrastructure-as-code (Terraform, CloudFormation) and Kubernetes. Experience integrating ML feature stores and monitoring data quality with Great Expectations or similar tools. Certification on AWS, GCP, or Azure data services. Benefits & Culture Highlights On-site, engineer-first environment with dedicated lab space and latest Mac/Linux gear. Rapid career progression through technical mentorship, sponsored certifications, and conference budgets. Inclusive, innovation-driven culture that rewards outcome ownership and creative problem-solving. Ready to architect next-gen data pipelines that power AI at scale? Apply now and join a mission-focused team turning data into competitive advantage. Skills: airflow,docker,snowflake,apache spark,data engineering,redshift,sql,pyspark,python,data modeling,bigquery,ci/cd,apache airflow,data warehousing,spark,etl,kafka,git

Posted 6 days ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

P2-C2-TSTS We are looking for a Senior Java Developer with 5+ years of hands-on experience in Java development, API integrations, and full-stack development. The ideal candidate will have relevant experience in the banking domain, particularly in lead-to-deal processes. This role involves building scalable backend services, integrating APIs, and contributing to end-to-end development within a collaborative Agile team. Key Responsibilities Design, code, test, and maintain Java-based applications with high performance and scalability. Develop and integrate RESTful APIs, working with distributed systems and external interfaces. Implement clean, tested, and maintainable code using Java, JavaScript, HTML, and CSS. Contribute to front-end development using modern JavaScript frameworks such as Angular (preferred). Collaborate with business analysts, QA, and other developers to deliver high-quality features. Participate in sprint planning, code reviews, and regular Agile ceremonies. Support deployments and troubleshooting in development and production environments. Required Skills & Experience 5+ years of experience in Java development (Java 17 preferred). Experience in the banking domain, particularly in lead-to-deal workflows. Strong skills in API development and integration (REST, SOAP). Proficiency in front-end fundamentals (HTML, CSS, JavaScript) and front-end frameworks (Angular preferred). Experience working with PostgreSQL and relational database systems. Familiarity with BPMN tools or rules engines like Camunda, Drools, or Activiti (nice to have). Hands-on experience with DevOps tools such as GitHub, Maven, and Kubernetes. Understanding of system layers including database, API, caching, and message queues. Excellent problem-solving, debugging, and communication skills. Qualifications Bachelors degree in Computer Science, Engineering, or related field. Experience working in Agile development environments. Ability to collaborate in cross-functional teams and contribute to delivery goals.

Posted 6 days ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About The Opportunity We are a fast-scaling technology services provider to Fortune 500 banking and insurance clients, specializing in modernizing mission-critical mainframe applications and integrating them with digital channels. Our onsite engineering teams in India translate complex legacy workloads into high-performance, compliant, and future-ready solutions that keep global businesses running 24x7. Role & Responsibilities Design, code, and unit-test high-volume COBOL, JCL, and DB2 programs for batch and online systems. Analyse functional specifications, perform impact analysis, and create technical design documents aligned with banking standards. Troubleshoot production incidents, execute root-cause analysis, and implement permanent fixes within SLA. Optimise performance of CICS transactions and batch jobs, leveraging utilities, SQL tuning, and indexing strategies. Migrate legacy VSAM/IMS data to relational stores and integrate mainframe APIs with distributed platforms via MQ or REST gateways. Collaborate with cross-functional scrum teams, provide code reviews, and mentor junior engineers on mainframe best practices. Skills & Qualifications Must-Have 3-8 years hands-on COBOL, JCL, DB2, and CICS development. Experience with debugging tools (Xpediter/Abend-Aid) and batch scheduling (Control-M/CA-7). Solid knowledge of VSAM, IMS, and file handling utilities. Proven ability to dissect complex business logic and convert into technical designs. Comfortable working onsite with tight release cycles and change-management processes. Preferred Exposure to Agile, Jira, and DevOps pipelines (Endevor/Git, Jenkins). Knowledge of re-hosting or re-engineering projects to cloud platforms. Banking or insurance domain experience. Benefits & Culture Highlights Work on mission-critical, large-scale systems impacting millions of users. Continuous upskilling through sponsored Mainframe-to-Cloud training tracks. Collaborative, merit-driven environment with fast growth opportunities. If you are a Mainframe Engineer eager to solve complex banking challenges and grow with a forward-looking team, apply now. Skills: debugging tools (xpediter/abend-aid),jira,jcl,cloud,ims,vsam,devops pipelines (endevor/git, jenkins),cobol,agile,db2,file handling utilities,batch scheduling (control-m/ca-7),mainframe,cics

Posted 6 days ago

Apply

7.0 years

10 - 20 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Lead Python Engineer Industry: Information Technology & Digital Solutions We are a fast-growing provider of enterprise-grade software engineering and digital transformation services, delivering secure, data-driven platforms for global finance, healthcare, and e-commerce customers from our on-site engineering hub in India. Role & Responsibilities Lead a squad of 4-8 Python engineers through the full SDLC—requirements, design, coding, code review, testing, deployment, and support. Architect scalable, low-latency microservices using Django/FastAPI, RESTful APIs, and asynchronous task queues such as Celery/RabbitMQ. Implement cloud-native solutions on AWS, leveraging Lambda, ECS/EKS, S3, RDS, and Terraform for infrastructure as code. Drive engineering best practices—TDD, CI/CD pipelines with GitLab/Jenkins, automated static analysis, and performance profiling. Collaborate with Product, DevOps, and QA to translate business problems into technical deliverables and ensure on-time releases. Mentor developers, conduct technical workshops, and contribute to hiring to build a high-performing Python guild. Skills & Qualifications Must-Have 7+ years hands-on Python backend development with Django, Flask, or FastAPI. Proven experience designing microservices and REST APIs serving high-concurrency workloads. Deep understanding of relational databases (PostgreSQL/MySQL) and NoSQL stores (MongoDB/Redis). Production exposure to AWS, containerisation with Docker, and orchestration via Kubernetes or ECS. Strong grasp of Git workflows, automated testing, and Agile/Scrum ceremonies. Excellent communication and people-leader skills enabling cross-functional influence. Preferred Exposure to event-driven patterns with Kafka or Kinesis. Experience implementing GraphQL, gRPC, or WebSocket streaming services. Knowledge of security, compliance, and observability (OpenTelemetry, Prometheus, ELK). Benefits & Culture Highlights Collaborative on-site culture with hackathons, brown-bag sessions, and a dedicated innovation lab. Fast-track leadership path and fully-funded certifications in AWS, Kubernetes, and data engineering. Comprehensive health coverage for family, annual performance bonus up to 20%, and generous leave policies. Skills: python,kubernetes,graphql,celery,sql,grpc,s3,mysql,eks,elk,jenkins,scrum,kafka,ecs,rest apis,prometheus,mongodb,opentelemetry,docker,redis,websocket,aws,terraform,postgresql,rabbitmq,rds,fastapi,tdd,django,lambda,kinesis,gitlab,flask,agile,python software foundation,ci/cd,microservices

Posted 6 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About the Role We are looking for a Software Engineer Lead with deep technical expertise and a strategic mindset to lead the design, architecture, and development of secure, scalable, cloud-native systems. You’ll play a key role in guiding engineering teams, shaping architectural direction, and ensuring alignment across product and technology roadmaps. If you're passionate about cloud, data architecture, and DevOps, and enjoy mentoring others while delivering impact, we’d love to hear from you. Key Responsibilities Lead architecture and technical design efforts for complex software systems and cloud-native solutions. Collaborate with cross-functional teams including engineering, DevOps, and product managers to deliver high-quality solutions aligned with business goals. Guide and mentor developers, conduct code and architecture reviews, and ensure best practices across the SDLC. Design and implement data-intensive solutions, including structured and unstructured data flows, with attention to performance, scalability, and security. Drive adoption of CI/CD, DevOps, and containerization best practices for efficient and reliable deployments. Define and evolve architecture roadmaps, ensuring technical coherence across systems and services. Key Requirements Must-Have Skills Proven experience in software architecture and technical leadership roles. Strong hands-on development expertise in Java , Python , or Node.js , with exposure to modern microservices frameworks ( Flask , FastAPI , Celery ). Deep experience with cloud-native architectures (AWS, Azure, or GCP) and cloud certifications (Associate or Professional level preferred). Expertise in Relational and NoSQL databases (e.g., PostgreSQL , MongoDB , Redis ), with advanced knowledge of data modelling and query optimisation. Solid understanding of Kubernetes , Docker , and CI/CD pipelines, with practical experience in DevOps tooling and practices. Proficiency in Infrastructure as Code tools such as Terraform or CloudFormation . Excellent communication skills with the ability to articulate architectural decisions and mentor across teams. Optional/Nice-to-Have Experience in MLOps pipelines , monitoring tools ( ELK , Prometheus/Grafana ), and tools like MLflow , Langfuse . Familiarity with GenAI frameworks (e.g., Langchain , LlamaIndex ), vector databases (e.g., Milvus , ChromaDB ), agentic AI , and multi-component pipelines (MCP) . Experience building event-driven and serverless architectures. Strong grasp of security , compliance , and cost optimisation in enterprise-scale cloud environments. Preferred Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related technical field. Cloud certifications: AWS Certified Solutions Architect , Azure Solutions Architect Expert , or equivalent. Why Join Us Be part of a dynamic, fast-growing tech team building innovative, high-impact solutions. Work on modern technologies including cloud, data pipelines, and AI frameworks. Supportive, collaborative, and inclusive work culture. Opportunities for professional development and career progression. Competitive salary and benefits package.

Posted 6 days ago

Apply

10.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Description The ideal candidate will have 10+ years of experience in full-stack development, working on robust web applications and services. Role The ideal candidate will be a self-motivated individual with a passion for excellent quality software development: Design, develop, and maintain applications using C#.NET. Analyze and understand existing software modules and take part in documenting the business knowledge gathered. Write complex SQL queries, procedures, and scripts for data manipulation and retrieval. Implement new features, improve existing functionality, and ensure code quality and performance. Collaborate with front-end teams and work on minimal React integration (React 16/17 or higher). Perform unit testing and debugging of the application code. Ensure the application architecture is scalable and maintainable. Troubleshoot production issues, perform root cause analysis, and provide timely solutions. Work in an Agile/Scrum environment, contributing to sprint planning, daily stand-ups, and sprint retrospectives. Mentor junior developers and review their code to ensure best practices are followed. Optimize database performance, including indexing, query optimization, and troubleshooting. Stay up to date with emerging technologies and industry trends. Qualifications Requirements: BE, BTech or MCA as educational qualification 10+ years of experience using C# (.NET Framework/Core). Should be expert in object-oriented programming. SQL: Proficiency in SQL Server (or other relational databases), including writing complex SQL queries, stored procedures, and database optimization techniques. Experience in React or Angular: Familiarity with React/Angular for building front-end components, with a focus on integrating React/Angular into existing/new .NET applications. Hand-on knowledge on Azure or AWS cloud services Web Development: Solid understanding of HTTP, RESTful services, and web application architectures. Version Control: Experience using Git for version control and collaboration in multi-developed environments. Unit Testing: Experience with testing frameworks (e.g. xUnit) to ensure application reliability. Additional Skills Strong problem-solving skills and ability to troubleshoot complex issues. Ability to collaborate effectively in a team environment. Strong understanding of software development, lifecycle and Agile methodologies. Good communication skills with both technical and non-technical team members. Nice-to-Have Experience in infrastructure management with Terraform script using YAML. Experience with DevOps tools (preferably GitHub Action) and continuous integration/continuous deployment (CI/CD) pipelines.

Posted 6 days ago

Apply

10.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Description Role: The ideal candidate will be a self-motivated individual with a passion for excellent quality software development: Design, develop, and maintain complex MS-SQL / SQL Server databases, ensuring high performance, availability, and security. Troubleshoot and resolve database-related issues and performance bottlenecks Collaborate with cross-functional teams to define, design, and ship new features. Optimize and enhance existing data objects for performance and scalability. Implement back-end solutions using C#.NET, ensuring seamless integration with database. Participate in code reviews, providing constructive feedback to peers. Troubleshoot and resolve software defects and issues. Stay updated with the latest industry trends and technologies to ensure our solutions remain cutting-edge. Qualifications Requirements: BE, BTech or MCA as educational qualification 10+ years of experience in MS-SQL / SQL Server development and administration. Strong expertise in database design, optimization, and query performance tuning. Understand in back-end development using C#.NET Proficient in Azure SQL and with other database technologies including non-relational databases Should have good understand in database unit testing using tools like TSQLT Familiarity with database migration tools like FlyWay Hands-on experience with RESTful APIs and web services. Solid understanding of software development principles, design patterns, and best practices. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Ability to work in a fast-paced, agile development environment. Additional Skills Strong problem-solving skills and ability to troubleshoot complex issues. Ability to collaborate effectively in a team environment. Strong understanding of software development, lifecycle and Agile methodologies. Good communication skills with both technical and non-technical team members. Nice-to-Have Familiarity with Azure or AWS cloud services. Experience with DevOps tools (preferably GitHub Action) and continuous integration/continuous deployment (CI/CD) pipelines.

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies