Jobs
Interviews

86 Workflow Orchestration Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

20 - 24 Lacs

mumbai

Work from Office

Overview The Production Shared Services (PSS) function is responsible for the seamless and timely delivery of MSCI products and services to clients across multiple business lines, including Index, Analytics, and Sustainability & Climate. The work is highly time-sensitive and business-critical, requiring exceptional accuracy, judgment, accountability, and collaboration. The team operates on a global follow-the-sun model across APAC, EMEA, and the Americas. Key responsibilities of the team include: Enterprise Production Operations: Accurate and timely delivery of workflows across multiple business units. Incident Management: Identifying and resolving production issues, coordinating with stakeholders, and contributing to root cause analysis and permanent remediation. Client Case Handling: Managing production-related queries and tickets, ensuring prompt resolution in partnership with global teams. Change and Release Readiness: Supporting testing, deployments, and governance for product enhancements and new releases. Business Continuity & Resilience: Safeguarding uninterrupted operations and ensuring readiness for critical events. Automation and Transformation: Driving initiatives to modernize, scale, and improve production processes across business units. Responsibilities As a Senior Specialist, Enterprise Data Operations , you will be accountable for safeguarding enterprise data integrity and driving delivery excellence across MSCI’s most critical workflows. You will: Execute and oversee complex production workflows across Index, Analytics, and ESG/Climate with enterprise-level precision and timeliness. Partner with stakeholders across Product, Technology, and Client Services to align priorities, resolve issues, and improve delivery efficiency. Lead incident and problem management efforts, ensuring permanent remediation and embedding technology-driven solutions. Champion automation and process transformation, leveraging scripting, cloud platforms, and AI/ML solutions to increase scalability and resilience. Ensure production readiness for product launches, methodology updates, and system enhancements. Represent Enterprise Data Operations in global forums, providing input on risks, dependencies, and readiness. Drive documentation and best practices to strengthen governance and enterprise standards. Qualifications 10+ years of experience in data operations, production services, or financial data/analytics workflows (experience with Index Data Operations strongly preferred). Demonstrated experience in incident/problem management with ability to engage senior stakeholders under pressure. Strong leadership attributes demonstrated through initiative, problem ownership, and the ability to guide and influence peers, fostering a culture of excellence and continuous improvement. Strong technical and operational mindset, with preferred experience in automation (Python, SQL, or similar), workflow orchestration, and cloud environments (AWS, Azure, Snowflake) Familiarity with AI/ML tools, APIs, and automation platforms, and an interest in applying them to production workflows. Strong analytical and problem-solving mindset, with the ability to innovate beyond BAU operations. High attention to detail, accountability, and the ability to work independently in a fast-paced, global environment. Excellent stakeholder management and communication skills, capable of representing enterprise operations in global forums. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 6 days ago

Apply

6.0 - 11.0 years

7 - 12 Lacs

kochi, bengaluru, thiruvananthapuram

Work from Office

We are seeking an experienced Data Engineer to design and implement scalable data solutions The ideal candidate will have deep expertise in cloud data warehousing, ETL/ELT processes, data modeling, and business intelligence. Requirements: 6+ years of experience in data engineering. Deep expertise inAWS Redshift, including data modeling, query optimization, and cluster management. Good understanding and experience in implementing Data Quality, lineage, data quality, data governance etc. Strong experience inApache Airflow for workflow orchestration and scheduling. Proficiency indbt for data transformation and modeling. Good experience in Azure data stack can also be considered. Experience creating dashboards and reports in Tableau;Excellent SQL skills and experience with Python, Java or Scala;Knowledge of data warehousing concepts and dimensional modeling. Strong communication skills and ability to work cross-functionally. Bachelor's or Master's degree in Computer Science, Engineering, or related field.

Posted 6 days ago

Apply

5.0 - 10.0 years

18 - 33 Lacs

japan, chennai

Work from Office

C1X AdTech Pvt Ltd is a fast-growing product and engineering-driven AdTech company building next-generation advertising and marketing technology platforms. Our mission is to empower enterprise clients with the smartest marketing solutions, enabling seamless integration with personalization engines and delivering cross-channel marketing capabilities. We are dedicated to enhancing customer engagement and experiences while focusing on increasing Lifetime Value (LTV) through consistent messaging across all channels.Our engineering team spans front end (UI), back end (Java/Node.js APIs), Big Data, and DevOps , working together to deliver scalable, high-performance products for the digital advertising ecosystem. Role Overview As a Data Engineer , you will be a key member of our data engineering team, responsible for building and maintaining large-scale data products and infrastructure. Youll shape the next generation of data analytics tech stack by leveraging modern big data technologies. This role involves working closely with business stakeholders, product managers, and engineering teams to meet diverse data requirements that drive business insights and product innovation. Objectives Design, build, and maintain scalable data infrastructure for collection, storage, and processing. Enable easy access to reliable data for data scientists, analysts, and business users. Support data-driven decision-making and improve organizational efficiency through high-quality data products. Responsibilities Build large-scale batch and real-time data pipelines using frameworks like Apache Spark on AWS or GCP. Design, manage, and automate data flows between multiple data sources. Implement best practices for continuous integration, testing, and data quality assurance . Maintain data documentation, definitions, and governance practices. Optimize performance, scalability, and cost-effectiveness of data systems. Collaborate with stakeholders to translate business needs into data-driven solutions. Qualifications Bachelor’s degree in Computer Science, Engineering, or related field (exceptional coding performance on platforms like LeetCode/HackerRank may substitute). 2+ years’ experience working on full lifecycle Big Data projects. Strong foundation in data structures, algorithms, and software design principles . Proficiency in at least two programming languages – Python or Scala preferred. Experience with AWS services such as EMR, Lambda, S3, DynamoDB (GCP equivalents also relevant). Hands-on experience with Databricks Notebooks and Jobs API. Strong expertise in big data frameworks: Spark, MapReduce, Hadoop, Sqoop, Hive, HDFS, Airflow, Zookeeper . Familiarity with containerization (Docker) and workflow management tools (Apache Airflow) . Intermediate to advanced knowledge of SQL (relational + NoSQL databases like Postgres, MySQL, Redshift, Redis). Experience with SQL tuning, schema design, and analytical programming . Proficient in Git (version control) and collaborative workflows. Comfortable working across diverse technologies in a fast-paced, results-oriented environment .

Posted 1 week ago

Apply

6.0 - 11.0 years

7 - 12 Lacs

kochi, bengaluru, thiruvananthapuram

Work from Office

We are seeking an experienced Data Engineer to design and implement scalable data solutions The ideal candidate will have deep expertise in cloud data warehousing, ETL/ELT processes, data modeling, and business intelligence. Requirements: 6+ years of experience in data engineering; Deep expertise with AWS Redshift, including data modeling, query optimization, and cluster management Good understanding and experience in implementing Data Quality, lineage, data quality, data governance etc. Strong experience inApache Airflow for workflow orchestration and scheduling. Proficiency with dbt for data transformation and modeling. Good experience in Azure data stack can also be considered. Experience creating dashboards and reports in Tableau; Excellent SQL skills and experience with Python, Java or Scala; Knowledge of data warehousing concepts and dimensional modeling. Strong communication skills and ability to work cross-functionally. Bachelor's or Master's degree in Computer Science, Engineering, or related field.

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As part of our team, you will have the opportunity to work in a dynamic and inclusive environment that fosters diversity and collaboration to build extraordinary teams. With a strong focus on diversity, inclusion, and social responsibility, we are proud to be recognized as one of the world's best places to work. Our commitment to creating a supportive and empowering workplace has led us to be consistently ranked as a top employer by various reputable organizations. You will be working alongside our generalist consultants and the Advanced Analytics Group (AAG) at Bain, where we leverage data science, customer insights, statistics, machine learning, and data engineering to help clients solve complex problems across various industries. Our team members, with diverse educational backgrounds in computer science, engineering, AI, data science, and other quantitative disciplines, collaborate to deliver innovative solutions. As a part of the team, your responsibilities will include designing, developing, and maintaining cloud-based AI applications using a full-stack technology stack to ensure high-quality, scalable, and secure solutions. You will collaborate with cross-functional teams to implement analytics features that meet business requirements and user needs. Additionally, you will deploy and manage analytics applications in cloud environments, develop APIs and microservices, and implement security measures to protect sensitive data. Your role will also involve monitoring application performance, participating in code reviews, staying updated on emerging technologies, and collaborating with DevOps teams to automate deployment processes. Furthermore, you will work with business consulting staff to develop analytics solutions for our clients and support the analytics application engineering capabilities. To be successful in this role, you are required to have a Master's degree in Computer Science or related field, along with expertise in Python, cloud platforms, and server-side technologies. Experience with client-side and server-side technologies, DevOps, CI/CD, and workflow orchestration is essential. You should possess strong interpersonal and communication skills, curiosity, critical thinking, and a solid foundation in computer science fundamentals. If you are passionate about leveraging analytics to drive business solutions, enjoy working in a collaborative environment, and are committed to continuous learning and development, we encourage you to apply. Travel may be required as part of the role.,

Posted 1 week ago

Apply

6.0 - 10.0 years

13 - 16 Lacs

chennai

Work from Office

Project description We are seeking experienced AI Agents to support the design and deployment of an AI-driven Regulatory Horizon Scanning solution. These roles focus on leveraging Artificial Intelligence (AI), Natural Language Processing (NLP), and advanced analytics to automatically monitor, interpret, and analyse regulatory updates across multiple jurisdictions. Successful candidates will help the bank anticipate regulatory change, assess downstream reporting impacts, and provide actionable insights for Compliance, Risk, and Finance functions. Responsibilities Regulatory Horizon Scanning Continuously monitor global and regional regulatory bodies for new and updated regulations, consultation papers, and guidance. Build and optimise AI-enabled pipelines to ingest and process structured and unstructured regulatory publications. Automate the classification of regulatory documents using NLP, LLMs, and machine learning to identify topics, obligations, timelines, and impacted regulations. Regulatory Analysis & Impact Mapping Extract obligations and key data requirements from regulatory texts, speeches, consultation papers, and technical standards. Map regulatory changes to the bank's reporting frameworks/reporting solution and internal policies. Support impact assessments for upcoming regulations such as Basel III/IV, EMIR Refit, MiFID II, SFTR, IFRS 9, IREF, ESG, and climate risk reporting. AI Solution Development Train, fine-tune, and validate NLP/GenAI models to improve regulatory document interpretation and summarisation. Build AI agents capable of summarising regulatory updates, highlighting key changes, and recommending next steps. Design automation workflows for regulatory monitoring, impact assessment, and stakeholder notifications. Integrate AI agents with data warehouses, data lakes/lakehouses, and regulatory reporting systems to ensure seamless consumption of regulatory intelligence. Collaboration & Stakeholder Engagement Work with Regulatory Reporting, Compliance, Risk, Legal, and Data Governance teams to contextualise regulatory changes. Present regulatory insights, thematic trends, and recommended actions to senior stakeholders and governance forums. Continuous Improvement Implement feedback loops to refine AI models and improve accuracy over time. Explore emerging AI/ML, GenAI, and knowledge graph techniques to enhance regulatory monitoring. Establish performance metrics (precision, recall, timeliness, false positive/negative rates) to ensure solution effectiveness. Skills Must have 6-10+ years of experience in financial services, regulatory reporting, compliance, or RegTech projects. Prior experience implementing AI-driven regulatory change monitoring or regulatory technology platforms. Exposure to automation frameworks (RPA, workflow orchestration) to streamline regulatory processes.

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As an Azure Data Engineer at our company, you will be responsible for designing, developing, and deploying scalable data pipelines using Azure Data Factory, Databricks, and PySpark. Your role will involve working with large datasets (500 GB+) to optimize data processing workflows. Collaborating with cross-functional teams to prioritize project requirements and ensuring data quality, security, and compliance with organizational standards will be essential tasks. Troubleshooting data pipeline issues, optimizing performance, and effectively communicating technical solutions to non-technical stakeholders are key aspects of the role. To excel in this position, you should have at least 4 years of experience in data engineering and hold an Azure certification. Proficiency in Azure, Databricks, PySpark, SQL, data modeling, data integration, and workflow orchestration is crucial. Strong communication and interpersonal skills are necessary, along with the ability to work in a 2nd shift. Experience in product engineering would be a valuable asset. Additionally, having a minimum of 2-3 projects working with large datasets is preferred. Key Responsibilities: - Design, develop, and deploy scalable data pipelines using Azure Data Factory, Databricks, and PySpark - Work with large datasets (500 GB+) to develop and optimize data processing workflows - Collaborate with cross-functional teams to identify and prioritize project requirements - Develop and maintain data models, data integration, and workflow orchestration - Ensure data quality, security, and compliance with organizational standards - Troubleshoot data pipeline issues and optimize performance - Communicate technical solutions to non-technical stakeholders Requirements: - 4+ years of experience in data engineering - Azure certification is mandatory - Strong proficiency in Azure, Databricks, PySpark, SQL, data modeling, data integration, and workflow orchestration - Experience working with large datasets (500 GB+) - Strong communication and interpersonal skills - Ability to work in a 2nd shift - Experience in product engineering is a plus - Minimum 2-3 projects with experience working in large datasets Skills: azure, sql, integration, orchestration, communication skills, data security, databricks, workflow, modeling, data quality, data integration, data modeling, advanced SQL, workflow orchestration, data compliance, PySpark,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be joining a renowned consulting firm that has consistently been recognized as one of the best places to work globally. As part of the Application Engineering team within the AI, Insights & Solutions group, you will collaborate with a diverse group of experts in analytics, engineering, product management, and design to address clients" most challenging problems. Your role will involve designing, developing, and maintaining cloud-based AI applications using a full-stack technology stack to deliver secure and scalable solutions. Working in a multidisciplinary environment, you will collaborate with cross-functional teams to define and implement analytics features that meet business requirements and user needs. Utilizing Kubernetes and containerization technologies, you will deploy and manage analytics applications in cloud environments to ensure optimal performance and availability. Your responsibilities will also include developing APIs and microservices, implementing security measures to protect sensitive data, and monitoring application performance to resolve issues promptly. As a Senior or Staff level professional with a Masters degree in Computer Science or related field, you are expected to have a minimum of 5 years of experience in application engineering. Proficiency in client-side and server-side technologies, cloud platforms, and programming languages such as Python is essential. Additionally, experience with DevOps, CI/CD practices, and version control tools like Git is required. Your role will involve staying updated on emerging technologies, collaborating with DevOps teams to automate deployment processes, and contributing to the development of high-quality, maintainable code. You will play a pivotal role in collaborating with business consulting teams to assess opportunities and develop analytics solutions for clients across various sectors. Strong interpersonal skills, curiosity, critical thinking, and a proactive mindset are key attributes for success in this role. Travel may be required up to 30% of the time, and a deep understanding of data architecture, database design, and agile development methodologies will be advantageous in meeting the job requirements.,

Posted 2 weeks ago

Apply

10.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

On-site

About Ethos Ethos was built to make it faster and easier to get life insurance for the next million families. Our approach blends industry expertise, technology, and the human touch to find you the right policy to protect your loved ones. We leverage deep technology and data science to streamline the life insurance process, making it more accessible and convenient. Using predictive analytics, we are able to transform a traditionally multi-week process into a modern digital experience for our users that can take just minutes! Weve issued billions in coverage each month and eliminated the traditional barriers, ushering the industry into the modern age. Our full-stack technology platform is the backbone of family financial health. We make getting life insurance easier, faster and better for everyone. Our investors include General Catalyst, Sequoia Capital, Accel Partners, Google Ventures, SoftBank, and the investment vehicles of Jay-Z, Kevin Durant, Robert Downey Jr and others. This year, we were named on CB Insights' Global Insurtech 50 list and BuiltIn&aposs Top 100 Midsize Companies in San Francisco. We are scaling quickly and looking for passionate people to protect the next million families! Director of Software Engineering, Enterprise Systems About The Role At Ethos , technology is the foundation that powers our mission to protect One Million Families. As Director of Engineering within our Enterprise Systems group , you will lead a high-performing organization responsible for the platforms that underpin our business: policy lifecycle development, policy administration, and customer-facing portals. You will own the strategy, architecture, and delivery of these mission-critical systemsensuring they are secure, resilient, and scalable to meet the needs of our customers, partners, and internal teams. Partnering with Product, Operations, and Executive leadership, youll balance rapid delivery with long-term platform health while cultivating a culture of engineering excellence across distributed teams. Our stack: Our backend runs on Node.js and Postgres, hosted on AWS. Our front end is built in React/Redux, with modern CI/CD pipelines, observability, and a growing investment in AI capabilities Key Responsibilities Technology Strategy & Delivery Define and execute the engineering strategy and roadmap for Enterprise Systems Drive architectural decisions that ensure scalability, reliability, security, and extensibility across large distributed systems. Ensure predictable, high-quality delivery of enterprise features and enhancements. Champion integration and interoperability between platforms Drive best practices in distributed system design, observability, DevSecOps, and regulatory compliance. Partner cross-functionally to align enterprise system investments with business outcomes. People & Organizational Leadership Recruit, develop, and retain world-class engineering talent, building diverse and inclusive distributed teams. Mentor and grow both engineering managers and ICs, establishing clear career paths and leadership development opportunities. Foster a culture of autonomy, accountability, psychological safety, and continuous learning. Strengthen engineering processes (e.g., technical design reviews, onboarding, knowledge sharing) to enable scale and consistent delivery. Establish Ethos as a destination for top engineering talent by shaping a strong internal and external technology brand. Qualifications & Skills 5+ years of senior engineering leadership experience in high-growth or enterprise-scale organizations. 10+ years of software engineering experience with a track record of shipping large-scale, distributed, and mission-critical systems. Proven success managing teams of 15+ engineers, including both ICs and managers. Deep expertise in cloud-native architectures (AWS or equivalent), multi-region deployments, distributed databases, and event-driven systems. Experience designing and scaling secure, high-throughput APIs and real-time data pipelines. Hands-on knowledge of observability practices (metrics, logging, tracing, alerting) to ensure reliability and proactive incident response. Proven experience with data management at scale (transactional integrity, policy lifecycle data models, customer privacy, regulatory compliance). Familiarity with enterprise integration patterns (service mesh, messaging queues, workflow orchestration, API gateways). Demonstrated ability to implement DevSecOps practices across distributed teams. Skilled at balancing long-term platform investments and immediate business needs. Bonus: Experience with insurance-grade systems such as policy administration platforms or regulated customer portals. Dont meet every single requirement If youre excited about this role but your past experience doesnt align perfectly with every qualification in the job description, we encourage you to apply anyway. At Ethos we are dedicated to building a diverse, inclusive and authentic workplace. We are an equal opportunity employer who values diversity and inclusion and look for applicants who understand, embrace and thrive in a multicultural world. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Pursuant to the SF Fair Chance Ordinance, we will consider employment for qualified applicants with arrests and conviction records. To learn more about what information we collect and how it may be used, please refer to our California Candidate Privacy Notice. Show more Show less

Posted 2 weeks ago

Apply

15.0 - 19.0 years

0 Lacs

hyderabad, telangana

On-site

About Mobius: Mobius is an AI-native platform that surpasses current AI products by blending neural networks, symbolic reasoning, graph intelligence, and autonomous agent coordination into a cohesive digital ecosystem. It represents the next phase of cloud-native AI platforms, specifically engineered to construct, oversee, and enhance intelligent software automatically. Mobius is the convergence point of data and reasoning, automation and intelligence, where software learns to self-construct. The Role: As a key figure, you will steer the architectural strategy of Mobius's core orchestration and infrastructure layer. This layer is crucial for driving all automation, workflow execution, and backend intelligence, shaping how digital systems are established, tested, deployed, and refined over time. Your primary responsibility is to design an AI-centric orchestration layer that is both modular, scalable, and autonomous. What You'll Own: - Design and oversee orchestration layers for various components including business process automation (BPMN, API orchestration), machine learning pipelines (MLFlow, Kubeflow), large language model workflows (LLMOps), DevSecOps automation (CI/CD, ArgoCD), data workflows (ETL pipelines using Airflow/SeaTunnel), distributed databases (NoSQL, Graph, Vector, RDBMS), and governance systems (identity, access, compliance workflows). - Establish a unified abstraction layer for AI-driven workflow composition. - Ensure runtime safety, observability, and dynamic scalability. - Enable real-time graph-based reconfiguration of infrastructure. - Collaborate with AI, data, and product teams to bolster intelligent automation. What We're Looking For: - A minimum of 15 years of experience in cloud-native architecture, infrastructure automation, or workflow orchestration. - Proficiency in orchestrating large-scale systems involving ML, APIs, data, and software delivery. - In-depth knowledge of Kubernetes, container systems, service mesh, and CI/CD frameworks. - Familiarity with BPMN, workflow engines such as Camunda and Argo, and process modeling tools. - Strong grasp of distributed systems, observability stacks, and runtime graph engines. - Ability to model and execute dynamic workflows based on declarative specifications. Bonus Points: - Prior involvement in designing orchestration platforms for AI/ML agents. - Publication of work or patents related to system design, process automation, or software-defined infrastructure. - Understanding of decision modeling, agent behavior, or adaptive workflows. Mobius is in search of architects for the future, not mere builders. If you have envisioned software that can think, adapt, and evolve, then you have imagined Mobius.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

12 - 15 Lacs

kolkata, hyderabad, pune

Hybrid

Job Description: We are seeking an experienced and hands-on Data Engineer with strong expertise in AWS and Databricks to join our growing data engineering team. The ideal candidate will be passionate about building scalable data pipelines and delivering high-quality data solutions. Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes using Databricks and AWS services Hands-on development using Python , Scala , and SQL Orchestrate workflows and manage job scheduling Optimize and monitor Databricks jobs for performance and reliability Work with large-scale datasets and implement best practices for data ingestion and processing Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Root cause analysis and performance tuning Document data engineering processes and developer guides Work in an Agile environment using tools like JIRA Batch ETL experience is essential; real-time pipeline experience is a plus Exposure to Google Cloud Platform (GCP) is a plus Required Skills & Experience: 3+ years of hands-on experience with Databricks 5+ years of experience with AWS data engineering services Strong proficiency in Python and Scala Experience with ETL development and SQL Knowledge of workflow orchestration tools Experience with monitoring , tuning , and debugging Databricks jobs Ability to perform root cause analysis and improve data quality/processes Experience working in Agile teams and using JIRA

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

new delhi, ahmedabad, bengaluru

Work from Office

We are seeking a skilled Big Data Developer with 3+ years of experience to develop, maintain, and optimize large-scale data pipelines using frameworks like Spark, PySpark, and Airflow. The role involves working with SQL, Impala, Hive, and PL/SQL for advanced data transformations and analytics, designing scalable data storage systems, and integrating structured and unstructured data using tools like Sqoop. The ideal candidate will collaborate with cross-functional teams to implement data warehousing strategies and leverage BI tools for insights. Proficiency in Python programming, workflow orchestration with Airflow, and Unix/Linux environments is essential. Location: Remote- Delhi / NCR,Bangalore/Bengaluru, Hyderabad/Secunderabad,Chennai, Pune,Kolkata,Ahmedabad,Mumbai

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

mumbai, new delhi, bengaluru

Work from Office

We are seeking a skilled Big Data Developer with 3+ years of experience to develop, maintain, and optimize large-scale data pipelines using frameworks like Spark, PySpark, and Airflow. The role involves working with SQL, Impala, Hive, and PL/SQL for advanced data transformations and analytics, designing scalable data storage systems, and integrating structured and unstructured data using tools like Sqoop. The ideal candidate will collaborate with cross-functional teams to implement data warehousing strategies and leverage BI tools for insights. Proficiency in Python programming, workflow orchestration with Airflow, and Unix/Linux environments is essential. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

haryana

On-site

As a Lead Platform Engineer at Bain & Company, you will have the opportunity to design and build cloud-based distributed systems that address complex business challenges for some of the world's largest companies. Leveraging your expertise in software engineering, cloud engineering, and DevOps, you will play a crucial role in creating technology stacks and platform components that empower cross-functional AI Engineering teams to develop robust, observable, and scalable solutions. Joining a diverse and globally distributed engineering team, you will be involved in the complete engineering life cycle, encompassing design, development, optimization, and deployment of solutions and infrastructure at a scale that matches the requirements of leading global companies. Your core responsibilities will include: - Architecting cloud solutions and distributed systems for full-stack AI software and data solutions - Implementing, testing, and managing Infrastructure as Code (IAC) for cloud-based solutions, covering CI/CD, data integrations, APIs, web and mobile apps, and AI solutions - Defining and implementing scalable, observable, manageable, and self-healing cloud-based solutions across AWS, Google Cloud, and Azure - Collaborating with diverse teams, including product managers, data scientists, and engineers, to implement analytics and AI features that align with business needs - Utilizing Kubernetes and containerization technologies to deploy and scale analytics applications in cloud environments - Developing APIs and microservices to expose analytics functionality to internal and external users while upholding best practices for API design and documentation - Ensuring robust security measures are in place to safeguard sensitive data and uphold data privacy regulations - Monitoring and troubleshooting application performance to maintain system reliability, latency, and user experience - Contributing to code reviews and enforcing coding standards and best practices to ensure high-quality, maintainable code - Staying updated on emerging trends in cloud computing, data analytics, and software engineering to enhance the analytics platform's capabilities - Collaborating with business consulting teams to assess opportunities and develop analytics solutions for clients across various sectors - Supporting and enhancing clients" analytics application engineering capabilities through education and direct assistance To be successful in this role, you should possess: - A Master's degree in Computer Science, Engineering, or a related technical field - At least 6 years of experience with a minimum of 3 years at a Staff level or equivalent - Proven experience as a cloud engineer and software engineer in product engineering or professional services organizations - Experience in designing and delivering cloud-based distributed solutions, with certifications in GCP, AWS, or Azure being advantageous - Proficiency in building infrastructure as code using tools like Terraform, Cloud Formation, or others - Familiarity with software development lifecycle, configuration management tools, monitoring and analytics platforms, CI/CD deployment pipelines, backend APIs, Kubernetes, Git, and workflow orchestration - Strong interpersonal and communication skills, along with critical thinking and a proactive mindset - Solid foundation in computer science fundamentals, data architecture, database design, and agile development methodologies By joining Bain & Company, a global consultancy dedicated to assisting ambitious change makers in shaping the future, you will collaborate with clients across the world to achieve extraordinary results and redefine industries. With a commitment to investing in pro bono services and a focus on environmental, social, and ethical performance, Bain & Company offers a unique opportunity to make a meaningful impact in today's rapidly evolving landscape.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

15 - 20 Lacs

bengaluru

Work from Office

(15 Days or Immediate Joiner) About the Role We are seeking a Python & SQL Software Engineer with strong hands-on expertise in developing and deploying automated tools. This role is ideal for someone who combines programming skills with problem-solving ability, and can build scalable automation solutions to streamline business processes. Key Responsibilities Design, develop, and maintain automated tools using Python and SQL Work across the end-to-end software engineering lifecycle: design, development, testing, and deployment Collaborate with stakeholders to translate business requirements into technical specifications and automation workflows Optimize existing processes through automation and workflow orchestration Deploy solutions in production environments ensuring performance, scalability, and maintainability Participate in data transformation, migration, and integration projects Ensure robust data quality and integrity in all automation pipelines Stay updated with new tools, frameworks, and technologies in automation and analytics Requirements Proven hands-on experience in Python and SQL development Strong expertise in automation, tool creation, and deployment Banking domain knowledge is strongly preferred Familiarity with software development best practices (version control, testing, CI/CD) Strong analytical and problem-solving skills Ability to work independently as well as in cross-functional teams Excellent communication skills for stakeholder collaboration Whats on Offer Competitive salary and benefits Opportunity to be part of a fast-paced, high-growth environment Ownership of initiatives with clear responsibilities Continuous coaching and mentoring from senior data and AI experts A dynamic, collaborative, and respectful work environment Education- Bachelors degree in Computer Science, Information Technology, Engineering, or a related field Masters degree (M.Tech/MCA/M.Sc. in Computer Science/IT/Analytics) is a plus

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Founding Tech Lead, you will report directly to the CEO and be responsible for owning the technical vision and execution across the platform. Your daily tasks will involve writing code in Python for services, agent orchestration, and production-grade systems. It will be your role to drive architectural decisions, establish engineering standards, and oversee the growth of the engineering team from 3 engineers to 10 or more. Additionally, you will be expected to translate healthcare revenue cycle requirements into clear technical solutions and roadmap priorities. Leading the hiring process for critical technical roles, setting the engineering culture, and building scalable processes will also fall under your responsibilities. Taking end-to-end accountability for system reliability, performance, and compliance in revenue-critical environments will be key to your success. You will face various technical challenges in this role, including building reliable agent orchestration using Temporal and agentic frameworks for healthcare billing, insurance claims, and optimization. Designing systems for high uptime, ensuring HIPAA compliance, and navigating regulated environments will be crucial. As the Founding Tech Lead, you will have full-stack ownership, encompassing browser/voice agents, Python backends, JS frontends, and AWS infrastructure with Terraform. Your role will involve shaping the entire stack from day one and architecting for scalability from dozens of clinics to potentially thousands, adapting to increasing complexity and volume. To be successful in this position, you should have a minimum of 5-7 years of experience building production systems with real users. Prior experience as a founder or in a strong early-stage startup environment (pre-Series A) is preferred. You should have a track record of building at least one independent project in AI/automation and possess strong fundamentals in distributed systems, databases, and cloud infrastructure. Expertise in Python, the AWS ecosystem, and workflow orchestration tools is essential, as well as experience in hiring and managing top engineering talent. The ability to work comfortably in healthcare and fintech-level regulated environments is a must. Experience with Temporal, agentic AI frameworks, or healthcare technology would be considered a bonus for this role.,

Posted 3 weeks ago

Apply

3.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Title: SDE-2/3 Location: Mumbai/Bangalore/Hyderabad Experience range: 3 to 12 years What we offer: Our mission is simple - Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That's why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. What we ask for: Design, develop and implement software solutions. Solve business problems through innovation and engineering practices. Involved in all aspects of the Software Development Lifecycle (SDLC) including analyzing requirements, incorporating architectural standards into application design specifications, documenting application specifications, translating technical requirements into programmed application modules, and developing or enhancing software application modules. Identify or troubleshoot application code-related issues. Take active role in code reviews to ensure solutions are aligned to pre- defined architectural specifications. Assist with design reviews by recommending ways to incorporate requirements into designs and information or data flows. Participate in project planning sessions with project managers, business analysts, and team members to analyze business requirements and outline proposed solutions. Qualifications: Should have strong technical background in JAVA, J2EE or Python, Spring stack Well versed with OOP's concept and design patterns Good understanding of data structure and algorithms Strong experience with Database systems like RDBMS (PostgreSQL, Oracle etc.) and NoSQL (Dynamo, MongoDB etc.) Experience in building Microservices and knowledge of workflow orchestration with Camunda or Temporal Knowledge of docker and containerization. Should have good experience in using messaging platforms like Kafka, RabbitMQ, etc. Knowledge in CI/CD Pipeline and Dev Ops tools Knowledge in Cloud Services such as AWS or Azure Should be familiar with Domain Driven Design Passionate and having depth knowledge in agile, Kanban process Should be able to communicate effectively with stakeholders Manage scope, timelines, quality, goals and deliverables that supports business Good communications skills Prior work experience in the product engineering/development. Good to have prior experience in Indian Banking segment and/or Fintech. Education background: Bachelor's degree in Computer Science, Information Technology or related field of study Good to have Certifications Java Certified Developer AWS Developer or Solution Architect Experience range required 3-12 Years

Posted 3 weeks ago

Apply

3.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Title: SDE-2/3 Location: Mumbai/Bangalore/Hyderabad Experience range: 3 to 12 years What we offer: Our mission is simple - Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That's why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. What we ask for: Design, develop and implement software solutions. Solve business problems through innovation and engineering practices. Involved in all aspects of the Software Development Lifecycle (SDLC) including analyzing requirements, incorporating architectural standards into application design specifications, documenting application specifications, translating technical requirements into programmed application modules, and developing or enhancing software application modules. Identify or troubleshoot application code-related issues. Take active role in code reviews to ensure solutions are aligned to pre- defined architectural specifications. Assist with design reviews by recommending ways to incorporate requirements into designs and information or data flows. Participate in project planning sessions with project managers, business analysts, and team members to analyze business requirements and outline proposed solutions. Qualifications: Should have strong technical background in JAVA, J2EE or Python, Spring stack Well versed with OOP's concept and design patterns Good understanding of data structure and algorithms Strong experience with Database systems like RDBMS (PostgreSQL, Oracle etc.) and NoSQL (Dynamo, MongoDB etc.) Experience in building Microservices and knowledge of workflow orchestration with Camunda or Temporal Knowledge of docker and containerization. Should have good experience in using messaging platforms like Kafka, RabbitMQ, etc. Knowledge in CI/CD Pipeline and Dev Ops tools Knowledge in Cloud Services such as AWS or Azure Should be familiar with Domain Driven Design Passionate and having depth knowledge in agile, Kanban process Should be able to communicate effectively with stakeholders Manage scope, timelines, quality, goals and deliverables that supports business Good communications skills Prior work experience in the product engineering/development. Good to have prior experience in Indian Banking segment and/or Fintech. Education background: Bachelor's degree in Computer Science, Information Technology or related field of study Good to have Certifications Java Certified Developer AWS Developer or Solution Architect Experience range required 3-12 Years

Posted 3 weeks ago

Apply

3.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Title: SDE-2/3 Location: Mumbai/Bangalore/Hyderabad Experience range: 3 to 12 years What we offer: Our mission is simple - Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That's why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. What we ask for: Design, develop and implement software solutions. Solve business problems through innovation and engineering practices. Involved in all aspects of the Software Development Lifecycle (SDLC) including analyzing requirements, incorporating architectural standards into application design specifications, documenting application specifications, translating technical requirements into programmed application modules, and developing or enhancing software application modules. Identify or troubleshoot application code-related issues. Take active role in code reviews to ensure solutions are aligned to pre- defined architectural specifications. Assist with design reviews by recommending ways to incorporate requirements into designs and information or data flows. Participate in project planning sessions with project managers, business analysts, and team members to analyze business requirements and outline proposed solutions. Qualifications: Should have strong technical background in JAVA, J2EE or Python, Spring stack Well versed with OOP's concept and design patterns Good understanding of data structure and algorithms Strong experience with Database systems like RDBMS (PostgreSQL, Oracle etc.) and NoSQL (Dynamo, MongoDB etc.) Experience in building Microservices and knowledge of workflow orchestration with Camunda or Temporal Knowledge of docker and containerization. Should have good experience in using messaging platforms like Kafka, RabbitMQ, etc. Knowledge in CI/CD Pipeline and Dev Ops tools Knowledge in Cloud Services such as AWS or Azure Should be familiar with Domain Driven Design Passionate and having depth knowledge in agile, Kanban process Should be able to communicate effectively with stakeholders Manage scope, timelines, quality, goals and deliverables that supports business Good communications skills Prior work experience in the product engineering/development. Good to have prior experience in Indian Banking segment and/or Fintech. Education background: Bachelor's degree in Computer Science, Information Technology or related field of study Good to have Certifications Java Certified Developer AWS Developer or Solution Architect Experience range required 3-12 Years

Posted 3 weeks ago

Apply

3.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Title: SDE-2/3 Location: Mumbai/Bangalore/Hyderabad Experience range: 3 to 12 years What we offer: Our mission is simple - Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That's why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. What we ask for: Design, develop and implement software solutions. Solve business problems through innovation and engineering practices. Involved in all aspects of the Software Development Lifecycle (SDLC) including analyzing requirements, incorporating architectural standards into application design specifications, documenting application specifications, translating technical requirements into programmed application modules, and developing or enhancing software application modules. Identify or troubleshoot application code-related issues. Take active role in code reviews to ensure solutions are aligned to pre- defined architectural specifications. Assist with design reviews by recommending ways to incorporate requirements into designs and information or data flows. Participate in project planning sessions with project managers, business analysts, and team members to analyze business requirements and outline proposed solutions. Qualifications: Should have strong technical background in JAVA, J2EE or Python, Spring stack Well versed with OOP's concept and design patterns Good understanding of data structure and algorithms Strong experience with Database systems like RDBMS (PostgreSQL, Oracle etc.) and NoSQL (Dynamo, MongoDB etc.) Experience in building Microservices and knowledge of workflow orchestration with Camunda or Temporal Knowledge of docker and containerization. Should have good experience in using messaging platforms like Kafka, RabbitMQ, etc. Knowledge in CI/CD Pipeline and Dev Ops tools Knowledge in Cloud Services such as AWS or Azure Should be familiar with Domain Driven Design Passionate and having depth knowledge in agile, Kanban process Should be able to communicate effectively with stakeholders Manage scope, timelines, quality, goals and deliverables that supports business Good communications skills Prior work experience in the product engineering/development. Good to have prior experience in Indian Banking segment and/or Fintech. Education background: Bachelor's degree in Computer Science, Information Technology or related field of study Good to have Certifications Java Certified Developer AWS Developer or Solution Architect Experience range required 3-12 Years

Posted 3 weeks ago

Apply

3.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Title: SDE-2/3 Location: Mumbai/Bangalore/Hyderabad Experience range: 3 to 12 years What we offer: Our mission is simple - Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That's why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. What we ask for: Design, develop and implement software solutions. Solve business problems through innovation and engineering practices. Involved in all aspects of the Software Development Lifecycle (SDLC) including analyzing requirements, incorporating architectural standards into application design specifications, documenting application specifications, translating technical requirements into programmed application modules, and developing or enhancing software application modules. Identify or troubleshoot application code-related issues. Take active role in code reviews to ensure solutions are aligned to pre- defined architectural specifications. Assist with design reviews by recommending ways to incorporate requirements into designs and information or data flows. Participate in project planning sessions with project managers, business analysts, and team members to analyze business requirements and outline proposed solutions. Qualifications: Should have strong technical background in JAVA, J2EE or Python, Spring stack Well versed with OOP's concept and design patterns Good understanding of data structure and algorithms Strong experience with Database systems like RDBMS (PostgreSQL, Oracle etc.) and NoSQL (Dynamo, MongoDB etc.) Experience in building Microservices and knowledge of workflow orchestration with Camunda or Temporal Knowledge of docker and containerization. Should have good experience in using messaging platforms like Kafka, RabbitMQ, etc. Knowledge in CI/CD Pipeline and Dev Ops tools Knowledge in Cloud Services such as AWS or Azure Should be familiar with Domain Driven Design Passionate and having depth knowledge in agile, Kanban process Should be able to communicate effectively with stakeholders Manage scope, timelines, quality, goals and deliverables that supports business Good communications skills Prior work experience in the product engineering/development. Good to have prior experience in Indian Banking segment and/or Fintech. Education background: Bachelor's degree in Computer Science, Information Technology or related field of study Good to have Certifications Java Certified Developer AWS Developer or Solution Architect Experience range required 3-12 Years

Posted 3 weeks ago

Apply

3.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Title: SDE-2/3 Location: Mumbai/Bangalore/Hyderabad Experience range: 3 to 12 years What we offer: Our mission is simple - Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That's why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. What we ask for: Design, develop and implement software solutions. Solve business problems through innovation and engineering practices. Involved in all aspects of the Software Development Lifecycle (SDLC) including analyzing requirements, incorporating architectural standards into application design specifications, documenting application specifications, translating technical requirements into programmed application modules, and developing or enhancing software application modules. Identify or troubleshoot application code-related issues. Take active role in code reviews to ensure solutions are aligned to pre- defined architectural specifications. Assist with design reviews by recommending ways to incorporate requirements into designs and information or data flows. Participate in project planning sessions with project managers, business analysts, and team members to analyze business requirements and outline proposed solutions. Qualifications: Should have strong technical background in JAVA, J2EE or Python, Spring stack Well versed with OOP's concept and design patterns Good understanding of data structure and algorithms Strong experience with Database systems like RDBMS (PostgreSQL, Oracle etc.) and NoSQL (Dynamo, MongoDB etc.) Experience in building Microservices and knowledge of workflow orchestration with Camunda or Temporal Knowledge of docker and containerization. Should have good experience in using messaging platforms like Kafka, RabbitMQ, etc. Knowledge in CI/CD Pipeline and Dev Ops tools Knowledge in Cloud Services such as AWS or Azure Should be familiar with Domain Driven Design Passionate and having depth knowledge in agile, Kanban process Should be able to communicate effectively with stakeholders Manage scope, timelines, quality, goals and deliverables that supports business Good communications skills Prior work experience in the product engineering/development. Good to have prior experience in Indian Banking segment and/or Fintech. Education background: Bachelor's degree in Computer Science, Information Technology or related field of study Good to have Certifications Java Certified Developer AWS Developer or Solution Architect Experience range required 3-12 Years

Posted 3 weeks ago

Apply

3.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Title: SDE-2/3 Location: Mumbai/Bangalore/Hyderabad Experience range: 3 to 12 years What we offer: Our mission is simple - Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That's why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. What we ask for: Design, develop and implement software solutions. Solve business problems through innovation and engineering practices. Involved in all aspects of the Software Development Lifecycle (SDLC) including analyzing requirements, incorporating architectural standards into application design specifications, documenting application specifications, translating technical requirements into programmed application modules, and developing or enhancing software application modules. Identify or troubleshoot application code-related issues. Take active role in code reviews to ensure solutions are aligned to pre- defined architectural specifications. Assist with design reviews by recommending ways to incorporate requirements into designs and information or data flows. Participate in project planning sessions with project managers, business analysts, and team members to analyze business requirements and outline proposed solutions. Qualifications: Should have strong technical background in JAVA, J2EE or Python, Spring stack Well versed with OOP's concept and design patterns Good understanding of data structure and algorithms Strong experience with Database systems like RDBMS (PostgreSQL, Oracle etc.) and NoSQL (Dynamo, MongoDB etc.) Experience in building Microservices and knowledge of workflow orchestration with Camunda or Temporal Knowledge of docker and containerization. Should have good experience in using messaging platforms like Kafka, RabbitMQ, etc. Knowledge in CI/CD Pipeline and Dev Ops tools Knowledge in Cloud Services such as AWS or Azure Should be familiar with Domain Driven Design Passionate and having depth knowledge in agile, Kanban process Should be able to communicate effectively with stakeholders Manage scope, timelines, quality, goals and deliverables that supports business Good communications skills Prior work experience in the product engineering/development. Good to have prior experience in Indian Banking segment and/or Fintech. Education background: Bachelor's degree in Computer Science, Information Technology or related field of study Good to have Certifications Java Certified Developer AWS Developer or Solution Architect Experience range required 3-12 Years

Posted 3 weeks ago

Apply

3.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Title: SDE-2/3 Location: Mumbai/Bangalore/Hyderabad Experience range: 3 to 12 years What we offer: Our mission is simple - Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That's why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. What we ask for: Design, develop and implement software solutions. Solve business problems through innovation and engineering practices. Involved in all aspects of the Software Development Lifecycle (SDLC) including analyzing requirements, incorporating architectural standards into application design specifications, documenting application specifications, translating technical requirements into programmed application modules, and developing or enhancing software application modules. Identify or troubleshoot application code-related issues. Take active role in code reviews to ensure solutions are aligned to pre- defined architectural specifications. Assist with design reviews by recommending ways to incorporate requirements into designs and information or data flows. Participate in project planning sessions with project managers, business analysts, and team members to analyze business requirements and outline proposed solutions. Qualifications: Should have strong technical background in JAVA, J2EE or Python, Spring stack Well versed with OOP's concept and design patterns Good understanding of data structure and algorithms Strong experience with Database systems like RDBMS (PostgreSQL, Oracle etc.) and NoSQL (Dynamo, MongoDB etc.) Experience in building Microservices and knowledge of workflow orchestration with Camunda or Temporal Knowledge of docker and containerization. Should have good experience in using messaging platforms like Kafka, RabbitMQ, etc. Knowledge in CI/CD Pipeline and Dev Ops tools Knowledge in Cloud Services such as AWS or Azure Should be familiar with Domain Driven Design Passionate and having depth knowledge in agile, Kanban process Should be able to communicate effectively with stakeholders Manage scope, timelines, quality, goals and deliverables that supports business Good communications skills Prior work experience in the product engineering/development. Good to have prior experience in Indian Banking segment and/or Fintech. Education background: Bachelor's degree in Computer Science, Information Technology or related field of study Good to have Certifications Java Certified Developer AWS Developer or Solution Architect Experience range required 3-12 Years

Posted 3 weeks ago

Apply

3.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Title: SDE-2/3 Location: Mumbai/Bangalore/Hyderabad Experience range: 3 to 12 years What we offer: Our mission is simple - Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That's why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. What we ask for: Design, develop and implement software solutions. Solve business problems through innovation and engineering practices. Involved in all aspects of the Software Development Lifecycle (SDLC) including analyzing requirements, incorporating architectural standards into application design specifications, documenting application specifications, translating technical requirements into programmed application modules, and developing or enhancing software application modules. Identify or troubleshoot application code-related issues. Take active role in code reviews to ensure solutions are aligned to pre- defined architectural specifications. Assist with design reviews by recommending ways to incorporate requirements into designs and information or data flows. Participate in project planning sessions with project managers, business analysts, and team members to analyze business requirements and outline proposed solutions. Qualifications: Should have strong technical background in JAVA, J2EE or Python, Spring stack Well versed with OOP's concept and design patterns Good understanding of data structure and algorithms Strong experience with Database systems like RDBMS (PostgreSQL, Oracle etc.) and NoSQL (Dynamo, MongoDB etc.) Experience in building Microservices and knowledge of workflow orchestration with Camunda or Temporal Knowledge of docker and containerization. Should have good experience in using messaging platforms like Kafka, RabbitMQ, etc. Knowledge in CI/CD Pipeline and Dev Ops tools Knowledge in Cloud Services such as AWS or Azure Should be familiar with Domain Driven Design Passionate and having depth knowledge in agile, Kanban process Should be able to communicate effectively with stakeholders Manage scope, timelines, quality, goals and deliverables that supports business Good communications skills Prior work experience in the product engineering/development. Good to have prior experience in Indian Banking segment and/or Fintech. Education background: Bachelor's degree in Computer Science, Information Technology or related field of study Good to have Certifications Java Certified Developer AWS Developer or Solution Architect Experience range required 3-12 Years

Posted 3 weeks ago

Apply
Page 1 of 4
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies