Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Senior Software Engineer specializing in Debezium, Snowflake, Business Objects, Power BI, Java/Python, and SQL, with 3 to 6 years of experience in Software Development/Engineering, you will be a crucial member of our team in either Bangalore or Hyderabad (Position ID: J1124-1679). In this permanent role, your primary responsibility will be the development and maintenance of our applications to ensure they are robust, user-friendly, and scalable. Your key duties and responsibilities will include designing, developing, and maintaining web applications utilizing technologies such as Debezium, Snowflake, Business Objects, Power BI, and Pentaho. You will collaborate with cross-functional teams to define, design, and implement new features, ensuring clean, scalable, and efficient code. Additionally, you will conduct code reviews, perform unit testing and continuous integration, as well as troubleshoot and resolve technical issues promptly. Staying abreast of emerging technologies and industry trends will be essential, and active participation in Agile/Scrum development processes is expected. To excel in this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field, coupled with at least 3 years of experience in full-stack development. Possessing analytical and multitasking skills will be advantageous, along with familiarity with tools like JIRA, Gitlab, and Confluence. Proficiency in database technologies such as SQL, MySQL, PostgreSQL, or NoSQL databases, as well as experience with version control systems like Git, is preferred. Knowledge of cloud services like AWS, Azure, or Google Cloud, understanding CI/CD pipelines, and DevOps practices will be beneficial. Soft skills are paramount in this role, with a strong emphasis on problem-solving, communication, collaboration, and the ability to thrive in a fast-paced, agile environment. The successful candidate will exhibit a strong work ethic and a commitment to turning insights into actionable solutions. At CGI, we prioritize ownership, teamwork, respect, and belonging. As a CGI Partner, you will have the opportunity to contribute from day one, shaping our collective success and actively participating in the company's strategy and direction. Your work will be valued and impactful, allowing you to innovate, build relationships, and leverage global capabilities. CGI offers a supportive environment for career growth, health, and well-being, providing opportunities to enhance your skills and broaden your horizons. Join our team at CGI, one of the world's largest IT and business consulting services firms, and embark on a fulfilling career journey with us.,
Posted 3 days ago
13.0 - 16.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Evernorth Evernorth? exists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care, we solve the problems others dont, wont or cant. Our innovation hub in India will allow us to work with the right talent, expand our global footprint, improve our competitive stance, and better deliver on our promises to stakeholders. We are passionate about making healthcare better by delivering world-class solutions that make a real difference. We are always looking upward. And that starts with finding the right talent to help us get there. Software Engineering Senior Manager Position Summary In this role, the Software Engineering Senior Manager will be responsible for building and leading a highly talented team that is focused on using technology, advanced analytics, embedded insights, and product design principles to innovate and deliver modern solutions aligned to strategic initiatives. In this role you will collaborate with business leadership and technology partners to define and execute on a shared vision. You authentically engage with your team and matrix partners to ensure ongoing alignment and delivery success. You are dedicated to technical excellence for yourself, your team, and the software you deliver. The ideal candidate should have 13 - 16 years of experience in building and leading high performing teams in software engineering, preferably in health care or a related industry. Job Description & Responsibilities Grow our engineering team Invest time in developing team members and mentoring; attract, hire, and retain top talent. Provide leadership and management of teams responsible for software development and the introduction of new technologies at offshore. Partner with the business units, customers and stakeholders. Experience leading development using modern software engineering and product development tools including Agile/SAFE, Continuous Integration, Continuous Delivery, etc. Demonstrate leadership in the context of software engineering and be an evangelist for engineering best practices. Stay abreast of leading-edge technologies in the industry. Evaluate emerging software technologies. Work collaboratively with all business areas to assess unmet/new business needs and solutions. Encourage the growth of direct and indirect reports through skills development, objectives, and goal settings. Hold direct reports accountable for meeting performance standards and departmental goals. Mentor staff, measure staff performance and complete regular performance reviews and ranking. Experience Desired 13 16 years technology experience, with direct experience designing and implementing high volume multi-tier transactional systems, including: web, large-scale database, workflow, enterprise-scale software, and service oriented and cloud-based architectures. Proven experience leading/managing technical teams with a passion for the developing the talent within the team. Experience with vendor management in an onshore/offshore model. Demonstrated success with delivering software using modern, cloud-based technologies preferably on AWS. Strong experience with most of the following technologies Qlik/Kafka, AWS (serverless related services), python, sql stored procedures, data bricks, java, Debezium. Expertise across Relational and NoSQL database platforms including MS SQL, Postgres, DynamoDB and Redshift Strong grasp of cloud-native architectures, APIs, microservices, and modern DevOps practices (CI/CD, IaC, monitoring). Experience with agile methodology including SCRUM team leadership. Experience with modern delivery practices such as continuous integration, behavior/test driven development, and specification by example. Proven experience with architecture, design, and development of large-scale enterprise application solutions. Strong written and verbal communication skills with the ability to interact with all levels of the organization. Strong influencing/negotiation, interpersonal/relationship management skills. Strong time and project management skills. Proven ability to resolve difficult and politically spirited issues and mitigate risks that have the potential of undermining the delivery of critical initiatives. Demonstrated leadership in building high performing teams including the hiring and developing of great people. Proven and accomplished individual with excellent leadership and strategic management skills Primary Skills / Education Degree in Computer Science, Information Technology, Artificial Intelligence, or a related field. Strong analytical skills to challenge design recommendations. Strong understanding of best practices related to interface and interoperability. Strong understanding of SDLC in an Agile environment. Additional Skills Proven track record of delivering software engineering initiatives, IT-application initiatives, and cross IT/business initiatives Demonstrated success with delivering products using modern software techniques Applied expertise in software engineering methodologies including automation and CI/CD About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives. Show more Show less
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
We are seeking experienced and talented engineers to join our team. Your main responsibilities will include designing, building, and maintaining the software that drives the global logistics industry. WiseTech Global is a leading provider of software for the logistics sector, facilitating connectivity for major companies like DHL and FedEx within their supply chains. Our organization is product and engineer-focused, with a strong commitment to enhancing the functionality and quality of our software through continuous innovation. Our primary Research and Development center in Bangalore plays a pivotal role in our growth strategies and product development roadmap. As a Lead Software Engineer, you will serve as a mentor, a leader, and an expert in your field. You should be adept at effective communication with senior management while also being hands-on with the code to deliver effective solutions. The technical environment you will work in includes technologies such as C#, Java, C++, Python, Scala, Spring, Spring Boot, Apache Spark, Hadoop, Hive, Delta Lake, Kafka, Debezium, GKE (Kubernetes Engine), Composer (Airflow), DataProc, DataStreams, DataFlow, MySQL RDBMS, MongoDB NoSQL (Atlas), UIPath, Helm, Flyway, Sterling, EDI, Redis, Elastic Search, Grafana Dashboard, and Docker. Before applying, please note that WiseTech Global may engage external service providers to assess applications. By submitting your application and personal information, you agree to WiseTech Global sharing this data with external service providers who will handle it confidentially in compliance with privacy and data protection laws.,
Posted 4 days ago
6.0 - 10.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Responsibilities: Design, build, and optimize data pipelines to ingest, process, transform, and load data from various sources into our data platform Implement and maintain ETL workfl ows using tools like Debezium, Kafka, Airfl ow, and Jenkins to ensure reliable and timely data processing Develop and optimize SQL and NoSQL database schemas, queries, and stored procedures for effi cient data retrieval and processing *** Work with both relational databases (MySQL, PostgreSQL) and NoSQL databases (MongoDB, DocumentDB) to build scalable data solutions Design and implement data warehouse solutions that support analytical needs and machine learning applications Collaborate with data scientists and ML engineers to prepare data for AI/ML models and implement data-driven features Implement data quality checks, monitoring, and alerting to ensure data accuracy and reliability Optimize query performance across various database systems through indexing, partitioning, and query refactoring Develop and maintain documentation for data models, pipelines, and processes Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs Stay current with emerging technologies and best practices in data engineering Requirements: 5+ years of experience in data engineering or related roles with a proven track record of building data pipelines and infrastructure Strong profi ciency in SQL and experience with relational databases like MySQL and PostgreSQL Hands-on experience with NoSQL databases such as MongoDB or AWS DocumentDB Expertise in designing, implementing, and optimizing ETL processes using tools like Kafka, Debezium, Airfl ow, or similar technologies Experience with data warehousing concepts and technologies Solid understanding of data modeling principles and best practices for both operational and analytical systems Proven ability to optimize database performance, including query optimization, indexing strategies, and database tuning Experience with AWS data services such as RDS, Redshift, S3, Glue, Kinesis, and ELK stack Profi ciency in at least one programming language (Python, Node.js, Java) Experience with version control systems (Git) and CI/CD pipelines Job Description: Experience with graph databases (Neo4j, Amazon Neptune) Knowledge of big data technologies such as Hadoop, Spark, Hive, and data lake architectures Experience working with streaming data technologies and real-time data processing Familiarity with data governance and data security best practices Experience with containerization technologies (Docker, Kubernetes) Understanding of fi nancial back-offi ce operations and FinTech domain Experience working in a high-growth startup environment
Posted 1 week ago
5.0 - 10.0 years
12 - 22 Lacs
Pune
Remote
Mandate skillsets- Debezium, Oracle connector (LogMiner)., Kafka. Review the current Debezium deployment architecture, including Oracle connector configuration, Kafka integration, and downstream consumers. Analyze Oracle database setup for CDC compatibility (e.g., redo log configuration, supplemental logging, privileges). Evaluate connector performance, lag, and error handling mechanisms. Identify bottlenecks, misconfigurations, or anti-patterns in the current implementation. Provide a detailed report with findings, best practices, and actionable recommendations. Optionally, support implementation of recommended changes and performance tuning. Experience: Strong hands-on experience with Debezium, especially the Oracle connector (LogMiner). Deep understanding of Oracle internals relevant to CDC: redo logs, SCNs, archive log mode, supplemental logging. Proficiency with Apache Kafka and Kafka ecosystem tools. Experience with monitoring and debugging Debezium connectors in production environments. Ability to analyze logs, metrics, and connector configurations to identify root causes of issues. Strong documentation and communication skills for delivering technical assessments
Posted 3 weeks ago
7.0 - 12.0 years
12 - 18 Lacs
Pune, Chennai
Work from Office
Key Responsibilities: Implement Confluent Kafka-based CDC solutions to support real-time data movement across banking systems. Implement event-driven and microservices-based data solute zions for enhanced scalability, resilience, and performance . Integrate CDC pipelines with core banking applications, databases, and enterprise systems . Ensure data consistency, integrity, and security , adhering to banking compliance standards (e.g., GDPR, PCI-DSS). Lead the adoption of Kafka Connect, Kafka Streams, and Schema Registry for real-time data processing. Optimize data replication, transformation, and enrichment using CDC tools like Debezium, GoldenGate, or Qlik Replicate . Collaborate with Infra team, data engineers, DevOps teams, and business stakeholders to align data streaming capabilities with business objectives. Provide technical leadership in troubleshooting, performance tuning, and capacity planning for CDC architectures. Stay updated with emerging technologies and drive innovation in real-time banking data solutions . Required Skills & Qualifications: Extensive experience in Confluent Kafka and Change Data Capture (CDC) solutions . Strong expertise in Kafka Connect, Kafka Streams, and Schema Registry . Hands-on experience with CDC tools such as Debezium, Oracle GoldenGate, or Qlik Replicate . Hands on experience on IBM Analytics Solid understanding of core banking systems, transactional databases, and financial data flows . Knowledge of cloud-based Kafka implementations (AWS MSK, Azure Event Hubs, or Confluent Cloud) . Proficiency in SQL and NoSQL databases (e.g., Oracle, MySQL, PostgreSQL, MongoDB) with CDC configurations. Strong experience in event-driven architectures, microservices, and API integrations . Familiarity with security protocols, compliance, and data governance in banking environments. Excellent problem-solving, leadership, and stakeholder communication skills .
Posted 1 month ago
3.0 - 6.0 years
20 - 30 Lacs
Bengaluru
Work from Office
Job Title: Data Engineer II (Python, SQL) Experience: 3 to 6 years Location: Bangalore, Karnataka (Work from office, 5 days a week) Role: Data Engineer II (Python, SQL) As a Data Engineer II, you will work on designing, building, and maintaining scalable data pipelines. Youll collaborate across data analytics, marketing, data science, and product teams to drive insights and AI/ML integration using robust and efficient data infrastructure. Key Responsibilities: Design, develop and maintain end-to-end data pipelines (ETL/ELT). Ingest, clean, transform, and curate data for analytics and ML usage. Work with orchestration tools like Airflow to schedule and manage workflows. Implement data extraction using batch, CDC, and real-time tools (e.g., Debezium, Kafka Connect). Build data models and enable real-time and batch processing using Spark and AWS services. Collaborate with DevOps and architects for system scalability and performance. Optimize Redshift-based data solutions for performance and reliability. Must-Have Skills & Experience: 3+ years in Data Engineering or Data Science with strong ETL and pipeline experience. Expertise in Python and SQL . Strong experience in Data Warehousing , Data Lakes , Data Modeling , and Ingestion . Working knowledge of Airflow or similar orchestration tools. Hands-on with data extraction techniques like CDC , batch-based, using Debezium, Kafka Connect, AWS DMS . Experience with AWS Services : Glue, Redshift, Lambda, EMR, Athena, MWAA, SQS, etc. Knowledge of Spark or similar distributed systems. Experience with queuing/messaging systems like SQS , Kinesis , RabbitMQ .
Posted 1 month ago
8.0 - 13.0 years
25 - 40 Lacs
Pune
Work from Office
What You'll Do Job Description: You will Provide 24/7 administrative support (on-prime and Atlas Cloud) on MongoDB Clusters, Postgres & Snowflake Provide support for on-prime and Confluent Cloud Kafka Clusters You will Review database designs to ensure all technical and our requirements are met. Perform database Optimization, testing to ensure Service level agreements are met. You will support during system implementation and in production Provide Support for Snowflake Administrative Tasks (Data Pipeline, Object creation, Access) Participate in Weekdays and Weekend Oncall Rotation to support Products running on Mongo, SQL, Kafka & Snowflake, and other RDBMS Systems. This roles does not have any managerial responsibilities. Its an individual contributor role. You will report to Sr. Manager Reliability Engineering. What Your Responsibilities Will Be 8+ years of experience in Managing MongoDB on-prime and Atlas Cloud Be an part of the database team in developing next-generation database systems. Provide services in administration and performance monitoring of database related systems. Develop system administration standards and procedures to maintain practices. Support backup and recovery strategies. Provide in the creative process to improving architectural designs and implement new architectures Expertise in delivering efficiency and cost effectiveness. Monitor and support capacity planning and analysis. Monitor performance, troubleshoot issues and proactively tune database and workloads. Sound knowledge Terraform, Grafana and Manage Infra as a code using Terraform & Gitlab. Ability to work remotely. What You'll Need to be Successful Working knowledge of MongoDB (6.0 or above). Experience with Sharding and Replica sets. Working knowledge of database installation, setup, creation, and maintenance processes. Working knowledge on Change Streams and Mongo ETL's to replicate live changes to downstream Analytics systems. Experience running MongoDB in containerized environment (EKS clusters) Support Reliability Engineering task for all other database platform (SQL, MYSQL, Postgres, Snowflake, Kafka). Experience with Cloud or Ops Manager (a plus) Understand Networking components on aws and gcp cloud. Technical knowledge of Backup/ Recovery. Disaster Recovery and High Availability techniques Strong technical knowledge in writing shell scripts used to support database administration. Good Understanding of Kafka and Snowflake Administration. Good Understanding of Debezium, Kafka, Zookeeper and Snowflake is plus. Automate Database Routine tasks Independently with shell, python and other languages.
Posted 1 month ago
5.0 - 10.0 years
9 - 19 Lacs
Pune, Bengaluru, Delhi / NCR
Hybrid
Skills : Debezium, SQL, Apache Kafka Exp : 5-11 yrs Location : PAN India
Posted 1 month ago
5.0 - 8.0 years
2 - 3 Lacs
Delhi, India
On-site
Responsible for building and maintaining Cloud based database systems of high availability and quality depending on clients Requirement Responsible for data migration from various systems to RDBMS PostgreSQLMySQL Responsible for writing ETL packages using Kafka Debezium Define users and enable data distribution to the right user in appropriate format and in a timely manner Use highspeed transaction recovery techniques and backup data Minimize database downtime and manage parameters to provide fast query responses Provide proactive and reactive data management support and training to users whenever required Monitor database performance implement changes and apply new patches and versions when required Responsible for building and maintaining Cloud based database systems of high availability and quality depending on Client Requirement Design and implement cloudbased database in accordance to on client requirement information needs and views Define users and enable data distribution to the right user in appropriate format and in a timely manner Use highspeed transaction recovery techniques and backup data Minimize database downtime and manage parameters to provide fast query responses Provide proactive and reactive data management support and training to users whenever required Determine enforce and document database policies procedures and standards Perform tests and evaluations regularly to ensure data security privacy and integrity
Posted 1 month ago
10.0 - 15.0 years
13 - 17 Lacs
Bengaluru
Work from Office
About the Role: We are looking for a Senior Engineering Manager with 10+ years of experience and 2 years of people management experience to help scale and modernize Myntra's data platform. The ideal candidate will have a strong background in building scalable data platforms using a combination of open-source technologies and enterprise solutions. The role demands deep technical expertise in data ingestion, processing, serving, and governance, with a strategic mindset to scale the platform 10x to meet the ever-growing data needs across the organization. This is a high-impact role requiring innovation, engineering excellence and system stability, with an opportunity to contribute to OSS projects and build data products leveraging available data assets. Key Responsibilities: Design and scale Myntra's data platform to support growing data needs across analytics, ML, and reporting. Architect and optimize streaming data ingestion pipelines using Debezium, Kafka (Confluent), Databricks Spark and Flink. Lead improvements in data processing and serving layers, leveraging Databricks Spark, Trino, and Superset. Good understanding of open table formats like Delta and Iceberg. Scale data quality frameworks to ensure data accuracy and reliability. Build data lineage tracking solutions for governance, access control, and compliance. Collaborate with engineering, analytics, and business teams to identify opportunities and build / enhance self-serve data platforms. Improve system stability, monitoring, and observability to ensure high availability of the platform. Work with open-source communities and facilitate contributing to OSS projects aligned with Myntras tech stack. Implement cost-efficient, scalable architectures for handling 10B+ daily events in a cloud environment. Management Responsibilities: Technical Guidance : This role will play the engineering lead role for teams within Myntra Data Platform. You will provide technical leadership to a team of excellent data engineers; this requires that you have the technical depth to make complex design decisions and the hands-on ability to lead by example. Execution and Delivery : You will be expected to instill and follow good software development practices and ensure timely delivery of high-quality products. You should be familiar with agile practices as well as be able to adapt these to the needs of the business, with a constant focus on product quality. Team management : You will be responsible for hiring and mentoring your team; helping individuals grow in their careers, having constant dialogue about their aspirations and sharing prompt, clear and actionable feedback about performance. Qualifications: Education: Bachelor's or Masters degree in Computer Science, Information Systems, or a related field. Experience: 10+ years of experience in building large-scale data platforms. 2+ years of people management experience. Expertise in big data architectures using Databricks, Trino, and Debezium. Strong experience with streaming platforms, including Confluent Kafka. Experience in data ingestion, storage, processing, and serving in a cloud-based environment. Experience implementing data quality checks using Great Expectations. Deep understanding of data lineage, metadata management, and governance practices. Strong knowledge of query optimization, cost efficiency, and scaling architectures. Familiarity with OSS contributions and keeping up with industry trends in data engineering. Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technical challenges. Excellent communication and collaboration skills to work effectively with cross-functional teams. Ability to lead large-scale projects in a fast-paced, dynamic environment. Passion for continuous learning, open-source collaboration, and building best-in-class data products.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough