Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
punjab
On-site
As an Integration DevOps Engineer in Sydney, you will need to possess a skillset that includes expertise in Redhat Openshift Kubernetes Container Platform, Infrastructure as Code using Terraform, and various DevOps concepts, tools, and languages. With at least 3+ years of experience, you will be responsible for developing, configuring, and maintaining Openshift in lower environments and production settings. Your role will involve working with Automation and CI/CD tools such as Ansible, Jenkins, Tekton, or Bamboo pipelines in conjunction with Kubernetes containers. A strong understanding of security policies including BasicAuth, OAuth, WSSE token, and configuring security policies for APIs using Hashicorp Vault will be essential for this position. In addition, you will be expected to create environments, namespaces, virtual hosts, API proxies, and cache as well as work with APIGEEX, ISTIO ServiceMesh, and Confluent Kafka Setup and Deployments. Your experience with cloud architectures such as AWS, Azure, Private, OnPrem, and Multi-cloud will be valuable. Furthermore, you will play a key role in developing, managing, and supporting automation tools, processes, and runbooks. Your contribution to delivering services or features via an agile DevOps approach, ensuring information security for the cloud, and promoting good practices in coding and automation processes will be crucial. Effective communication skills are essential as you will be required to provide thought leadership on cloud platform, automation, coding patterns, and components. A self-rating matrix from the candidate on skills like OpenShift, Kubernetes, and APIGEEX is mandatory for consideration. If you have any questions or need further clarification, please feel free to reach out.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
You are a Senior Cloud Application Developer (AWS to Azure Migration) with 8+ years of experience. Your role involves hands-on experience in developing applications for both AWS and Azure platforms. You should have a strong understanding of Azure services for application development and deployment, including Azure IaaS and PaaS services. Your responsibilities include proficiency in AWS to Azure cloud migration, which involves service mapping and SDK/API conversion. You will also be required to perform code refactoring and application remediation for cloud compatibility. You should have a minimum of 5 years of experience in application development using Java, Python, Node.js, or .NET. Additionally, you must possess a solid understanding of CI/CD pipelines, deployment automation, and Azure DevOps. Experience with containerized applications, AKS, Kubernetes, and Helm charts is also necessary. Your role will involve application troubleshooting, support, and testing in cloud environments. Experience with the following tech stack is highly preferred: - Spring Boot REST API, NodeJS REST API - Apigee config, Spring Server Config - Confluent Kafka, AWS S3 Sync Connector - Azure Blob Storage, Azure Files, Azure Functions - Aurora PostgreSQL to Azure DB migration - EKS to AKS migration, S3 to Azure Blob Storage - AWS to Azure SDK Conversion Location options for this role include Hyderabad, Bangalore, or Pune. You should have a notice period of 10-15 days.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Join us as a Senior ETL Test Engineer at Barclays, responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. To be successful as a Senior ETL Test Engineer, you should have experience with expertise in ETL tool e.g. Informatica. Develop and execute ETL test cases, test scripts, and data validation scenarios. Validation on data extraction, transformation, and loading along with data completeness. Test Automation experience in developing and implementing ETL test automation scripts using Python, SQL scripting, QuerySurge, Unix, and Shall Scripts. Automate comparison, schema validation, and regression testing. Integrate test automation with CICD pipeline (Jenkins, Gitlab, DevOps). Optimize and maintain automated test suite for scalability and performance. Understand requirements, user stories, and able to relate them with the design document. Work closely with business analyst, Dev team to define the test scope. Maintain Test Plan, Test Data, and Automation in version control. Document best practices, lessons learned, and continuous improvement strategies. Identify and log defects via JIRA & defect management. Work with business analyst and developers to troubleshoot data issues and pipeline failures. Provide a detailed report on test execution, coverage, and defect analysis. Understanding of agile development/test methodology and practice it in day-to-day work. Unearth gaps between business requirements and User stories. Ensure ETL process adheres to data privacy and compliance. Validate data masking encryption and access control. Audit and data Recon testing to track the data modification. Some other highly valued skills may include preferable earlier experience in coding with an engineering background. Detail understanding of Cloud technology viz AWS, Confluent Kafka. Good if have hands-on experience in BDD/TDD. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. Collaboration with cross-functional teams to analyze requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise. Thorough understanding of the underlying principles and concepts within the area of expertise. They lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviors are: L Listen and be authentic, E Energize and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they develop technical expertise in the work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for the end results of a team's operational processing and activities. Escalate breaches of policies/procedure appropriately. Take responsibility for embedding new policies/procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership of managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with the function, alongside knowledge of the organization's products, services, and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex/sensitive information. Act as the contact point for stakeholders outside of the immediate function while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,
Posted 4 days ago
5.0 - 10.0 years
15 - 25 Lacs
Navi Mumbai
Work from Office
We are hiring a RabbitMQ Admin with strong expertise in Kafka, messaging systems, and performance monitoring. This role involves managing and optimizing enterprise messaging infrastructure in a banking environment. Required Candidate profile Experienced Messaging Admin with hands-on Kafka & RabbitMQ skills, certified in Confluent Kafka,adept at ensuring high-performance message delivery,troubleshooting issues,securing middleware systems.
Posted 1 week ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Key Skills: Confluent Kafka, Kafka Connect, Schema Registry, Kafka Brokers, KSQL, KStreams, Java/J2EE, Troubleshooting, RCA, Production Support. Roles & Responsibilities: Design and develop Kafka Pipelines. Perform unit testing of the code and prepare test plans as required. Analyze, design, and develop programs in a development environment. Support applications and jobs in the production environment for issues or failures. Develop operational documents for applications, including DFD, ICD, HLD, etc. Troubleshoot production issues and provide solutions within defined SLA. Prepare RCA (Root Cause Analysis) document for production issues. Provide permanent fixes to production issues. Experience Requirement: 5-10 yeras of experience working with Confluent Kafka. Hands-on experience with Kafka Connect using Schema Registry. Strong knowledge of Kafka brokers and KSQL. Familiarity with Kafka Control Center, Zookeepers, and KStreams is good to have. Experience with Java/J2EE is a plus. Education: B.E., B.Tech.
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
You are a results-driven Data Project Manager (PM) responsible for leading data initiatives within a regulated banking environment, focusing on leveraging Databricks and Confluent Kafka. Your role involves overseeing the successful end-to-end delivery of complex data transformation projects aligned with business and regulatory requirements. In this position, you will be required to lead the planning, execution, and delivery of enterprise data projects using Databricks and Confluent. This includes developing detailed project plans, delivery roadmaps, and work breakdown structures, as well as ensuring resource allocation, budgeting, and adherence to timelines and quality standards. Collaboration with data engineers, architects, business analysts, and platform teams is essential to align on project goals. You will act as the primary liaison between business units, technology teams, and vendors, facilitating regular updates, steering committee meetings, and issue/risk escalations. Your technical oversight responsibilities include managing solution delivery on Databricks for data processing, ML pipelines, and analytics, as well as overseeing real-time data streaming pipelines via Confluent Kafka. Ensuring alignment with data governance, security, and regulatory frameworks such as GDPR, CBUAE, and BCBS 239 is crucial. Risk and compliance management are key aspects of your role, involving ensuring regulatory reporting data flows comply with local and international financial standards and managing controls and audit requirements in collaboration with Compliance and Risk teams. The required skills and experience for this role include 7+ years of Project Management experience within the banking or financial services sector, proven experience in leading data platform projects, a strong understanding of data architecture, pipelines, and streaming technologies, experience in managing cross-functional teams, and proficiency in Agile/Scrum and Waterfall methodologies. Technical exposure to Databricks (Delta Lake, MLflow, Spark), Confluent Kafka (Kafka Connect, kSQL, Schema Registry), Azure or AWS Cloud Platforms, integration tools, CI/CD pipelines, and Oracle ERP Implementation is expected. Preferred qualifications include PMP/Prince2/Scrum Master certification, familiarity with regulatory frameworks, and a strong understanding of data governance principles. The ideal candidate will hold a Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related field. Key performance indicators for this role include on-time, on-budget delivery of data initiatives, uptime and SLAs of data pipelines, user satisfaction, and compliance with regulatory milestones.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
maharashtra
On-site
NTT DATA is looking for a Data Ingest Engineer to join the team in Pune, Mahrshtra (IN-MH), India (IN). As a Data Ingest Engineer, you will be part of the Ingestion team of the DRIFT data ecosystem, focusing on ingesting data in a timely, complete, and comprehensive manner using the latest technology available to Citi. Your role will involve leveraging new and creative methods for repeatable data ingestion from various sources while ensuring the highest quality data is provided to downstream partners. Responsibilities include partnering with management teams to integrate functions effectively, identifying necessary system enhancements for new products and process improvements, and resolving high impact problems/projects through evaluation of complex business processes and industry standards. You will provide expertise in applications programming, ensure application design aligns with the overall architecture blueprint, and develop standards for coding, testing, debugging, and implementation. Additionally, you will analyze issues, develop innovative solutions, and mentor mid-level developers and analysts. The ideal candidate should have 6-10 years of experience in Apps Development or systems analysis, with extensive experience in system analysis and programming of software applications. Proficiency in Application Development using JAVA, Scala, Spark, familiarity with event-driven applications and streaming data, and experience with various schema, data types, ELT methodologies, and formats are required. Experience working with Agile and version control tool sets, leadership skills, and clear communication abilities are also essential. NTT DATA is a trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. With experts in more than 50 countries and a strong partner ecosystem, NTT DATA is committed to helping clients innovate, optimize, and transform for long-term success. As a part of the NTT Group, NTT DATA invests significantly in R&D to support organizations and society in moving confidently into the digital future. For more information, visit us at us.nttdata.com.,
Posted 2 weeks ago
5.0 - 10.0 years
3 - 6 Lacs
Chennai, Tamil Nadu, India
On-site
Skills: 5+ years experience as developer in .NET environment Exposure to Azure/AWS cloud infra and services Hands-on tech stack deployment and implementation Client-facing experience Good comm skills Desirable: Angular/React exposure TCP/IP Socket programming Message queues using Apache/Confluent Kafka SQL Server/ MongoDB / Cosmos DB work experience REDIS cache usage experienceIdeally, it would be a solution architect who can do hands-on coding and deployment.
Posted 1 month ago
8.0 - 13.0 years
3 - 5 Lacs
Chennai, Tamil Nadu, India
On-site
8+ years experience as developer in .NET environment Exposure to Azure/AWS cloud infra and services Hands-on tech stack deployment and implementation Client-facing experience Good comm skills Desirable: Angular/React exposure TCP/IP Socket programming Message queues using Apache/Confluent Kafka SQL Server/ MongoDB / Cosmos DB work experience REDIS cache usage experience Ideally, it would be a solution architect who can do hands-on coding and deployment.
Posted 1 month ago
7.0 - 12.0 years
12 - 18 Lacs
Pune, Chennai
Work from Office
Key Responsibilities: Implement Confluent Kafka-based CDC solutions to support real-time data movement across banking systems. Implement event-driven and microservices-based data solute zions for enhanced scalability, resilience, and performance . Integrate CDC pipelines with core banking applications, databases, and enterprise systems . Ensure data consistency, integrity, and security , adhering to banking compliance standards (e.g., GDPR, PCI-DSS). Lead the adoption of Kafka Connect, Kafka Streams, and Schema Registry for real-time data processing. Optimize data replication, transformation, and enrichment using CDC tools like Debezium, GoldenGate, or Qlik Replicate . Collaborate with Infra team, data engineers, DevOps teams, and business stakeholders to align data streaming capabilities with business objectives. Provide technical leadership in troubleshooting, performance tuning, and capacity planning for CDC architectures. Stay updated with emerging technologies and drive innovation in real-time banking data solutions . Required Skills & Qualifications: Extensive experience in Confluent Kafka and Change Data Capture (CDC) solutions . Strong expertise in Kafka Connect, Kafka Streams, and Schema Registry . Hands-on experience with CDC tools such as Debezium, Oracle GoldenGate, or Qlik Replicate . Hands on experience on IBM Analytics Solid understanding of core banking systems, transactional databases, and financial data flows . Knowledge of cloud-based Kafka implementations (AWS MSK, Azure Event Hubs, or Confluent Cloud) . Proficiency in SQL and NoSQL databases (e.g., Oracle, MySQL, PostgreSQL, MongoDB) with CDC configurations. Strong experience in event-driven architectures, microservices, and API integrations . Familiarity with security protocols, compliance, and data governance in banking environments. Excellent problem-solving, leadership, and stakeholder communication skills .
Posted 1 month ago
6.0 - 8.0 years
27 - 30 Lacs
Pune, Ahmedabad, Chennai
Work from Office
Technical Skills Must Have: 8+ years overall IT industry experience, with 5+ years in a solution or technical architect role using service and hosting solutions such as private/public cloud IaaS, PaaS and SaaS platforms. 5+ years of hands-on development experience with event driven architecture-based implementation. Achieved one or more of the typical solution and technical architecture certifications e.g. Microsoft, MS Azure Certification, TOGAF, AWS Cloud Certified, SAFe, PMI, and SAP etc. Hand-on experience with: o Claims-based authentication (SAML/OAuth/OIDC), MFA, JIT, and/or RBAC / Ping etc. o Architecting Mission critical technology components with DR capabilities. o Multi-geography, multi-tier service design and management. o Project financial management, solution plan development and product cost estimation. o Supporting peer teams and their responsibilities; such as infrastructure, operations, engineering, info-security. o Configuration management and automation tools such as Azure DevOps, Ansible, Puppet, Octopus, Chef, Salt, etc. o Software development full lifecycle methodologies, patterns, frameworks, libraries and tools. o Relational, graph and/or unstructured data technologies such as SQL Server, Azure SQL, Cosmos, Azure Data Lake, HD Insights, Hadoop, Neo4j etc. o Data management and data governance technologies. o Experience in data movement and transformation technologies. o AI and Machine Learning tools such as Azure ML etc. o Architecting mobile applications that are either independent applications or supplementary addons (to intranet or extranet). o Cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls, SIEM, etc. o Apache Kafka, Confluent Kafka, Kafka Streams, and Kafka Connect. o Proficient in NodeJS, Java, Scala, or Python languages.
Posted 1 month ago
2.0 - 7.0 years
4 - 9 Lacs
Bengaluru
Work from Office
We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make efficient use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective implementations. We are looking for Data Engineer (AWS, Confluent & Snaplogic ) Data Integration : Integrate data from various Siemens organizations into our data factory, ensuring seamless data flow and real-time data fetching. Data Processing : Implement and manage large-scale data processing solutions using AWS Glue, ensuring efficient and reliable data transformation and loading. Data Storage : Store and manage data in a large-scale data lake, utilizing Iceberg tables in Snowflake for optimized data storage and retrieval. Data Transformation : Apply various data transformations to prepare data for analysis and reporting, ensuring data quality and consistency. Data Products : Create and maintain data products that meet the needs of various stakeholders, providing actionable insights and supporting data-driven decision-making. Workflow Management : Use Apache Airflow to orchestrate and automate data workflows, ensuring timely and accurate data processing. Real-time Data Streaming : Utilize Confluent Kafka for real-time data streaming, ensuring low-latency data integration and processing. ETL Processes : Design and implement ETL processes using SnapLogic , ensuring efficient data extraction, transformation, and loading. Monitoring and Logging : Use Splunk for monitoring and logging data processes, ensuring system reliability and performance. Youd describe yourself as: Experience : 3+ relevant years of experience in data engineering, with a focus on AWS Glue, Iceberg tables, Confluent Kafka, SnapLogic, and Airflow. Technical Skills : Proficiency in AWS services, particularly AWS Glue. Experience with Iceberg tables and Snowflake. Knowledge of Confluent Kafka for real-time data streaming. Familiarity with SnapLogic for ETL processes. Experience with Apache Airflow for workflow management. Understanding of Splunk for monitoring and logging. Programming Skills : Proficiency in Python, SQL, and other relevant programming languages. Data Modeling : Experience with data modeling and database design. Problem-Solving : Strong analytical and problem-solving skills, with the ability to troubleshoot and resolve data-related issues. Preferred Qualities: Attention to Detail : Meticulous attention to detail, ensuring data accuracy and quality. Communication Skills : Excellent communication skills, with the ability to collaborate effectively with cross-functional teams. Adaptability : Ability to adapt to changing technologies and work in a fast-paced environment. Team Player : Strong team player with a collaborative mindset. Continuous Learning : Eagerness to learn and stay updated with the latest trends and technologies in data engineering.
Posted 1 month ago
8.0 - 10.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Cloud Solution Delivery Lead Consultant to join our team in bangalore, Karn?taka (IN-KA), India (IN). Data Engineer Lead Robust hands-on experience with industry standard tooling and techniques, including SQL, Git and CI/CD pipelines mandiroty Management, administration, and maintenance with data streaming tools such as Kafka/Confluent Kafka, Flink Experienced with software support for applications written in Python & SQL Administration, configuration and maintenance of Snowflake & DBT Experience with data product environments that use tools such as Kafka Connect, Synk, Confluent Schema Registry, Atlan, IBM MQ, Sonarcube, Apache Airflow, Apache Iceberg, Dynamo DB, Terraform and GitHub Debugging issues, root cause analysis, and applying fixes Management and maintenance of ETL processes (bug fixing and batch job monitoring) Training & Certification . Apache Kafka Administration Snowflake Fundamentals/Advanced Training . Experience 8 years of experience in a technical role working with AWS At least 2 years in a leadership or management role About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 1 month ago
0.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients . Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant, Senior Java Developer In this role, we are inviting applications for the role of Senior Java Developer with working experience in Microservices. Responsibilities Must have skills: JAVA, Spring/ SpringBoot , REST API, Microservices - JAVA or NODE.JS, JBOSS, SQL, MS Azure (Azure EventHub, Confluent Kafka, Azure App Service Environment, ASP) or AWS equivalent. Working Knowledge: Bitbucket, GIT, Confluence, JIRA, Strong experience in DevOps pipeline, CI/ CD and related tools Nice to Have: Kubernetes, MQ, OAuth and Event Driven messaging, Postman, O/S (Windows, Linux), Jboss scripting/CLI, prior FI experience. Readiness and motivation to work autonomously in a lead capacity on a diverse range of activities ( e.g. design, support of technical business solutions) and can be relied on to coach, educate and monitor the work of others. Primary subject matter expertise in multiple areas you%27re seasoned in counselling clients and project teams on all aspects of research, analysis, design, hardware and software support, development of technical solutions and testing. Involvement coaching and advising clients, partners and project teams capable of being an internal expert resource in technical information exchange. Commitment to and belief in the quality of your deliverables. Qualifications we seek in you Minimum Qualifications BE/ B Tech/ MCA or equivalent Excellent Communication skills Preferred Qualifications/ Skills Experience of software development including architecting, designing and coding Strong expertise in API, Microservices development and integration using Java/Spring Boot, Node.js Expert knowledge of the business, broader organization, technical environment, standards, processes, tools, procedures, multiple programming languages, operating systems, solutions design and other relevant technology areas from a design/support/solutions perspective. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit Follow us on X, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 month ago
0.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients . Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant, Senior Java Developer In this role, we are inviting applications for the role of Senior Java Developer with working experience in Microservices. Responsibilities Must have skills: JAVA, Spring/ SpringBoot , REST API, Microservices - JAVA or NODE.JS, JBOSS, SQL, MS Azure (Azure EventHub, Confluent Kafka, Azure App Service Environment, ASP) or AWS equivalent. Working Knowledge: Bitbucket, GIT, Confluence, JIRA, Strong experience in DevOps pipeline, CI/ CD and related tools Nice to Have: Kubernetes, MQ, OAuth and Event Driven messaging, Postman, O/S (Windows, Linux), Jboss scripting/CLI, prior FI experience. Readiness and motivation to work autonomously in a lead capacity on a diverse range of activities ( e.g. design, support of technical business solutions) and can be relied on to coach, educate and monitor the work of others. Primary subject matter expertise in multiple areas you%27re seasoned in counselling clients and project teams on all aspects of research, analysis, design, hardware and software support, development of technical solutions and testing. Involvement coaching and advising clients, partners and project teams capable of being an internal expert resource in technical information exchange. Commitment to and belief in the quality of your deliverables. Qualifications we seek in you Minimum Q ualifications BE/ B Tech/ MCA or equivalent Excellent Communication skills Preferred Q ualifications / Skills Experience of software development including architecting, designing and coding Strong expertise in API, Microservices development and integration using Java/Spring Boot, Node.js Expert knowledge of the business , broader organization, technical environment, standards, processes, tools, procedures, multiple programming languages, operating systems, solutions design and other relevant technology areas from a design/support/solutions perspective. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws.Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit Follow us on X, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
10+ years of software development experience building large scale distributed data processing systems/application, Data Engineering or large scale internet systems. Experience of at least 4 years in Developing/ Leading Big Data solution at enterprise scale with at least one end to end implementation Strong experience in programming languages Java/J2EE/Scala. Good experience in Spark/Hadoop/HDFS Architecture, YARN, Confluent Kafka , Hbase, Hive, Impala and NoSQL database. Experience with Batch Processing and AutoSys Job Scheduling and Monitoring Performance analysis, troubleshooting and resolution (this includes familiarity and investigation of Cloudera/Hadoop logs) Work with Cloudera on open issues that would result in cluster configuration changes and then implement as needed Strong experience with databases such as SQL,Hive, Elasticsearch, HBase, etc Knowledge of Hadoop Security, Data Management and Governance Primary Skills: Java/Scala, ETL, Spark, Hadoop, Hive, Impala, Sqoop, HBase, Confluent Kafka, Oracle, Linux, Git, Jenkins CI/CD
Posted 1 month ago
10.0 - 15.0 years
13 - 17 Lacs
Bengaluru
Work from Office
About the Role: We are looking for a Senior Engineering Manager with 10+ years of experience and 2 years of people management experience to help scale and modernize Myntra's data platform. The ideal candidate will have a strong background in building scalable data platforms using a combination of open-source technologies and enterprise solutions. The role demands deep technical expertise in data ingestion, processing, serving, and governance, with a strategic mindset to scale the platform 10x to meet the ever-growing data needs across the organization. This is a high-impact role requiring innovation, engineering excellence and system stability, with an opportunity to contribute to OSS projects and build data products leveraging available data assets. Key Responsibilities: Design and scale Myntra's data platform to support growing data needs across analytics, ML, and reporting. Architect and optimize streaming data ingestion pipelines using Debezium, Kafka (Confluent), Databricks Spark and Flink. Lead improvements in data processing and serving layers, leveraging Databricks Spark, Trino, and Superset. Good understanding of open table formats like Delta and Iceberg. Scale data quality frameworks to ensure data accuracy and reliability. Build data lineage tracking solutions for governance, access control, and compliance. Collaborate with engineering, analytics, and business teams to identify opportunities and build / enhance self-serve data platforms. Improve system stability, monitoring, and observability to ensure high availability of the platform. Work with open-source communities and facilitate contributing to OSS projects aligned with Myntras tech stack. Implement cost-efficient, scalable architectures for handling 10B+ daily events in a cloud environment. Management Responsibilities: Technical Guidance : This role will play the engineering lead role for teams within Myntra Data Platform. You will provide technical leadership to a team of excellent data engineers; this requires that you have the technical depth to make complex design decisions and the hands-on ability to lead by example. Execution and Delivery : You will be expected to instill and follow good software development practices and ensure timely delivery of high-quality products. You should be familiar with agile practices as well as be able to adapt these to the needs of the business, with a constant focus on product quality. Team management : You will be responsible for hiring and mentoring your team; helping individuals grow in their careers, having constant dialogue about their aspirations and sharing prompt, clear and actionable feedback about performance. Qualifications: Education: Bachelor's or Masters degree in Computer Science, Information Systems, or a related field. Experience: 10+ years of experience in building large-scale data platforms. 2+ years of people management experience. Expertise in big data architectures using Databricks, Trino, and Debezium. Strong experience with streaming platforms, including Confluent Kafka. Experience in data ingestion, storage, processing, and serving in a cloud-based environment. Experience implementing data quality checks using Great Expectations. Deep understanding of data lineage, metadata management, and governance practices. Strong knowledge of query optimization, cost efficiency, and scaling architectures. Familiarity with OSS contributions and keeping up with industry trends in data engineering. Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technical challenges. Excellent communication and collaboration skills to work effectively with cross-functional teams. Ability to lead large-scale projects in a fast-paced, dynamic environment. Passion for continuous learning, open-source collaboration, and building best-in-class data products.
Posted 2 months ago
7.0 - 10.0 years
15 - 30 Lacs
Hyderabad, Coimbatore, Bengaluru
Work from Office
We are looking for a highly skilled Senior Java Developer with strong hands-on experience in cloud platforms, specifically in migrating Java-based applications from AWS to Azure. The ideal candidate will have a deep understanding of Java, Spring Boot, Kubernetes, and both AWS and Azure services, with direct experience in infrastructure, refactoring code for Azure compatibility, and deploying applications in containerized environments. Key Responsibilities: Lead and execute the migration of Java applications from AWS to Azure environments. Analyze source architecture, source code, and AWS service dependencies to identify necessary code and infrastructure remediation. Perform hands-on refactoring and configuration changes to enable application deployment on Azure. Migrate applications and services including EKS to AKS, AWS S3 to Azure Blob Storage, and AWS SDK to Azure SDK. Implement and manage Azure services such as AKS, Azure Functions, Azure App Services, VMs, APIM, and others. Design and update CI/CD pipelines, deployment scripts, and manage Kubernetes environments using Helm charts. Provide application testing support, troubleshooting, and ensure smooth deployment on Azure. Configure and manage related services like Apigee, Kafka connectors, and Azure Blob Storage. Collaborate with DevOps and Infra teams to ensure readiness of IaaS and PaaS environments on Azure. Participate in PostgreSQL data migration (Aurora to Azure PostgreSQL good to have). Required Qualifications & Skills: 8+ years of hands-on experience in Java / Spring Boot application development. Proven experience in cloud migration projects, specifically AWS to Azure. Hands-on experience with AKS, Azure Functions, and Azure DevOps. Experience with AWS and Azure services for application development and deployment. Deep understanding of Azure infrastructure services (IaaS/PaaS). Strong understanding of microservices architecture, containerization, Kubernetes, and Helm. Experience with Apigee, Confluent Kafka, and Spring Boot REST APIs. Proven experience in: AWS to Azure SDK conversion (Must Have) EKS to AKS migration (Must Have) S3 to Azure Blob Storage migration PostgreSQL (Aurora) data migration (Good to Have) Skills : - Senior Java Developer, Java, Spring Boot, Kubernetes, AWS, Azure, AWS to Azure migration, cloud platforms, EKS to AKS migration, AWS S3 to Azure Blob Storage, AWS SDK to Azure SDK, Azure Functions, Azure App Services, Azure VMs, Azure APIM, CI/CD pipelines, deployment scripts, Helm charts, containerized environments, Apigee, Confluent Kafka, Kafka connectors, Azure Blob Storage, IaaS, PaaS, Aurora PostgreSQL, Azure PostgreSQL, application development, microservices, application deployment, cloud migration, Azure infrastructure, Azure DevOps, Spring Boot REST APIs, code remediation, source code analysis, application troubleshooting, application testing support, application refactoring, DevOps collaboration, cloud-native applications, Location : - Hyderabad,Bengaluru,Coimbatore,Pune
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough