Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Position Overview: We are seeking a talented Data Engineer with expertise in Apache Spark, Python / Java and distributed systems. The ideal candidate will be skilled in creating and managing data pipelines using AWS. Key Responsibilities: Design, develop, and implement data pipelines for ingesting, transforming, and loading data at scale. Utilise Apache Spark for data processing and analysis. Utilise AWS services (S3, Redshift, EMR, Glue) to build and manage efficient data pipelines. Optimise data pipelines for performance and scalability, considering factors like partitioning, bucketing, and caching. Write efficient and maintainable Python code. Implement and manage distributed systems for data processing. Collaborate with cross-functional teams to understand data requirements and deliver optimal solutions. Ensure data quality and integrity throughout the data lifecycle. Qualifications: Proven experience with Apache Spark and Python / Java. Strong knowledge of distributed systems. Proficiency in creating data pipelines with AWS. Excellent problem-solving and analytical skills. Ability to work independently and as part of a team. Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent experience). Proven experience in designing and developing data pipelines using Apache Spark and Python. Experience with distributed systems concepts (Hadoop, YARN) is a plus. In-depth knowledge of AWS cloud services for data engineering (S3, Redshift, EMR, Glue). Familiarity with data warehousing concepts (data modeling, ETL) is preferred. Strong programming skills in Python (Pandas, NumPy, Scikit-learn are a plus). Experience with data pipeline orchestration tools (Airflow, Luigi) is a plus. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Preferred Qualifications: Experience with additional AWS services (e.g., AWS Glue, AWS Lambda, Amazon Redshift). Familiarity with data warehousing and ETL processes. Knowledge of data governance and best practices. Have a good understanding of the oops concept. Hands-on experience with SQL database design Experience with Python, SQL, and data visualization/exploration tools
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
This role is for one of Weekday's clients Location: Chennai, Pune, Kochi JobType: full-time Requirements We are looking for a skilled and versatile JAVA FSD AWS Developer to join our client's Agile/SAFe development teams. In this role, you will participate in the design, development, integration, and deployment of enterprise-grade applications built on both modern cloud-native architectures (AWS) . You will ensure high-quality, testable, secure, and compliant code while collaborating in a fast-paced Agile setup Key Responsibilities: Agile Participation & Code Quality Active involvement in Scrum and SAFe team events, including planning, daily stand-ups, reviews, and retrospectives Create and validate testable features, ensuring coverage of both functional and non-functional requirements Deliver high-quality code through practices like Pair Programming and Test-Driven Development (TDD) Maintain operability, deployability, and integration readiness of application increments Ensure full compliance with internal frameworks such as PITT and established security protocols (SAST, DAST). Development & Integration Develop software solutions using a diverse tech stack: TypeScript, Java, SQL, Python, COBOL, Shell scripting Spring Boot, Angular, Node.js, Hibernate Work across multiple environments and technologies including Linux, Apache, Tomcat, Elasticsearch, IBM DB2 Build and maintain web applications, backend services, and APIs using modern and legacy technologies. AWS & Cloud Infrastructure Hands-on development and deployment with AWS services,EKS, ECR, IAM, SQS, SES, S3, CloudWatch Develop Infrastructure as Code using Terraform Ensure system reliability, monitoring, and traceability using tools like Splunk, UXMon, and AWS CloudWatch. Systems & Batch Integration Work with Kafka, particularly Streamzilla Kafka from PAG, for high-throughput messaging Design and consume both REST and SOAP APIs for integration with third-party and internal systems Manage and automate batch job scheduling via IBM Tivoli Workload Scheduler (TWS/OPC) and HostJobs Required Skills & Experience: 5+ years of experience in full stack development, DevOps, and mainframe integration Strong programming experience in: Languages: TypeScript, Java, Python, COBOL, Shell scripting Frameworks & Tools: Angular, Spring Boot, Hibernate, Node.js Databases: SQL, IBM DB2, Elasticsearch Proficient in AWS Cloud Services including container orchestration, IAM, S3, CloudWatch, SES, SQS, and Terraform Strong understanding of API development and integration (REST & SOAP) Experience in secure software development using SAST/DAST, TDD, and compliance frameworks (e.g., PITT) Familiarity with Kafka messaging systems, particularly Streamzilla Kafka Monitoring and observability experience using tools like Splunk, UXMon, or equivalents Preferred Qualifications: Experience with PCSS Toolbox or similar enterprise tooling Prior exposure to highly regulated industries (e.g., automotive, banking, insurance) Bachelor's or Master's degree in Computer Science, Information Technology, or related fields Certifications in AWS or DevOps tools are a plus
Posted 1 week ago
4.0 - 9.0 years
4 - 8 Lacs
Pune
Work from Office
Experience: 4+ Years. Expertise in Python Language is MUST. SQL (should be able to write complex SQL Queries) is MUST Hands on experience in Apache Flink Streaming Or Spark Streaming MUST Hands On expertise in Apache Kafka experience is MUST Data Lake Development experience. Orchestration (Apache Airflow is preferred). Spark and Hive: Optimization of Spark/PySpark and Hive apps Trino/(AWS Athena) (Good to have) Snowflake (good to have). Data Quality (good to have). File Storage (S3 is good to have) Our Offering:- Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture.
Posted 1 week ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Responsibilities: Design, develop, and maintain both front-end and back-end components of web applications Write clean, efficient, and maintainable code using languages such as JavaScript, HTML5, jQuery, React, Python, or Nodejs Build front-end applications through appealing visual design Develop and manage databases, ensuring data integrity and security Create and maintain RESTful and GraphQL APIs Implement JWT and OAuth for secure authentication and authorization Implement automated testing frameworks and conduct thorough testing Manage the deployment process, including CI/CD pipelines Work with development teams and product managers to create efficient software solutions Lead and mentor junior developers, providing guidance and support Oversee the entire software development lifecycle from conception to deployment. Good to have: Bachelor’s degree or higher in Computer Science or a related field Prior 10+ years of experience as a Full Stack Developer or similar role Experience developing web and mobile applications Experience with version control systems like Git Proficient in multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery, ReactJs, Angular, ASP.NET) Proficient in multiple back-end languages (e.g. C#, Python, .NET Core) and JavaScript frameworks (e.g. Node.js, Django) Knowledge of databases (e.g. SQL, MySQL, MongoDB), web servers (e.g. Apache), UI/UX design Experience with cloud platforms such as AWS, Azure, or Google Cloud Familiarity with containerization (Docker) and orchestration (Kubernetes) Understanding of software development principles and best practices Conduct regular code reviews to ensure code quality and adherence to standards Ability to work efficiently in a collaborative team environment Excellent problem-solving and analytical skills Experience with other JavaScript frameworks and libraries (e.g., Angular, Vue.js) Knowledge of DevOps practices and tools like Azure CI/CD, Jenkins, or GitLab CI Familiarity with data warehousing and ETL processes Experience with microservices architecture
Posted 1 week ago
4.0 - 7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
SEAMEDU - A SSISTANT PROFESSOR – DATA SCIENCE & ARTIFICIAL INTELLIGENCE This position is for Seamedu - School of Pro Expressionism, a division of Seamless Education and Services Pvt. Ltd (SEAS) which has 2 divisions: Seamedu - School of Pro Expressionism is a Media, Technology and Management school producing world class talent for the industry. Seamedu is an institution that nurtures the creativity of the students. Seamedu has been awarded by the Government of Maharashtra in IT & IT related Fields- Multimedia/ Entertainment/ Gaming. Seamedu has campuses in Pune, Gurgaon and Bangalore. To know more about us, please visit – www.seamedu.com. Job Title / Designation: A SSISTANT PROFESSOR – DATA SCIENCE & ARTIFICIAL INTELLIGENCE Job Description: We seek a passionate and experienced Senior Faculty / Assistant Professor to join our dynamic Information Technology programs. You will play a key role in shaping the future generation of IT professionals by delivering engaging and challenging courses that bridge the gap between theory and practice. Responsibilities: · Develop and deliver a range of high-quality courses in Artificial Intelligence, Machine Learning and Data Science at intermediate and advanced levels. Subjects include but are not limited to: o Deep Learning and Neural Networks o Predictive Analytics o Time Series Analysis, Recommender Systems, Data Mining Techniques o AI/ML Algorithms o Data Analytics using SQL, Apache Spark, and Big Data Analytics · Utilize interactive teaching methods with hands-on projects and case studies to enhance critical thinking and real-world problem-solving. · Develop engaging syllabi, and course materials and provide hands-on support to students as they work on projects, guiding them through the creative process and problem-solving to achieve their goals. · Stay up-to-date with the latest advancements in AI, ML, and Data Science. Incorporate cutting-edge tools and frameworks such as TensorFlow, Keras, Scikit-Learn, R, and Cloud-based AI platforms (AWS, Azure AI). · Work with faculty to enhance the AI/ML/Data Science curriculum, reflecting emerging technologies and industry needs. · Participate in departmental and university-wide committees and activities. · Participate in recruitment activities and advise students on career opportunities in AI, machine learning, and data science. · Continuously improve your teaching skills through professional development and stay updated on the latest advancements to provide relevant education to your students. Minimum/ Work Experience Required: 4-7 years Annual CTC : Not a Constraint for right person Location(s) of Job: Ajeenkya D Y Patil University, Pune Minimum Qualification Requirements: · Master's degree in Computer Science, Information Technology, or a related field. · Strong understanding of current trends and technologies in IT. · Excellent communication, interpersonal, and organizational skills. · Ability to develop and deliver engaging and effective lectures. · Commitment to continuous learning and professional development. Any Other Skillset : · Ph.D. in a relevant field. · Experience in curriculum development and program administration.\ · Demonstrated experience teaching courses within the areas covered by the Bachelor's degree programs. · Familiarity with cutting-edge technologies like NLP, Computer Vision, and Reinforcement Learning · Published research in AI/ML or data science, or a strong research background · Experience with advanced data visualization tools such as Tableau, Power BI, or Data Studio Also looking for ASSISTANT PROFESSOR- INFORMATION SECURITY and ASSISTANT PROFESSOR- Cloud Technology.
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Role Description We’re looking for a Senior Frontend Developer who loves to tackle challenging problems with a firm grasp on browser technologies having more than 5 years of experience. You will take on a central role in developing our products using ReactJS, Ant Design, and other libraries with input from product management. Our teams are spread across several locations & serve customers in the US, Europe, and India (Pune, Bangalore, & NCR). Our team is at the forefront of technology, and loves working with others via Meetups and Hackathons. We are one of a couple of hundred companies who applied for the TiE Pune Nurture Accelerator Program for 2019/20 and 1 of 12 that actually graduated. We were also 1 of 4 accepted companies out of 170 that applied, for the 2021 Brigade REAP Accelerator Program. Our Technologies Include Python ElasticSearch ReactJS React Native / Flutter / iOS / Android Apache Cassandra VoIP and related technologies (Freeswitch, Kamailio, etc) Docker/K8/Puppet AWS/GCP/Azure Responsibilities Develop user interface components that are robust and easy to maintain Build, test, document, and deploy at scale Implement and integrate RESTful APIs in ReactJS Work in a team-oriented environment, providing software development technical expertise and guidance to key stakeholders on variety of enterprise-scale applications and projects Provide technical direction and guidance, as well as draft specifications, architect solutions, define timelines, advise on industry best practices and problems to be solved Work closely with Customers, Product Managers, and Architects to develop effective, high-quality enterprise software solutions Understand and apply a variety of project life-cycles, methods, and software development techniques Write code and review other people’s code. Ensure the technical feasibility of UI/UX designs. Optimize application for maximum speed and scalability. About You 5+ years of overall software development experience Proficient understanding of modern web tech stack including HTML, Less, JQuery, and ES6. Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model Good understanding of React.js and its core principles Experience with popular React.js workflows (such as Flux or Redux) Familiarity with integrating RESTful APIs and browser nuances Experience with front-end development tools such as Babel, Webpack, NPM, Yarn Attention to detail and a strong sense of ownership. The mindset to take up project individually and meet the deadline BS/MS in Computer Science or related stream is a must Bonus: Experience with unit testing using jest or react-testing- library. Perks A great team culture Challenging work environment Open door policy Liberal work from home Conference and training support Amazing referral program PF & Health Insurance Team outings (Regular & Annual offsite) About Us We Are Engineers. We Are Innovators. We Are Creators. Inspired by real problems, driving real results, MetroGuild, a global B2B SaaS company, developed MetroLeads – marketing, sales, and communications management platform. Rooted in the science of selling, MetroGuild evolved to offer a range of products and services to your Sales team. MetroGuild empowers organizations globally to own and grow their Marketing and Sales Teams and drive growth. MetroGuild provides CRM, digital asset building, and support to help organizations reach their true growth potential. Desired Position * Applicant Name * Email Address * Phone Number Qualification * Associate DegreeBachelor's DegreeCollegePostgraduateOther Resume * The file can be in PDF/TXT format.(upload limit upto 6MB) Remarks Fields with * are required. Be assured that your information will not be sold or distributed and will only be used to respond to your query. Thanks for your interest! Δ
Posted 1 week ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
✅ Job Title: Data Engineer – Apache Spark, Scala, GCP & Azure 📍 Location: Gurugram (Hybrid – 3 days/week in office) 🕒 Experience: 5–10 Years 🧑💻 Type: Full-time 📩 Apply: Share your resume with the details listed below to vijay.s@xebia.com 🕐 Availability: Immediate joiners or max 2 weeks' notice period only 🚀 About the Role Xebia is looking for a skilled Data Engineer to join our fast-paced team in Gurugram. You will work on building and optimizing scalable data pipelines, processing large datasets using Apache Spark and Scala , and deploying on cloud platforms like GCP and Azure . If you're passionate about clean architecture, high-quality data flow, and performance tuning, this is the opportunity for you. 🔧 Key Responsibilities Design and develop robust ETL pipelines using Apache Spark Write clean and efficient data processing code in Scala Handle large-scale data movement, transformation, and storage Build solutions on Google Cloud Platform (GCP) and Microsoft Azure Collaborate with teams to define data strategies and ensure data quality Optimize jobs for performance and cost on distributed systems Document technical designs and ETL flows clearly for the team ✅ Must-Have Skills Apache Spark Scala ETL design & development Cloud platforms: GCP & Azure Strong understanding of Data Engineering best practices Solid communication and collaboration skills 🌟 Good-to-Have Skills Apache tools (Kafka, Beam, Airflow, etc.) Knowledge of data lake and data warehouse concepts CI/CD for data pipelines Exposure to modern data monitoring and observability tools 💼 Why Xebia? At Xebia, you’ll be part of a forward-thinking, tech-savvy team working on high-impact, global data projects. We prioritize clean code, scalable solutions, and continuous learning. Join us to build real-time, cloud-native data platforms that power business intelligence across industries. 📤 To Apply Please share your updated resume and include the following details in your email to vijay.s@xebia.com : Full Name: Total Experience: Current CTC: Expected CTC: Current Location: Preferred Xebia Location: Gurugram Notice Period / Last Working Day (if serving): Primary Skills: LinkedIn Profile URL: Note: Only candidates who can join immediately or within 2 weeks will be considered. Build intelligent, scalable data solutions with Xebia – let’s shape the future of data together. 📊🚀
Posted 1 week ago
0.0 - 3.0 years
0 - 0 Lacs
Surat, Gujarat
On-site
Job Overview: We are seeking a highly skilled and experienced Cloud Engineer with expertise in Java and Apache Flink to join our dynamic data engineering team. This role involves building scalable, real-time data pipelines and backend components while managing cloud infrastructure (AWS or GCP). The ideal candidate has a strong foundation in Java development, distributed stream processing, and hands-on experience with cloud-native data systems. Key Responsibilities: Java Backend Development: Write clean, efficient, and well-tested Java code Build POJOs and custom serializers for data processing Manage project dependencies using Maven or Gradle Apache Flink – Real-Time Stream Processing: Develop real-time data pipelines using Apache Flink Utilize Flink’s streaming and batch modes effectively Work with event time vs processing time concepts Implement Flink stateful operations (keyed and operator state) Set up checkpointing, fault-tolerance, and recovery via savepoints Optimize task execution using Flink parallelism, slots, and task chaining Data Integration & Connectors: Integrate Flink with Kafka (source/sink connectors) (Bonus) Experience with Kinesis or Google Pub/Sub Write data to various sinks such as Elasticsearch and MySQL Cloud Engineering: Design and manage scalable cloud-based infrastructure on AWS or GCP Ensure high availability, reliability, and performance of backend services Collaborate with DevOps teams on CI/CD and deployment strategies Required Skills & Qualifications: 3+ years of Java development experience 2+ years of hands-on experience with Apache Flink Strong understanding of distributed stream processing Experience with Kafka integration (source/sink) Familiarity with Elasticsearch, MySQL as data sinks Proficiency with Maven or Gradle build tools Solid grasp of event-driven architecture and real-time systems Experience working with cloud environments (AWS or GCP) Preferred Qualifications: Experience with Google Pub/Sub or Amazon Kinesis Prior experience building microservices and containerized apps Familiarity with CI/CD tools, monitoring, and logging frameworks Knowledge of other big data tools (e.g., Spark, Hive) is a plus Why Join Us? Work on cutting-edge data engineering and cloud projects Collaborate with a high-performing and passionate team 5-day work week and a strong focus on work-life balance Competitive salary and performance-based growth Learning opportunities with modern tools and cloud technologies Take the lead in transforming how real-time data powers decision-making across systems. Let’s build the future together. Job Type: Full-time Pay: ₹40,000.00 - ₹74,500.00 per month Benefits: Flexible schedule Health insurance Leave encashment Paid time off Ability to commute/relocate: Surat, Gujarat: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Required) Experience: Flink Java Developer: 2 years (Required) Cloud Engineer: 3 years (Required) Location: Surat, Gujarat (Required) Work Location: In person Speak with the employer +91 9904361666
Posted 1 week ago
0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job purpose: Design & implement the best-engineered technical solutions using the latest technologies and tools. Who you are: Lead our DevOps team and drive the adoption of DevOps best practices within the organization Excellent communication and leadership skills, with the ability to lead and inspire a team Strong problem-solving and troubleshooting skills Knowledge on DevOps Tools, deploying and managing infrastructure with automation. Proficient knowledge with Networking & Firewall. Strong proficiency in Linux, Open Source, Web-based and Cloud-based environments (ability to use open-source technologies and tools) Hands-on experience with Apache, Nginx, Git, GNU tools Must have exposure on AWS, BitBucket Pipeline, Docker, npm, Maven, SNKY Experience working with Shell Scripting Hands-on with SonarQube, Jenkins, and Kubernetes. Strong understanding / experience with logging, logging frameworks API related skills (REST, and any other like google, AWS, Atlassian) DevOps (ansible, apache, python) Web services / REST (JSON) Good team player who can help and support team Good To have Knowledge on All DevOps Tools Exposure to PCI DSS environments Experience with any programming language Knowledge on third party integration with automation. FinTech domain experience What will excite us: Previous experience of delivering solution from idea to technicality. You are a whiz enough to Communication solution and troubleshooting. You aware maturely your temperament to handle any urgency or critical points. What will excite you: Opportunity to build Enterprise Grade applications Complete ownership and independence of execution Innovation is rewarded Learn from accomplished UI tech architects A great and rewarding work environment Job location : Ahmedabad
Posted 1 week ago
8.0 years
0 Lacs
India
Remote
Avaya IVR and ChatBot developer Remote 3 Months Contract + Extendable Job Purpose Primary responsibility is to handle production IVR and chatbots applications. Understand business requirements, new projects raised for the IVR & chatbot. Co-ordinate with development team so that requirements are developed and delivered as per user expectation. Planning and ensuring smooth deployments and postproduction support. Ensure applications are running in health status with adequate monitoring. Work with stakeholders and collaborate with different teams for delivery and support of the applications. Key Result Areas Ensuring IVR and chatbot applications are stable and healthy in production. IVR and chatbot channels should be available 24* 7 unless planned downtime activities Conduct regular review on infra usage (AKS/CPU/Memory/Disk etc.) both production and test environments. Responsible for patching (Infra and application components) and accountability. Automation of BAU tasks and reduce manual intervention. Available to support application during incident/outages and ensure resolution. Work on RCA and problem management process is followed as per Mashreq policies. Work on documentation pertaining to DR, day to today activities as required by management team. Ensure backups are scheduled as per application recovery strategyand testing the back ups. Adherence/updation of SOP, audit evidence gathering, infra, governance, data policies. Compliant to HR and organization policies. Work closely with business stake holders on IVR and chatbot deliveries. Work with vendors to upgrade/incident fixes/new solution designs for vendor-based applications/products. Identify system improvements and simplify the system complexity. Identify revenue opportunities and make constructive suggestions. Provide support of the IVR & chatbot systems to ensure expected system availability and stability as per business criticality SLA/TAT. System monitoring and analysis - Pro-actively report the issues, analyze and provide solution. Service tickets support – Service tickets closure as per agreed SLA with root cause analysis and bug fixing if required. Participate in audit and other compliance activities as assigned and complete action items for mitigation and rectification. Ensure all processes adhere to internal standards set as part of Quality & Audit, Infrastructure Management Guidelines. Operating Environment, Framework and Boundaries, Working Relationships E2E responsibility of delivery of IVR & chatbot requirements. Co-ordination with required stakeholders to meet the project/URF timelines Internal functions and work practices as per policy, compliance and procedures of the organization. Manage production support and maintenance of IVR & chatbot related system. Internal functions and work practices as per policy, compliance and procedures of the organization. Problem Solving Ability to break down the user requirement and identify most important and relevant information and options. Ability to present and discuss problems with others toward permanent solution. Investigate and understand the cause of UAT defects raised. Interpret, understand and resolve user support issues. Recommend solutions to system or process obstacles. Decision Making Authority & Responsibility Responsible to design functional solutions to business requirements. Use effective judgment to weigh different options for achieving better results within appropriate timeframes. Highlight any concerns affecting workplace deliverables. Complete root cause analysis & propose alternative solutions, and/or escalate in line with the process / practices. Knowledge, Skills and Experience Domain Knowledge Avaya IVR, Avaya Orchestration Designer, Chatbot technologies, Azure and Micro services knowledge. Expertise in Java. Proficiency in SQL Server and other databases like mongo. Knowledge of Apache Tomcat, Kubernetes. Proficiency in integrating and consuming RESTful webservice interfaces 8-10 years of experience Mandatory Avaya IVR, Avaya OD, Chatbot integrations knowledge, Cloud technologies (Azure), RESTful API SQL Server and above Java, spring boot Apache Tomcat Kubernetes Micro services design Desirable Document Statement systems’ domain knowledge Mobility development tools
Posted 1 week ago
8.0 - 10.0 years
9 - 14 Lacs
Pune
Work from Office
TBCMQ/IIB administrator 8-10+ years of administrative experience using IBM MQ Family tool sets including IBM Message Broker, IIB, MQ, and ACE 11.x Must know MQ/IIB cluster and hands-on experience with SSL certificate installation Have deep knowledge of Client-server and server-server architecture configurationMust have hands-on experience in MQ/IIB migration and troubleshooting skills Must have hands-on experience in QM and queue-related error troubleshootingMust have ITIL knowledge and deployment Must have IIB product knowledge and SSL configuration, etc
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are hiring for Java DevOps Engineer for the Hyderabad/Chennai location. Skills: Java, SpringBoot, RDBMS, Devops Exp: 5-9 yrs Work Location: Chennai/Hyderabad Key Responsibilities: We are currently seeking a skilled and experienced Java J2EE developer with a minimum of 5 years of hands-on experience. Capability to create design solutions independently for a given module Develop and maintain web applications using Java Spring Boot and user interfaces using HTML, CSS, and JavaScript. Write and maintain unit tests using Junit and Mockito Deploy and manage applications on servers such as JBoss, WebLogic, Apache, and Nginx Ensure application security. Familiarity with build tools such as Maven and Gradle. Experience with caching technologies like Redis and Coherence Understanding Spring Security. Knowledge of Groovy is a plus. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities Proven track record of delivering high-quality software solutions with cross-functional teams to define design and ship new features Troubleshoot and resolve issues promptly. Stay updated with the latest industry trends and technologies Required Skills: Strong experience with Java and Spring frameworks, Spring Boot, and familiarity with CI/CD. Interview Process: CBIE Assessment Two Technical Rounds Interested or know someone who fits? Send your resume to gautam@mounttalent.com.
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role - Cloud Architect – Analytics & Data Products We’re looking for a Cloud Architect / Lead to design, build, and manage scalable AWS infrastructure that powers our analytics and data product initiatives. This role focuses on automating infrastructure provisioning , application/API hosting , and enabling data and GenAI workloads through a modern, secure cloud environment. Key Responsibilities Design and provision AWS infrastructure using Terraform or AWS CloudFormation to support evolving data product needs. Develop and manage CI/CD pipelines using Jenkins , AWS Code Pipeline , Code Build , or GitHub Actions . Deploy and host internal tools, APIs, and applications using ECS , EKS , Lambda , API Gateway , and ELB . Provision and support analytics and data platforms using S3 , Glue , Redshift , Athena , Lake Formation , and orchestration tools like Step Functions or Apache Airflow (MWAA) . Implement cloud security, networking, and compliance using IAM , VPC , KMS , CloudWatch , CloudTrail , and AWS Config . Collaborate with data engineers, ML engineers, and analytics teams to align infrastructure with application and data product requirements. Support GenAI infrastructure, including Amazon Bedrock , Sage Maker , or integrations with APIs like Open AI . Requirements 10-14 years of experience in cloud engineering, DevOps, or cloud architecture roles. Strong hands-on expertise with the AWS ecosystem and tools listed above. Proficiency in scripting (e.g., Python , Bash ) and infrastructure automation. Experience deploying containerized workloads using Docker , ECS , EKS , or Fargate . Familiarity with data engineering and GenAI workflows is a plus. AWS certifications (e.g., Solutions Architect , DevOps Engineer ) are preferred.
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Data Engineer - AWS MSK Specialist Job Summary: We're looking for a skilled Data Engineer with expertise in AWS Managed Streaming for Apache Kafka (MSK) to design, build, and maintain high-performance data pipelines. You'll work closely with data architects and software engineers to ensure seamless data integration and streaming capabilities using AWS MSK. Responsibilities: - Design, develop, and manage real-time data pipelines using AWS MSK - Integrate AWS MSK with various data sources and consumers - Monitor and optimize AWS MSK cluster performance and costs - Collaborate with DevOps to ensure optimal performance and uptime - Implement security best practices for data streaming and storage - Document technical processes and best practices - Work with AWS services like S3, Glue, and Lambda for data processing and integration Requirements: - Bachelor’s degree in computer science, Engineering, or a related field - 3-5 years of experience working with AWS MSK or Apache Kafka in a production environment - Strong understanding of distributed systems, real-time data processing, and cloud-based data streaming - Proficiency in programming languages such as Python - Experience with AWS services like S3, Glue, and Lambda - Excellent problem-solving and debugging skills Nice-to-Have Skills: - Experience with containerization technologies like Docker and Kubernetes - Familiarity with other messaging systems like RabbitMQ or Apache Pulsar - Experience with data serialization formats like Avro, Protobuf, or JSON - Knowledge of Snowflake or other cloud-based data warehousing solutions Interested candidate can apply or share the updated resume on jahanavi.kaudiki@bpmlinks.com
Posted 1 week ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About Intellismith Intellismith, founded in 2019, is a dynamic HR service and technology startup. Our mission is to tackle India’s employability challenges head-on. We specialize in scaling talent acquisition and technology resource outsourcing. Also, as an IBM and Microsoft Business Partner, we leverage industry-leading solutions to enhance and diversify our offerings. As we chart our growth trajectory, we’re transitioning from a service-centric model to a product-focused company. Our journey involves building a cutting-edge skilling platform to empower Indian youth with domain-specific training, making them job-ready for the competitive market. Why Join Intellismith? Impactful Mission: Be part of a forward-thinking organisation committed to solving employability challenges. Your work directly contributes to bridging the skills gap and transforming lives. Innovation and Growth: Contribute to our exciting transition from services to products. Shape the future of our skilling platform and impact Indian youth positively. Collaborative Environment: Work alongside talented professionals across multiple locations. Our diverse teams foster creativity and learning. Entrepreneurial Spirit: Intellismith encourages fresh ideas and entrepreneurial thinking. Your voice matters here. As a leading outsourcing partners, we are hiring a Java & Apache Camel Specialist to work on a project for our client, which is the largest provider of telecoms and mobile money services in 14 countries spanning Sub-Saharan, Central, and Western Africa. Job Details: Experience: Min 3 years of relevant experience in Java & Apache Camel Qualification: BE / B Tech / MCA / BCA / MTech. Location: Gurugram (WFO - 5 days) CTC Bracket: 21.5 LPA Notice Period: Immediate to 15 days (Candidates with notice period of less than 30 days are preferred). Mandatory Skills: Java and Apache Camel, Kafka and Quarkus (good to have), Key Responsibilities: Design and develop scalable microservices using Quarkus framework. Build and maintain integration flows and APIs leveraging Red Hat Fuse (Apache Camel) for enterprise integration patterns. Develop and consume RESTful web services and APIs. Design, implement, and optimize Kafka producers and consumers for real-time data streaming and event-driven architecture. Write efficient, well-documented, and testable Java code adhering to best practices. Work with relational databases (e.g., PostgreSQL, MySQL, Oracle) including schema design, queries, and performance tuning. Collaborate with DevOps teams to build and maintain CI/CD pipelines for automated build, test, and deployment workflows. Deploy and manage applications on OpenShift Container Platform (OCP) including containerization best practices (Docker). Participate in code reviews, design discussions, and agile ceremonies. Troubleshoot and resolve production issues with a focus on stability and performance. Required Skills & Experience: Strong experience with Java (Java 8 or above) and the Quarkus framework. Expertise in Red Hat Fuse (or Apache Camel) for integration development. Proficient in designing and consuming REST APIs . Experience with Kafka for event-driven and streaming solutions. Solid understanding of relational databases and SQL. Experience in building and maintaining CI/CD pipelines (e.g., Jenkins, GitLab CI) and automated deployment. Hands-on experience deploying applications to OpenShift Container Platform (OCP) . Working knowledge of containerization tools like Docker #JAVA #Javadeveloper #Quarkus #Kafka #Apachecamel #Immediatejoiner #career #ITJobs
Posted 1 week ago
0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Company:IT Services Organization Key Skills: Spark, Azure Databricks, Azure, Python, Pyspark Roles and Responsibilities: Develop and maintain scalable data processing systems using Apache Spark and Azure Databricks. Implement data integration from various sources including RDBMS, ERP systems, and files. Design and optimize SQL queries, stored procedures, and relational schemas. Build stream-processing systems using technologies such as Apache Storm or Spark-Streaming. Utilize messaging systems like Kafka or RabbitMQ for data ingestion. Ensure performance tuning of Spark jobs for optimal efficiency. Collaborate with cross-functional teams to deliver high-quality data solutions. Lead and mentor a team of data engineers, fostering a culture of continuous improvement and Agile practices. Skills Required: Proficient in Apache Spark and Azure Databricks Strong experience with Azure ecosystem and Python Working knowledge of Pyspark (Nice-to-Have) Experience in data integration from varied sources Expertise in SQL optimization and stream-processing systems Familiarity with Kafka or RabbitMQ Ability to lead and mentor engineering teams Strong understanding of distributed computing principles Education: Bachelor's degree in Computer Science, Information Technology, or a related field.
Posted 1 week ago
10.0 years
0 Lacs
India
Remote
Job Description EMPLOYMENT TYPE: Full-Time, Permanent LOCATION: Remote (Pan India) SHIFT TIMINGS: 2.00 pm-11:00 pm IST Budget- As per company standards REPORTING: This position will report to our CEO or any other Lead as assigned by Management. The Senior Data Engineer will be responsible for building and extending our data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys working with big data and building systems from the ground up. You will collaborate with our software engineers, database architects, data analysts, and data scientists to ensure our data delivery architecture is consistent throughout the platform. You must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives. What You’ll Be Doing: ● Design and build parts of our data pipeline architecture for extraction, transformation, and loading of data from a wide variety of data sources using the latest Big Data technologies. ● Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. ● Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. ● Work with machine learning, data, and analytics experts to drive innovation, accuracy and greater functionality in our data system. Qualifications: ● Bachelor's degree in Engineering, Computer Science, or relevant field. ● 10+ years of relevant and recent experience in a Data Engineer role. ● 5+ years recent experience with Apache Spark and solid understanding of the fundamentals. ● Deep understanding of Big Data concepts and distributed systems. ● Strong coding skills with Scala, Python, Java and/or other languages and the ability to quickly switch between them with ease. ● Advanced working SQL knowledge and experience working with a variety of relational databases such as Postgres and/or MySQL. ● Cloud Experience with DataBricks ● Experience working with data stored in many formats including Delta Tables, Parquet, CSV and JSON. ● Comfortable working in a linux shell environment and writing scripts as needed. ● Comfortable working in an Agile environment ● Machine Learning knowledge is a plus. ● Must be capable of working independently and delivering stable, efficient and reliable software. ● Excellent written and verbal communication skills in English. ● Experience supporting and working with cross-functional teams in a dynamic environment.
Posted 1 week ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Summary We are looking for a Data Engineer with strong experience in cloud platforms (AWS & Azure) , Scala programming , and a solid understanding of data architecture and governance frameworks . You will play a key role in building, optimizing, and maintaining scalable data pipelines and systems while ensuring data quality, security, and compliance across the organization. Key Responsibilities Data Engineering & Development Design and develop reliable, scalable ETL/ELT data pipelines using Scala , SQL , and orchestration tools. Integrate and process structured, semi-structured, and unstructured data from various sources (APIs, databases, flat files, etc.). Develop solutions on AWS (e.g., S3, Glue, Redshift, EMR) and Azure (e.g., Data Factory, Synapse, Blob Storage). Cloud & Infrastructure Build cloud-native data solutions that align with enterprise architecture standards. Leverage IaC tools (Terraform, CloudFormation, ARM templates) to deploy and manage infrastructure. Monitor performance, cost, and security posture of data environments in both AWS and Azure. Data Architecture & Governance Collaborate with data architects to define and implement logical and physical data models. Apply data governance principles including data cataloging , lineage tracking , data privacy , and compliance (e.g., GDPR) . Support the enforcement of data policies and data quality standards across data domains. Collaboration & Communication Work cross-functionally with data analysts, scientists, architects, and business stakeholders to support data needs. Participate in Agile ceremonies and contribute to sprint planning and reviews. Maintain clear documentation of pipelines, data models, and data flows. Required Qualifications Bachelor's degree in Computer Science, Engineering, or a related field. 3–6 years of experience in data engineering or data platform development. Hands-on experience with AWS and Azure data services. Proficient in Scala for data processing (e.g., Spark, Kafka Streams). Strong SQL skills and familiarity with distributed systems. Experience with orchestration tools (e.g., Apache Airflow, Azure Data Factory).
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role: Snowflake Developer Location: Pune (Kalyani Nagar) Exp: 6+ Key Responsibilities: •Design, develop, and maintain scalable Snowflake data warehouse solutions. •Write and optimize complex SQL queries for data extraction, transformation, and reporting. •Develop and manage Snowflake stored procedures using SQL and JavaScript. •Implement and manage data integration between Snowflake and external systems (e.g., using ETL tools, APIs, or Snowpipe). •Create and maintain data models and ensure data quality and consistency across environments. •Collaborate with data engineers, analysts, and business stakeholders to understand requirements and deliver reliable solutions. •Monitor performance, diagnose issues, and implement performance tuning best practices. •Implement access controls and security policies aligned with enterprise standards. Required Skills & Qualifications: •Strong hands-on experience with Snowflake platform and architecture. •Should know python libraries. •Advanced proficiency in SQL, including writing and optimizing complex queries. •Experience with stored procedures, user-defined functions (UDFs), and task scheduling in Snowflake. •Familiarity with data integration tools (e.g., Informatica, Talend, Apache Airflow, DBT, Fivetran, or custom Python scripts). •Experience with data modeling (star/snowflake schemas) and data warehouse design. •Knowledge of cloud platforms (AWS, Azure, or GCP) and how Snowflake integrates with them. •Experience working with large datasets and performance tuning of data loads/queries. •Strong problem-solving and communication skills. Please share the resume to hema@synapsetechservice.com
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Backend & MLOps Engineer – Integration, API, and Infrastructure Expert 1. Role Objective: Responsible for building robust backend infrastructure, managing ML operations, and creating scalable APIs for AI applications. Must excel in deploying and maintaining AI products in production environments with high availability and security standards. The engineer will be expected to build secure, scalable backend systems that integrate AI models into services (REST, gRPC), manage data pipelines, enable model versioning, and deploy containerized applications in secure (air-gapped) Naval infrastructure. 2. Key Responsibilities: 2.1. Create RESTful and/or gRPC APIs for model services. 2.2. Containerize AI applications and maintain Kubernetes-compatible Docker images. 2.3. Develop CI/CD pipelines for model training and deployment. 2.4. Integrate models as microservices using TorchServe, Triton, or FastAPI. 2.5. Implement observability (metrics, logs, alerts) for deployed AI pipelines. 2.6. Build secured data ingestion and processing workflows (ETL/ELT). 2.7. Optimize deployments for CPU/GPU performance, power efficiency, and memory usage 3. Educational Qualifications Essential Requirements: 3.1. B.Tech/ M.Tech in Computer Science, Information Technology, or Software Engineering. 3.2. Strong foundation in distributed systems, databases, and cloud computing. 3.3. Minimum 70% marks or 7.5 CGPA in relevant disciplines. Professional Certifications: 3.4. AWS Solutions Architect/DevOps Engineer Professional 3.5. Google Cloud Professional ML Engineer or DevOps Engineer 3.6. Azure AI Engineer or DevOps Engineer Expert. 3.7. Kubernetes Administrator (CKA) or Developer (CKAD). 3.8. Docker Certified Associate Core Skills & Tools 4. Backend Development: 4.1. Languages: Python, FastAPI, Flask, Go, Java, Node.js, Rust (for performance-critical components) 4.2. Web Frameworks: FastAPI, Django, Flask, Spring Boot, Express.js. 4.3. API Development: RESTful APIs, GraphQL, gRPC, WebSocket connections. 4.4. Authentication & Security: OAuth 2.0, JWT, API rate limiting, encryption protocols. 5. MLOps & Model Management: 5.1. ML Platforms: MLflow, Kubeflow, Apache Airflow, Prefect 5.2. Model Serving: TensorFlow Serving, TorchServe, ONNX Runtime, NVIDIA Triton, BentoML 5.3. Experiment Tracking: Weights & Biases, Neptune, ClearML 5.4. Feature Stores: Feast, Tecton, Amazon SageMaker Feature Store 5.5. Model Monitoring: Evidently AI, Arize, Fiddler, custom monitoring solutions 6. Infrastructure & DevOps: 6.1. Containerization: Docker, Podman, container optimization. 6.2. Orchestration: Kubernetes, Docker Swarm, OpenShift. 6.3. Cloud Platforms: AWS, Google Cloud, Azure (multi-cloud expertise preferred). 6.4. Infrastructure as Code: Terraform, CloudFormation, Pulumi, Ansible. 6.5. CI/CD: Jenkins, GitLab CI, GitHub Actions, ArgoCD. 6.6. DevOps & Infra: Docker, Kubernetes, NGINX, GitHub Actions, Jenkins. 7. Database & Storage: 7.1. Relational: PostgreSQL, MySQL, Oracle (for enterprise applications) 7.2. NoSQL: MongoDB, Cassandra, Redis, Elasticsearch 7.3. Vector Databases: Pinecone, Weaviate, Chroma, Milvus 7.4. Data Lakes: Apache Spark, Hadoop, Delta Lake, Apache Iceberg 7.5. Object Storage: AWS S3, Google Cloud Storage, MinIO 7.6. Backend: Python (FastAPI, Flask), Node.js (optional) 7.7. DevOps & Infra: Docker, Kubernetes, NGINX, GitHub Actions, Jenkins 8. Secure Deployment: 8.1. Military-grade security protocols and compliance 8.2. Air-gapped deployment capabilities 8.3. Encrypted data transmission and storage 8.4. Role-based access control (RBAC) & IDAM integration 8.5. Audit logging and compliance reporting 9. Edge Computing: 9.1. Deployment on naval vessels with air gapped connectivity. 9.2. Optimization of applications for resource-constrained environment. 10. High Availability Systems: 10.1. Mission-critical system design with 99.9% uptime. 10.2. Disaster recovery and backup strategies. 10.3. Load balancing and auto-scaling. 10.4. Failover mechanisms for critical operations. 11. Cross-Compatibility Requirements: 11.1. Define and expose APIs in a documented, frontend-consumable format (Swagger/OpenAPI). 11.2. Develop model loaders for AI Engineer's ONNX/ serialized models. 11.3. Provide UI developers with test environments, mock data, and endpoints. 11.4. Support frontend debugging, edge deployment bundling, and user role enforcement. 12. Experience Requirements 12.1. Production experience with cloud platforms and containerization. 12.2. Experience building and maintaining APIs serving millions of requests. 12.3. Knowledge of database optimization and performance tuning. 12.4. Experience with monitoring and alerting systems. 12.5. Architected and deployed large-scale distributed systems. 12.6. Led infrastructure migration or modernization projects. 12.7. Experience with multi-region deployments and disaster recovery. 12.8. Track record of optimizing system performance and cost
Posted 1 week ago
2.0 years
0 Lacs
India
Remote
Job description L1 Support – Data Engineering (Remote, South India) Location: Permanently based in South India (any city) – non-negotiable Work Mode: Remote | 6 days/week | 24x7x365 support (rotational shifts) Salary Range - Between INR 2.5 to 3 Lacs Per Annum Experience: 2 years Language: English proficiency mandatory ; Hindi is a plus About the Role We're looking for an experienced and motivated L1 Support Engineer – Data Engineering to join our growing team. If you have solid exposure to AWS , SQL , and Python scripting , and you're ready to thrive in a 24x7 support environment—this role is for you! What You’ll Do Monitor and support AWS services (S3, EC2, CloudWatch, IAM) Handle SQL-based issue resolution and data analysis Run and maintain Python scripts ; Shell scripting is a plus Support ETL pipelines and data workflows Monitor Apache Airflow DAGs and resolve basic issues Collaborate with cross-functional and multicultural teams What We’re Looking For B.Tech or MCA preferred , but candidates with a Bachelor’s degree in any field and the right skillset are welcome to apply. 2 years of Data Engineering Support or similar experience Strong skills in AWS , SQL , Python , and ETL processes Familiarity with data warehousing (Amazon Redshift or similar) Ability to work rotational shifts in a 6-day, 24x7 environment Excellent communication and problem-solving skills English fluency is required ; Hindi is an advantage
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are hiring Java Developers. We are having Walk-In F2F weekend drive on this 2nd August in Pune office - Shivajinagar If interested and looking for change, please do share your updated CV @ pinki.patade@lntinfotech.com. Experience - 5 to 10 years Notice Period - Open JD: Job Location - Pune - ICC Teach Park Skills Required - Java, Microservices ,Spring boot, Hibernate Java version 18Hibenate JPASpring Spring Boot MVC Rest SOAP API Rest SOAP Web Services MicroServices OOPS Indepth knowledge of collection SQLPLSQLNOSQL Web Servers Apache Tomcat WebLogic Web Sphere Test software JunitMockitoPowermockMochaKarmaChai Troubleshoot debug and upgrade software Version Control Tool example GitHubGitLabBitBucket knowledge of minimum one Plus 5 years Design Patterns API designing TDD test driven development CICD tools AWSJenkinsDockerKubernates Create security and data protection settings Project Management Tools Maven Gradlelog4jjirasonarputtyantPostmanSwagger Agile development including daily scrum and weekly iteration reviews and planning Build features and applications with a mobile responsive design Write technical documentation Excellent communication and teamwork skills Kafkaelastic searchmango Mandatory certificate Oracle Certified Professional Java SE 11 Programmer Pls share below Info while applying Total Experience - Experience in Java- Experience in Angular Current Company - Current CTC - Expected CTC - Any Offer - Available for F2F on 2nd Aug - Notice Period - If Job Role is not matching, please ignore it.
Posted 1 week ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
🚀 We’re Hiring | AEM Digital Experience Roles (Forms, Front-End, Back-End, Project Management) 💼 Employment Type : Full-Time 📧 Apply/Refer : rizwaan@avivsha.com 🧠 Experience : 6 to 15+ Years (Multiple Levels Open) Are you passionate about Adobe Experience Manager and building cutting-edge digital solutions? We are hiring across multiple AEM skill areas including Front-End, Back-End, Adaptive Forms, and Project Management . Join a high-performing team delivering impactful digital transformation using AEM across industries. 💡 Open Roles : AEM Front-End Developer / Sr. Consultant AEM Back-End Developer / Sr. Consultant AEM Adaptive Forms Specialist AEM Project Manager / Program Manager 🎨 Front-End Developer Skills (MUST unless noted) : JavaScript (ES6), JSON, JSON Schema RESTful APIs (async/await, Promises) Design Systems, Style Guides, Pattern Libraries (debugging required) Git, Babel, Webpack, NPM Core Web Vitals, web performance optimization Micro Front-End & Headless CMS (Good to have) 🧩 Adaptive Forms Expertise : Adaptive Forms development Form Data Model (FDM) design & backend integration Lazy Loading and performance tuning Logical structuring & Fragments for reusability ⚙️ Back-End (AEM Java) Skills : Strong Java programming & Servlet development OSGi / FELIX framework Apache Sling, Maven, JCR/CRX RESTful web service development JUnit, code quality reviews, clean coding 👨💼 AEM Project Manager Responsibilities : Manage multiple consulting engagements across Adobe Marketing Cloud Interact with stakeholders: customers, partners, internal teams Agile/Scrum delivery & governance Experience in AEM, Adobe Campaign, Analytics preferred Project Management certification (PMP, CSM, PRINCE2) required Strong CXO-level communication & digital transformation experience ✅ Non-Technical Skills Across All Roles : Excellent problem-solving and analytical ability Team leadership and mentorship SDLC and Agile process knowledge Strong communication skills (written & verbal) Self-motivated, solution-oriented, and collaborative 📩 Interested in joining our AEM team or know someone who might be a great fit? Send your CV to rizwaan@avivsha.com
Posted 1 week ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary The Chapter Lead Backend development is a rolerole is a hands-on developer role focusing on back-end development and is accountable for people management and capability development of their Chapter members. Responsibilities in detail are: Responsibilities Oversees the execution of functional standards and best practices and provide technical assistance to the members of their Chapter. Responsible for the quality of the code repository where applicable. Maintain exemplary coding standards within the team, contributing to code base development and code repository management. Perform code reviews to guarantee quality and promote a culture of technical excellence in Java development. Function as a technical leader and active coder, setting and enforcing domain-specific best practices and technology standards. Allocate technical resources and personal coding time effectively, balancing leadership with hands-on development tasks. Maintain a dual focus on leadership and hands-on development, committing code while steering the chapter's technical direction. Oversee Java backend development standards within the chapter across squads, ensuring uniform excellence and adherence to best coding practices. Harmonize Java development methodologies across the squad, guiding the integration of innovative practices that align with the bank’s engineering strategies. Advocate for the adoption of cutting-edge Java technologies and frameworks, driving the evolution of backend practices to meet future challenges. Strategy Oversees the execution of functional standards and best practices and provide technical assistance to the members of their Chapter. Responsible for the quality of the code repository where applicable. Acts as a conduit for the wider domain strategy, for example technical standards. Prioritises and makes available capacity for technical debt. This role is around capability building, it is not to own applications or delivery. Actively shapes and drives towards the Bank-Wide engineering strategy and programmes to uplift standards and steer the technological direction towards excellence Act as a custodian for Java backend expertise, providing strategic leadership to enhance skill sets and ensure the delivery of high-performance banking solutions. Business Experienced practitioner and hands on contribution to the squad delivery for their craft (Eg. Engineering). Responsible for balancing skills and capabilities across teams (squads) and hives in partnership with the Chief Product Owner & Hive Leadership, and in alignment with the fixed capacity model. Responsible to evolve the craft towards improving automation, simplification and innovative use of latest market trends. Collaborate with product owners and other tech leads to ensure applications meet functional requirements and strategic objectives Processes Promote a feedback-rich environment, utilizing internal and external insights to continuously improve chapter operations. Adopt and embed the Change Delivery Standards throughout the lifecycle of the product / service. Ensure role, job descriptions and expectations are clearly set and periodic feedback provided to the entire team. Follows the chapter operating model to ensure a system exists to continue to build capability and performance of the chapter. Chapter Lead may vary based upon the specific chapter domain its leading. People & Talent Accountable for people management and capability development of their Chapter members. Reviews metrics on capabilities and performance across their area, has improvement backlog for their Chapters and drives continual improvement of their chapter. Focuses on the development of people and capabilities as the highest priority. Risk Management Responsible for effective capacity risk management across the Chapter with regards to attrition and leave plans. Ensures the chapter follows the standards with respect to risk management as applicable to their chapter domain. Adheres to common practices to mitigate risk in their respective domain. Design and uphold a robust risk management plan, with contingencies for succession and role continuity, especially in critical positions. Governance Ensure all artefacts and assurance deliverables are as per the required standards and policies (e.g., SCB Governance Standards, ESDLC etc.). Regulatory & Business Conduct Ensure a comprehensive understanding of and adherence to local banking laws, anti-money laundering regulations, and other compliance mandates. Conduct business activities with a commitment to legal and regulatory compliance, fostering an environment of trust and respect. Key Stakeholders Chapter Area Lead Sub-domain Tech Lead Domain Architect Business Leads / Product owners Other Responsibilities Champion the company's broader mission and values, integrating them into daily operations and team ethos. Undertake additional responsibilities as necessary, ensuring they contribute to the organisation's strategic aims and adhere to Group and other Relevant policies. \ Qualification Requirements & Skills Bachelor’s or Master’s degree in Computer Science, Computer Engineering, or related field, with preference given to advanced degrees. 10 years of professional Java development experience, including a proven record in backend system architecture and API design. At least 5 years in a leadership role managing diverse development teams and spearheading complex Java projects. Proficiency in a range of Java frameworks such as Spring, Spring Boot, and Hibernate, and an understanding of Apache Struts. Proficient in Java, with solid expertise in core concepts like object-oriented programming, data structures, and complex algorithms. Knowledgeable in web technologies, able to work with HTTP, RESTful APIs, JSON, and XML Expert knowledge of relational databases such as Oracle, MySQL, PostgreSQL, and experience with NoSQL databases like MongoDB, Cassandra is a plus Familiarity with DevOps tools and practices, including CI/CD pipeline deployment, containerisation technologies like Docker and Kubernetes, and cloud platforms such as AWS, Azure, or GCP. Solid grasp of front-end technologies (HTML, CSS, JavaScript) for seamless integration with backend systems. Strong version control skills using tools like Git / Bitbucket with a commitment to maintaining high standards of code quality through reviews and automated tests. Exceptional communication and team-building skills, with the capacity to mentor developers, facilitate technical skill growth, and align team efforts with strategic objectives. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Ability to work effectively in a fast-paced, dynamic environment. Role Specific Technical Competencies Hands-on Java Development Leadership in System Architecture Database Proficiency CI / CD Container Platforms – Kubernetes / OCP / Podman About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Technology->BPMI Others->Appian->Appian BPM,Technology->Cloud Platform->AWS Data Analytics->Amazon Managed Streaming for Apache Kafka,Technology->Cloud Platform->Google Cloud – Contact Center AI,Technology->Java->Apache->Kafka A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15459 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France