Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 10.0 years
7 - 12 Lacs
Kochi
Work from Office
Develop user-friendly web applications using Java and React.js while ensuring high performance. Design, develop, test, and deploy robust and scalable applications. Building and consuming RESTful APIs. Collaborate with the design and development teams to translate UI/UX design wireframes into functional components. Optimize applications for maximum speed and scalability. Stay up-to-date with the latest Java and React.js trends, techniques, and best practices. Participate in code reviews to maintain code quality and ensure alignment with coding standards. Identify and address performance bottlenecks and other issues as they arise. Help us shape the future of Event Driven technologies, including contributing to Apache Kafka, Strimzi, Apache Flink, Vert.x and other relevant open-source projects. Collaborate within a dynamic team environment to comprehend and dissect intricate requirements for event processing solutions. Translate architectural blueprints into actualized code, employing your technical expertise to implement innovative and effective solutions. Conduct comprehensive testing of the developed solutions, ensuring their reliability, efficiency, and seamless integration. Provide ongoing support for the implemented applications, responding promptly to customer inquiries, resolving issues, and optimizing performance. Serve as a subject matter expert, sharing insights and best practices related to product development, fostering knowledge sharing within the team. Continuously monitor the evolving landscape of event-driven technologies, remaining updated on the latest trends and advancements. Collaborate closely with cross-functional teams, including product managers, designers, and developers, to ensure a holistic and harmonious product development process. Take ownership of technical challenges and lead your team to ensure successful delivery, using your problem-solving skills to overcome obstacles. Mentor and guide junior developers, nurturing their growth and development by providing guidance, knowledge transfer, and hands-on training. Engage in agile practices, contributing to backlog grooming, sprint planning, stand-ups, and retrospectives to facilitate effective project management and iteration. Foster a culture of innovation and collaboration, contributing to brainstorming sessions and offering creative ideas to push the boundaries of event processing solutions. Maintain documentation for the developed solutions, ensuring comprehensive and up-to-date records for future reference and knowledge sharing. Involve in building and orchestrating containerized services Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise Proven 5+ years of experience as aFull stack developer (Java and React.js) with a strong portfolio of previous projects. Proficiency in Java, JavaScript, HTML, CSS, and related web technologies. Familiarity with RESTfulAPIs and their integration into applications. Knowledge of modern CICD pipelines and tools like Jenkinsand Travis. Strong understanding of version control systems, particularly Git. Good communication skills and the ability to articulate technical concepts to both technical and non-technical team members. Familiarity with containerizationand orchestrationtechnologies like Docker and Kubernetes for deploying event processing applications. Proficiency in troubleshootingand debugging. Exceptional problem-solving and analytical abilities, with a knack for addressing technical challenges. Ability to work collaboratively in an agile and fast-paced development environment. Leadership skills to guide and mentorjunior developers, fostering their growth and skill development. Strong organizational and time management skills to manage multiple tasks and priorities effectively. Adaptability to stay current with evolving event-driven technologies and industry trends. Customer-focused mindset, with a dedication to delivering solutions that meet or exceed customer expectations. Creative thinking and innovation mindset to drive continuous improvement and explore new possibilities. Collaborative and team-oriented approach to work, valuing open communication and diverse perspectives. Preferred technical and professional ex
Posted 21 hours ago
4.0 - 6.0 years
6 - 8 Lacs
Pune
Work from Office
Developer leads the cloud application development/deployment for client based on AWS development methodology, tools and best practices. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns, strong knowledge of ORM tools like Hibernate or JPA Java based Micro-services framework, Hands on experience on Spring boot Microservices Strong knowledge of micro-service logging, monitoring, debugging and testing,In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes Experience in messaging platforms such as Kafka or IBM MQ Good understanding of Test-Driven-Development, familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands Preferred technical and professional experience Significant software development experience, including 4-6+ years of experience in web UI application development
Posted 21 hours ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5-8 years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Knowledge or experience of Snowflake will be an added advantage
Posted 21 hours ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5-8 years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Knowledge or experience of Snowflake will be an added advantage
Posted 21 hours ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
Work with broader team to build, analyze and improve the AI solutions. You will also work with our software developers in consuming different enterprise applications Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Resource should have 5-7 years of experience. Sound knowledge of Python and should know how to use the ML related services. Proficient in Python with focus on Data Analytics Packages. Strategy Analyse large, complex data sets and provide actionable insights to inform business decisions. Strategy Design and implementing data models that help in identifying patterns and trends. Collaboration Work with data engineers to optimize and maintain data pipelines. Perform quantitative analyses that translate data into actionable insights and provide analytical, data-driven decision-making. Identify and recommend process improvements to enhance the efficiency of the data platform. Develop and maintain data models, algorithms, and statistical models Preferred technical and professional experience Experience with conversation analytics. Experience with cloud technologies Experience with data exploration tools such as Tableu
Posted 21 hours ago
3.0 - 6.0 years
0 Lacs
Mohali district, India
On-site
About Antier Solutions Antier Solutions is a leading technology solutions provider offering high-quality software development, blockchain development, and consulting services to businesses globally. With a strong emphasis on innovation and problem-solving, we help our clients achieve their digital transformation goals by creating cutting-edge solutions across industries. Job Overview Antier Solutions is looking for an experienced and dynamic Python Ai Developer to join our development team. The ideal candidate will be responsible for developing, testing, and maintaining Python-based applications and solutions. You will work closely with other developers, designers, and stakeholders to create efficient, scalable, and high-performing systems. Key Responsibilities Design, develop, and maintain Python applications and services. Write reusable, testable, and efficient code. Collaborate with cross-functional teams to define, design, and ship new features. Optimize applications for maximum speed and scalability. Implement automated testing (unit tests, integration tests) to ensure the reliability of code. Troubleshoot, debug, and upgrade existing systems. Work with databases (SQL and NoSQL) and integrate APIs. Stay up to date with the latest industry trends and best practices. Collaborate in agile development processes and participate in sprint planning, standups, and code reviews. Ensure compliance with security best practices and data protection laws. Mentor junior developers and provide technical guidance where necessary. Required Skills and Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 3-6 years of proven experience in Python development. Strong knowledge of Python frameworks such as Django, Flask, or FastAPI. Hands-on experience with RESTful API development and integration. Proficiency in working with relational and non-relational databases (e.g., MySQL, PostgreSQL, MongoDB). Solid understanding of data structures, algorithms, and software design principles. Knowledge of version control systems (Git, SVN). Familiarity with front-end technologies like HTML, CSS, and JavaScript is a plus. Experience with cloud services (AWS, Azure, GCP) is a plus. Understanding of containerization technologies (Docker, Kubernetes) is a plus. Strong problem-solving and analytical skills. Ability to work both independently and as part of a team in a fast-paced environment. Excellent communication and collaboration skills. Preferred Skills Experience with microservices architecture. Knowledge of Agile methodologies and version control systems like Git. Familiarity with CI/CD pipelines and DevOps practices. Experience with message brokers like RabbitMQ or Kafka. Exposure to machine learning, data science, or artificial intelligence is a plus Interested Candidates can also share the Resume at shikha.rana@antiersolutions.com Show more Show less
Posted 23 hours ago
3.0 years
0 Lacs
Hyderābād
Remote
Your opportunity At New Relic, we provide businesses with a state-of-the-art observability platform, leveraging advanced technologies to deliver real-time insights into the performance of software applications and infrastructure. We enable organizations to monitor, analyze, and optimize their systems to achieve enhanced reliability, performance, and user experience. New Relic is a leader in the industry and has been on the forefront of developing cutting edge AI/ML solutions to revolutionise observability. What you'll do Drive the design, development, and enhancement of core features and functionalities of our AI platform with micro-services architecture and deliver scalable, secure and reliable solutions Be proactive in identifying and addressing performance bottlenecks, applying optimizations, and maintaining the stability and availability of our platform Build thoughtful, high-quality code that is easy to read and maintain Collaborate with your team, external contributors, and others to help solve problems. Write and share proposals to improve team processes and approaches. This role requires Bachelor’s degree in Computer Science discipline or related field 3+ years of experience as a Software Engineer working with Python, developing production grade applications Demonstrated experience in designing, developing, and maintaining large-scale cloud platforms with a strong understanding of scalable distributed systems and microservices architecture Proficiency in back-end frameworks such as Flask/FastAPI; Pydantic for robust models; asyncio, aiohttp libraries for asynchronous request handling; Decorators for abstraction; Pytest for testing Competency in using Python threading and multiprocessing modules for parallel task execution. Knowledge of Coroutines. Understand the GIL and its implications on concurrency Experience in building secure infrastructure having simulated race condition attacks, injection attacks; leading teams through real incident management situations with strong debugging skills Demonstrated experience in working with both Relational and NoSQL DBs; message queueing systems (SQS/Kafka/RabbitMQ) Up to date with cloud technologies - AWS/Azure/GCP, Serverless, Docker, Kubernetes, CI/CD pipelines among others Bonus points if you have Masters in Computer Science discipline Exposure to Machine Learning and GenAI technologies Experience with Authentication/Authorization services etc. Communication protocol - gRPC GraphQL API working knowledge Please note that visa sponsorship is not available for this position. Fostering a diverse, welcoming and inclusive environment is important to us. We work hard to make everyone feel comfortable bringing their best, most authentic selves to work every day. We celebrate our talented Relics’ different backgrounds and abilities, and recognize the different paths they took to reach us – including nontraditional ones. Their experiences and perspectives inspire us to make our products and company the best they can be. We’re looking for people who feel connected to our mission and values, not just candidates who check off all the boxes. If you require a reasonable accommodation to complete any part of the application or recruiting process, please reach out to resume@newrelic.com. We believe in empowering all Relics to achieve professional and business success through a flexible workforce model. This model allows us to work in a variety of workplaces that best support our success, including fully office-based, fully remote, or hybrid. Our hiring process In compliance with applicable law, all persons hired will be required to verify identity and eligibility to work and to complete employment eligibility verification. Note: Our stewardship of the data of thousands of customers’ means that a criminal background check is required to join New Relic. We will consider qualified applicants with arrest and conviction records based on individual circumstances and in accordance with applicable law including, but not limited to, the San Francisco Fair Chance Ordinance. Headhunters and recruitment agencies may not submit resumes/CVs through this website or directly to managers. New Relic does not accept unsolicited headhunter and agency resumes, and will not pay fees to any third-party agency or company that does not have a signed agreement with New Relic. Candidates are evaluated based on qualifications, regardless of race, religion, ethnicity, national origin, sex, sexual orientation, gender expression or identity, age, disability, neurodiversity, veteran or marital status, political viewpoint, or other legally protected characteristics. Review our Applicant Privacy Notice at https://newrelic.com/termsandconditions/applicant-privacy-policy
Posted 23 hours ago
12.0 years
0 Lacs
Telangana
On-site
Underwriting Frameworks Platform Architect Chubb is the world’s largest publicly traded property and casualty insurer. With operations in 54 countries, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is distinguished by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength, underwriting excellence, superior claims handling expertise and local operations globally. The Underwriting Frameworks Solution Architect (this role) will be responsible for the architecture, design, performance, build, deployment and support of core insurance capabilities for Chubb built on modern and opensource configuration driven integration platform. A suitable candidate is a successful, experienced, and accomplished technology leader with significant hands-on experience in cross platform design and has a deep passion for technology and architecture. This role will be required to plan, manage, monitor as well as influence, differentiate, and guide both business and technologies teams across multiple divisions of Chubb through the various phase of the SDLC. The candidate is expected to actively collaborate with the business, business architects, and BA’s along with development teams, test teams, Enterprise Architects, Information Security, Data & Analytics, Digital Technology and other shared services teams to build cutting edge, highly reusable platform with a cloud first mentality. Scope/Responsibilities Responsible for architecture and design of rating models in Radar Base software including model versioning and segmentation, technical keys, integrations, private rating details for forms/print needs, and both Actuary and Underwriter needs. Infrastructure and model service integration request and response design along with model service deployment design within Radar Live Stays on top of industry trends including Digital Transformation, InsureTech, AI/Machine Learning etc. Owns the underwriting frameworks solution and designs/develops technology patterns/frameworks to deliver the solution effectively and efficiently while maximizing reuse. Act as the primary design owner of all underwriting frameworks solutions & associated infrastructure while collaborating with business and projects teams to assist with capacity planning, discovery, strategy, design and prototyping of new features & capabilities. Act as a broker between divergent business and IT teams to bring them together, share knowledge, solve problems, design common reusable assets and maximize the use of existing capabilities and frameworks. Work closely with project management and delivery leads to identify key activities and owners while ensuring alignment and feasibility of all planned deliverables and related dependencies and include all optimization feedback. Build and maintain relationships with our business and IT stakeholders across the enterprise while providing support, updates, and issue resolutions for their respective platform instances. Support infrastructure teams in build out, availability, performance, reliability and ensure that all non-functional system requirements will be met. Skills, Knowledge, and Experience: Bachelor’s or master’s degree in computer science, Information Systems, or another related field. At least 12 years of experience as a Solution Architect, leading technical teams with strong software engineering background. Minimum 10 years of industry experience in system development and implementation in the insurance industry. At least 7 years of experience in insurance technology projects/programs. At least 7 years of experience with Radar platform. At least 7 years of experience in engineering, designing and implementing large & complex system & application architectures. At least 5 years of experience in performance engineering & optimization of distributed systems in API based architectures. At least 5 years of experience leading systems integration projects with many internal and external interfaces At least 5-10 years of hands-on experience in technology stack such as Microsoft.Net, Java, Quarkus, ASP.NET, IIS, Kafka, SQL Server, XML, XPATH, JSON, CSS, XSLT, MongoDB etc. At least 5-10 years of hands-on experience in technology including IBM BAMOE rules engine. Knowledge of message bus technologies (Kafka preferred, or MQ, Amazon SQS, etc.) Knowledge of various cloud design patterns and common security patterns (Azure preferred) Superior communication skills and ability to interact with business and IT resources at all levels. Experience with systems integration to third parties, internal systems, and downstream systems as well exposure of products and services to external partners or risk aggregators. Good knowledge of Data & Analytics technologies such as Data Lakes, Business Intelligence Tools, Predictive Models, 3 rd Party Data. Good knowledge of Microservices & API Mgmt., Java Script Frameworks such as Angular, Cloud Platforms such as Microsoft Azure is preferred. Experience with AGILE methodology. Critical Skills Adaptability- Agility. Responds well to change. Handles multiple demands / priorities effectively. Adapts to best fit with situation at hand. Handles conflict effectively. Collaborates. Develops new skills quickly. Willing to accept new responsibilities. Takes initiative & accountability. IT Methodologies. Good organizational skills. Has experience managing projects and staff. Advanced technical skills or specialized knowledge. Analyzes tasks, dependencies, and resource needs. Manages budgets and performs financial activities effectively. Understands system development life cycle (SDLC) including AGILE methodology; uses project methodologies. Business Knowledge. Understands fundamentals of insurance, IT best practices, and technology. Understands some insurance business processes. Connects business problems to technical solutions. Validates that solutions achieve desired business result. Deliverables Management. Matches business requirements to deliverables. Understands & meets deadlines. Proactive identification and management of risk. Manages multiple work streams. Works effectively across multiple groups (both internal and external). Has experience managing scope and change control. Communication. Communicates effectively both written and oral. Responds to customers appropriately, timely, and accurately. Manages relationships effectively. Influences others. Transfers knowledge to others. Has experience working with multi-vendor projects and related communications challenges.
Posted 23 hours ago
1.0 years
11 - 13 Lacs
Hyderābād
Remote
Experience : 1 + Years Work location: Bangalore, Chennai, Hyderabad, Pune- Hybrid Job Description : GCP Cloud Engineer Shift Time:- 2 to 11 PM IST Budget:- Max 13 LPA Primary Skill & Weightage GCP -50% Kubernetes -25% NodeJS -25% Technical Skills Cloud: Experience working with Google Cloud Platform (GCP) services. Containers & Orchestration: Practical experience deploying and managing applications on Kubernetes. Programming: Proficiency in Node.js development, including building and maintaining RESTful APIs or backend services. Messaging: Familiarity with Apache Kafka for producing and consuming messages. Databases: Experience with PostgreSQL or similar relational databases (writing queries, basic schema design). Version Control: Proficient with Git and GitHub workflows (branching, pull requests, code reviews). Development Tools: Comfortable using Visual Studio Code (VSCode) or similar IDEs. Additional Requirements • Communication: Ability to communicate clearly in English (written and verbal). Collaboration: Experience working in distributed or remote teams. Problem Solving: Demonstrated ability to troubleshoot and debug issues independently. Learning: Willingness to learn new technologies and adapt to changing requirements. ________________________________________ Preferred but not required: Experience with CI/CD pipelines. Familiarity with Agile methodologies. Exposure to monitoring/logging tools (e.g., Prometheus, Grafana, ELK stack). Job Type: Full-time Pay: ₹1,100,000.00 - ₹1,300,000.00 per year Schedule: UK shift Work Location: In person
Posted 23 hours ago
0 years
0 Lacs
Hyderābād
On-site
Global Technology Solutions (GTS) at ResMed is a division dedicated to creating innovative, scalable, and secure platforms and services for patients, providers, and people across ResMed. The primary goal of GTS is to accelerate well-being and growth by transforming the core, enabling patient, people, and partner outcomes, and building future-ready operations. The strategy of GTS focuses on aligning goals and promoting collaboration across all organizational areas. This includes fostering shared ownership, developing flexible platforms that can easily scale to meet global demands, and implementing global standards for key processes to ensure efficiency and consistency. Role Overview As a Data Engineering Lead, you will be responsible for overseeing and guiding the data engineering team in developing, optimizing , and maintaining our data infrastructure. You will play a critical role in ensuring the seamless integration and flow of data across the organization, enabling data-driven decision-making and analytics. Key Responsibilities Data Integration: Coordinate with various teams to ensure seamless data integration across the organization's systems. ETL Processes: Develop and implement efficient data transformation and ETL (Extract, Transform, Load) processes. Performance Optimization: Optimize data flow and system performance for enhanced functionality and efficiency. Data Security: Ensure adherence to data security protocols and compliance standards to protect sensitive information. Infrastructure Management: Oversee the development and maintenance of the data infrastructure, ensuring scalability and reliability. Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven initiatives. Innovation: Stay updated with the latest trends and technologies in data engineering and implement best practices. Qualifications Experience: Proven experience in data engineering, with a strong background in leading and managing teams. Technical Skills: Proficiency in programming languages such as Python, Java, and SQL, along with experience in big data technologies like Hadoop, Spark, and Kafka. Data Management: In-depth understanding of data warehousing, data modeling, and database management systems. Analytical Skills: Strong analytical and problem-solving skills with the ability to handle complex data challenges. Communication: Excellent communication and interpersonal skills, capable of working effectively with cross-functional teams. Education: Bachelor's or Master's degree in Computer Science , Engineering, or a related field. Why Join Us? Work on cutting-edge data projects and contribute to the organization's data strategy. Collaborative and innovative work environment that values creativity and continuous learning. If you are a strategic thinker with a passion for data engineering and leadership, we would love to hear from you. Apply now to join our team and make a significant impact on our data-driven journey. #LI-India Joining us is more than saying “yes” to making the world a healthier place. It’s discovering a career that’s challenging, supportive and inspiring. Where a culture driven by excellence helps you not only meet your goals, but also create new ones. We focus on creating a diverse and inclusive culture, encouraging individual expression in the workplace and thrive on the innovative ideas this generates. If this sounds like the workplace for you, apply now! We commit to respond to every applicant.
Posted 23 hours ago
6.0 - 10.0 years
3 - 6 Lacs
Hyderābād
On-site
Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description Mandatory skills: Java, springboot, Microservices, Kafka Responsibilities include designing and developing high-volume, low-latency applications for mission-critical business systems / application services and modules. Delivering high-availability and performance. We expect them to contribute to all phases of the development lifecycle including writing well designed, testable, efficient code. Must be capable of working independently and collaboratively. Responsibilities: Developer responsibilities include, but are not limited to the following: Experience as a Sun Certified Java Developer with proven hands-on Software Development experience. We use Java 8 6-10 years java development experience with JSE/JEE, Java based Micro-services framework and implementation, Spring framework, Hibernate framework, SQL etc Hands on experience on Spring boot & SPARK Microservices and OSGi specifications Hands on experience on Kafka Strong knowledge of micro-service logging, monitoring, debugging and testing Implementations experience of micro-service integration, packaging, build automation and deployment At least two years of experience in SOA & Micro services based process applications using BPM (Activiti/JBPM/Camunda) Object Oriented analysis and design using common design patterns. Insight of Java and JEE internals (Class loading, Memory Management, Transaction management etc) Excellent knowledge of Relational Databases, SQL and ORM technologies (JPA2, Hibernate) Experience in developing web applications using at least one popular web framework (JSF, Wicket, GWT, Spring MVC, Spring Boot) Hands on experience with Relational and NOSQL databases (Mongo DB or Cassandra either one is must) Hands on experience in one of the cloud AWS, Google or Azure. Hands on with Rest based web services Work experience either of following CLOUD (AWS or Azure or GCP) will be an advantage. ͏ ͏ ͏ ͏ Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Fullstack Java Enterprise. Experience: 3-5 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 23 hours ago
3.0 years
0 Lacs
Gurgaon
On-site
JOB DESCRIPTION About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. TempHtmlFile About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. About: Our Financial Crimes specialist teams provide solutions to BFSI clients by conducting model validation testing for AML risk models and frameworks, sanctions screening and transaction monitoring system to ensure efficiency and efficacy of underlying frameworks both functionally and statistically. We are looking to hire colleagues with advance data science and analytics skill to support our financial crimes team. You will play a crucial role in helping clients tackle the multifaceted challenges of financial crime. By utilizing advanced analytics and deep technical knowledge, our team aids top clients in reducing risks associated with financial crime, terrorist financing, and sanctions violations. We also work to enhance their screening and transaction monitoring systems. Our team of specialized analysts ensures that leading financial institutions adhere to industry best practices for robust programs and controls. Through a variety of project experiences, you will develop your professional skills, assisting clients in understanding and addressing complex issues, and implementing top-tier solutions to resolve identified problems. Minimum work experience: 3+ years of advance analytics Preferred experience: 1+ years in AML model validation Responsibilities · Support functional SME teams to build data driven Financial Crimes solution · Conduct statistical testing of the screening matching algorithms, risk rating models and thresholds configured for detection rules · Validate data models of AML systems built on systems such as SAS Viya, Actimize, Lexis Nexis, Napier, etc. · Develop, validate, and maintain AML models to detect suspicious activities and transactions. · Conduct Above the Line and Below the Line testing · Conduct thorough model validation processes, including performance monitoring, tuning, and calibration. · Ensure compliance with regulatory requirements and internal policies related to AML model risk management. · Collaborate with cross-functional teams to gather and analyze data for model development and validation. · P erform data analysis and statistical modeling to identify trends and patterns in financial transactions. · Prepare detailed documentation and reports on model validation findings and recommendations. · Assist in feature engineering for improvising Gen AI prompts applicable for automation of AML / Screening related investigations · Use advanced Machine Learning deployment (e.g. XGBoost) and GenAI approaches Criteria: · Bachelor’s degree from accredited university · 3+ years of complete hands-on experience in Python with an experience in Java, Fast, Django, Tornado or Flask frameworks · Working experience in Relational and NoSQL databases like Oracle, MS SQL MongoDB or ElasticSearch · Proficiency BI tools such as Power BI, Tableau, etc. · Proven experience in data model development and testing · Education background in Data Science and Statistics · Strong proficiency in programming languages such as Python, R, and SQL. · Expertise in machine learning algorithms, statistical analysis, and data visualization tools. · Familiarity with regulatory guidelines and standards for AML · Experience in AML related model validation and testing · Expertise in techniques and algorithms to include sampling, optimization, logistic regression, cluster analysis, Neural Networks, Decision Trees, supervised and unsupervised machine learning Preferred experiences: Validation of AML compliance models such as statistical testing of customer / transaction risk models, screening algorithm testing, etc. Experience with developing proposals (especially new solutions) Experience working AML technology platforms e.g. Norkom, SAS, Lexis Nexis, etc. Hands on experience with data analytics tools using Informatica, Kafka, etc. Equal employment opportunity information KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you. QUALIFICATIONS Bachelor’s degree from accredited university · Education background in Data Science and Statistics · 3+ years of complete hands-on experience in data science and data analytics
Posted 23 hours ago
0 years
12 - 20 Lacs
Gurgaon
Remote
Position: GCP Data Engineer Company Info: Prama (HQ : Chandler, AZ, USA) Prama specializes in AI-powered and Generative AI solutions for Data, Cloud, and APIs. We collaborate with businesses worldwide to develop platforms and AI-powered products that offer valuable insights and drive business growth. Our comprehensive services include architectural assessment, strategy development, and execution to create secure, reliable, and scalable systems. We are experts in creating innovative platforms for various industries. We help clients to overcome complex business challenges. Our team is dedicated to delivering cutting-edge solutions that elevate the digital experience for corporations. Prama is headquartered in Phoenix with offices in USA, Canada, Mexico, Brazil and India. Location: Bengaluru | Gurugram | Hybrid Benefits: 5 Day Working | Career Growth | Flexible working | Potential On-site Opportunity Kindly send your CV or Resume to careers@prama.ai Primary skills: GCP, PySpark, Python, SQL, ETL Job Description: We are seeking a highly skilled and motivated GCP Data Engineer to join our team. As a GCP Data Engineer, you will play a crucial role in designing, developing, and maintaining robust data pipelines and data warehousing solutions on the Google Cloud Platform (GCP). You will work closely with data analysts, data scientists, and other stakeholders to ensure the efficient collection, transformation, and analysis of large datasets. Responsibilities: · Design, develop, and maintain scalable data pipelines using GCP tools such as Dataflow, Dataproc, and Cloud Functions. · Implement ETL processes to extract, transform, and load data from various sources into BigQuery. · Optimize data pipelines for performance, cost-efficiency, and reliability. · Collaborate with data analysts and data scientists to understand their data needs and translate them into technical solutions. · Design and implement data warehouses and data marts using BigQuery. · Model and structure data for optimal performance and query efficiency. · Develop and maintain data quality checks and monitoring processes. · Use SQL and Python (PySpark) to analyze large datasets and generate insights. · Create visualizations using tools like Data Studio or Looker to communicate data findings effectively. · Manage and maintain GCP resources, including virtual machines, storage, and networking. · Implement best practices for security, cost optimization, and scalability. · Automate infrastructure provisioning and management using tools like Terraform. Qualifications: · Strong proficiency in SQL, Python, and PySpark. · Hands-on experience with GCP services, including BigQuery, Dataflow, Dataproc, Cloud Storage, and Cloud Functions. · Experience with data warehousing concepts and methodologies. · Understanding of data modeling techniques and best practices. · Strong analytical and problem-solving skills. · Excellent communication and collaboration skills. · Experience with data quality assurance and monitoring. · Knowledge of cloud security best practices. · A passion for data and a desire to learn new technologies. Preferred Qualifications: · Google Cloud Platform certification. · Experience with machine learning and AI. · Knowledge of data streaming technologies (Kafka, Pub/Sub). · Experience with data visualization tools (Looker, Tableau, Data Studio Job Type: Full-time Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Leave encashment Paid sick time Provident Fund Work from home Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): CTC Expected CTC Notice Period (days) Experience in GCP Total Experience Work Location: Hybrid remote in Gurugram, Haryana
Posted 23 hours ago
5.0 years
0 Lacs
India
Remote
Note: Please do not apply if your experience is less than 5 years. We will hire ONLY people with experience with Travel Industry who have worked on Hotel, Car Rental or Ferry Booking API integrations. Company Description: Our company is involved in promoting Greece for the last 25 years through travel sites visited from all around the world with 10 million visitors per year such www.greeka.com, www.ferriesingreece.com etc. Through the websites, we provide a range of travel services for a seamless holiday experience such online car rental reservations, ferry tickets, transfers, tours etc. Role Description: This is a full-time remote role for a Sr Node.js Backend Developer. As a Sr Backend Developer, you will be responsible for developing and maintaining the server-side logic of our website and web applications and responsible to check achievements of the junior developers and take responsibilities of their deliverables. We have at the moment 15 developers. Your day-to-day tasks will include designing and implementing our own APIs or implementing different 3d parties APIs, optimizing the performance of our backend systems, and collaborating with the frontend development/backend team to ensure seamless integration. Qualifications: • Strong proficiency in Node.js and server-side JavaScript development • Experience with Nest.js, Fastify or similar web frameworks • Must have knowledge about Travel APIs. • Strong experience in microservices approach • Strong knowledge about AMQPs like RabbitMQ, Kafka, Nats or similar tools. • Knowledge of database technologies such as MongoDB or Postgres SQL • Experience with API design and development (Rest/Soap) • Familiarity with front-end technologies like HTML, CSS, and JavaScript or frameworks like React.js, Angular.js and Next.js • Understanding of version control systems, such as Git • Ability to work independently and in a team environment • Excellent problem-solving skills and attention to detail If you have a passion for learning up-to-date modern techniques that will enhance your skills and being part of super exciting projects, and if you want to contribute to making travel experiences around the globe unforgettable, we would love to hear from you. Show more Show less
Posted 23 hours ago
3.0 years
4 - 6 Lacs
Pune
Remote
R020800 Pune, Maharashtra, India Engineering Regular Location Details: Pune (Maharastra) - Hybrid At GoDaddy the future of work looks different for each team. Some teams work in the office full-time; others have a hybrid arrangement (they work remotely some days and in the office some days) and some work entirely remotely. Hybrid: This is a hybrid position. You’ll divide your time between working remotely from your home and an office, so you should live within commuting distance. Hybrid teams may work in-office as much as a few times a week or as little as once a month or quarter, as decided by leadership. The hiring manager can share more about what hybrid work might look like for this team. Join Our Team... The Terminal Management team in the GoDaddy Commerce Division is responsible for the maintenance and development of Poynt Smart Terminal Management System (aka Mothership) capabilities that monitors, provisions, and updates all Smart Terminals, Mobile apps and 3rd party PoyntOS enabled hardware. It plays a vital role in the entire lifecycle of a Payment Terminal - from the factory, through fulfillment, at the customer site (e.g. retail stores) and our repair centers. This system enables real time control and insights of each Terminal, enabling customer service, fulfillment providers, resellers, banking partners and ISOs to remotely manage their customers’ devices, manage various device settings and fine tune the device for a customer’s in-store environment (network settings, timeouts, EMV parameters, etc.). If you love building impactful products that set the path for future growth and expansion immediately, join our team! What you'll get to do... As a member of the Terminal Management team, you will play a pivotal role in enhancing various critical aspects of our Terminal Management platform, contributing to a seamless and efficient ecosystem for managing payment terminals globally: Comprehensive API Development: Build and maintain a robust set of APIs to manage payment terminals, supporting both GoDaddy’s proprietary devices and third-party OEM hardware across diverse markets worldwide. Over-the-Air (OTA) Management System: Design and improve a scalable OTA solution to efficiently distribute OS and application updates to hundreds of thousands of terminals globally, ensuring devices are always up-to-date with the latest features and security enhancements. Real-Time Data Collection and Diagnostics: Implement advanced systems for collecting and managing logs, crash reports, and critical system diagnostics in real-time, enabling rapid troubleshooting and proactive issue resolution. Data Integration and Analytics: Work closely with data ingestion pipelines to streamline the collection and processing of vital telemetry data, feeding into business analytics, product insights, and machine learning models that inform strategic decision-making. Global Device Provisioning and Configuration: Develop systems for seamless provisioning and configuration of devices at scale, allowing easy customization for diverse customer environments, including network settings, EMV parameters, and region-specific requirements. Enhanced Lifecycle Management: Contribute to features that support every phase of a terminal's lifecycle—from factory production and fulfillment to deployment at retail locations and service centers, ensuring operational excellence and customer satisfaction. Partner and Reseller Enablement: Enable customer service teams, resellers, banking partners, and ISOs with tools to remotely manage and fine-tune devices, driving efficiency and reducing customer downtime. Your experience should include... 3+ years of experience in server-side programming preferably with Java / Golang. Proficient in developing secure, high-performance cloud applications on AWS (ECS, EC2). Expertise in designing and implementing external-facing, highly organized APIs. Skilled in building large-scale cloud services, distributed systems, and event-driven architectures. Strong knowledge of databases (SQL, NoSQL) and scalable data management solutions. You might also have... At least 2 years experience with Java / Golang backend development. Knowledge of integrating messaging systems like Kafka, RabbitMQ, or AWS SNS/SQS. Familiarity with AWS Lambda or similar platforms for building lightweight, event-driven applications. We've got your back... We offer a range of total rewards that may include paid time off, retirement savings (e.g., 401k, pension schemes), bonus/incentive eligibility, equity grants, participation in our employee stock purchase plan, competitive health benefits, and other family-friendly benefits including parental leave. GoDaddy’s benefits vary based on individual role and location and can be reviewed in more detail during the interview process. We also embrace our diverse culture and offer a range of Employee Resource Groups (Culture). Have a side hustle? No problem. We love entrepreneurs! Most importantly, come as you are and make your own way. About us... GoDaddy is empowering everyday entrepreneurs around the world by providing the help and tools to succeed online, making opportunity more inclusive for all. GoDaddy is the place people come to name their idea, build a professional website, attract customers, sell their products and services, and manage their work. Our mission is to give our customers the tools, insights, and people to transform their ideas and personal initiative into success. To learn more about the company, visit About Us. At GoDaddy, we know diverse teams build better products—period. Our people and culture reflect and celebrate that sense of diversity and inclusion in ideas, experiences and perspectives. But we also know that’s not enough to build true equity and belonging in our communities. That’s why we prioritize integrating diversity, equity, inclusion and belonging principles into the core of how we work every day—focusing not only on our employee experience, but also our customer experience and operations. It’s the best way to serve our mission of empowering entrepreneurs everywhere, and making opportunity more inclusive for all. To read more about these commitments, as well as our representation and pay equity data, check out our Diversity and Pay Parity annual report which can be found on our Diversity Careers page. GoDaddy is proud to be an equal opportunity employer . GoDaddy will consider for employment qualified applicants with criminal histories in a manner consistent with local and federal requirements. Refer to our full EEO policy. Our recruiting team is available to assist you in completing your application. If they could be helpful, please reach out to myrecruiter@godaddy.com. GoDaddy doesn’t accept unsolicited resumes from recruiters or employment agencies.
Posted 23 hours ago
1.0 years
11 - 13 Lacs
Pune
Remote
Experience : 1 + Years Work location: Bangalore, Chennai, Hyderabad, Pune- Hybrid Job Description : GCP Cloud Engineer Shift Time:- 2 to 11 PM IST Budget:- Max 13 LPA Primary Skill & Weightage GCP -50% Kubernetes -25% NodeJS -25% Technical Skills Cloud: Experience working with Google Cloud Platform (GCP) services. Containers & Orchestration: Practical experience deploying and managing applications on Kubernetes. Programming: Proficiency in Node.js development, including building and maintaining RESTful APIs or backend services. Messaging: Familiarity with Apache Kafka for producing and consuming messages. Databases: Experience with PostgreSQL or similar relational databases (writing queries, basic schema design). Version Control: Proficient with Git and GitHub workflows (branching, pull requests, code reviews). Development Tools: Comfortable using Visual Studio Code (VSCode) or similar IDEs. Additional Requirements • Communication: Ability to communicate clearly in English (written and verbal). Collaboration: Experience working in distributed or remote teams. Problem Solving: Demonstrated ability to troubleshoot and debug issues independently. Learning: Willingness to learn new technologies and adapt to changing requirements. ________________________________________ Preferred but not required: Experience with CI/CD pipelines. Familiarity with Agile methodologies. Exposure to monitoring/logging tools (e.g., Prometheus, Grafana, ELK stack). Job Type: Full-time Pay: ₹1,100,000.00 - ₹1,300,000.00 per year Schedule: UK shift Work Location: In person
Posted 23 hours ago
8.0 - 12.0 years
15 - 29 Lacs
India
On-site
Job Title: Technical Lead Experience: 8 to 12 Years Location: Chennai Domain : BFSI Job Summary: We are seeking a versatile and highly skilled Senior Software Engineer with expertise in full stack development, mobile application development using Flutter, and backend systems using Java/Spring Boot. The ideal candidate will have strong experience across modern development stacks, cloud platforms (AWS), containerization, and CI/CD pipelines. Key Responsibilities: Design and develop scalable web, mobile, and backend applications. Build high-quality, performant cross-platform mobile apps using Flutter and Dart. Develop RESTful APIs and services using Node.js/Express and Java/Spring Boot. Integrate frontend components with backend logic and databases (Oracle, PostgreSQL, MongoDB). Work with containerization tools like Docker and orchestration platforms like Kubernetes or ROSA. Leverage AWS cloud services for deployment, scalability, and monitoring (e.g., EC2, S3, RDS, Lambda). Collaborate with cross-functional teams including UI/UX, QA, DevOps, and product managers. Participate in Agile ceremonies, code reviews, unit/integration testing, and performance tuning. Maintain secure coding practices and ensure compliance with security standards. Required Skills & Qualifications: Strong programming in Java (Spring Boot), Node.js, and React.js. Proficiency in Flutter & Dart for mobile development. Experience with REST APIs, JSON, and third-party integrations. Hands-on experience with cloud platforms (preferably AWS). Strong skills in databases such as Oracle, PostgreSQL, MongoDB. Experience with Git, CI/CD tools (Jenkins, GitLab CI, GitHub Actions). Familiarity with containerization using Docker and orchestration via Kubernetes. Knowledge of secure application development (OAuth, JWT, encryption). Solid understanding of Agile/Scrum methodologies. Preferred Qualifications: Experience with Firebase, messaging queues (Kafka/RabbitMQ), and server-side rendering (Next.js). Familiarity with DevOps practices, infrastructure as code (Terraform/CloudFormation), and observability tools (Prometheus, ELK). Exposure to platform-specific integrations for Android/iOS through native channels. Understanding of App Store / Play Store deployment. Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field, or equivalent practical experience. Job Types: Full-time, Permanent Pay: ₹1,575,371.85 - ₹2,989,972.99 per year Benefits: Health insurance Provident Fund Schedule: Morning shift Supplemental Pay: Performance bonus Yearly bonus Application Question(s): How many years of experience in Java? How many years of experience in Flutter? How many years of experience in CRM Model ? What's your notice period ? Work Location: In person
Posted 23 hours ago
1.0 years
11 - 13 Lacs
Chennai
Remote
Experience : 1 + Years Work location: Bangalore, Chennai, Hyderabad, Pune- Hybrid Job Description : GCP Cloud Engineer Shift Time:- 2 to 11 PM IST Budget:- Max 13 LPA Primary Skill & Weightage GCP -50% Kubernetes -25% NodeJS -25% Technical Skills Cloud: Experience working with Google Cloud Platform (GCP) services. Containers & Orchestration: Practical experience deploying and managing applications on Kubernetes. Programming: Proficiency in Node.js development, including building and maintaining RESTful APIs or backend services. Messaging: Familiarity with Apache Kafka for producing and consuming messages. Databases: Experience with PostgreSQL or similar relational databases (writing queries, basic schema design). Version Control: Proficient with Git and GitHub workflows (branching, pull requests, code reviews). Development Tools: Comfortable using Visual Studio Code (VSCode) or similar IDEs. Additional Requirements • Communication: Ability to communicate clearly in English (written and verbal). Collaboration: Experience working in distributed or remote teams. Problem Solving: Demonstrated ability to troubleshoot and debug issues independently. Learning: Willingness to learn new technologies and adapt to changing requirements. ________________________________________ Preferred but not required: Experience with CI/CD pipelines. Familiarity with Agile methodologies. Exposure to monitoring/logging tools (e.g., Prometheus, Grafana, ELK stack). Job Type: Full-time Pay: ₹1,100,000.00 - ₹1,300,000.00 per year Schedule: UK shift Work Location: In person
Posted 23 hours ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Req ID: 321843 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Sr Java Full Stack Developer to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Lead Java Developer How You’ll Help Us: Our clients need digital solutions that will transform their business so they can succeed in today’s hypercompetitive marketplace. As a team member you will routinely deliver elite solutions to clients that will impact their products, customers, and services. Using your development, design and leadership skills and experience, you will design and implement solutions based on client needs. You will collaborate with customers on future system enhancements, thus resulting to continued engagements. How We Will Help You: Joining our Java practice is not only a job, but a chance to grow your career. We will make sure to equip you with the skills you need to produce robust applications that you can be proud of. Whether it is providing you with training on a new programming language or helping you get certified in a new technology, we will help you grow your skills so you can continue to deliver increasingly valuable work. Once You Are Here, You Will: The Lead Applications Developer provides leadership in full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery is on time and within budget. You will direct component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. This position develops and leads AD project activities and integrations. The Lead Applications Developer guides teams to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications’ development, integration, and maintenance. The Lead Applications Developer will lead junior team members with project related activities and tasks. You will guide and influence department and project teams. This position facilitates collaboration with stakeholders. Apply Disaster Recovery Knowledge Apply Foundation Architecture Knowledge Apply Information Analysis and Solution Generation Knowledge Apply Information Systems Knowledge Apply Internal Systems Knowledge Assess Business Needs IT – Design/Develop Application Solutions IT – Knowledge of Emerging Technology IT – Process, Methods, and Tools IT – Stakeholder Relationship Management Project Risk Management Problem Management and Project Planning Technical Problem Solving and Analytical Processes Technical Writing Job Requirements: Lead IS Projects; delegate work assignments to complete the deliverables for small projects or components of larger projects to meet project plan requirements Lead System Analysis and Design; Translates business and functional requirements into technical design to meet stated business needs. Leads Design and Development of Applications; Identify new areas for process improvements to enhance performance results. Deliver application solutions to meet business and non-functional requirements. Develop and Ensure Creation of Application Documentation; determines documentation needs to deliver applications Define and Produce Integration Builds; lead build processes for target environments to create software. Verifies integration test specifications to ensure proper testing. Monitor Emerging Technology Trends; monitor the industry to gain knowledge and understanding of emerging technologies. Lead Maintenance and Support; drives problem resolution to identify, recommend, and implement process improvements. Lead other Team Members; provides input to people processes (e.g., Quality Performance Review Career Development, Training, Staffing, etc.) to provide detailed performance level information to managers. Basic qualifications: 6+ years of experience with Java, leading the development of highly scalable and resilient applications. 6+ years of experience of deep architectural experience with Spring Boot, including experience mentoring others in its best practices and advanced features. 4+ years of Angular 4+ years of GCP or similar platform such as Azure or AWS 4+ years of experience with Couchbase, including leading performance tuning, data modeling, and scalability efforts. 4+ years of experience with Kafka, AMQ, WMQ and the strategic implementation of messaging and event-driven architectures 4+ years of experience in Apache Camel, including designing and implementing complex integration solutions. 4+ years of leadership experience in adopting new technologies and frameworks, guiding best practices in development methodologies, and overseeing technical project management Ideal Mindset: Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator. You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. Please note Shift Timing Requirement: 1:30pm IST -10:30 pm IST #Launchjobs #LaunchEngineering About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less
Posted 23 hours ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Key Responsibilities Lead and architect end-to-end data migrations from on-premise and legacy systems to Snowflake, ensuring optimal performance, scalability, and cost-efficiency. Design and develop reusable data ingestion and transformation frameworks using Python. Build and optimize real-time ingestion pipelines using Kafka, Snowpipe, and the COPY command. Utilize SnowConvert to migrate and optimize legacy ETL and SQL logic for Snowflake. Design and implement high-performance Snowflake data models, including materialized views, clustering keys, and result caching strategies. Monitor resource usage and implement auto-suspend/auto-resume, query profiling, and cost-control measures to manage compute and storage effectively. Drive cost governance initiatives, providing insights into credit usage and optimizing workload distribution. Integrate Snowflake with AWS services such as S3, Lambda, Glue, and Step Functions to ensure a robust data ecosystem. Mentor junior engineers, enforce best practices in development and code quality, and champion agile data engineering practices. ________________________________________ Required Skills And Experience 10+ years of experience in data engineering with a focus on enterprise ETL and cloud data platforms. 4+ years of hands-on experience in Snowflake development and architecture. Expertise In Advanced Snowflake Features Such As Snowpark, Streams & Tasks, Secure Data Sharing, Data Masking, and Time Travel. Proven ability to architect enterprise-grade Snowflake solutions optimized for performance, governance, and scalability. Proficient in Python for building orchestration tools, automation, and reusable data pipelines. Solid knowledge of AWS services, including S3, IAM, Lambda, Glue, and Step Functions. Hands-on experience with SnowConvert or similar tools for legacy code conversion. Familiarity with real-time data streaming technologies such as Kafka, Kinesis, or other event-based systems. Strong SQL skills with proven experience in query tuning, profiling, and performance optimization. Deep understanding of legacy ETL tools, with preferable experience in Ab Initio. Exposure to CI/CD pipelines, version control systems (e.g., Git), and automated deployment practices. ________________________________________ Preferred Qualifications Bachelors degree in Computer Science, Information Technology, or a related field. Experience in migrating on-premises or mainframe data warehouses to Snowflake. Familiarity with BI/analytics tools such as Tableau, Power BI, or Looker. Knowledge of data security and compliance best practices, including data masking, RBAC, and OAuth integration. Snowflake certifications (Developer, Architect) are a strong plus. Show more Show less
Posted 23 hours ago
5.0 years
15 Lacs
India
On-site
Key Responsibilities: Architect, design, and optimize enterprise-grade NiFi data flows for large-scale ingestion, transformation, and routing. Manage Kafka clusters at scale (multi-node, multi-datacenter setups), ensuring high availability, fault tolerance, and maximum throughput. Create custom NiFi processors and develop advanced flow templates and best practices. Handle advanced Kafka configurations — partitioning, replication, producer tuning, consumer optimization, rebalancing, etc. Implement stream processing using Kafka Streams and manage Kafka Connect integrations with external systems (databases, APIs, cloud storage). Design secure pipelines with end-to-end encryption, authentication (SSL/SASL), and RBAC for both NiFi and Kafka. Proactively monitor and troubleshoot performance bottlenecks in real-time streaming environments. Collaborate with infrastructure teams for scaling, backup, and disaster recovery planning for NiFi/Kafka. Mentor junior engineers and enforce best practices for data flow and streaming architectures. Required Skills and Qualifications: 5+ years of hands-on production experience with Apache NiFi and Apache Kafka . Deep understanding of NiFi architecture (flow file repository, provenance, state management, backpressure handling). Mastery over Kafka internals — brokers, producers/consumers, Zookeeper (or KRaft mode), offsets, ISR, topic configurations. Strong experience with Kafka Connect , Kafka Streams , Schema Registry , and data serialization formats (Avro, Protobuf, JSON). Expertise in tuning NiFi and Kafka for ultra-low latency and high throughput . Strong scripting and automation skills (Shell, Python, Groovy, etc.). Experience with monitoring tools : Prometheus, Grafana, Confluent Control Center, NiFi Registry, NiFi Monitoring dashboards. Solid knowledge of security best practices in data streaming (encryption, access control, secret management). Hands-on experience deploying on cloud platforms (AWS MSK, Azure Event Hubs, GCP Pub/Sub with Kafka connectors). Bachelor's or Master’s degree in Computer Science, Data Engineering, or equivalent field. Preferred (Bonus) Skills: Experience with containerization and orchestration (Docker, Kubernetes, Helm). Knowledge of stream processing frameworks like Apache Flink or Spark Streaming. Contributions to open-source NiFi/Kafka projects (a huge plus!). Soft Skills: Analytical thinker with exceptional troubleshooting skills. Ability to architect solutions under tight deadlines. Leadership qualities for guiding and mentoring engineering teams. Excellent communication and documentation skills. pls send your resume on hr@rrmgt.in or call me on 9081819473. Job Type: Full-time Pay: From ₹1,500,000.00 per year Work Location: In person
Posted 23 hours ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Req ID: 321853 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Sr Java Full Stack Developer to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Lead Java Developer How You’ll Help Us: Our clients need digital solutions that will transform their business so they can succeed in today’s hypercompetitive marketplace. As a team member you will routinely deliver elite solutions to clients that will impact their products, customers, and services. Using your development, design and leadership skills and experience, you will design and implement solutions based on client needs. You will collaborate with customers on future system enhancements, thus resulting to continued engagements. How We Will Help You: Joining our Java practice is not only a job, but a chance to grow your career. We will make sure to equip you with the skills you need to produce robust applications that you can be proud of. Whether it is providing you with training on a new programming language or helping you get certified in a new technology, we will help you grow your skills so you can continue to deliver increasingly valuable work. Once You Are Here, You Will: The Lead Applications Developer provides leadership in full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery is on time and within budget. You will direct component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. This position develops and leads AD project activities and integrations. The Lead Applications Developer guides teams to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications’ development, integration, and maintenance. The Lead Applications Developer will lead junior team members with project related activities and tasks. You will guide and influence department and project teams. This position facilitates collaboration with stakeholders. Apply Disaster Recovery Knowledge Apply Foundation Architecture Knowledge Apply Information Analysis and Solution Generation Knowledge Apply Information Systems Knowledge Apply Internal Systems Knowledge Assess Business Needs IT – Design/Develop Application Solutions IT – Knowledge of Emerging Technology IT – Process, Methods, and Tools IT – Stakeholder Relationship Management Project Risk Management Problem Management and Project Planning Technical Problem Solving and Analytical Processes Technical Writing Job Requirements: Lead IS Projects; delegate work assignments to complete the deliverables for small projects or components of larger projects to meet project plan requirements Lead System Analysis and Design; Translates business and functional requirements into technical design to meet stated business needs. Leads Design and Development of Applications; Identify new areas for process improvements to enhance performance results. Deliver application solutions to meet business and non-functional requirements. Develop and Ensure Creation of Application Documentation; determines documentation needs to deliver applications Define and Produce Integration Builds; lead build processes for target environments to create software. Verifies integration test specifications to ensure proper testing. Monitor Emerging Technology Trends; monitor the industry to gain knowledge and understanding of emerging technologies. Lead Maintenance and Support; drives problem resolution to identify, recommend, and implement process improvements. Lead other Team Members; provides input to people processes (e.g., Quality Performance Review Career Development, Training, Staffing, etc.) to provide detailed performance level information to managers. Basic qualifications: 6+ years of experience with Java, leading the development of highly scalable and resilient applications. 6+ years of experience of deep architectural experience with Spring Boot, including experience mentoring others in its best practices and advanced features. 4+ years of Angular 4+ years of GCP or similar platform such as Azure or AWS 4+ years of experience with Couchbase, including leading performance tuning, data modeling, and scalability efforts. 4+ years of experience with Kafka, AMQ, WMQ and the strategic implementation of messaging and event-driven architectures 4+ years of experience in Apache Camel, including designing and implementing complex integration solutions. 4+ years of leadership experience in adopting new technologies and frameworks, guiding best practices in development methodologies, and overseeing technical project management Ideal Mindset: Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator. You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. Please note Shift Timing Requirement: 1:30pm IST -10:30 pm IST #Launchjobs #LaunchEngineering About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less
Posted 23 hours ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Required Skills And Experience 8+ years in IT operations, scheduling, and workflow automation using Control-M. Strong experience integrating Control-M with AWS cloud services. Hands-on experience working with enterprise ETL tools like Ab Initio or Informatica. Experience supporting data migration and orchestration involving modern cloud data platforms like Snowflake. Proficiency in Python scripting for automation and custom tooling around Control-M. Familiarity with real-time data streaming platforms such as Kafka or Kinesis. Solid understanding of job scheduling concepts, batch processing, and event-driven automation. Experience with CI/CD pipelines, Git, and automation of deployment workflows. Strong troubleshooting, root cause analysis, and incident resolution skills. ________________________________________ Preferred Qualifications Bachelors degree in Computer Science, IT, or related field. Experience managing large-scale Control-M environments in enterprise settings. Knowledge of cloud data architecture and modern data engineering practices. Familiarity with Snowflake features and cloud data warehousing concepts. Certification in Control-M Administration or related scheduling tools is a plus. Show more Show less
Posted 23 hours ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a detail-oriented Data Test Engineer to join our data migration and cloud modernization team. The ideal candidate will have hands-on experience testing complex ETL pipelines, data migration workflows, and cloud data platforms like Snowflake, with exposure to legacy ETL tools such as Ab Initio or Informatica. Experience in automating data validation, performance testing, and supporting real-time ingestion using Kafka or similar technologies is essential. ________________________________________ Key Responsibilities Design, develop, and execute test plans for data migration projects moving data from legacy systems to Snowflake. Validate data pipelines developed using ETL tools like Ab Initio and Informatica, ensuring data quality, accuracy, and integrity. Develop automated test scripts and frameworks using Python for data validation, reconciliation, and regression testing. Perform end-to-end data validation including schema validation, volume checks, transformation logic verification, and performance benchmarking. Test real-time data ingestion workflows integrating Kafka, Snowpipe, and Snowflake COPY commands. Collaborate closely with development, data engineering, and DevOps teams to identify defects, track issues, and ensure timely resolution. Participate in designing reusable test automation frameworks tailored for cloud data platforms. Ensure compliance with data governance, security, and regulatory requirements during testing. Document test cases, results, and provide clear reporting to stakeholders. Support CI/CD pipelines by integrating automated testing into the deployment workflow. ________________________________________ Required Skills And Experience 5+ years in data testing or quality assurance with strong experience in data validation and ETL testing. Hands-on experience testing data migrations to Snowflake or other cloud data warehouses. Familiarity with legacy ETL tools like Ab Initio or Informatica and their testing methodologies. Proficient in scripting languages such as Python for test automation and data validation. Knowledge of real-time data streaming platforms such as Kafka, Kinesis, or equivalents. Strong SQL skills for writing complex queries to validate data integrity and transformations. Experience with automated testing tools and frameworks for data quality checks. Understanding of cloud environments, particularly AWS services (S3, Lambda, Glue). Familiarity with CI/CD tools and practices to integrate automated testing. ________________________________________ Preferred Qualifications Bachelors degree in Computer Science, Information Technology, or related field. Experience with performance and load testing of data pipelines. Knowledge of data governance and compliance frameworks. Exposure to BI tools such as Tableau, Power BI for validating data consumption layers. Certifications in data quality or cloud platforms (Snowflake, AWS) are a plus Show more Show less
Posted 23 hours ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Req ID: 321843 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Sr Java Full Stack Developer to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Lead Java Developer How You’ll Help Us Our clients need digital solutions that will transform their business so they can succeed in today’s hypercompetitive marketplace. As a team member you will routinely deliver elite solutions to clients that will impact their products, customers, and services. Using your development, design and leadership skills and experience, you will design and implement solutions based on client needs. You will collaborate with customers on future system enhancements, thus resulting to continued engagements. How We Will Help You Joining our Java practice is not only a job, but a chance to grow your career. We will make sure to equip you with the skills you need to produce robust applications that you can be proud of. Whether it is providing you with training on a new programming language or helping you get certified in a new technology, we will help you grow your skills so you can continue to deliver increasingly valuable work. Once You Are Here, You Will The Lead Applications Developer provides leadership in full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery is on time and within budget. You will direct component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. This position develops and leads AD project activities and integrations. The Lead Applications Developer guides teams to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications’ development, integration, and maintenance. The Lead Applications Developer will lead junior team members with project related activities and tasks. You will guide and influence department and project teams. This position facilitates collaboration with stakeholders. Apply Disaster Recovery Knowledge Apply Foundation Architecture Knowledge Apply Information Analysis and Solution Generation Knowledge Apply Information Systems Knowledge Apply Internal Systems Knowledge Assess Business Needs IT – Design/Develop Application Solutions IT – Knowledge of Emerging Technology IT – Process, Methods, and Tools IT – Stakeholder Relationship Management Project Risk Management Problem Management and Project Planning Technical Problem Solving and Analytical Processes Technical Writing Job Requirements Lead IS Projects; delegate work assignments to complete the deliverables for small projects or components of larger projects to meet project plan requirements Lead System Analysis and Design; Translates business and functional requirements into technical design to meet stated business needs. Leads Design and Development of Applications; Identify new areas for process improvements to enhance performance results. Deliver application solutions to meet business and non-functional requirements. Develop and Ensure Creation of Application Documentation; determines documentation needs to deliver applications Define and Produce Integration Builds; lead build processes for target environments to create software. Verifies integration test specifications to ensure proper testing. Monitor Emerging Technology Trends; monitor the industry to gain knowledge and understanding of emerging technologies. Lead Maintenance and Support; drives problem resolution to identify, recommend, and implement process improvements. Lead other Team Members; provides input to people processes (e.g., Quality Performance Review Career Development, Training, Staffing, etc.) to provide detailed performance level information to managers. Basic Qualifications 6+ years of experience with Java, leading the development of highly scalable and resilient applications. 6+ years of experience of deep architectural experience with Spring Boot, including experience mentoring others in its best practices and advanced features. 4+ years of Angular 4+ years of GCP or similar platform such as Azure or AWS 4+ years of experience with Couchbase, including leading performance tuning, data modeling, and scalability efforts. 4+ years of experience with Kafka, AMQ, WMQ and the strategic implementation of messaging and event-driven architectures 4+ years of experience in Apache Camel, including designing and implementing complex integration solutions. 4+ years of leadership experience in adopting new technologies and frameworks, guiding best practices in development methodologies, and overseeing technical project management Ideal Mindset Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator. You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. Please note Shift Timing Requirement: 1:30pm IST -10:30 pm IST #Launchjobs #LaunchEngineering About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less
Posted 23 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2