Home
Jobs

8532 Kafka Jobs - Page 48

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 years

0 Lacs

India

On-site

Linkedin logo

Company Description 👋🏼 We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience 7+ years Extensive experience in back-end development utilizing Java 8 or higher, Spring Framework (Core/Boot/MVC), Hibernate/JPA, and Microservices Architecture. Experience with messaging systems like Kafka. Hands-on experience with REST APIs, Caching system (e.g Redis) etc. Proficiency in Service-Oriented Architecture (SOA) and Web Services (Apache CXF, JAX-WS, JAX-RS, SOAP, REST). Experience with modern testing framework (Jest, Mocha, Chai) Hands-on experience with multithreading, and cloud development. Strong working experience in Data Structures and Algorithms, Unit Testing, and Object-Oriented Programming (OOP) principles. Hands-on experience with relational databases such as SQL Server, Oracle, MySQL, and PostgreSQL. Experience with DevOps tools and technologies such as Ansible, Docker, Kubernetes, Puppet, Jenkins, and Chef. Proficiency in build automation tools like Maven, Ant, and Gradle. Hands on experience on cloud technologies such as AWS/ Azure. Strong understanding of UML and design patterns. Ability to simplify solutions, optimize processes, and efficiently resolve escalated issues. Strong problem-solving skills and a passion for continuous improvement. Excellent communication skills and the ability to collaborate effectively with cross-functional teams. RESPONSIBILITIES: Writing and reviewing great quality code Understanding functional requirements thoroughly and analyzing the client’s needs in the context of the project Envisioning the overall solution for defined functional and non-functional requirements, and being able to define technologies, patterns and frameworks to realize it Determining and implementing design methodologies and tool sets Enabling application development by coordinating requirements, schedules, and activities. Being able to lead/support UAT and production roll outs Creating, understanding and validating WBS and estimated effort for given module/task, and being able to justify it Addressing issues promptly, responding positively to setbacks and challenges with a mindset of continuous improvement Giving constructive feedback to the team members and setting clear expectations. Helping the team in troubleshooting and resolving of complex bugs Coming up with solutions to any issue that is raised during code/design review and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less

Posted 5 days ago

Apply

2.0 - 4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. Qualifications Experience - 2-4 years Education - B.E/B.Tech/MCA/M.Tech Minimum Qualifications Bachelor's Degree in Computer Science, CIS, or related field (or equivalent work experience in a related field) 2 years of experience in software development or a related field 2 years of experience in database technologies 2 years of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC) Skillset 2 years of experience in software development or a related field 2 2 years of experience in database technologies 2 years of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC) 2 years of experience in Java, JEE application development. Proficient in core Java 8 or higher Spring boot / Microservices development experience is a must Experience with JPA and Hibernate. Should have experience in creating APIs / Kafka consumers. Should have worked in Agile methodology. Strong in DSA and problem Solving Collaboration Tools: Experience with collaboration tools like Jira, Confluence, or Slack. Problem-Solving: Strong analytical and problem-solving skills. Communication: Excellent verbal and written communication skills. Version Control: Proficiency with Git and version control workflows. Must have done Hackerrank/Leetcode/GFG problem --- 100 problems. Equal employment opportunity information KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you. Qualifications B.E- B-Tech Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

oin us as a Software Engineer - Sales Tech at Barclays, responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Software Engineer - Sales Tech you should have experience with: Proficiency in Java development, with extensive experience in Java (version 8 or higher) Experience with container technologies, including microservices, messaging protocols like Kafka, and caching technologies such as Apache Ignite Experience with Spring Boot, JUnit, GitLab/Maven, and JIRA A solid understanding of Test-Driven Development (TDD) and Continuous Integration (CI) processes A Bachelor's degree in Computer Science or equivalent experience Some Other Highly Valued Skills May Include Financial industry experience, knowledge of Fixed Income products (Prior experience with Structuring / Pre-Trade / Post-Trade capture, workflow, processing and associated life-cycling will be a plus) Proficiency in requirements analysis and software design; Experience investigating production incidents with priority with a view to restore the services ASAP Basic knowledge of user interface (UI) and user experience (UX) design principles to collaborate effectively with the UI team Knowledge of microservices orchestration & BPMN tools preferably Camunda Demonstrated ability to collaborate with diverse individuals and global teams You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization’s technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together Primary Responsibilities Develop systems to liberate data from RDBMS platforms into a streaming architecture on-premise and in the cloud Work with automation and orchestration tools such as Kubernetes, GitHub Action, Terrafor, Ansible or other commercial PaaS offerings Continually monitor industry developments in Cloud infrastructure developments, tools and products used in the cloud delivery model Work with Software, Platform Engineering and Operations teams on the development and delivery of operational ready platforms Work with log aggregation tools like Splunk and ELK Design automated, resilient, scalable platform solutions Participate in development of automated delivery work flows using cloud automation and orchestration tools, Unix shell scripting and other deployment tools Assess and interpret customer needs and requirements Involved in solving moderately complex problems and/or conduct moderately complex analysis Deliver the process and standards for developing new and exciting services for our customers Develop integrations between vendor tool APIs and automation frameworks Provide explanation and information to others on difficult issues Coach, provide feedback and guide others Identify/quantify scope and impact of business changes on systems Support, design and improve on monitoring, alerting and tooling efforts Maintain awareness of current technology assets, and the applicability and capability of each Ability and willingness to augment current expertise with new open source or targeted vendor technologies Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Graduate degree or equivalent experience 4+ years of Python or Go development experience 3+ years of cloud technology experience (GCP / Azure preferred) 2+ year in platform engineering experience 2+ years of experience with Agile methodology and Agile DevOps delivery model experience 1+ years of experience working across fully automated stacks in a CI/CD ecosystem Preferred Qualifications Kafka, Kubernetes, Terraform, Go or Python development experience Health care industry experience Experience with distributed data stores at scale Experience in working with emerging technologies Experience working effectively across multiple functional areas in a matrixed environment Experience with Kafka, Prometheus, Grafana or any CDC tools At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less

Posted 5 days ago

Apply

10.0 years

0 Lacs

Vadodara, Gujarat, India

On-site

Linkedin logo

Company Description Wiser Solutions is a suite of in-store and eCommerce intelligence and execution tools. We're on a mission to enable brands, retailers, and retail channel partners to gather intelligence and automate actions to optimize in-store and online pricing, marketing, and operations initiatives. Our Commerce Execution Suite is available globally. Job Description When looking to buy a product, whether it is in a brick and mortar store or online, it can be hard enough to find one that not only has the characteristics you are looking for but is also at a price that you are willing to pay. It can also be especially frustrating when you finally find one, but it is out of stock. Likewise, brands and retailers can have a difficult time getting the visibility they need to ensure you have the most seamless experience as possible in selecting their product. We at Wiser believe that shoppers should have this seamless experience, and we want to do that by providing the brands and retailers the visibility they need to make that belief a reality. Our goal is to solve a messy problem elegantly and cost effectively. Our job is to collect, categorize, and analyze lots of structured and semi-structured data from lots of different places every day (whether it’s 20 million+ products from 500+ websites or data collected from over 300,000 brick and mortar stores across the country). We help our customers be more competitive by discovering interesting patterns in this data they can use to their advantage, while being uniquely positioned to be able to do this across both online and instore. We are looking for a lead-level software engineer to lead the charge on a team of like-minded individuals responsible for developing the data architecture that powers our data collection process and analytics platform. If you have a passion for optimization, scaling, and integration challenges, this may be the role for you. What You Will Do Think like our customers – you will work with product and engineering leaders to define data solutions that support customers’ business practices. Design/develop/extend our data pipeline services and architecture to implement your solutions – you will be collaborating on some of the most important and complex parts of our system that form the foundation for the business value our organization provides Foster team growth – provide mentorship to both junior team members and evangelizing expertise to those on others. Improve the quality of our solutions – help to build enduring trust within our organization and amongst our customers by ensuring high quality standards of the data we manage Own your work – you will take responsibility to shepherd your projects from idea through delivery into production Bring new ideas to the table – some of our best innovations originate within the team Technologies We Use Languages: SQL, Python Infrastructure: AWS, Docker, Kubernetes, Apache Airflow, Apache Spark, Apache Kafka, Terraform Databases: Snowflake, Trino/Starburst, Redshift, MongoDB, Postgres, MySQL Others: Tableau (as a business intelligence solution) Qualifications Bachelors/Master’s degree in Computer Science or relevant technical degree 10+ years of professional software engineering experience Strong proficiency with data languages such as Python and SQL Strong proficiency working with data processing technologies such as Spark, Flink, and Airflow Strong proficiency working of RDMS/NoSQL/Big Data solutions (Postgres, MongoDB, Snowflake, etc.) Solid understanding of streaming solutions such as Kafka, Pulsar, Kinesis/Firehose, etc. Hands-on experience with Docker, Kubernetes, infrastructure as code using Terraform, and Kubernetes package management with Helm charts Solid understanding of ETL/ELT and OLTP/OLAP concepts Solid understanding of columnar/row-oriented data structures (e.g. Parquet, ORC, Avro, etc.) Solid understanding of Apache Iceberg, or other open table formats Proven ability to transform raw unstructured/semi-structured data into structured data in accordance to business requirements Solid understanding of AWS, Linux and infrastructure concepts Proven ability to diagnose and address data abnormalities in systems Proven ability to learn quickly, make pragmatic decisions, and adapt to changing business needs Experience building data warehouses using conformed dimensional models Experience building data lakes and/or leveraging data lake solutions (e.g. Trino, Dremio, Druid, etc.) Experience working with business intelligence solutions (e.g. Tableau, etc.) Experience working with ML/Agentic AI pipelines (e.g. , Langchain, LlamaIndex, etc.) Understands Domain Driven Design concepts and accompanying Microservice Architecture Passion for data, analytics, or machine learning. Focus on value: shipping software that matters to the company and the customer Bonus Points Experience working with vector databases Experience working within a retail or ecommerce environment. Proficiency in other programming languages such as Scala, Java, Golang, etc. Experience working with Apache Arrow and/or other in-memory columnar data technologies Supervisory Responsibility Provide mentorship to team members on adopted patterns and best practices. Organize and lead agile ceremonies such as daily stand-ups, planning, etc Additional Information EEO STATEMENT Wiser Solutions, Inc. is an Equal Opportunity Employer and prohibits Discrimination, Harassment, and Retaliation of any kind. Wiser Solutions, Inc. is committed to the principle of equal employment opportunity for all employees and applicants, providing a work environment free of discrimination, harassment, and retaliation. All employment decisions at Wiser Solutions, Inc. are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion, sex, national origin, family or parental status, disability, genetics, age, sexual orientation, veteran status, or any other status protected by the state, federal, or local law. Wiser Solutions, Inc. will not tolerate discrimination, harassment, or retaliation based on any of these characteristics. Show more Show less

Posted 5 days ago

Apply

7.5 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Project Role : Full Stack Engineer Project Role Description : Responsible for developing and/or engineering the end-to-end features of a system, from user experience to backend code. Use development skills to deliver innovative solutions that help our clients improve the services they provide. Leverage new technologies that can be applied to solve challenging business problems with a cloud first and agile mindset. Must have skills : Java Full Stack Development, Node.js Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : BE Summary: As a Full Stack Engineer, you will be responsible for developing and/or engineering the end-to-end features of a system, from user experience to backend code. You will use your development skills to deliver innovative solutions that help our clients improve the services they provide. Additionally, you will leverage new technologies to solve challenging business problems with a cloud-first and agile mindset. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Develop and engineer end-to-end features of a system. - Deliver innovative solutions to improve client services. - Utilize development skills to solve challenging business problems. - Stay updated with new technologies and apply them to projects. Professional & Technical Skills: - Must To Have Skills: Proficiency in Java Full Stack Development, Apache Kafka. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Java Full Stack Development. - This position is based at our Bengaluru office. - A BE degree is required. Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Gurgaon, Haryana, India

Remote

Linkedin logo

About About this role When BlackRock started in 1988, its founders envisioned a company that combined the best of financial services with cutting edge technology. They imagined a business that would provide financial services to clients as well as technology services to other financial firms. The result of their vision is Aladdin, our industry leading, end-to-end investment management platform. With assets valued over USD $10 trillion managed on Aladdin, our technology empowers millions of investors to save for retirement, pay for college, buy a home and improve their financial wellbeing. Are you interested in building innovative technology that crafts the financial markets? Do you like working at the speed of a startup, and solving some of the world’s most exciting challenges? Do you want to work with, and learn from, hands-on leaders in technology and finance? At BlackRock, we are looking for Software Engineers who like to innovate and solve sophisticated problems. We recognize that strength comes from diversity, and will embrace your outstanding skills, curiosity, and passion while giving you the opportunity to grow technically and as an individual. We invest and protect over $9 trillion (USD) of assets and have an extraordinary responsibility to our clients all over the world. Our technology empowers millions of investors to save for retirement, pay for college, buy a home, and improve their financial well-being. Being a technologist at BlackRock means you get the best of both worlds: working for one of the most sophisticated financial companies and being part of a software development team responsible for next generation technology and solutions. What are Aladdin and Aladdin Engineering? You will be working on BlackRock's investment operating system called Aladdin. Aladdin is used both internally within BlackRock and externally by many financial institutions. Aladdin combines sophisticated risk analytics with comprehensive portfolio management, trading, and operations tools on a single platform to power informed decision-making and create a connective tissue for thousands of users investing worldwide. Our development teams reside inside the Aladdin Engineering group. We collaboratively build the next generation of technology that changes the way information, people, and technology intersect for global investment firms. We build and package tools that manage trillions in assets and supports millions of financial instruments. We perform risk calculations and process millions of transactions for thousands of users every day worldwide! Being a Member Of Aladdin Engineering, You Will Be Tenacious: Work in a fast paced and highly complex environment Creative thinker: Analyses multiple solutions and deploy technologies in a flexible way. Great teammate: Think and work collaboratively and communicate effectively. Fast learner: Pick up new concepts and apply them quickly. Responsibilities Include Collaborate with team members in a multi-office, multi-country environment. Deliver high efficiency, high availability, concurrent and fault tolerant software systems. Significantly contribute to development of Aladdin’s global, multi-asset trading platform. Work with product management and business users to define the roadmap for the product. Design and develop innovative solutions to complex problems, identifying issues and roadblocks. Apply validated quality software engineering practices through all phases of development. Ensure resilience and stability through quality code reviews, unit, regression and user acceptance testing, dev ops and level two production support. Be a leader with vision and a partner in brainstorming solutions for team productivity, efficiency, guiding and motivating others. Drive a strong culture by bringing principles of inclusion and diversity to the team and setting the tone through specific recruiting, management actions and employee engagement. Qualifications B.E./ B.TECH./ MCA or any other relevant engineering degree from a reputed university. Skills And Experience 4 + years of experience A proven foundation in core Java and related technologies, with OO skills and design patterns. Track record building high quality software with design-focused and test-driven approaches. Hands-on experience in Java/ Spring Framework/Sprint Boot/Hibernate In depth understanding of concurrent programming and experience in designing high throughput, high availability, fault tolerant distributed applications. Prior experience in message brokers Understanding of relational databases is a must. Demonstrable experience building modern software using engineering tools such as git, maven, unit testing and integration testing tools, mocking frameworks. Strong analytical and software architecture design skills with an emphasis on test driven development Great analytical, problem-solving and communication skills Some experience or a real interest in finance, investment processes, and/or an ability to translate business problems into technical solutions. Nice To Have And Opportunities To Learn Expertise in building distributed applications using SQL and/or NOSQL technologies like MS SQL, Sybase, Cassandra or Redis A real-world practitioner of applying cloud-native design patterns to event-driven microservice architectures. Exposure to high scale distributed technology like Kafka, Mongo, Ignite, Redis Exposure to building microservices and APIs ideally with REST, Kafka or gRPC Experience working in an agile development team or on open-source development projects. Experience with optimization, algorithms or related quantitative processes. Experience with Cloud platforms like Microsoft Azure, AWS, Google Cloud Experience with cloud deployment technology (Docker, Ansible, Terraform, etc.) is also a plus. Experience with DevOps and tools like Azure DevOps Experience with AI-related projects/products or experience working in an AI research environment. Knowledge of modern front-end frameworks such as React, Vue.js or Angular is a plus. Exposure to Docker, Kubernetes, and cloud services is beneficial. A degree, certifications or opensource track record that shows you have a mastery of software engineering principles. Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Kafka Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in problem-solving discussions and contribute to the overall success of the projects by implementing effective strategies and solutions. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Kafka. - Strong understanding of distributed systems and messaging protocols. - Experience with stream processing frameworks and data integration tools. - Familiarity with cloud platforms and containerization technologies. - Ability to troubleshoot and optimize application performance. Additional Information: - The candidate should have minimum 5 years of experience in Apache Kafka. - This position is based at our Pune office. - A 15 years full time education is required. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

oin us as a Software Engineer - Sales Tech at Barclays, responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Software Engineer - Sales Tech you should have experience with: Proficiency in Java development, with extensive experience in Java (version 8 or higher) Experience with container technologies, including microservices, messaging protocols like Kafka, and caching technologies such as Apache Ignite Experience with Spring Boot, JUnit, GitLab/Maven, and JIRA A solid understanding of Test-Driven Development (TDD) and Continuous Integration (CI) processes A Bachelor's degree in Computer Science or equivalent experience Some Other Highly Valued Skills May Include Financial industry experience, knowledge of Fixed Income products (Prior experience with Structuring / Pre-Trade / Post-Trade capture, workflow, processing and associated life-cycling will be a plus) Proficiency in requirements analysis and software design; Experience investigating production incidents with priority with a view to restore the services ASAP Basic knowledge of user interface (UI) and user experience (UX) design principles to collaborate effectively with the UI team Knowledge of microservices orchestration & BPMN tools preferably Camunda Demonstrated ability to collaborate with diverse individuals and global teams You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization’s technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window) Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Description Java+ Microservices, Cloud AWS+ Mongo DB+ Kafka —Must have—- JAVA, Spring, Springboot, Maven, Junit, Spring Scheduler, spring Kafka, design patterns, microservices, microservices design patterns, AWS —-Good to have—— Drool rules, Okta/Keycloak, Docker, Kubernetes, Service Mesh/Istio Requirements Same as above Job responsibilities Java+ Microservices, Cloud AWS+ Mongo DB+ Kafka —Must have—- JAVA, Spring, Springboot, Maven, Junit, Spring Scheduler, spring Kafka, design patterns, microservices, microservices design patterns, AWS —-Good to have—— Drool rules, Okta/Keycloak, Docker, Kubernetes, Service Mesh/Istio What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Description Java+ Microservices, Cloud AWS+ Mongo DB+ Kafka —Must have—- JAVA, Spring, Springboot, Maven, Junit, Spring Scheduler, spring Kafka, design patterns, microservices, microservices design patterns, AWS —-Good to have—— Drool rules, Okta/Keycloak, Docker, Kubernetes, Service Mesh/Istio Requirements Same as above Job responsibilities Java+ Microservices, Cloud AWS+ Mongo DB+ Kafka —Must have—- JAVA, Spring, Springboot, Maven, Junit, Spring Scheduler, spring Kafka, design patterns, microservices, microservices design patterns, AWS —-Good to have—— Drool rules, Okta/Keycloak, Docker, Kubernetes, Service Mesh/Istio What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Description Java+ Microservices, Cloud AWS+ Mongo DB+ Kafka Requirements —Must have—- JAVA, Spring, Springboot, Maven, Junit, Spring Scheduler, spring Kafka, design patterns, microservices, microservices design patterns, AWS —-Good to have—— Drool rules, Okta/Keycloak, Docker, Kubernetes, Service Mesh/Istio Job responsibilities —Must have—- JAVA, Spring, Springboot, Maven, Junit, Spring Scheduler, spring Kafka, design patterns, microservices, microservices design patterns, AWS —-Good to have—— Drool rules, Okta/Keycloak, Docker, Kubernetes, Service Mesh/Istio What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Description Java, Springboot,Microservices, AWS, Mongodb, Kafka Requirements Java, Springboot,Microservices, AWS, Mongodb, Kafka Job responsibilities Java, Springboot,Microservices, AWS, Mongodb, Kafka What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Description We are looking for an experienced Java Developer with strong foundations in Data Structures, Algorithms, and System Design. The ideal candidate has a product mindset, writes clean and optimized code, and enjoys solving complex backend challenges at scale. Requirements Required Skills: Strong programming skills in Core Java, including OOP, concurrency, collections, and memory optimization Deep understanding of DSA with the ability to solve complex problems efficiently Experience with Apache Kafka – producer/consumer architecture, message serialization, stream processing, and Kafka internals Proven experience with Spring Boot and Spring Cloud Strong experience with microservices architecture, including service discovery, load balancing, and fault tolerance Hands-on with SQL and NoSQL databases (PostgreSQL, MongoDB, Redis, etc.) Familiarity with CI/CD, Git, Docker, and container orchestration (e.g., Kubernetes) Solid understanding of REST APIs, HTTP, and API gateway patterns Job responsibilities Key Responsibilities: Develop and maintain high-performance backend services using Java (preferably Java 11+) Design efficient and scalable systems leveraging Data Structures and Algorithms Implement real-time, event-driven systems using Apache Kafka Design and develop microservices and RESTful APIs Collaborate with cross-functional teams on architecture, design, and product strategy Participate in code reviews, mentor junior engineers, and enforce engineering best practices Troubleshoot production issues and optimize performance What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services. Show more Show less

Posted 5 days ago

Apply

2.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Description Looking for a highly skilled Microservices Engineer (.NET) to design, develop, and maintain scalable microservices-based architectures. You will play a crucial role in building high-performance, distributed, and cloud-native applications using .NET Core, Docker, Kubernetes, and modern DevOps practices. Requirements About the Role: Looking for a highly skilled Microservices Engineer (.NET) to design, develop, and maintain scalable microservices-based architectures. You will play a crucial role in building high-performance, distributed, and cloud-native applications using .NET Core, Docker, Kubernetes, and modern DevOps practices. Required Skills & Experience: 2-4 years NET Core/.NET 6+ development experience. Experience with microservices architecture & event-driven systems. Strong expertise in SQL/NoSQL databases (SQL Server, MongoDB, Redis, etc.). Hands-on experience with Docker & Kubernetes in cloud environments. Proficiency in RESTful APIs, gRPC, and API Gateways. Experience with message brokers (Kafka, RabbitMQ, Azure Service Bus). Strong understanding of authentication & authorization (OAuth2, JWT, Identity Server). Familiarity with DevOps, CI/CD tools, and Infrastructure as Code (Terraform, Helm). Preferred Skills: Experience with GraphQL. Familiarity with Domain-Driven Design (DDD) & Clean Architecture. Knowledge of serverless computing (AWS Lambda/Azure Functions). Exposure to monitoring & logging tools (Prometheus, ELK, Grafana). Job responsibilities Key Responsibilities: Design & Development: Develop scalable, secure, and high-performance microservices using .NET Core/.NET 6+. Build RESTful APIs and integrate with various third-party services. Implement event-driven architectures using Kafka, RabbitMQ, or Azure Service Bus. Cloud & DevOps: Deploy microservices on Azure/AWS/GCP using Docker & Kubernetes. Implement CI/CD pipelines with GitHub Actions, Azure DevOps, or Jenkins. Performance & Security: Ensure high availability and fault tolerance of microservices. Apply best security practices (OAuth, JWT, API Gateway security). Testing & Maintenance: Write unit, integration, and performance tests (xUnit, NUnit, Postman). Optimize services for latency, performance, and scalability. Collaboration & Documentation: Work closely with frontend, DevOps, and data engineers. Document microservice design and APIs using Swagger/OpenAPI. What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services. Show more Show less

Posted 5 days ago

Apply

7.0 years

0 Lacs

India

On-site

Linkedin logo

Company Description 👋🏼 We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000 experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in! Job Description REQUIREMENTS: Total experience 7+ years Extensive experience in back-end development utilizing Java 8 or higher, Spring Framework (Core/Boot/MVC), Hibernate/JPA, and Microservices Architecture. Experience with messaging systems like Kafka. Hands-on experience with REST APIs, Caching system (e.g Redis) etc. Proficiency in Service-Oriented Architecture (SOA) and Web Services (Apache CXF, JAX-WS, JAX-RS, SOAP, REST). Hands-on experience with multithreading, and cloud development. Strong working experience in Data Structures and Algorithms, Unit Testing, and Object-Oriented Programming (OOP) principles. Hands-on experience with relational databases such as SQL Server, Oracle, MySQL, and PostgreSQL. Experience with DevOps tools and technologies such as Ansible, Docker, Kubernetes, Puppet, Jenkins, and Chef. Proficiency in build automation tools like Maven, Ant, and Gradle. Hands on experience on cloud technologies such as AWS/ Azure. Strong understanding of UML and design patterns. Ability to simplify solutions, optimize processes, and efficiently resolve escalated issues. Strong problem-solving skills and a passion for continuous improvement. Excellent communication skills and the ability to collaborate effectively with cross-functional teams. RESPONSIBILITIES: Writing and reviewing great quality code Understanding functional requirements thoroughly and analyzing the client’s needs in the context of the project Envisioning the overall solution for defined functional and non-functional requirements, and being able to define technologies, patterns and frameworks to realize it Determining and implementing design methodologies and tool sets Enabling application development by coordinating requirements, schedules, and activities. Being able to lead/support UAT and production roll outs Creating, understanding and validating WBS and estimated effort for given module/task, and being able to justify it Addressing issues promptly, responding positively to setbacks and challenges with a mindset of continuous improvement Giving constructive feedback to the team members and setting clear expectations. Helping the team in troubleshooting and resolving of complex bugs Coming up with solutions to any issue that is raised during code/design review and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less

Posted 5 days ago

Apply

8.0 years

0 Lacs

India

Remote

Linkedin logo

Looking for: Data/ML Engineer Job Type: Contract Location: Remote (India) Work Time: Remote: Time coverage up 12 AM IST Job Description: Required Skills & Experience: • Hands-on code mindset with deep understanding in technologies / skillset and an ability to understand larger picture. • Sound knowledge to understand Architectural Patterns, best practices and Non-Functional Requirements • Overall, 8-10 years of experience in heavy volume data processing, data platform, data lake, big data, data warehouse, or equivalent. • 5+ years of experience with strong proficiency in Python and Spark (must-have). • 3+ years of hands-on experience in ETL workflows using Spark and Python. • 4+ years of experience with large-scale data loads, feature extraction, and data processing pipelines in different modes – near real time, batch, realtime. • Solid understanding of data quality, data accuracy concepts and practices. • 2+ years of solid experience in building and deploying ML models in a production setup. Ability to quickly adapt and take care of data preprocessing, feature engineering, model engineering as needed. • 2+ years of experience working with Python deep learning libraries like any or more than one of these - PyTorch, Tensorflow, Keras or equivalent. • Prior experience working with LLMs, transformers. Must be able to work through all phases of the model development as needed. • Experience integrating with various data stores, including: o SQL/NoSQL databases o In-memory stores like Redis o Data lakes (e.g., Delta Lake) • Experience with Kafka streams, producers & consumers. • Required: Experience with Databricks or similar data lake / data platform. • Required: Java and Spring Boot experience with respect to data processing - near real time, batch based. • Familiarity with notebook-based environments such as Jupyter Notebook. • Adaptability: Must be open to learning new technologies and approaches. • Initiative: Ability to take ownership of tasks, learn independently, and innovate. • With technology landscape changing rapidly, ability and willingness to learn new technologies as needed and produce results on job. Preferred Skills: • Ability to pivot from conventional approaches and develop creative solutions. Show more Show less

Posted 5 days ago

Apply

5.0 - 9.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

What is Findem: Findem is the only talent data platform that combines 3D data with AI. It automates and consolidates top-of-funnel activities across your entire talent ecosystem, bringing together sourcing, CRM, and analytics into one place. Only 3D data connects people and company data over time - making an individual’s entire career instantly accessible in a single click, removing the guesswork, and unlocking insights about the market and your competition no one else can. Powered by 3D data, Findem’s automated workflows across the talent lifecycle are the ultimate competitive advantage. Enabling talent teams to deliver continuous pipelines of top, diverse candidates while creating better talent experiences, Findem transforms the way companies plan, hire, and manage talent. Learn more at www.findem.ai Experience - 5 - 9 years We are looking for an experienced Big Data Engineer, who will be responsible for building, deploying and managing various data pipelines, data lake and Big data processing solutions using Big data and ETL technologies. Location- Delhi, India Hybrid- 3 days onsite Responsibilities Build data pipelines, Big data processing solutions and data lake infrastructure using various Big data and ETL technologies Assemble and process large, complex data sets that meet functional non-functional business requirements ETL from a wide variety of sources like MongoDB, S3, Server-to-Server, Kafka etc., and processing using SQL and big data technologies Build analytical tools to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics Build interactive and ad-hoc query self-serve tools for analytics use cases Build data models and data schema for performance, scalability and functional requirement perspective Build processes supporting data transformation, metadata, dependency and workflow management Research, experiment and prototype new tools/technologies and make them successful Skill Requirements Must have-Strong in Python/Scala Must have experience in Big data technologies like Spark, Hadoop, Athena / Presto, Redshift, Kafka etc Experience in various file formats like parquet, JSON, Avro, orc etc Experience in workflow management tools like airflow Experience with batch processing, streaming and message queues Any of visualization tools like Redash, Tableau, Kibana etc Experience in working with structured and unstructured data sets Strong problem solving skills Good to have Exposure to NoSQL like MongoDB Exposure to Cloud platforms like AWS, GCP, etc Exposure to Microservices architecture Exposure to Machine learning techniques The role is full-time and comes with full benefits. We are globally headquartered in the San Francisco Bay Area with our India headquarters in Bengaluru. Equal Opportunity As an equal opportunity employer, we do not discriminate on the basis of race, color, religion, national origin, age, sex (including pregnancy), physical or mental disability, medical condition, genetic information, gender identity or expression, sexual orientation, marital status, protected veteran status or any other legally-protected characteristic. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Join our Team About this opportunity: We are looking for and experienced Java Developer or Architect with strong technical expertise to design and lead development of scalable, high performance Java applications. The ideal candidate should have in depth understanding of Java/J2ee technologies, Design Pattern, Microservice Architecture, Docker & Kubernetes, Integration Framework. This role requires design skills, excellent problem-solving skills, and the ability to collaborate with cross-functional teams, including DevOps, and Front-End developers. What you will do: Architect, design, and implement back-end solutions using Java/J2ee, Spring MVC, Spring Boot and related frameworks. Design, develop and maintain scalable Java components using REST or SOAP based Web Services  Design & develop enterprise solution with Messaging or Streaming Framework like ActiveMQ, HornetQ & Kafka  Work with Integration Framework like Apache Camel/Jboss Fuse/Mule ESB/EAI/Spring Integration.  Make effective use of Caching Technologies (like Hazlecast /Redis /Infinispan /EHCache /MemCache) in application to handle large volume of data set.  Deploy the application in Middleware or App Server (like Jboss/Weblogic/tomcat)  Collaborate with the DevOps team to manage builds and CI/CD pipelines using Jira, GitLab, Sonar and other tools. The skills you bring: Strong expertise in Java/J2ee, Springboot & Micriservices.  Good understanding of Core Java concepts (like Collections Framework, Object Oriented Design)  Experience in working with Multithreading Concepts (like Thread Pool, Executor Service, Future Task, Concurrent API, Countdown Latch)  Detailed working exposure of Java8 with Stream API, Lambda, Interface, Functional Interfaces  Proficiency in Java Web Application Development using Spring MVC & Spring Boot  Good Knowledge about using Data Access Frameworks using ORM (Hibernate & JPA)  Familiar with Database concepts with knowledge in RDBMS/SQL  Good understanding of Monolithic & Microservice Architecture What happens once you apply? Click Here to find all you need to know about what our typical hiring process looks like. We encourage you to consider applying to jobs where you might not meet all the criteria. We recognize that we all have transferrable skills, and we can support you with the skills that you need to develop. Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity Employer. learn more. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

As a Senior/Expert Quality Engineer, you must be able to provide among these: Ability to work in an autonomous, self-responsible and self-organised way. Strong Experience working with modern test automation frameworks and providing End-to-End testing strategy to integrate third party platforms using tools (e.g. Selenium, JUnit, TestNG, Cucumber etc.). Strong experience in different testing practices (from unit to load to endurance to cross-platform) specifically integrated within CI/CD. Experience in continuous testing practices in production by leveraging BOT and virtual users Experience working with CI/CD pipelines and monitoring tools (e.g. Jenkins, Kibana, Grafana, etc.). Strong Experience of API testing, REST protocol and microservice architecture concepts. Knowledge and experience with testing Event-driven architectures (EDA) using AWS SQS and Kafka. Knowledge of SQL for relational databases and Object-relational mapping tools (e.g., Hibernate, JPA). As our Expert Quality Engineer, you embrace the following responsibilities: Take ownership and responsibility for the design and development of all aspects of End-to-End testing. Work on acceptance criteria and test scenarios with the Product Owner and development team ensuring integration of third-party platforms (Tealium and Braze) are smooth and functional as per requirements. Designed, executed, and maintained test scenarios and automation capabilities for all test levels and types (e.g., automated, regression, exploratory, etc.). Create and optimize test frameworks and integrate them into deployment pipelines. Participate in the code review process for both production and test code to ensure all critical cases are covered. Monitoring test runs, application errors, data integrity btw these platforms and performance. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Transform data into a format that can be easily analyzed by developing, maintaining, and testing infrastructures for data generation. Work closely with data scientists and are largely in charge of architecting solutions for data scientists that enable them to do their jobs. Role involves creating Data pipelines and integrating, transforming & enabling data for wider enterprise use. Job Description Duties for this role include but not limited to: supporting the design, build, test and maintain data pipelines at big data scale. Assists with updating data from multiple data sources. Work on batch processing of collected data and match its format to the stored data, make sure that the data is ready to be processed and analyzed. Assisting with keeping the ecosystem and the pipeline optimized and efficient, troubleshooting standard performance, data related problems and provide L3 support. Implementing parsers, validators, transformers and correlators to reformat, update and enhance the data. Provides recommendations to highly complex problems. Providing guidance to those in less senior positions. Additional Job Description Data Engineers play a pivotal role within Dataworks, focused on creating and driving engineering innovation and facilitating the delivery of key business initiatives. Acting as a “universal translator” between IT, business, software engineers and data scientists, data engineers collaborate across multi-disciplinary teams to deliver value. Data Engineers will work on those aspects of the Dataworks platform that govern the ingestion, transformation, and pipelining of data assets, both to end users within FedEx and into data products and services that may be externally facing. Day-to-day, they will be deeply involved in code reviews and large-scale deployments. Essential Job Duties & Responsibilities Understanding in depth both the business and technical problems Dataworks aims to solve Building tools, platforms and pipelines to enable teams to clearly and cleanly analyze data, build models and drive decisions Scaling up from “laptop-scale” to “cluster scale” problems, in terms of both infrastructure and problem structure and technique Collaborating across teams to drive the generation of data driven operational insights that translate to high value optimized solutions. Delivering tangible value very rapidly, collaborating with diverse teams of varying backgrounds and disciplines Codifying best practices for future reuse in the form of accessible, reusable patterns, templates, and code bases Interacting with senior technologists from the broader enterprise and outside of FedEx (partner ecosystems and customers) to create synergies and ensure smooth deployments to downstream operational systems Skill/Knowledge Considered a Plus Technical background in computer science, software engineering, database systems, distributed systems Fluency with distributed and cloud environments and a deep understanding of optimizing computational considerations with theoretical properties Experience in building robust cloud-based data engineering and curation solutions to create data products useful for numerous applications Detailed knowledge of the Microsoft Azure tooling for large-scale data engineering efforts and deployments is highly preferred. Experience with any combination of the following azure tools: Azure Databricks, Azure Data Factory, Azure SQL D, Azure Synapse Analytics Developing and operationalizing capabilities and solutions including under near real-time high-volume streaming conditions. Hands-on development skills with the ability to work at the code level and help debug hard to resolve issues. A compelling track record of designing and deploying large scale technical solutions, which deliver tangible, ongoing value Direct experience having built and deployed robust, complex production systems that implement modern, data processing methods at scale Ability to context-switch, to provide support to dispersed teams which may need an “expert hacker” to unblock an especially challenging technical obstacle, and to work through problems as they are still being defined Demonstrated ability to deliver technical projects with a team, often working under tight time constraints to deliver value An ‘engineering’ mindset, willing to make rapid, pragmatic decisions to improve performance, accelerate progress or magnify impact Comfort with working with distributed teams on code-based deliverables, using version control systems and code reviews Ability to conduct data analysis, investigation, and lineage studies to document and enhance data quality and access Use of agile and devops practices for project and software management including continuous integration and continuous delivery Demonstrated expertise working with some of the following common languages and tools: Spark (Scala and PySpark), Kafka and other high-volume data tools SQL and NoSQL storage tools, such as MySQL, Postgres, MongoDB/CosmosDB Java, Python data tools Azure DevOps experience to track work, develop using git-integrated version control patterns, and build and utilize CI/CD pipelines Working knowledge and experience implementing data architecture patterns to support varying business needs Experience with different data types (json, xml, parquet, avro, unstructured) for both batch and streaming ingestions Use of Azure Kubernetes Services, Eventhubs, or other related technologies to implement streaming ingestions Experience developing and implementing alerting and monitoring frameworks Working knowledge of Infrastructure as Code (IaC) through Terraform to create and deploy resources Implementation experience across different data stores, messaging systems, and data processing engines Data integration through APIs and/or REST service PowerPlatform (PowerBI, PowerApp, PowerAutomate) development experience a plus Minimum Qualifications Data Engineer I: Bachelor’s Degree in Information Systems, Computer Science or a quantitative discipline such as Mathematics or Engineering and/or One (1) year equivalent formal training or work experience. Basic knowledge in data engineering and machine learning frameworks including design, development and implementation of highly complex systems and data pipelines. Basic knowledge in Information Systems including design, development and implementation of large batch or online transaction-based systems. Experience as a junior member of multi-functional project teams. Strong oral and written communication skills. A related advanced degree may offset the related experience requirements. Data Engineer II Bachelor's Degree in Computer Science, Information Systems, a related quantitative field such as Engineering or Mathematics or equivalent formal training or work experience. Two (2) years equivalent work experience in measurement and analysis, quantitative business problem solving, simulation development and/or predictive analytics. Strong knowledge in data engineering and machine learning frameworks including design, development and implementation of highly complex systems and data pipelines. Strong knowledge in Information Systems including design, development and implementation of large batch or online transaction-based systems. Strong understanding of the transportation industry, competitors, and evolving technologies. Experience as a member of multi-functional project teams. Strong oral and written communication skills. A related advanced degree may offset the related experience requirements. Data Engineer III Bachelor’s Degree in Information Systems, Computer Science or a quantitative discipline such as Mathematics or Engineering and/or equivalent formal training or work experience. Three to Four (3 - 4) years equivalent work experience in measurement and analysis, quantitative business problem solving, simulation development and/or predictive analytics. Extensive knowledge in data engineering and machine learning frameworks including design, development and implementation of highly complex systems and data pipelines. Extensive knowledge in Information Systems including design, development and implementation of large batch or online transaction-based systems. Strong understanding of the transportation industry, competitors, and evolving technologies. Experience providing leadership in a general planning or consulting setting. Experience as a senior member of multi-functional project teams. Strong oral and written communication skills. A related advanced degree may offset the related experience requirements. Data Engineer Lead Bachelor’s Degree in Information Systems, Computer Science, or a quantitative discipline such as Mathematics or Engineering and/or equivalent formal training or work experience. Five to Seven (5 - 7) years equivalent work experience in measurement and analysis, quantitative business problem solving, simulation development and/or predictive analytics. Extensive knowledge in data engineering and machine learning frameworks including design, development and implementation of highly complex systems and data pipelines. Extensive knowledge in Information Systems including design, development and implementation of large batch or online transaction-based systems. Strong understanding of the transportation industry, competitors, and evolving technologies. Experience providing leadership in a general planning or consulting setting. Experience as a leader or a senior member of multi-function project teams. Strong oral and written communication skills. A related advanced degree may offset the related experience requirements. Analytical Skills, Accuracy & Attention to Detail, Planning & Organizing Skills, Influencing & Persuasion Skills, Presentation Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace. Show more Show less

Posted 5 days ago

Apply

12.0 - 15.0 years

35 - 50 Lacs

Hyderabad

Work from Office

Naukri logo

Skill : Java, Spark, Kafka Experience : 10 to 16 years Location : Hyderabad As Data Engineer, you will : Support in designing and rolling out the data architecture and infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources Identify data source, design and implement data schema/models and integrate data that meet the requirements of the business stakeholders Play an active role in the end-to-end delivery of AI solutions, from ideation, feasibility assessment, to data preparation and industrialization. Work with business, IT and data stakeholders to support with data-related technical issues, their data infrastructure needs as well as to build the most flexible and scalable data platform. With a strong focus on DataOps, design, develop and deploy scalable batch and/or real-time data pipelines. Design, document, test and deploy ETL/ELT processes Find the right tradeoffs between the performance, reliability, scalability, and cost of the data pipelines you implement Monitor data processing efficiency and propose solutions for improvements. • Have the discipline to create and maintain comprehensive project documentation. • Build and share knowledge with colleagues and coach junior profiles.

Posted 5 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of a Software Engineer In this role, you will: Proactively communicate with product owner, developers, QA, and database engineers to ensure requirements are implemented correctly. Experience with stress/performance testing and has the knowledge of security testing, scalability testing Good understanding of Web Technologies and understanding of emerging trends and technologies i.e. Knowledge in MQ,Power apps, Kafka, Integration Patterns, Mongo DB, Maven, Jenkins, Micro Service Architecture, BPM (PEGA or Appian) and WebSphere application server. Requirements To be successful in this role, you should meet the following requirements: Expert in Selenium, Webdriver, Test NG, Rest Assured, Cucumber BDD, SOAP UI,API testing, Karate Expert in Devops tooling like GIT, Jenkins, Nexus & G3. Experience in SQL/MongoDB. Experience in JMeter. Experience in JIRA, Zephyr. Preferred to have technical understanding of Java, APIs, Mulesoft, Microservices etc. Hands-on experience on implementing BDD framework Preferred to have experience in Performance testing and Stress testing tools Experience in Agile methodologies, preparing test strategies for medium to high range projects Familiar with SQL skills and database operations Proactively communicate with product owner, developers, QA, and database engineers to ensure requirements are implemented correctly. Experience with stress/performance testing and has the knowledge of security testing, scalability testing Good understanding of Web Technologies and understanding of emerging trends and technologies i.e. Knowledge in MQ,Power apps, Kafka, Integration Patterns, Mongo DB, Maven, Jenkins, Micro Service Architecture, BPM (PEGA or Appian) and WebSphere application server. Strong English Communication and analytical skills are a must for working in a complex global team environment. Strong risk management skills Excellent communication skills Project Management tool – Clarity Project status reporting(from testing perspective). The successful candidate will also meet the following requirements: Strong technical aptitude Willing to work in shifts based on the project need. Maintain a good rapport with stakeholders and delivery teams Knowledge of Credit Risk domain would be preferred. Nice to have exposure to powerapps. Minimum of 3 Yrs Experience You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India Show more Show less

Posted 5 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Consumer and Community Banking, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job Responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, equity, inclusion, and respect Required Qualifications, Capabilities, And Skills Formal Training or certification on Java, J2EE, concepts, with 3+ years of applied experience Hands-on practical experience in system design, application development, testing, and operational stability Proficient in coding in one or more languages Hands on experience on UI- React Js Familiarity with Spring frameworks, including Spring Boot, Spring Batch, and Spring JPA Knowledge of AWS services such as ECS, EKS, EC2, S3, Lambda, and Redis cache Expertise in KAFKA and MQ event programming Experience with databases like Oracle, AWS Aurora PostgreSQL, Cassandra, and DynamoDB Understanding of Kubernetes Skills in microservices development Testing proficiency with tools such as Cucumber framework (functional testing), PACT / SpringContractTest (contract testing), Junit v5.x, Jmeter (performance testing), and Gremlin (resiliency testing) Preferred Qualifications, Capabilities, And Skills Familiarity with modern front-end technologies Exposure to cloud technologies ABOUT US Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / Data Bricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred Technical And Professional Experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers Show more Show less

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies