Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
15.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Position Description JOB DESCRIPTION We are seeking aspirational candidates who are interested in a career in Consulting to join our niche Banking Domain and Practice. The position will support Territory Heads, Delivery Managers, Portfolio and Project Managers and teams of talented, professional business and technology consultants in the delivery of business focused solutions for our clients using Oracle applications, tools and technology. Utilizing sound product skills and experience, the successful applicant will work on value consulting, solutioning and transforming and addressing complex business requirements into sound and optimal solutions to achieve successful outcomes for our customers, partners and associates and drive towards client and customer reference ability. Longer term you will grow, with the help of extensive training and experience of the team around you, into a seasoned employee and become a Subject Matter experts in Business domain and or Solution Architecture with full accountability and responsibility of the delivered solution for your own projects, programs and territory and larger region and organization. Job Responsibilities Partnering with and acting as a trusted advisor to stakeholders in both Consulting Sales and Delivery to assist in defining and delivering high quality enterprise capable solutions Working closely with stakeholders to develop practical roadmaps to move the enterprise towards the future state vision, while taking into account business, technical and delivery constraints Analyzing stakeholder requirements, current state architecture, and gaps to create a future state architecture vision for one or more parts of the enterprise with a focus on reduced complexity, cost efficiencies, reuse, convergence, reduced risk and/or improved business capabilities Participating in defining and operating the architecture governance process to ensure change initiatives align to the vision and roadmaps Working closely with Domain Architects across key initiatives and projects to apply architecture principles and standards, and develop reference architectures and design patterns Communicating the principles, standards, vision and roadmaps to stakeholders and proactively addressing any questions / concerns identified Providing thought leadership on architectural or other topics, developing a forward looking view of current and emerging technologies and their impact on the Enterprise Architecture Embedding Platform Thinking in everything Owning and enhancing workflows and processes, and delegates with clear accountabilities across the teams to meet objectives / outcomes Promoting an environment of learning and development. Understand and develop team members and others to achieve their professional growth Responsibilities Job Requirements Bachelor's Degree in Engineering, Computer Science or equivalent; Master's degree in Business or Technology is an advantage Formal architecture certification (TOGAF or equivalent) At least 15 years' experience in the IT industry, preferably in large, complex enterprises At least 7 years' experience in Enterprise Architecture in a large, complex, multi-location, multi-national environment Deep experience delivering mission critical, enterprise scale IT solutions in a heterogeneous technology environment Demonstrate deep domain expertise in Application Architecture in EAI, Microservices and Cloud native technologies Experience in Domain driven and Event driven architecture and in technologies such as Kafka and Spark Experience architecting, designing and developing large scale high performance retail business banking solutions utilizing a mixture of Open systems and messaging and high performance DB solutions. Experience in log analysis and log based monitoring (e.g. ELK) and metrics driven monitoring (Grafana, Prometheus) Direct experience with highly scalable enterprise applications, designing high performance, low latency solutions with high availability and near or no data loss. Familiarity with: best practice methodologies and tools for the entire solution lifecycle from ideation to requirements, design, development, testing, deployment and operations one or more formal Architecture frameworks / methodologies (TOGAF, Zachman, BIAN, etc.) architecture governance frameworks heterogeneous technology platforms such as AS400, Unix/Linux, Windows A deep understanding of all domains of Enterprise Architecture, including the business, data, application, infrastructure and security domains Possess strong understanding of business strategies and able to translate them into concrete achievable action plans Practical experience in: data modelling, object modelling, design patterns and Enterprise Architecture tool or other software modelling tools. business capabilities model Advanced Relational Database Experience (RDBMS) in Oracle Multi-tenant database is an advantage Functional Expertise preferred (but not a mandatory requirement) in at least two of the below domains: Branch Banking CRM and e-CIF Transaction Banking - Cash Management and Payments Lending Origination and Servicing Trade Finance & Supply Chain Finance e-Channels, eco-system partnerships and API (Open Banking) Proven experience leading teams resulting in the successful deployment of applications built on Cloud or on-prem enterprise environments for large Tier-1 Banks and Financial institutions Candidates having experience with migrating from legacy applications to a solution utilizing methodologies and Platforms that will ensure least down-time, reduced risk and excellent customer experience both the customer and end-users of those services will be preferred. IT Strategy consulting experience will be an added advantage Comfortable in working in an environment which is mix of several parties and teams from customer as well from Oracle where collaboration is key. Excellent verbal, written and presentation skills to stakeholders at all levels Ability to communicate complex topics in an understandable way using a level of detail and terms appropriate to the situation Capability to think conceptually and identify patterns across seemingly unrelated situations Must be a good team player and able to drive consensus amongst stakeholders with conflicting viewpoints and objectives People and team management in a transversal function Able to collaborate and drive motivation across a diverse slate within and across teams and can deal with difficult conversations effectively Diversity and Inclusion: An Oracle career can span industries, roles, Countries, and cultures, allowing you to flourish in new roles and innovate while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. To nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, and interview process, and in potential roles. to perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 4 days ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Job Description Are you ready to make an impact at DTCC? Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We are committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. The Information Technology group delivers secure, reliable technology solutions that enable DTCC to be the trusted infrastructure of the global capital markets. The team delivers high-quality information through activities that include development of essential, building infrastructure capabilities to meet client needs and implementing data standards and governance. Pay And Benefits Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits, based on location Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact You Will Have In This Role The Development family is responsible for creating, designing, deploying, and supporting applications, programs, and software solutions. May include research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities related to software products used internally or externally on product platforms supported by the firm. The software development process requires in-depth subject matter expertise in existing and emerging development methodologies, tools, and programming languages. Software Developers work closely with business partners and / or external clients in defining requirements and implementing solutions. The Software Engineering role specializes in planning, documenting technical requirements, designing, developing, and testing all software systems and applications for the firm. Works closely with architects, product managers, project management, and end-users in the development and enhancement of existing software systems and applications, proposing and recommending solutions that solve complex business problems. Your Primary Responsibilities Act as a technical expert on one or more applications utilized by DTCC Work with the Business System Analyst to ensure designs satisfy functional requirements Partner with Infrastructure to identify and deploy optimal hosting environments Participate in code development, code deploys while working as individual or in team projects Tune application performance to eliminate and reduce issues Research and evaluate technical solutions consistent with DTCC technology standards Align risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Apply different software development methodologies dependent on project needs Contribute expertise to the design of components or individual programs, and participate in the construction and functional testing Support development teams, testing, troubleshooting, and production support Create applications and construct unit test cases that ensure compliance with functional and non-functional requirements Work with peers to mature ways of working, continuous integration, and continuous delivery Aligns risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Qualifications Minimum of 7 years of related experience Bachelor's degree preferred or equivalent experience Talents Needed For Success Hands on experience in software development using Design Patterns, Java, Java EE, Spring Boot, Spring 6, JMS, REST API, Middleware like IBM MQ, Tomcat, Liberty, WebSphere Demonstrated capability working with middleware like IBM MQ, Apache Kafka, Amazon EventBridge and other messaging frameworks Familiarity working with relational database Oracle with experience in developing stored procedure and managing database schema and tables. Familiarity on UI frameworks like Angular or other java scripts is a plus. Familiar developing and running applications in Windows and Linux environments and container technologies like Docker, Kubernetes, OpenShift will be a plus. Demonstrable experience in software development using CI/CD tools especially GIT, Bitbucket, Maven, Jenkins, Jira Experience using the following development tools: Visual Studio, IntelliJ, or Eclipse. Familiarity with different software development methodologies (Waterfall, Agile, Scrum, Kanban) The salary range is indicative for roles at the same level within DTCC across all US locations. Actual salary is determined based on the role, location, individual experience, skills, and other considerations. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. About Us With over 50 years of experience, DTCC is the premier post-trade market infrastructure for the global financial services industry. From 20 locations around the world, DTCC, through its subsidiaries, automates, centralizes, and standardizes the processing of financial transactions, mitigating risk, increasing transparency, enhancing performance and driving efficiency for thousands of broker/dealers, custodian banks and asset managers. Industry owned and governed, the firm innovates purposefully, simplifying the complexities of clearing, settlement, asset servicing, transaction processing, trade reporting and data services across asset classes, bringing enhanced resilience and soundness to existing financial markets while advancing the digital asset ecosystem. In 2024, DTCC’s subsidiaries processed securities transactions valued at U.S. $3.7 quadrillion and its depository subsidiary provided custody and asset servicing for securities issues from over 150 countries and territories valued at U.S. $99 trillion. DTCC’s Global Trade Repository service, through locally registered, licensed, or approved trade repositories, processes more than 25 billion messages annually. To learn more, please visit us at www.dtcc.com or connect with us on LinkedIn , X , YouTube , Facebook and Instagram . DTCC proudly supports Flexible Work Arrangements favoring openness and gives people freedom to do their jobs well, by encouraging diverse opinions and emphasizing teamwork. When you join our team, you’ll have an opportunity to make meaningful contributions at a company that is recognized as a thought leader in both the financial services and technology industries. A DTCC career is more than a good way to earn a living. It’s the chance to make a difference at a company that’s truly one of a kind. Learn more about Clearance and Settlement by clicking here . About The Team The IT SIFMU Delivery Department supports core Clearing and Settlement application delivery for DTC, NSCC and FICC. The department also develops and supports Asset Services, Wealth Management & Insurance Services and Master Reference Data applications. Show more Show less
Posted 4 days ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description The Applications Development Technology Lead Analyst is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to lead applications systems analysis and programming activities. At least two years (Over all 10+ hands on Data Engineering experience) of experience building and leading highly complex, technical data engineering teams. Lead data engineering team, from sourcing to closing. Drive strategic vision for the team and product Responsibilities: Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements Experience managing an data focused product,ML platform Hands on experience relevant experience in design, develop, and optimize scalable distributed data processing pipelines using Apache Spark and Scala. Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation Experience managing, hiring and coaching software engineering teams. Experience with large-scale distributed web services and the processes around testing, monitoring, and SLAs to ensure high product quality. Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary Required Skills: Experience: 7 to 10+ years of hands-on experience in big data development, focusing on Apache Spark, Scala, and distributed systems. Proficiency in Functional Programming: High proficiency in Scala-based functional programming for developing robust and efficient data processing pipelines. Proficiency in Big Data Technologies: Strong experience with Apache Spark, Hadoop ecosystem tools such as Hive, HDFS, and YARN. AIRFLOW, DataOps, Data Management Programming and Scripting: Advanced knowledge of Scala and a good understanding of Python for data engineering tasks. Data Modeling and ETL Processes: Solid understanding of data modeling principles and ETL processes in big data environments. Analytical and Problem-Solving Skills: Strong ability to analyze and solve performance issues in Spark jobs and distributed systems. Version Control and CI/CD: Familiarity with Git, Jenkins, and other CI/CD tools for automating the deployment of big data applications. Desirable Experience: Real-Time Data Streaming: Experience with streaming platforms such as Apache Kafka or Spark Streaming.Python Data Engineering Experience is Plus Financial Services Context: Familiarity with financial data processing, ensuring scalability, security, and compliance requirements. Leadership in Data Engineering: Proven ability to work collaboratively with teams to develop robust data pipelines and architectures. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description About Oracle APAC ISV Business Oracle APAC ISV team is one of the fastest-growing and high-performing business units in APAC. We are a prime team that operates to serve a broad range of customers across the APAC region. ISVs are at the forefront of today's fastest-growing industries. Much of this growth stems from enterprises shifting toward adopting cloud-native ISV SaaS solutions. This transformation drives ISVs to evolve from traditional software vendors to SaaS service providers. Industry analysts predict exponential growth in the ISV market over the coming years, making it a key growth pillar for every hyperscaler. Our Cloud engineering team works on pitch-to-production scenarios of bringing ISVs’ solutions on the Oracle cloud (#oci) with an aim to provide a cloud platform for running their business which is better performant, more flexible, more secure, compliant to open-source technologies and offers multiple innovation options yet being most cost effective. The team walks along the path with our customers and are being regarded as a trusted techno-business advisors by them. Required Skills/Experience Your versatility and hands-on expertise will be your greatest asset as you deliver on time bound implementation work items and empower our customers to harness the full power of OCI. We also look for: Bachelor's degree in Computer Science, Information Technology, or a related field. Relevant certifications in database management, OCI or other cloud platforms (AWS, Azure, Google Cloud), or NoSQL databases 8+ years of professional work experience Proven experience in migrating databases and data to OCI or other cloud environments (AWS, Azure, Google Cloud, etc.). Expertise on Oracle DB and related technologies like RMAN, DataGuard, Advanced Security Options, MAA Hands on experience with NoSQL databases (MongoDB / Cassandra / DynamoDB, etc.) and other DBs like MySQL/PostgreSQL Demonstrable expertise in Data Management systems, caching systems and search engines such as MongoDB, Redshift, Snowflake, Spanner, Redis, ElasticSearch, as well as Graph databases like Neo4J An understanding of complex data integration, data pipelines and stream analytics using products like Apache Kafka, Oracle GoldenGate, Oracle Stream Analytics, Spark etc. Knowledge of how to deploy data management within a Kubernetes/docker environment as well as the corresponding management of state in microservice applications is a plus Ability to work independently and handle multiple tasks in a fast-paced environment. Solid experience managing multiple implementation projects simultaneously while maintaining high-quality standards. Ability to develop and manage project timelines, resources, and budgets. Career Level - IC4 Responsibilities What You’ll Do As a solution specialist, you will work closely with our cloud architects and key stakeholders of ISVs to propagate awareness and drive implementation of OCI native as well as open-source technologies by ISV customers. Lead and execute end-to-end data platforms migrations (including heterogeneous data platforms) to OCI. Design and implement database solutions within OCI, ensuring scalability, availability, and performance. Set up, configure, and secure production environments for data platforms in OCI Migrate databases from legacy systems or other Clouds to OCI while ensuring minimal downtime and data integrity. Implement and manage CDC solutions to track and capture changes in databases in real-time. Configure and manage CDC tools, ensuring low-latency, fault-tolerant data replication for high-volume environments. Assist with the creation of ETL/data pipelines for the migration of large datasets into data warehouse on OCI Configure and manage complex database deployment topologies, including clustering, replication, and failover configurations. Perform database tuning, monitoring, and optimization to ensure high performance in production environments. Implement automation scripts and tools to streamline database administration and migration processes. Develop and effectively present your proposed solution and execution plan to both internal and external stakeholders. Clearly explain the technical advantages of OCI based database management systems About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 4 days ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
As part of a highly motivated global Cloud Engineering team, the right candidate will play a critical hands-on functional leadership role as Engineering Manager and drive architecture design, implementation, and deployment of the Cloud Data Platform to build data operations/orchestration solutions in Calix Cloud. Responsibilities and Duties: Technical leadership in all phases of software design and development in meeting requirements of service stability, reliability, scalability, and security. Hiring, training, provide technical direction and lead discussions and coordinate deliverables across multiple engineering teams globally. Work closely with Cloud and Systems product owners to understand, analyze product requirements, provide feedback, coordinate resources, and deliver a complete solution. Drive evaluation and selection of best fit, efficient, cost-effective frameworks for the Calix Cloud platform. Drive development of scalable and distributed data systems, related automation platform and network data ingestion infrastructure and services for enabling Calix Cloud suite of products. Drive continuous scale and optimization of the platform with an automation and metrics driven approach. Participate and drive technical discussions within engineering group in all phases of the SDLC: review requirements, produce design documents, participate in peer reviews, support QA team, provide internal training and support TAC team. Have a Test first mindset and use modern DevSecOps practices for Agile development. Collaborate with senior leadership to translate platform opportunities into an actionable roadmap, track progress, and deliver new platform capabilities on-time and on-budget. Triage and resolve customer escalations and technical issues. Qualifications: 10+ years of highly technical, hands-on software engineering experience and cloud based solution development 3+ experience leading and mentoring engineering team with strong technical direction and delivering high quality software on schedule, including delivery for large, cross-functional projects and working with geographically distributed teams Strong, creative problem-solving skills and ability to abstract and share details to create meaningful articulation. Passionate about delivering high quality software solutions and enabling automation in all phases. Solid data engineering background, good understanding of ETL technologies and experience with building large scale cloud solutions. Experience in designing and developing event-based / pub-sub workflows & data ingestion solutions. Proficiency and hands-on experience with Kafka at scale (or similar) desired. Strong background in transactional databases and good understanding and experience with no-SQL datastores. Experience with microservices-based, API/Endpoint architectures and orchestration. Practical understanding and usage of AWS (or GCP) Cloud platform and services. Hands on expert level on one or more of the following programming languages - Java, Go, Python Organized and goal-focused, ability to deliver in a fast-paced environment. BS degree in Computer Science, Engineering, Mathematics, or relevant industry standard experience to match. Location: India – (Flexible hybrid work model - work from Bangalore office for 20 days in a quarter) Show more Show less
Posted 4 days ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About the Company Radware is revolutionizing cloud-based management and protection solutions as we expand our cloud business footprint. We're seeking a passionate and driven Team Leader to spearhead the development of our next-generation SaaS platforms that will define the future of cloud security. About the Role As Team Leader – Software Engineering, you will lead a talented team of engineers while remaining hands-on with cutting-edge technologies. You'll architect, develop, and deploy scalable SaaS solutions using modern tech stacks including Java, Spring Boot, Kubernetes, Docker, Go, Kafka, Elasticsearch, Postgres, and Redis. Responsibilities Lead and inspire a team of software engineers to develop scalable, high-performance cloud-based products. Take full ownership of design, development, and deployment of microservices-based SaaS solutions. Demonstrate technical excellence and mentor team members to ensure best coding practices. Drive accountability within the team for deliverables, quality, and timelines. Partner with Product Managers, Architects, and stakeholders to shape product requirements and roadmaps. Champion timely, high-quality feature delivery in an Agile environment. Foster innovation and continuous improvement in development processes. Nurture team growth and cultivate a culture of learning, teamwork, and excellence. Demonstrate a "can-do" attitude when facing challenges and finding solutions. Qualifications BE/B.Tech in Computer Science or equivalent. Required Skills 8+ years of backend software development experience using Java. 2+ years successfully leading or mentoring a software development team. Strong expertise in OOP, design patterns, and software development best practices. Hands-on experience with microservices architecture and cloud-based solutions. Demonstrated success in Agile environments with CI/CD methodologies. Exceptional leadership, communication, and problem-solving skills. Ability to make sound technical decisions and collaborate effectively across functions. Accountability for team deliverables and commitment to achieving results. Positive, can-do attitude with the ability to overcome obstacles creatively. Preferred Skills Proven team leadership experience. Deep expertise in Java, Spring Boot, and Go. Hands-on microservices development experience. Extensive work with cloud environments. Proficiency in Agile methodology, CI/CD, and DevOps practices. What We Value: We're looking for someone who takes ownership, demonstrates accountability, and approaches challenges with a positive, can-do attitude. You should be passionate about creating exceptional software and motivated to drive your team to excellence. Join Radware and help shape the future of cloud security solutions! Equal Opportunity Statement Radware is committed to diversity and inclusivity. ``` Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About the Company Radware is looking for an experienced Senior Software Engineer, which will have a significant impact on the company's cloud business expansion. We are building the next generation of the company's cloud-based management and protection solution. About the Role As a Software Engineer, you are part of a team responsible for the design, development and deployment of a SaaS solution. You will be using modern technologies such as Java, Spring, Kubernetes, Docker, Go, Kafka, Elasticsearch, Postgres, Redis and other advanced tools which are running in scalable cloud infrastructure. Responsibilities Develop cutting-edge, scalable and high-performance products based on a microservices architecture. Collaborate with Software Engineers and Product Managers to deliver a trendsetting product. Craft clean, maintainable and resilient code. Work side by side with your peers to overcome complex challenges. Continuously improve and learn new technological skills. Qualifications Educational Qualification: BE / B.Tech in Computer Science or an equivalent. Location: Bangalore. Required Skills At least 5 years of Backend applications development experience using Java. Experience in OOP and design patterns. Experience with solutions based on a microservices architecture. Experience in cloud solutions development. Experience working in an agile echo system with CI/CD development methodology. Team player as well as a great independent owner and contributor. Preferred Skills Familiarity with Kubernetes. Working on a SaaS echo system. Familiarity with AWS. Primary Location: IN-IN-Bengaluru Work Locations: Radware Shield Square India, Bengaluru Job: Software Equal Opportunity Statement Radware is committed to diversity and inclusivity. ``` Show more Show less
Posted 4 days ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title Data Engineer Job Description Data Engineer !! Hello, we’re IG Group. We’re a global, FTSE 250-listed company made up of a collection of progressive fintech brands in the world of online trading and investing. The best part? We’ve snapped up many awards for our top-class platforms, forward-thinking products, and incredible employee experiences. Who We’re Looking For You’re curious about things like the client experience, the rapid developments in tech, and the complex world of fintech regulation. You’re also a confident, creative thinker with a knack for innovating. We know that you know every problem has a solution. Here, you can try new ideas, and lead the way in creating inspiring experiences for our clients and everyone around you. We don’t fit the corporate stereotype. If you want to work for a traditional, suit-and-tie corporate that just gives you a pay cheque at the end of the month, we might not be for you. But, if you have that IG Group energy and you can stand behind what we believe in, let’s raise the bar together. About The Team We are looking for a Data Engineer for our team in our Bangalore office. The role, as well as the projects in which you will participate on, is crucial for the entire IG. Data Engineering is responsible to collect data from various sources and generate insights for our business stakeholders. As a Data engineer you will be responsible to the delivery of our projects and participate in the whole project life cycle (development and delivery) applying Agile best practices and you will also ensure good quality engineering . You will be working other technical teams members to build ingestion pipeline, build a shared company-wide Data platform in GCP as well as supporting and evolving our wide range of services in the cloud You will be owning the development and support of our applications which also include our out-of-ours support rota. The Skills You'll Need You will be someone who can demonstrate: Good understanding of IT development life cycle with focus on quality and continuous delivery and integration 3 - 5 years of experience in Python, Data processing - (pandas/pyspark), & SQL Good experience Cloud - GCP Good communications skills being able to communicate technical concepts to non-technical audience. Proven experience in working on Agile environments. Experience on working in data related projects from data ingestion to analytics and reporting. Good understanding of Big Data and distributed computes framework such as Spark for both batch and streaming workloads Familiar with kafka and different data formats AVRO/Parquet/ORC/Json. It Would Be Great If You Have Experience On GitLab Containerisation (Nomad or Kubernetes). How You’ll Grow When you join IG Group, we want you to have more than a job – we want you to have a career. And you can. If you spot an opportunity, we want you to chase it. Stretch yourself, challenge your self-beliefs and go for the things you dream of. With internal and external learning opportunities and the tools to help you skyrocket to success, we’ll support you all the way. And these opportunities truly are endless because we have some bold targets. We plan to expand our global presence, increase revenue growth, and ultimately deliver the world’s best trading experience. We’d love to have you along for the ride. The Perks It really is more than a job. We’ll recognise your talent and make sure that you can still have a life – at work, and outside of it. Networks, committees, awards, sports and social clubs, mentorships, volunteering opportunities, extra time off… the list goes on. Matched giving for your fundraising activity. Flexible working hours and work-from-home opportunities. Performance-related bonuses. Insurance and medical plans. Career-focused technical and leadership training. Contribution to gym memberships and more. A day off on your birthday. Two days’ volunteering leaves per year. Where You’ll Work We follow a hybrid working model; we reckon it’s the best of both worlds. This model also feeds into our secret ingredients for innovation: diversity, flexibility, and close connection. Plus, you’ll be welcomed into a diverse and inclusive workforce with a lot of creative energy. Ask our employees what their favourite thing is about working at IG, and you’ll hear an echo of ‘our culture’! That’s because you can come to work as your authentic self. The things that make you, you – like your ethnicity, sexual orientation, faith, age, gender identity/expression or physical capacity – can bring a fresh perspective or new skill to our business. That’s why we welcome people from various walks of life; and anyone who wants to help us realise our vision and strategy. So, if you’re keen to connect with our values, and lead the charge on innovation, you know what to do. Apply! Number of openings 1 Show more Show less
Posted 4 days ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Who We Are At Goldman Sachs, we connect people, capital and ideas to help solve problems for our clients. We are a leading global financial services firm providing investment banking, securities and investment management services to a substantial and diversified client base that includes corporations, financial institutions, governments and individuals. Equities Risk Technology is looking for a developer to work on a strategic platform for calculating real-time risk for our division. This platform will be used to drive real-time decision making by traders across a range of business lines on a daily basis. The ideal candidate would have experience and interest in building a large scale distributed system that is fast, accurate and highly reliable. They should also have commercial experience in Java. The successful candidate will join a team of talented developers working on this highly visible and prestigious project. The opportunity will appeal to motivated individuals who want to work directly with the trading desks. How Will You Fulfill Your Potential Build a unified world-class risk management framework supporting all aspects of trading in various Equities products. Work directly with Trading, Sales and Strats Great opportunity to learn the business while working with a talented group of individuals. Basic Qualifications 1+ years of professional Java development experience Familiarity with testing tools in Java Excellent object oriented analysis and design skills Strong knowledge of data structures, algorithms, and designing for performance Preferred Qualifications Experience in NoSQL data stores, e.g. Cassandra or MongoDB Experience in modern message oriented middleware, e.g. Kafka or RabbitMQ Experience in debugging distributed systems. Experience in developing software for Akka or Vertx Goldman Sachs Engineering Culture At Goldman Sachs, our Engineers don’t just make things – we make things possible. Change the world by connecting people and capital with ideas. Solve the most challenging and pressing engineering problems for our clients. Join our engineering teams that build massively scalable software and systems, architect low latency infrastructure solutions, proactively guard against cyber threats, and leverage machine learning alongside financial engineering to continuously turn data into action. Create new businesses, transform finance, and explore a world of opportunity at the speed of markets. Engineering is at the critical center of our business, and our dynamic environment requires innovative strategic thinking and immediate, real solutions. Want to push the limit of digital possibilities? Start here! © The Goldman Sachs Group, Inc., 2025. All rights reserved. Goldman Sachs is an equal employment/affirmative action employer Female/Minority/Disability/Veteran/Sexual Orientation/Gender Identity. Show more Show less
Posted 4 days ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Oracle, the world leader in Enterprise Cloud, is hiring the best and brightest technologists in the industry as we continue to add customer-centric, world-class, leading edge, secure, hyper-scale based solutions throughout all levels of the cloud stack. Oracle’s cloud eco-system is the only complete business cloud platform on the planet, with market leading and business transforming solutions spanning SaaS, DaaS, PaaS and IaaS. Oracle’s Cloud applications, such as Enterprise Resource Management, Customer Relationship Management, Human Capital Management, and Supply Chain Management are used by thousands of customers across the globe and are the broadest, most innovative in the industry, providing businesses with adaptive intelligence, standardized business processes and competitive advantage at low cost. As part of market leading ERP Cloud, Oracle ERP Cloud Integration & Functional Architecture team offers a broad suite of modules and capabilities designed to empower modern finance and deliver customer success with streamlined processes, increased productivity, and improved business decisions. The ERP Cloud Integration & Functional Architecture is looking for passionate, innovative, high caliber, team oriented super stars that seek being a major part of a transformative revolution in the development of modern business cloud based applications. We are seeking highly capable, best in the world developers, architects and technical leaders at the very top of the industry in terms of skills, capabilities and proven delivery; who seek out and implement imaginative and strategic, yet practical, solutions; people who calmly take measured and necessary risks while putting customers first. What You’ll Do You would work as a Principal Applications Engineer on Oracle Next Gen solutions developed and running on Oracle Cloud. Design and build distributed, scalable, fault-tolerant software systems Build cloud services on top of modern Oracle Cloud Infrastructure Collaborate with product managers and other stakeholders in understanding the requirements and work on delivering user stories/backlog items with highest levels of quality and consistency across the product. Work with geographically dispersed team of engineers by taking complete ownership and accountability to see the project through for completion. Skills And Qualifications 7+ years in building and architecting enterprise and consumer grade applications You have prior experience working on distributed systems in the Cloud world with full stack experience. Build and Delivery of a high-quality cloud service with the capabilities, scalability, and performance needed to match the needs of enterprise teams Take the initiative and be responsible for delivering complex software by working effectively with the team and other stakeholders You feel at home communicating technical ideas verbally and in writing (technical proposals, design specs, architecture diagrams, and presentations) You are ideally proficient in Java, J2EE, SQL and server-side programming. Proficiency in other languages like Javascript is preferred. You have experience with Cloud Computing, System Design, and Object-Oriented Design preferably, production experience with Cloud. Experience working in the Apache Hadoop Community and more broadly the Big Data ecosystem communities (e.g. Apache Spark, Kafka, Flink etc.) Experience on Cloud native technologies such as AWS/Oracle Cloud/Azure/Google Cloud etc Experience building microservices and RESTful services and deep understanding of building cloud-based services Experienced at building highly available services, possessing knowledge of common service-oriented design patterns and service-to-service communication protocols Knowledge of Docker/Kubernetes is preferred Ability to work creatively and analytically using data-driven decision-making to improve customer experience. Strong organizational, interpersonal, written and oral communication skills, with proven success in contributing in a collaborative, team-oriented environment. Self–motivated and self-driven, continuously learning and capable of working independently. BS/MS (MS preferred) in Computer Science. Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. As a member of the software engineering division, you will analyze and integrate external customer specifications. Specify, design and implement modest changes to existing software architecture. Build new products and development tools. Build and execute unit tests and unit test plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering to discuss major changes to functionality. Work is non-routine and very complex, involving the application of advanced technical/business skills in area of specialization. Leading contributor individually and as a team member, providing direction and mentoring to others. BS or MS degree or equivalent experience relevant to functional area. 7 years of software engineering or related experience. Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 4 days ago
6.0 years
0 Lacs
Kondapur, Telangana, India
On-site
What You'll Do Design and Development: Architect, design, and develop scalable backend systems and APIs using Java and related frameworks. Microservices: Build and maintain microservices-based architectures to ensure modularity and flexibility in system design. Database Management: Develop and optimize database schemas and queries for SQL and NoSQL databases. Code Quality: Write clean, maintainable, and well-documented code adhering to best practices and coding standards. Performance Optimization: Identify and resolve performance bottlenecks to ensure high availability and responsiveness. Integration: Develop and maintain integrations with third-party APIs and services. Collaboration: Work closely with product managers, frontend teams, and QA engineers to deliver end-to-end solutions. Testing: Write and maintain unit, integration, and performance tests to ensure software reliability. Security: Implement best practices for securing backend services and data. Mentorship: Mentor junior developers and participate in code reviews to promote a culture of excellence. Troubleshooting: Debug and resolve production issues in a timely manner. What You Know A Minimum 6+ years of Java/J2EE development experience Java Frameworks: Proficiency in frameworks like Spring Boot, Hibernate, or Quarkus. Database Skills: Strong experience with any of the relational databases (e.g., MySQL, PostgreSQL) and Any of the NoSQL databases (e.g., MongoDB, Cassandra). RESTful APIs: Expertise in designing and implementing RESTful APIs and working with JSON/XML data formats. Microservices: Hands-on experience with microservices architecture and tools like Docker and Kubernetes. Messaging Systems: Familiarity with any of the messaging systems like Kafka, RabbitMQ, or ActiveMQ. Version Control: Proficiency with Git and branching strategies. Testing: Strong knowledge of testing frameworks like JUnit, TestNG, or Mockito. Cloud: Experience with any of the cloud platforms like AWS, Azure, or GCP. CI/CD: Familiarity with CI/CD pipelines and tools like Jenkins, GitLab CI/CD, or GitHub Actions. Scripting: Knowledge of scripting languages like Python or Shell scripting for automation tasks. Problem-Solving: Excellent analytical and debugging skills. Independent thinker that can identify problems and provide creative solutions Communication: Strong written and verbal communication skills for technical discussions and documentation. Good to have Knowledge of reactive programming (e.g., Spring WebFlux, RxJava). Experience with GraphQL APIs. Familiarity with serverless architectures and functions Education Bachelor’s degree in Computer Science, Information Systems, Engineering, Computer Applications, or related field. Benefits In addition to competitive salaries and benefits packages, Nisum India offers its employees some unique and fun extras: Continuous Learning - Year-round training sessions are offered as part of skill enhancement certifications sponsored by the company on an as need basis. We support our team to excel in their field. Parental Medical Insurance - Nisum believes our team is the heart of our business and we want to make sure to take care of the heart of theirs. We offer opt-in parental medical insurance in addition to our medical benefits. Activities -From the Nisum Premier League's cricket tournaments to hosted Hack-a-thon, Nisum employees can participate in a variety of team building activities such as skits, dances performance in addition to festival celebrations. Free Meals - Free snacks and dinner is provided on a daily basis, in addition to subsidized lunch. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
India
Remote
Ready to be pushed beyond what you think you’re capable of? At Coinbase, our mission is to increase economic freedom in the world. It’s a massive, ambitious opportunity that demands the best of us, every day, as we build the emerging onchain platform — and with it, the future global financial system. To achieve our mission, we’re seeking a very specific candidate. We want someone who is passionate about our mission and who believes in the power of crypto and blockchain technology to update the financial system. We want someone who is eager to leave their mark on the world, who relishes the pressure and privilege of working with high caliber colleagues, and who actively seeks feedback to keep leveling up. We want someone who will run towards, not away from, solving the company’s hardest problems. Our work culture is intense and isn’t for everyone. But if you want to build the future alongside others who excel in their disciplines and expect the same from you, there’s no better place to be. While many roles at Coinbase are remote-first, we are not remote-only. In-person participation is required throughout the year. Team and company-wide offsites are held multiple times annually to foster collaboration, connection, and alignment. Attendance is expected and fully supported. Team/ Role Paragraph: Team - As part of Coinbase’s Enterprise Applications and Architecture organization, the CXCMSS (Customer Experience Channel Management and Shared Services) team is on the front lines of revolutionizing how we deliver exceptional support to millions of users globally. We are responsible for managing the delivery of work to agents across chat, phone, email, and social channels, empowering customer service agents across compliance and retail business units to operate at peak efficiency. We design, build, and own platforms that prioritize the customer-first approach while driving agent productivity and maximizing automation. Our team thrives on innovation, creating scalable solutions that not only meet today’s needs but also anticipate the future of customer engagement. Role - In this role, you'll be at the forefront of building complex, scalable applications that directly impact the Coinbase customer experience. You'll focus on building and optimizing Go-based backend APIs and React-powered UIs at scale, while integrating with third-party platforms to streamline operations for our customer service teams What you’ll be doing: Deliver cross-functional outcomes to complex problems in collaboration with product, design, security, data or other engineering teams Lead assessment and implementation of third-party AI/ML tools. Demonstrate a keen awareness of Coinbase’s platform, development practices, and various technical domains and build upon them to efficiently deliver improvements across multiple teams. Participate in an environment where innovative ideas are regularly generated, vetted, and transformed into action. Foresee potential issues before they arise and adapt their own approach, working within constraints to avoid adverse impact. Communicate across the company to technical and non-technical leaders with ease. Quickly distill complex technical themes so that an entry level, non-tech team member can understand them. Mentor team members in design techniques and best practices in coding, testing, release/deploy process, documentation, metrics/logging and scaling Working with teams and teammates across multiple time zones. What we look for in you: You have at least 5+ years of experience in software engineering. You’ve designed, built, scaled and maintained production services, and know how to compose a service oriented architecture. You have experience in authoring and contributing to technical architecture and implementing them. You write high quality, well tested code to meet the needs of your customers. You’re passionate about building an open financial system that brings the world together. You possess strong technical skills for system design and coding Excellent written and verbal communication skills, and a bias toward open, transparent cultural practices. Experience with third-party vendor integrations. You enjoy and have experience with large-scale, high-traffic platforms and implementing scalable, robust services in the real world. Experience in AWS, Kubernetes, Terraform, BuildKite or similar. Experience in rate limiters, caching, load balancing, circuit breakers, metrics, logging, tracing, debugging etc. Experience in event driven architectures (Kafka, MQ etc.), proficiency in either SQL or NoSQL DBs, understanding of concepts like gRPC, GraphQL, ETL. Proficiency in Go. Nice to haves: Familiarity with JavaScript and modern, component-based JS frameworks like React Familiarity with working in rapid growth environments Job #: G2768 *Answers to crypto-related questions may be used to evaluate your onchain experience Pay Transparency Notice: The target annual salary for this position can range as detailed below. Full time offers from Coinbase also include target bonus + target equity + benefits (including medical, dental, and vision). Pay Range:: ₹6,612,600 INR - ₹6,612,600 INR Please be advised that each candidate may submit a maximum of four applications within any 30-day period. We encourage you to carefully evaluate how your skills and interests align with Coinbase's roles before applying. Commitment to Equal Opportunity Coinbase is committed to diversity in its workforce and is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, creed, gender, national origin, age, disability, veteran status, sex, gender expression or identity, sexual orientation or any other basis protected by applicable law. Coinbase will also consider for employment qualified applicants with criminal histories in a manner consistent with applicable federal, state and local law. For US applicants, you may view the Know Your Rights notice here . Additionally, Coinbase participates in the E-Verify program in certain locations, as required by law. Coinbase is also committed to providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process, please contact us at accommodations[at]coinbase.com to let us know the nature of your request and your contact information. For quick access to screen reading technology compatible with this site click here to download a free compatible screen reader (free step by step tutorial can be found here) . Global Data Privacy Notice for Job Candidates and Applicants Depending on your location, the General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) may regulate the way we manage the data of job applicants. Our full notice outlining how data will be processed as part of the application procedure for applicable locations is available here. By submitting your application, you are agreeing to our use and processing of your data as required. For US applicants only, by submitting your application you are agreeing to arbitration of disputes as outlined here. Show more Show less
Posted 4 days ago
5.0 - 10.0 years
15 - 25 Lacs
Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 4 days ago
6.0 - 11.0 years
17 - 30 Lacs
Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR
Hybrid
Inviting applications for the role of Lead Consultant-Data Engineer, Azure+Python! Responsibilities Hands on experience with Azure, pyspark, and Python with Kafka Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the Azure environment, including IAM policies, security groups, and encryption mechanisms. Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on Azure Experience of Databricks will be added advantage Strong experience in Python and SQL Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Masters Degree-Computer Science, Electronics, Electrical. Azure Data Engineering & Cloud certifications, Databricks certifications Experience of working with Oracle ERP Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process
Posted 4 days ago
0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
About This Role About this role Are you interested in building innovative technology that shapes the financial markets? Do you like working at the speed of a startup, but want to solve some of the world’s most complex problems? Do you want to work with, and learn from, hands-on leaders in technology and finance? At BlackRock, we are looking for Software Engineers who like to innovate and solve complex problems. We recognize that strength comes from diversity, and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. With over USD $11+ trillion of assets we have an exceptional responsibility: our technology empowers millions of investors to save for retirement, pay for college, buy a home, and improve their financial wellbeing. Being a developer at BlackRock means you get the best of both worlds: working for one of the most advanced financial companies and being part of a software development team responsible for next generation technology and solutions. What is Aladdin and Aladdin Engineering (AE)? You will be working on BlackRock's investment operating system Aladdin. Aladdin is used both internally and externally by many financial institutions. Aladdin combines sophisticated risk analytics with comprehensive portfolio management, trading and operations tools on a single platform to power informed decision-making and create a connective tissue for thousands of users investing worldwide. Our development team sits inside Aladdin Engineering. We collaboratively build the next generation of technology that changes the way information, people, and technology intersect for global investment firms. We build and package tools that manage trillions in assets and supports millions of financial instruments. We perform risk calculations and process millions of transactions for thousands of users every day worldwide! The open position is in the Investment and Trading Engineering team within Aladdin Engineering. The team is on a transformational journey from a mature set of applications to an integrated persona-based platform with streamlined user workflows and a high degree of automation and scale. The team works closely with BlackRock portfolio managers, traders and investment compliance officers and delivers to external clients. They also partner closely with world class AI research and engineering teams, product managers, UX designers, quality assurance engineers, and client support teams to deliver high quality, scalable and resilient capabilities. Being a member of investment and trading engineering you will be: Tenacious: Work in a fast paced and highly complex environment Creative thinker: Analyze multiple solutions and deploy technologies in a flexible way. Great teammate: Think and work collaboratively and communicate effectively. Quick learner: Pick up new concepts and apply them quickly. Responsibilities Take ownership of individual project priorities, deadlines and deliverables using AGILE methodologies. Deliver high efficiency, high availability, concurrent and fault tolerant software systems. Contribute to development of Aladdin’s global, multi-asset trading platform. Provide impact and expertise as an individual contributor to greenfield work developing the streaming capabilities of the portfolio management system. Work with product management, business users and QA to deliver the roadmap. Design and develop innovative solutions to complex problems, identifying issues and roadblocks. Demonstrate vision when brainstorming solutions for team productivity, efficiency, guiding and motivating developers. Qualifications A degree in Computer Science, or Computer Engineering Years of hands-on experience in Golang Years of hands-on experience implementing large scale distributed systems Good understanding of concurrent programming and design of high throughput, high availability, fault-tolerant distributed applications and databases Strong interest in distributed systems, infrastructure services, cloud technology and Kubernetes Prior experience in building distributed applications using Golang Prior experience with Redis is a plus Prior experience with stream processors is a plus Prior experience with message broker technology such as Kafka Excellent analytical and software architecture design skills, with an emphasis on test-driven development. Effective communication and presentation skills, both written and verbal. Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
TransUnion's Job Applicant Privacy Notice What We'll Bring We are India’s leading credit information company with one of the largest collections of consumer information. We aim to be more than just a credit reporting agency. We are a sophisticated, global risk information provider striving to use information for good. We take immense pride in playing a pivotal role in catalyzing the BFSI industry in the country. We got here by tapping into our excitement and passion of wanting to make a difference in the lives of our clients and consumers. What is excitement and passion for us? We define it as a blend of curiosity, ability to unlearn and yet continuously learn, able to connect with meaning and finally the drive to execute ideas till the last mile is achieved. This passion helps us focus on continuous improvement, creative problem solving and collaboration which ensures delivery excellence. What You'll Bring The incumbent will play a vital role in setting up data analytics technology practice in TransUnion UK. This practice will be having functions like Data Engineering, Data Research, Data Visualization, Analytics Systems and Innovation Culture. This role will be a bridge between GT and other verticals like ISG and DSA. Accelerating and transforming the Data Analytics Technology by helping TransUnion build more trust using advanced data analytics.Setup practice and this practice will include functions like Data Engineering, Data Analytics Research, Data Visualization, Analytics Systems and Innovation Culture Working closely with the CIO to plan the data analytics technology roadmap for the next 5 years Work with key stakeholders to ensure alignment of the objectives across business units and the action plans required to achieve the same Work with GT and other Leadership and align functions and roadmap to achieve business goals Ensure cloud readiness of data and systems and plan the roadmap for analytics in the cloud Incumbent will lead the development of big data capabilities and utilization as well as the coordination Setup and manage a team of Data Engineers, ML Engineers, Data Analysts, Data Admins and Architects Managing budgets, projects, people, stakeholders, vendors is an integral part of this role Prepare the Machine learning roadmap with Auto ML, ML Pipelines and Advanced Analytical tools Local Analytical system (capacity planning, costing, setup, tuning, management, user management, operations, cloud readiness planning) Impact You'll Make 5+ years of IT Experience with 4+ relevant experience in Data Analytics Technology Experience on Big Data Technology Stack like hive, spark, pig, sqoop, kafka, flume etc. Experience on Data Architecture and Data Governance Experience in managing a team of Data Engineers, ML Engineers, Data Analysts Experience in costing, capacity planning, architecture of advance data analytics ecosystem Experience in working across geographies, regions, countries Experience in Data Engineering, Data Visualization and Data Analytics tech stack This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Sr Analyst, Data Science and Analytics Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Summary Designs, develops, tests, debugs and implements more complex operating systems components, software tools, and utilities with full competency. Coordinates with users to determine requirements. Reviews systems under development and related documentation. Makes more complex modifications to existing software to fit specialized needs and configurations, and maintains program libraries and technical documentation. May coordinate activities of the project team and assist in monitoring project schedules and costs. Essential Duties And Responsibilities Lead and Manage configuration, maintenance, and support of portfolio of AI models and related products. Manage model delivery to Production deployment team and coordinate model production deployments. Ability to analyze complex data requirements, understand exploratory data analysis, and design solutions that meet business needs. Work on analyzing data profiles, transformation, quality and security with the dev team to build and enhance data pipelines while maintaining proper quality and control around the data sets. Work closely with cross-functional teams, including business analysts, data engineers, and domain experts. Understand business requirements and translate them into technical solutions. Understand and review the business use cases for data pipelines for the Data Lake including ingestion, transformation and storing in the Lakehouse. Present architecture and solutions to executive-level. Minimum Qualifications Bachelor's or master's degree in computer science, Engineering, or related technical field Minimum of 5 years' experience in building data pipelines for both structured and unstructured data. At least 2 years' experience in Azure data pipeline development. Preferably 3 or more years' experience with Hadoop, Azure Databricks, Stream Analytics, Eventhub, Kafka, and Flink. Strong proficiency in Python and SQL Experience with big data technologies (Spark, Hadoop, Kafka) Familiarity with ML frameworks (TensorFlow, PyTorch, scikit-learn) Knowledge of model serving technologies (TensorFlow Serving, MLflow, KubeFlow) will be a plus Experience with one pof the cloud platforms (Azure preferred) and their Data Services. Understanding ML services will get preference. Understanding of containerization and orchestration (Docker, Kubernetes) Experience with data versioning and ML experiment tracking will be great addition Knowledge of distributed computing principles Familiarity with DevOps practices and CI/CD pipelines Preferred Qualifications Bachelor's degree in Computer Science or equivalent work experience. Experience with Agile/Scrum methodology. Experience with tax and accounting domain a plus. Azure Data Scientist certification a plus. Applicants may be required to appear onsite at a Wolters Kluwer office as part of the recruitment process. Show more Show less
Posted 4 days ago
6.0 - 12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description We are in need of a driven Scala Developer to join our dynamic team at GlobalLogic. In this role, you will have the groundbreaking opportunity to work on outstanding projects that innovate the future of technology. You will collaborate with our world-class engineers to deliver flawless solutions and compete in a crafty, innovative environment. Requirements Minimum 6-12 years of software development experience Scala Language Mastery Strong understanding of both functional and object-oriented programming paradigms. Deep knowledge of: Immutability, lazy evaluation Traits, case classes, companion objects Pattern matching Advanced type system: generics, type bounds, implicits, context bounds 🔹 Functional Programming (FP) Hands-on experience with: Pure functions, monads, functors, higher-kinded types FP libraries: Cats, Scalaz, or ZIO Understanding of effect systems and referential transparency 📦 Frameworks & Libraries 🔹 Backend / API Development RESTful API development using: Play Framework Akka HTTP Experience with GraphQL is a plus 🔹 Concurrency & Asynchronous Programming Deep understanding of: Futures, Promises Akka actors, Akka Streams ZIO or Cats Effect 🛠️ Build, Tooling & DevOps SBT for project building and dependency management Familiarity with Git, Docker, and Kubernetes CI/CD experience with Jenkins, GitHub Actions, or similar tools Comfortable with Linux command line and shell scripting 🗄️ Database & Data Systems Strong experience with: SQL databases: PostgreSQL, MySQL NoSQL databases: Cassandra, MongoDB Streaming/data pipelines: Kafka, Spark (with Scala) ORM / FP database libraries: Slick, Doobie 🧱 Architecture & System Design Microservices architecture design and deployment Event-driven architecture Familiarity with Domain-Driven Design (DDD) Designing for scalability, fault tolerance, and observability 🧪 Testing & Quality Experience with testing libraries: ScalaTest, Specs2, MUnit ScalaCheck for property-based testing Test-driven development (TDD) and behavior-driven development (BDD) 🌐 Cloud & Infrastructure (Desirable) Deploying Scala apps on: AWS (e.g., EC2, Lambda, ECS, RDS) GCP or Azure Experience with infrastructure-as-code (Terraform, CloudFormation) is a plus 🧠 Soft Skills & Leadership Mentorship: Ability to coach junior developers Code reviews: Ensure code quality and consistency Communication: Work cross-functionally with product managers, DevOps, QA Agile development: Experience with Scrum/Kanban Ownership: Capable of taking features from design to production ⚡ Optional (but Valuable) Scala.js / Scala Native experience Machine Learning with Scala (e.g., Spark MLlib) Exposure to Kotlin, Java, or Python Job responsibilities As a Scala Developer/ Big Data Engineer, you will: – Develop, test, and deploy high-quality Scala applications. – Implement functional and object-oriented programming paradigms. – Ensure code quality through immutability, lazy evaluation, and pattern matching. – Craft and build scalable systems using traits, case classes, and companion objects. – Collaborate with cross-functional teams to determine project requirements and deliver solutions successfully. – Troubleshoot and resolve complex technical issues. – Participate in code reviews to maintain our high standards of quality What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services. Show more Show less
Posted 4 days ago
5.0 - 7.0 years
16 - 27 Lacs
Bengaluru
Work from Office
We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! REQUIREMENTS: Total experience 5+ years Extensive experience in back-end development utilizing Java 8 or higher, Spring Framework (Core/Boot/MVC), Hibernate/JPA, and Microservices Architecture. Experience with messaging systems like Kafka. Hands-on experience with REST APIs, Caching system (e.g Redis) etc. Proficiency in Service-Oriented Architecture (SOA) and Web Services (Apache CXF, JAX-WS, JAX-RS, SOAP, REST). Hands-on experience with multithreading, and cloud development. Strong working experience in Data Structures and Algorithms, Unit Testing, and Object-Oriented Programming (OOP) principles. Hands-on experience with relational databases such as SQL Server, Oracle, MySQL, and PostgreSQL. Experience with DevOps tools and technologies such as Ansible, Docker, Kubernetes, Puppet, Jenkins, and Chef. Proficiency in build automation tools like Maven, Ant, and Gradle. Hands on experience on cloud technologies such as AWS/ Azure. Strong understanding of UML and design patterns. Ability to simplify solutions, optimize processes, and efficiently resolve escalated issues. Strong problem-solving skills and a passion for continuous improvement. Excellent communication skills and the ability to collaborate effectively with cross-functional teams. Enthusiasm for learning new technologies and staying updated on industry trends. RESPONSIBILITIES: Writing and reviewing great quality code Understanding functional requirements thoroughly and analyzing the client’s needs in the context of the project Envisioning the overall solution for defined functional and non-functional requirements, and being able to define technologies, patterns and frameworks to realize it Determining and implementing design methodologies and tool sets Enabling application development by coordinating requirements, schedules, and activities. Being able to lead/support UAT and production roll outs Creating, understanding and validating WBS and estimated effort for given module/task, and being able to justify it Addressing issues promptly, responding positively to setbacks and challenges with a mindset of continuous improvement Giving constructive feedback to the team members and setting clear expectations. Helping the team in troubleshooting and resolving of complex bugs Coming up with solutions to any issue that is raised during code/design review and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements
Posted 4 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are looking for an immediate joiner and experienced Big Data Developer with a strong background in Kafka, PySpark, Python/Scala, Spark, SQL, and the Hadoop ecosystem. The ideal candidate should have over 5 years of experience and be ready to join immediately. This role requires hands-on expertise in big data technologies and the ability to design and implement robust data processing solutions. Responsibilities Design, develop, and maintain scalable data processing pipelines using Kafka, PySpark, Python/Scala, and Spark. Work extensively with the Kafka and Hadoop ecosystem, including HDFS, Hive, and other related technologies. Write efficient SQL queries for data extraction, transformation, and analysis. Implement and manage Kafka streams for real-time data processing. Utilize scheduling tools to automate data workflows and processes. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality and integrity by implementing robust data validation processes. Optimize existing data processes for performance and scalability. Requirements Experience with GCP. Knowledge of data warehousing concepts and best practices. Familiarity with machine learning and data analysis tools. Understanding of data governance and compliance standards. This job was posted by Arun Kumar K from krtrimaIQ Cognitive Solutions. Show more Show less
Posted 4 days ago
5.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. About the Role As a Senior Software Engineer, you will be working within an international group of teams that span multiple time zones This group is responsible for creating and managing reference data from all the broadcast and streaming stations across the continental US for Television audience measurement Your primary objective is to ensure project goals are achieved and are aligned with business objectives. You will also work closely with your Scrum team and program team to test, develop, refine and implement quality software in production via standard Agile methodologies Responsibilities:- Design, code, and test iteratively to support microservices and container based applications on AWS Plan, develop, execute and maintain automated unit, functional test cases; develop modular, robust, and maintainable automation scripts to integrate with the CICD process. Leverage modern design patterns and architectural principles to build platform reusable code and components that can be used across projects and teams Must have strong analytical and technical skills with passion to deep dive on data in troubleshooting, devise techniques for problem resolution Support product owner in defining future stories and tech lead in defining technical requirements for new initiatives Build platform reusable code and components that could be used by multiple project teams Promote a culture of best practices with peer code reviews and extreme ownership for continuous incremental delivery. Collaborate with cross-functional teams and stakeholders to align development objectives with broader business goals Support any Production issues that may arise and collaborate with the Product owners to prioritize any enhancements to fix failure modes Key Skills:- 5-8 years of hands-on software development with a Bachelor’s degree in computer science Must have very good knowledge of microservices and event based architectural principles Must have the ability to provide solutions utilizing best practices for resilience, scalability, cloud optimization and security 3-5 years of experience in any of the following languages: Java, Go, Python Experience developing cloud-hosted (AWS) containerized applications and services on Kubernetes Knowledge of streaming based applications using Apache Kafka Hands-on experience with the following AWS Components: Managed Streaming for Apache Kafka (MSK), EKS, EC2, S3 storage, Lambda, Relational Database Service, Simple Notification Service (SNS) Demonstrates knowledge of CI/CD processes, testing frameworks, practices and tools (GitLab, jUnit, Terraform, JFrog, Jacoco, SonarQube, etc.) Knowledge of Infrastructure creation in the Cloud using Terraform or Cloud formation Familiarity of Linux platforms with knowledge of shell scripting Sound problem-solving skills with the ability to process complex information, articulate and present it clearly Passion to research and conduct POCs to optimize solutions Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status or other characteristics protected by law. Show more Show less
Posted 4 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
The Opportunity "We are seeking a senior software engineer to undertake a range of feature development tasks that continue the evolution of our DMP Streaming product. You will demonstrate the required potential and technical curiosity to work on software that utilizes a range of leading edge technologies and integration frameworks. Given your depth of experience, we also want you to technically guide more junior members of the team, instilling both good engineering practices and inspiring them to grow" What You'll Contribute Implement product changes, undertaking detailed design, programming, unit testing and deployment as required by our SDLC process Investigate and resolve reported software defects across supported platforms Work in conjunction with product management to understand business requirements and convert them into effective software designs that will enhance the current product offering Produce component specifications and prototypes as necessary Provide realistic and achievable project estimates for the creation and development of solutions. This information will form part of a larger release delivery plan Develop and test software components of varying size and complexity Design and execute unit, link and integration test plans, and document test results. Create test data and environments as necessary to support the required level of validation Work closely with the quality assurance team and assist with integration testing, system testing, acceptance testing, and implementation Produce relevant system documentation Participate in peer review sessions to ensure ongoing quality of deliverables. Validate other team members' software changes, test plans and results Maintain and develop industry knowledge, skills and competencies in software development What We're Seeking A Bachelor’s or Master’s degree in Computer Science, Engineering, or related field 10+ Java software development experience within an industry setting Ability to work in both Windows and UNIX/Linux operating systems Detailed understanding of software and testing methods Strong foundation and grasp of design models and database structures Proficient in Kubernetes, Docker, and Kustomize Exposure to the following technologies: Apache Storm, MySQL or Oracle, Kafka, Cassandra, OpenSearch, and API (REST) development Familiarity with Eclipse, Subversion and Maven Ability to lead and manage others independently on major feature changes Excellent communication skills with the ability to articulate information clearly with architects, and discuss strategy/requirements with team members and the product manager Quality-driven work ethic with meticulous attention to detail Ability to function effectively in a geographically-diverse team Ability to work within a hybrid Agile methodology Understand the design and development approaches required to build a scalable infrastructure/platform for large amounts of data ingestion, aggregation, integration and advanced analytics Experience of developing and deploying applications into AWS or a private cloud Exposure to any of the following: Hadoop, JMS, Zookeeper, Spring, JavaScript, Angular, UI Development Our Offer to You An inclusive culture strongly reflecting our core values: Act Like an Owner, Delight Our Customers and Earn the Respect of Others. The opportunity to make an impact and develop professionally by leveraging your unique strengths and participating in valuable learning experiences. Highly competitive compensation, benefits and rewards programs that encourage you to bring your best every day and be recognized for doing so. An engaging, people-first work environment offering work/life balance, employee resource groups, and social events to promote interaction and camaraderie. Show more Show less
Posted 4 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience Your Role And Responsibilities As an Associate Software Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems Preferred Education Master's Degree Required Technical And Professional Expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs. Excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA Java based Micro-services framework. Hands on experience on Spring boot Microservices Strong knowledge of micro-service logging, monitoring, debugging and testing Preferred Technical And Professional Experience In-depth knowledge of relational databases (e.g., MySQL) experience in container platforms such as Docker and Kubernetes Experience in messaging platforms such as Kafka or IBM MQ Good understanding of Test-Driven-Development Show more Show less
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Us Acceldata is the market leader in Enterprise Data Observability. Founded in 2018, Silicon Valley-based Acceldata has developed the world's first Enterprise Data Observability Platform to help build and operate great data products. Enterprise Data Observability is at the intersection of today’s hottest and most crucial technologies such as AI, LLMs, Analytics, and DataOps. Acceldata provides mission-critical capabilities that deliver highly trusted and reliable data to power enterprise data products. Delivered as a SaaS product, Acceldata's solutions have been embraced by global customers, such as HPE, HSBC, Visa, Freddie Mac, Manulife, Workday, Oracle, PubMatic, PhonePe (Walmart), Hersheys, Dun & Bradstreet, and many more. Acceldata is a Series-C funded company whose investors include Insight Partners, March Capital, Lightspeed, Sorenson Ventures, Industry Ventures, and Emergent Ventures. About the Role: We are looking for an experienced Lead SDET for our ODP, specializing in ensuring the quality and performance of large-scale data systems. In this role, you will work closely with development and operations teams to design and execute comprehensive test strategies for Open Source Data Platform (ODP) , including Hadoop, Spark, Hive, Kafka, and other related technologies. You will focus on test automation, performance tuning, and identifying bottlenecks in distributed data systems. Your key responsibilities will include writing test plans, creating automated test scripts, and conducting functional, regression, and performance testing. You will be responsible for identifying and resolving defects, ensuring data integrity, and improving testing processes. Strong collaboration skills are essential as you will be interacting with cross-functional teams and driving quality initiatives. Your work will directly contribute to maintaining high-quality standards for big data solutions and enhancing their reliability at scale. You are a great fit for this role if you have Proven expertise in Quality Engineering, with a strong background in test automation, performance testing, and defect management across multiple data platforms. A proactive mindset to define and implement comprehensive test strategies that ensure the highest quality standards are met. Experience in working with both functional and non-functional testing, with a particular focus on automated test development. A collaborative team player with the ability to effectively work cross-functionally with development teams to resolve issues and deliver timely fixes. Strong communication skills with the ability to mentor junior engineers and share knowledge to improve testing practices across the team. A commitment to continuous improvement, with the ability to analyze testing processes and recommend enhancements to align with industry best practices. Ability to quickly learn new technologies What We Look For 6-10 years of hands-on experience in quality engineering and quality assurance, focusing on test automation, performance testing, and defect management across multiple data platforms Proficiency in programming languages such as Java, Python, or Scala for writing test scripts and automating test cases with hands-on experience in developing automated tests using other test automation frameworks, ensuring robust and scalable test suites. Proven ability to define and execute comprehensive test strategies, including writing test plans, test cases, and scripts for both functional and non-functional testing to ensure predictable delivery of high-quality products and solutions Experience with version control systems like Git and CI/CD tools such as Jenkins or GitLab CI to manage code changes and automate test execution within the development pipeline. Expertise in identifying, tracking, and resolving defects and issues, collaborating closely with developers and product teams to ensure timely fixes. Strong communication skills with the ability to work cross-functionally with development teams and mentor junior team members to improve testing practices and tools. Ability to analyze testing processes, recommend improvements and ensure the testing environment aligns with industry best practices, contributing to the overall quality of software. Acceldata is an equal-opportunity employer At Acceldata, we are committed to providing equal employment opportunities regardless of job history, disability, gender identity, religion, race, color, caste, marital/parental status, veteran status, or any other special status. We stand against the discrimination of employees and individuals and are proud to be an equitable workplace that welcomes individuals from all walks of life if they fit the designated roles and responsibilities. is all about working with some of the best minds in the industry and experiencing a culture that values an ‘out-of-the-box’ mindset. If you want to push boundaries, learn continuously, and grow to be the best version of yourself, Acceldata is the place to be! Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
React Developer Location - Bangalore (4 Days WFO) Experience Level - 5+ yrs Notice Period - Immediate to 15 days Role & responsibilities Minimum 5 years of coding experience in ReactJS (TypeScript), HTML, CSS-Pre-processors, or CSS-in-JS in creating Enterprise Applications with high performance for Responsive Web Applications. Developing and implementing highly responsive user interface components using React concepts. (self-contained, reusable, and testable modules and components) Architecting and automating the build process for production, using task runners or scripts Knowledge of Data Structures for TypeScript. Monitoring and improving front-end performance. Banking or Retail domains knowledge is good to have. Hands on experience in performance tuning, debugging, monitoring. Technical Skills Excellent knowledge in development and testing scalable and highly available Restful APIs / Microservices using Javascript technologies Able to create end to end Automation test suites using Playwright / Selenium preferably using BDD approach. Practical experience with GraphQL. Well versed with CI/CD principles, and actively involved in solving, troubleshooting issues in distributed services ecosystem Understanding of containerization, experienced in Dockers , Kubernetes. Exposed to API gateway integrations like 3Scale. Understanding of Single-Sign-on or token based authentication (Rest, JWT, oAuth) Possess expert knowledge of task/message queues including but not limited to: AWS, Microsoft Azure, Pushpin and Kafka Show more Show less
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Kafka, a popular distributed streaming platform, has gained significant traction in the tech industry in recent years. Job opportunities for Kafka professionals in India have been on the rise, with many companies looking to leverage Kafka for real-time data processing and analytics. If you are a job seeker interested in Kafka roles, here is a comprehensive guide to help you navigate the job market in India.
These cities are known for their thriving tech industries and have a high demand for Kafka professionals.
The average salary range for Kafka professionals in India varies based on experience levels. Entry-level positions may start at around INR 6-8 lakhs per annum, while experienced professionals can earn between INR 12-20 lakhs per annum.
Career progression in Kafka typically follows a path from Junior Developer to Senior Developer, and then to a Tech Lead role. As you gain more experience and expertise in Kafka, you may also explore roles such as Kafka Architect or Kafka Consultant.
In addition to Kafka expertise, employers often look for professionals with skills in: - Apache Spark - Apache Flink - Hadoop - Java/Scala programming - Data engineering and data architecture
As you explore Kafka job opportunities in India, remember to showcase your expertise in Kafka and related skills during interviews. Prepare thoroughly, demonstrate your knowledge confidently, and stay updated with the latest trends in Kafka to excel in your career as a Kafka professional. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.