Home
Jobs

315 Cassandra Jobs - Page 10

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

12 - 17 Lacs

Gurugram

Work from Office

Naukri logo

Learn more about our culture and how we make our employees happier through The Sprinklr Way. Job Description Lead Design and Architecture: Architect, design, and develop highly scalable and robust software solutions, leveraging your deep technical expertise in backend systems. Ownership of Full Project Lifecycle: Drive projects from conception through implementation, testing, deployment, and maintenance. Mentor and Develop Talent: Guide and mentor junior engineers, promoting best practices in architecture, design, coding standards, and agile methodologies. Collaboration Across Teams: Work collaboratively with geographically distributed teams in a highly collaborative and ever-evolving environment. Engage with product managers, engineers, and business leaders to deliver impactful products. Adhere to Engineering Standards: Ensure adherence to standard software engineering practices and advocate for coding excellence and continuous improvements. Drive Innovation: Push boundaries with creative solutions and adopt emerging technologies that can solve complex problems or enhance platform performance. Communicate Technical Decisions: Clearly articulate architectural challenges, solutions, and opportunities to technical and non-technical stakeholders, including senior leadership. Skills Required: Educational Background: B.Tech or M.Tech in Computer Science or equivalent, with an excellent academic record. Extensive Development Experience: 2-5+ years of experience in software development, primarily in backend systems. Proficiency in Key Technologies: Expertise in Java, Javascript and relevant frameworks. Experience working on complex software projects Microservice architecture Data Expertise: Experience working with MySQL, Redis, Cassandra, DynamoDB, MongoDB, Kafka, and similar technologies. Scalable Systems Knowledge: Proven experience in designing, building, and evolving large-scale, complex software projects. Cloud Expertise: Hands-on experience in developing scalable Cloud-based services on AWS or Azure. Problem-Solving Prowess: Strong programming skills with a focus on solving technical challenges and delivering elegant solutions. Adaptability and Curiosity: Ability and interest in quickly learning new technologies and adapting to new environments. Leadership and Mentorship: A self-starter with a proven ability to lead, mentor, and inspire junior team members. Collaboration and Communication: Strong team player with excellent verbal and written communication skills, able to work across geo-distributed teams. Why Youll Love Sprinklr: Were committed to creating a culture where you feel like you belong, are happier today than you were yesterday, and your contributions matter. At Sprinklr, we passionately, genuinely care. For full-time employees, we provide a range of comprehensive health plans, leading well-being programs, and financial protection for you and your family through a range of global and localized plans throughout the world.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Lowe s Companies, Inc. (NYSE: LOW) is a FORTUNE 50 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com . About the Team This hiring is for Personalization team and it s a mix of Software Engineers, Data Engineers and Data scientist. This team build signals and segments based on customer s real-time and in session features and does gives multiple generative prediction which can help personalize the customer digital experience. This in turn will reduce friction and help customer find the right product more quick. Job Summary: The primary purpose of this role is to translate business requirements and functional specifications into logical program designs and to deliver code modules, stable application systems, and software solutions. This includes developing, configuring, or modifying integrated business and/or enterprise application solutions within various computing environments. This role will be working closely with stakeholders and cross-functional departments to communicate project statuses and proposals. Roles Responsibilities: Core Responsibilities: Translates business requirements and specifications into logical program designs, code modules, stable application systems, and software solutions with occasional guidance from senior colleagues; partners with the product team to understand business needs and functional specifications. Develops, configures, or modifies integrated business and/or enterprise application solutions within various computing environments by designing and coding component-based applications using various programming languages. Tests application using test-driven development and behavior-driven development frameworks to ensure the integrity of the application. Conducts root cause analysis of issues and participates in the code review process to identify gaps. Implements continuous integration/continuous delivery processes to ensure quality and efficiency in the development cycle using DevOps automation processes and tools. Ideates, builds, and publishes reusable libraries to improve productivity across teams. Conducts the implementation and maintenance of complex business and enterprise software solutions to ensure successful deployment of released applications. Solves difficult technical problems to ensure solutions are testable, maintainable, and efficient. Years of Experience: 2- 4 years experience Education Qualification Certifications Required Minimum Qualifications: B.Tech in computer science or equivalent stream. Skill Set Required Primary Skills (must have) Java Spring Boot Kafka Cassandra Redis GCP Secondary Skills (desired) Apache Beam React

Posted 3 weeks ago

Apply

6.0 - 11.0 years

12 - 17 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Job description: 6+ years of experience in designing and developing applications using C#, .NET Core, Web API, SQL. Implement and maintain Azure cloud solutions, Logic App, storage Account, Log analytics, creating VM and debugging it, Terraforms, Azure Boards, Azure Container Registry, Cassandra. Knowledge in Azure DevOps, CI/CD pipelines, PowerShell scripting. Good knowledge of Git for version control. Required Candidate login to applying this job. Click here to And try again Login to your account Email Address: Password: Senior .NET Developer Drop your resume or click to upload File types supported: .doc, .docx and .pdf | Max file size: 3 MB or First Name: * Last Name: * Email: * Phone: * Current Job Title: Online resume/portfolio link Address Country State Postal code How soon can you join* How did you hear about us* By clicking checkbox, you agree to our and

Posted 3 weeks ago

Apply

7.0 - 12.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Bureau is an all-in-one platform for identity decisioning, fraud prevention and compliance requirements. Trusted for enhancing security and compliance, Bureau simplifies identity management for businesses. This is a place where we celebrate homegrown leaders, and have an open-door policy where your voice matters, your ideas flourish, and your potential knows no bounds. We are driven to put our best foot forward everyday with confidence, growth, customer obsession and speed as our core values here at Bureau. Think of us as a launching pad for your growth. Come join us and help restore trust in online interactions! How will your day look like Design, build and maintain APIs, services, and systems with low latency, high availability, and performance efficient Create a developer-first risk and fraud API ecosystem Produce documentation to support other team members Design and implementation of security and data protection features Integration of data storage solutions Improve engineering standards, tooling, and processes What does it take to be in this role 7+ years of work experience and enjoys building RESTful APIs Strong Golang skills with experience in databases or data stores such as MySQL, Elasticsearch, Redis, Couchbase, MongoDB, Cassandra, DynamoDB, etc. Experience with AWS or other public clouds. Bachelor s degree in Computer Science or equivalent alternative education, skills and/or practical experience Why should you choose us Your growth is our responsibility. We emphasise on learning and development over material perks and are happier to nourish your mind. If theres a book, course, or program that enhances your work at Bureau, feel free to pursue it well take care of the financial aspect. We believe in flat structures While we do have designations and reporting managers, our structure fosters a lot more freedom. You can collaborate with anyone, explore job rotations, transition between different projects, and express your opinions openly to whomever you choose. Homegrown Leaders Our nurturing environment and specialized programs, like ElevateEngg, have led to success stories where even interns grow into impactful leadership roles over time. FAQs: What is our hiring process like We start with a friendly chat to get to know each other and align goals. Then, we ll have 2-3 discussions where we ll dive into real-world examples to explore your skills. Finally, we ll make sure you re a great fit with our culture and values. How can I improve my chances of getting hired Get to know Bureau s mission and what we re all about. Understand the role, and think about how your past work connects with it. Keep your resume simple, clear, and to the point (2 pages or less) to highlight your skills and experience. What is Bureau s approach to diversity and inclusion We believe in a diverse and inclusive culture where everyone s voice matters. We focus on diverse referrals, inclusive hiring, and offer special leaves to support our team. Our goal is for everyone to feel valued and empowered to grow with us. What learning and growth opportunities can I expect at Bureau At Bureau, we re all about growth. You ll have access to learning resources, mentorship, and exciting projects that help you level up in your career. We re committed to helping you grow and encourage continuous learning along the way.

Posted 3 weeks ago

Apply

13.0 - 18.0 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

About Enlyft Data and AI are at the core of the Enlyft platform. We are looking for creative, customer and detail-obsessed data engineers who can contribute to our strong engineering culture. Our big data engine indexes billions of structured / unstructured documents and leverages data science to accurately infer the footprint of thousands of technologies and products across millions of businesses worldwide. The complex and evolving relationships between products and companies form a technological graph that is core to our predictive modeling solutions. Our machine learning based models work by combining data from our customers CRM with our proprietary technological graph and firmographic data, and reliably predict an accounts propensity to buy. About the Role As a key member of our data platform team, youll lead the development of our next-gen cutting-edge data platform. Your responsibilities will include building robust data pipelines for data acquisition, processing, and implementing optimized data models, creating APIs and data products to support our machine learning models, insights engine, and customer-facing applications. Additionally, youll harness the power of GenAI throughout the data platform lifecycle, while maintaining a strong focus on data governance to uphold timely data availability with high accuracy. What we re looking for Bachelors degree or higher in Computer Science, Engineering or related field with 13+ years of experience in data engineering with a strong focus on designing and building scalable data platforms and products Proven expertise in data modeling, ETL/ELT processes, data warehousing with distributed computing - Hadoop, Spark and Kafka Proficient in programming languages such as Python, Java and SQL Experience with cloud such as AWS, Azure or GCP and related services (S3, Redshift, BigQuery, Dataflow) Strong understanding of SQL / NoSQL databases (e.g. Postgres, MySQL, Cassandra) Proven expertise in data quality checks to ensure data accuracy, completeness, consistency, and timeliness Excellent problem-solving in fast-paced, collaborative environment, coupled with strong communication for effective interaction with tech non-tech stakeholders Why join Enlyft A top-notch culture that is customer-obsessed, transparent and constantly strives for excellence A top-notch team with colleagues that will help you learn and grow in a collaborative environment Competitive pay and great benefits Enlyft is an equal opportunity employer and values diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

0 - 0 Lacs

Thane

Work from Office

Naukri logo

Job Title: Senior DevOps Engineer Hiring Location: Mumbai Experience Range: 4 to 10years Mandate skills: DevOps, Linux, Apache Cassandra, AWS, Docker, Java/Python/Bash Must-Have Skills: Minimum 4+ years of experience with Apache Cassandra 4.1 Administration/Architecture Minimum 4+ years of experience in provisioning, operating, and managing AWS environments Strong experience in Linux environments Hands-on experience with Infrastructure as Code (IaC) tools: Terraform, CloudFormation Proficiency in CI/CD pipeline development and maintenance Expertise in Kubernetes, Docker , and container orchestration Experience in programming/scripting languages : Python, Bash, or Java Proven ability to troubleshoot complex system issues and implement effective solutions Familiarity with AWS APIs Exposure to AWS Disaster Recovery design and multi-region deployments DevOps experience with a strong grasp of automation, deployment, and monitoring Good-to-Have Skills: Experience with multi-tier architectures : load balancers, caching, app/web servers, databases Scripting and automation using PowerShell, Jenkins, Ruby Exposure to Hashicorp Vault or other secrets management tools Experience working with globally distributed teams Professional certifications : AWS DevOps Engineer, Kubernetes Administrator, etc. Experience leading DevOps teams or projects Active participation in open-source communities , personal GitHub projects , or technical blogs Required Skills DevOps, Linux, Apache Cassandra, AWS, Docker, Java/Python/Bash

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

ECMS# * 527079 Number of openings 1 Job Title* SCALA Engineer Work Location (with ZIP code for US) Chennai / Bangalore Vendor Rate* INR 10,000 10,500 / Day Contract duration (in months)* 12 Months BGV Check (post or pre-onboarding) Pre-onboarding Skill Set SCALA Engineer Job Description Required Skills Minimum 5+ years experience in Scala, AKKA, HTTP4 Framework Experience working in Agile and DevOps environment Minimum 5+ years hands on experience with CI/CD tools such as Maven, Jenkins, Ansible Minimum 5+ years with Panthon, AKKA Framework, Junit, Mock framework Minimum 5+ years hands-on experience with Cassandra and Redis database Strong understanding and experience using GitHub, Gerrit for version control and code reviews Familiarity with Linux/Unix Familiarity with Networking concepts and VOIP Experience 13-14 Years

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

As a Lead Software Engineer at JPMorgan Chase within the Commercial and Investment Banks Equities Tech Group, you are an integral part of an agile team that works to enhance, build and deliver trusted market-leading technology products in a secure, stable and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Lead and contribute to the development of server-side applications using Java and Spring. Design and develop scalable architectures, ensuring performance efficiency. Implement messaging frameworks, with a preference for Kafka. Develop and execute automated testing strategies, including end-to-end testing. Collaborate effectively within an agile team setting and communicate with key stakeholders. Provide DevOps production support to ensure application stability and address business queries. Mentor junior developers and drive design and code review sessions. Add to team culture of diversity, equity, inclusion and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Strong hands-on experience with Java and Spring server-side development. Proficiency with NoSql databases - Cassandra Expertise in designing scalable architectures and developing performance-efficient applications. Experience with messaging frameworks, particularly ActiveMQ, Tibco. Proficiency in automated testing, including end-to-end processes. Excellent teamwork and communication skills. Ability to provide DevOps support and mentor junior team members. Capability to lead design and code review sessions. In-depth knowledge of the financial services industry and their IT systems Practical cloud native experience Preferred qualifications, capabilities, and skills Experience with AWS and cloud services. Good Understanding UI development using React JS/.Net Familiarity with container technologies. Understanding of DevOps practices, including CI/CD. Knowledge of SRE concepts, such as monitoring and log tracing.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

45 - 50 Lacs

Bengaluru

Work from Office

Naukri logo

Member of a software engineering team involved in development design of the AI Data Platform built on NetApp s flagship storage operating ONTAP. ONTAP is a feature rich stack with its rich data management capabilities that has tremendous value to our customers and are used in mission critical applications across the world. You will work as part of a team responsible for the development, testing and debugging of distributed software that drives NetApp cloud, hybrid-cloud, and on-premises AI/ML solutions. As part of the Research and Development function, the overall focus of the group is on competitive market and customer requirements, supportability, technology advances, product quality, product cost and time-to-market. Software engineers focus on enhancements to existing products as we'll as new product development. This is a mid-level technical position that requires an individual to be broad-thinking, systems-focused, creative, team-oriented, technologically savvy, able to work in a small and large cross-functional teams, willing to learn and driven to produce results. Job Requirements Proficiency in programming languages like GO/Golang Experience with Machine Learning Libraries and Frameworks: PyTorch, TensorFlow, Keras, Open AI, LLMs ( Open Source), LangChain etc Hands-on experience working with Rest APIs and Micro Services - Flask, API frameworks. Experience working in Linux, AWS/Azure/GCP, Kubernetes - Control plane, Auto scaling, orchestration, containerization is a must. Experience with No Sql Document Databases eg, Mongo DB, Cassandra, Cosmos DB, Document DB. Experience working building Micro Services, REST APIs and related API frameworks. Experience with Big Data Technologies: Understanding big data technologies and platforms like Spark, Hadoop and distributed storage systems for handling large-scale datasets and parallel processing. Proven track record of working on mid to large sized projects. Responsible for providing support in the development and testing activities of other engineers that involve several inter-dependencies. Participate in technical discussions within the team and across cross-functional teams. Willing to work on additional tasks and responsibilities that will contribute towards team, department and company goals. A strong understanding and experience with concepts related to computer architecture, data structures and programming practices. Experience with AI/ML frameworks like PyTorch or TensorFlow is a Plus. Education IC - Typically requires a minimum of 4-7 years of related experience with a bachelors degree or a masters degree; or a PhD with relevant experience.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

50 - 55 Lacs

Bengaluru

Work from Office

Naukri logo

We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Lead Software Engineer at JPMorgan Chase within the Consumer and Community Banking -Banking and Wealth Management, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Build and drive adoption of standardized telemetry, monitoring, alerting, and analysis tools and practices, while identifying and supporting observability and anomaly detection metrics tailored to the needs of specific applications and products Identify production incident themes and drive improvements across applications and product groups and work with teams to drive, support, and measure adoption Own standardized deployment and recertification processes and documentation, ensure and support cross-application adoption Define and ensure adherence to service level objectives for capacity and response time, specific to user journeys and applications Collaborate with architects and product teams to enhance application resiliency and implement Idempotency principles across transaction lifecycles. Conduct knowledge transfer sessions on operational best practices and establish support channels for developers to enhance their operational capabilities. Achieve AI/ML certifications for team members to enhance technical capabilities and expertise. Adds to team culture of diversity, equity, inclusion, and respect Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience. Hands-on experience with cloud-based applications, technologies and tools, deployment, monitoring and operations, such as Kubernetes, Prometheus, FluentD, Slack, Elasticsearch, Grafana, Kibana, etc Relational and NoSQL databases; developing and managing operations leveraging key event streaming, messaging and DB services such as Cassandra, MQ/JMS/Kafka, Aurora, RDS, Cloud SQL, BigTable, DynamoDB, MongoDB, Cloud Spanner, Kinesis, Cloud Pub/Sub, etc Networking (Security, Load Balancing, Network Routing Protocols, etc) Demonstrated experience in the fields of production engineering and automation. Strong understanding of cloud technology standards and practices. Proficiency in utilizing tools for monitoring, analysis, and troubleshooting, including Splunk, Dynatrace, Datadog, or equivalent. Preferred qualifications, capabilities, and skills Ability to conduct detailed analysis on incidents to identify patterns and trends, thereby enhancing operational stability and efficiency. Familiarity with digital certificate management and automation tools. Knowledge of frameworks such as CI/CD pipeline. Excellent communication and collaboration skills.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

40 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

The people here at Apple don t just build products they build the kind of wonder that s revolutionised entire industries. It s the diversity of those people and their ideas that inspires the innovation that runs through everything we do, from amazing technology to industry-leading environmental efforts. Join Apple, and help us leave the world better than we found it. Imagine what you could do here. Are you passionate about handling large complex data problems, want to make an impact and have the desire to work on groundbreaking big data technologiesThen we are looking for you.At Apple, great ideas have a way of becoming great products, services, and customer experiences very quickly. Bring passion and dedication to your job and theres no telling what you could accomplish. Would you like to work in a fast-paced environment where your technical abilities will be challenged on a day-to-day basisIf so, Apples Global Business Intelligence team is looking for passionate, meticulous, technical savvy, energetic engineer who likes to think creatively.Apples Enterprise Data warehouse team deals with Petabytes of data catering to a wide variety of real- time, near real-time and batch analytical solutions. These solutions are integral part of business functions like Retail, Sales, Operations, Finance, AppleCare, Marketing and Internet Services, enabling business drivers to make critical decisions. We use a diverse technology stack such as Snowflake, Spark, Flink, Trino, Kafka, Iceberg, Cassandra and beyond. Designing, developing and scaling these big data solutions are a core part of our daily job. Description We are seeking experienced Senior Data Engineer with a strong background in designing and building scalable data architectures. You will play a key role in creating and optimising our data pipelines, improving data flow across our organisation, and working closely with cross-functional teams to ensure data accessibility and quality. This role requires deep knowledge of BigData ecosystem and Datalake concepts, as well as hands-on expertise in modern big data technologies like Advanced SQL, Spark, Flink, Trino, Iceberg and SnowflakeData Pipeline Development: Design, build, and maintain scalable ELT processes using Spark, Flink, Snowflake and other big data frameworks. Implement robust, high-performance data pipelines in cloud environments. Deep and hand-on knowledge of at-least one programming language like Python, Java or, Scala Expertise with advanced SQL skills and knowledge of BI/Analytics platforms.Datalake Datawarehouse Architecture: Develop and maintain efficient Datalake solutions Ensure Datalake reliability, consistency, and cost-effectiveness. Develop data models and schemas optimised for performance and scalability. Experience with modern data warehouses like Iceberg, Snowflake, etc.Orchestration CI/CD: Comfortable with basic DevOps principles and tools for CI/CD (Jenkins, GitLab CI, or GitHub Actions). Familiar with containerisation and orchestration tools (Docker, Kubernetes). Familiarity with Infrastructure as Code (Terraform, CloudFormation) is a plus.Performance Tuning Optimisation: Identify bottlenecks, optimise processes, and improve overall system performance. Monitor job performance, troubleshoot issues, and refine long-term solutions for system efficiency.Collaboration Leadership: Work closely with data scientists, analysts, and stakeholders to understand data needs and deliver solutions. Mentor and guide junior data engineers on best practices and cutting-edge technologies. At-least 5+ years of hands on experience in developing and building data pipelines on Cloud Hybrid infrastructure for analytical needs Experience working with any cloud based data warehouse solutions - Snowflake, SingleStore etc. along with expertise in SQL and Advance SQL. Experience in designing and building dimensional data models to improve accessibility, efficiency and quality of data Bachelor s Degree or equivalent in data engineering, computer science or similar field. Preferred Qualifications High expertise in modern cloud warehouse, data lakes and implementation experience on any of the cloud platforms (preferably AWS) Expertise working with data at scale (peta bytes) with big data tech stack and advanced programming languages e:g Python, Java or, Scala. Database development experience with Relational or MPP/distributed systems such as Snowflake, SingleStore Hands-on experience with distributed computing in large-scale data environments. Excellent problem solving, critical thinking with ability to evaluate and apply new technologies in a short time Experience in working with global collaborators with ability to influence decision making.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Role Purpose The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipro’s standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Deliver NoPerformance ParameterMeasure1Operations of the towerSLA adherence Knowledge management CSAT/ Customer Experience Identification of risk issues and mitigation plans Knowledge management2New projectsTimely delivery Avoid unauthorised changes No formal escalations Mandatory Skills: Oracle Database Admin. Experience5-8 Years.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title:Java AWS Developer Experience6-12 Years Location:Bangalore : : Experience in Java, J2ee, Spring boot. Experience in Design, Kubernetes, AWS (EKS, EC2) is needed. Experience in AWS cloud monitoring tools like Datadog, Cloud watch, Lambda is needed. Experience with XACML Authorization policies. Experience in NoSQL , SQL database such as Cassandra, Aurora, Oracle. Experience with Web Services SOA experience (SOAP as well as Restful with JSON formats), with Messaging (Kafka). Hands on with development and test automation tools/frameworks (e.g. BDD and Cucumber)

Posted 3 weeks ago

Apply

6.0 - 11.0 years

2 - 5 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Job Title:Java AWS Experience6-12 Years Location:Hyderabad/ Bangalore : Experience in Java, J2ee, Spring boot. Experience in Design, Kubernetes, AWS (Lambda, EKS, EC2) is needed. Experience in AWS cloud monitoring tools like Datadog, Cloud watch, Lambda is needed. Experience with XACML Authorization policies. Experience in NoSQL , SQL database such as Cassandra, Aurora, Oracle. Experience with Web Services SOA experience (SOAP as well as Restful with JSON formats), with Messaging (Kafka). Hands on with development and test automation tools/frameworks (e.g. BDD and Cucumber)

Posted 3 weeks ago

Apply

6.0 - 10.0 years

2 - 5 Lacs

Pune, Chennai

Work from Office

Naukri logo

Job Title:Java Developer Experience6-10 Years Location:Pune, Chennai : You will develop a deep understanding of appropriate data & software components including conceptual data models, data flows, entity relationships, specifications, audit controls, exception and error handling, security, etc. You create and execute test plans/cases based on requirements and design specifications Support the development of high-quality software within a scrum team Contribute to code quality through maintaining high quality software development standards Support technical design decisions through validation of technical approaches and prototyping of new concepts Support team members to ensure successful sprint outcomes Raising technical, security, delivery, and performance risks Be part of and help promote our DevOps culture Identify and implement continuous improvements to the development practice Understand the existing processing architecture and provide high level design and technical solution for the new change request and project Coordinate and work with offshore team for all deliverables and this will include peer review, proper documentation, and installation. What you will need to have: Strong client-facing skills and experience Good understanding of software development patterns, object-oriented design principles, and coding techniques 5+ years of practical experience in JAVA/J2EE, Linux, Spring, Kafka, Redis, Cassandra, EJB. Experience working with cloud technology Azure, OpenShift, AWS, Docker etc. Bachelor’s degree from an accredited institution in Computer Science, Information Technology, or related technical discipline or equivalent work experience. 5+ years of experience with full stack technologies like Java/J2EE, React or Angular, Spring Boot, OpenShift, producing & consuming REST API’s. 3 - 10 years of experience designing, developing, and implementing software applications and hands on experience with large scale applications at large institutions. What would be great to have Experience in Agile Software Development methodologies including Scrum. Snowflake development experience. Can-do attitude with a delivery focus. Practical experience with the end-to-end DevOps CI/CD pipeline including one or more of the followingAzure DevOps, Jenkins, Maven, Gitlab, SonarQube. Knowledge of Test-Driven Development (TDD) and/or Behavior-Driven Development (BDD).

Posted 3 weeks ago

Apply

1.0 - 3.0 years

2 - 6 Lacs

Pune, Gurugram

Work from Office

Naukri logo

Job Title:Software Developer Experience:Years Location:Pune/Gurgaon : Job Profile: Work on Amdocs OSS Product with ample opportunity to grow professionally, functionally and technically. Develop software of high complexity in Java, J2EE and PL-SQL. Excellent debugging and analytical skills. Adhere to the organization’s software development standards in agile environment. Lead a task, development, and Unit testing. Work closely on all levels with internal and external teams. Fast, energetic, diligent, highly motivated with open mind and self-learner Technical Requirements Must have: Must have Excellent knowledge of Java / J2EE / JUNIT / Eclipse / Web Services(REST/SOAP) / XML / XSL / Oracle / PL-SQL/Spring/Maven Knowledge of Docker, Kubernetes – 6 months+ plus knowledge Knowledge of Spring Boot– 6 months+ plus knowledge. Must have Knowledge PostgreSQL Knowledge of AWS, Tomcat Ability to work in a challenging mode. Zero defect mindset. Experience in Agile environment. Functional Knowledge in OSS area - Domain Should be able to perform following tasks: Create micro-service using Spring Boot. Package micro-services as docker images. Create Kubernetes objects such as ingress, service, pod, deployment, etc. Good to have: Should have prior development experience in one of the OSS Application (Inventory COTS product – ARM / Netcracker / Granite.) Knowledge of Cassandra, Kafka, Knowledge of Cucumber, Elasticsearch/Kibana Should be able to perform following tasks: Deploy and support applications in Kubernetes cluster. Create and deploy helm charts.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

2 - 5 Lacs

Pune

Work from Office

Naukri logo

Job Title:Data Engineer Experience5-7Years Location:Pune : Roles & Responsibilities: Create and maintain optimal data pipeline architecture Build data pipelines that transform raw, unstructured data into formats that data analyst can use to for analysis Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvementsautomating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and delivery of data from a wide variety of data sources using SQL and AWS ‘Big Data’ technologies Work with stakeholders including the Executive, Product, and program teams to assist with data-related technical issues and support their data infrastructure needs. Work with data and analytics experts to strive for greater functionality in our data systems Develops and maintains scalable data pipelines and builds out new integrations and processes required for optimal extraction, transformation, and loading of data from a wide variety of data sources using HQL and 'Big Data' technologies Implements processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it Write unit/integration tests, contribute to engineering wiki, and document work Performs root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement Who You Are: You’re passionate about Data and building efficient data pipelines You have excellent listening skills and empathetic to others You believe in simple and elegant solutions and give paramount importance to quality You have a track record of building fast, reliable, and high-quality data pipelines Passionate with good understanding of data, with a focus on having fun, while delivering incredible business results Must have skills: AData Engineerwith 5+ years of relevant experience who is excited to apply their current skills and to grow their knowledge base. A Data Engineer who has attained a degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. Has experience using the following software/tools: Experience with big data tools:Hadoop, Spark, Kafka, Hive etc. Experience with relationalSQLandNoSQL databases, including Postgres and Cassandra Experience withdata pipelineandworkflow management tools Experience with AWS cloud services:EC2, EMR, RDS, Redshift Experience with object-oriented/object function scripting languages:Python, Java, Scala, etc. Experience withAirflow/Ozzie Experience inAWS/Spark/Python development Experience inGIT, JIRA, Jenkins, Shell scripting Familiar withAgile methodology,test-driven development, source control management and automated testing Build processes supporting data transformation, data structures, metadata, dependencies and workload management Experience supporting and working with cross-functional teams in a dynamic environment Nice to have skills: Experience with stream-processing systems:Storm, Spark-Streaming, etc. a plus Experience withSnowflake

Posted 3 weeks ago

Apply

1.0 - 5.0 years

1 - 5 Lacs

Pune

Work from Office

Naukri logo

Roles and Responsibilities Mindstix is looking for a proficient .NET Developer who is passionate about their work and committed to delivering exceptional results. You are a collaborative person who takes pleasure in finding solutions to issues. You are a team player, with a positive attitude and a desire to make a difference. You appreciate technical work by hand and feel a sense of ownership. You require a strong hands-on and extensive knowledge of C# and .NET Frameworks. Write clean, scalable code using .NET programming languages (C#, VB.NET, ASP.NET, etc.). Actively participate in the code review process to ensure development work adheres to standards and specifications. Collaborate with cross-functional teams to analyze, design, and develop new software applications. Identify and solve technical problems that arise during the development process, such as debugging code and optimizing application performance. Integrate databases (like SQL Server) with .NET applications, handling data storage, retrieval, and manipulation. Participate in requirements analysis and provide input on technical feasibility. Troubleshoot and debug applications to optimize performance and ensure functionality. Unit tests all code and programs before releasing them to the quality assurance team. Quickly resolve all unit test issues. Assist quality assurance in identifying test cases and creating/mining test data to enable a thorough test of all development deliverables. Stay updated with the latest advancements in the .NET framework and trends in the software development industry. Qualifications and Skills Bachelors or Master's degree in Computer Science or Information Technology. 2+ years of hands-on relevant work experience Strong foundations in computer science, data structures, algorithms, and programming logic. Experience in the full software development lifecycle and agile methodologies. Strong foundations in the .Net framework Strong understanding of object-oriented programming (OOP), MVC frameworks, common design patterns, and multi-tiered application architecture. Hands-on experience in .NET framework with C#. Utilize ASP.NET MVC, WCF and Web API for creating scalable web applications. Hands-on experience in web development technologies (JavaScript, HTML/HTML5, CSS, etc). Knowledge of relational databases (MySQL / PostgreSQL / MSSQL) and NoSQL databases (MongoDB, Cassandra, CouchDB). Experience with any Cloud Platform AWS, Azure, or Google Cloud would be an advantage. Understanding of containerization. Excellent logical reasoning and data interpretation capability. Ability to interpret business requirements accurately. Who Fits Best? You are a passionate programmer with a flair for solving complex engineering problems. You are a self-motivated and fast learner with a strong sense of ownership and drive. You enjoy working in a fast-paced creative environment. You have an appreciation for great design, a strong sense of aesthetics, and a keen eye for detail. You thrive in a customer-centric environment with the ability to actively listen, empathize, and collaborate with globally distributed teams. You are a team player with a desire to mentor and inspire others to do their best. You love to express ideas and articulate well with strong written and verbal English communication skills. You are detail-oriented with an appreciation for craftsmanship at work. Benefits Flexible working environment. Health Insurance Coverage. Accelerated Career Paths. Global customers. Competitive compensation and perks. Rewards and Recognition. Sponsored certifications. Mentorship by industry leaders.

Posted 3 weeks ago

Apply

10.0 - 13.0 years

35 - 50 Lacs

Chennai

Work from Office

Naukri logo

Cognizant Hiring Payments BA!!! Location: Chennai, Bangalore, Hyderabad JD: Job Summary Atleast 10yrs of experience in the BA role and in that a couple of years of experience as BA lead role good domain knowledge in SWIFT/ISO 20022 Payment background and stakeholders management Java Microservices and Spring boot Technical Knowledge: Java / Spring Boot Kafka Streams REST JSON Netflix Micro Services suite ( Zuul Eureka Hystrix etc)12 Factor Apps Oracle PostgresSQL Cassandra & ELK Ability to work with geographically dispersed and highly varied stakeholders Responsibilities Strategy Develop the strategic direction and roadmap for our flagship payments platform aligning with Business Strategy Tech and Ops Strategy and investment priorities. Tap into latest industry trends innovative products & solutions to deliver effective and faster product capabilities Support CASH Management Operations leveraging technology to streamline processes enhance productivity reduce risk and improve controls Business Work hand in hand with Payments Business taking product programs from investment decisions into design specifications solutioning development implementation and hand-over to operations securing support and collaboration from other teams Ensure delivery to business meeting time cost and high quality constraints Support respective businesses in growing Return on investment commercialization of capabilities bid teams monitoring of usage improving client experience enhancing operations and addressing defects & continuous improvement of systems Thrive an ecosystem of innovation and enabling business through technology Processes Responsible for the end-to-end deliveries of the technology portfolio comprising key business product areas such as Payments Clearing etc. Own technology delivery of projects and programs across global markets that a develop/enhance core product capabilities b ensure compliance to Regulatory mandates c support operational improvements process efficiencies and zero touch agenda d build payments platform to align with latest technology and architecture trends improved stability and scale Interface with business & technology leaders of other systems for collaborative delivery.

Posted 3 weeks ago

Apply

7.0 - 9.0 years

1 - 3 Lacs

Pune

Work from Office

Naukri logo

About the Role: We are looking for a highly skilled and experienced Lead Gen AI engineer to spearhead AI/ML initiatives and oversee the development of advanced machine learning models, deep learning architectures, and generative AI systems. The ideal candidate will have 7-8 years of hands-on experience in data science, machine learning, and data engineering, with a strong focus on leadership, innovation, and generative AI technologies. You will be responsible for guiding a team, delivering AI solutions, and collaborating with cross-functional stakeholders to meet business goals. The desired candidate should be well versed in AI/ML solutioning and should have worked on end to end Product Deliveries. Key Responsibilities: Lead the development and deployment of machine learning models, deep learning frameworks, and AI-driven solutions across the organization. Work closely with stakeholders to define data-driven strategies and drive innovation using AI and machine learning. Design and implement robust data pipelines and workflows in collaboration with data engineers and software developers. Develop and deploy APIs using web frameworks for seamless integration of AI/ML models into production environments. Mentor and lead a team of data scientists and engineers, providing technical guidance and fostering professional growth. Leverage LangChain or LlamaIndex to enhance model integration, document management, and data retrieval capabilities. Lead projects in Generative AI technologies, such as Large Language Models (LLM), Retrieval-Augmented Generation (RAG), and AI agents, to create innovative AI-driven products and services. Stay updated on the latest AI/ML trends, ensuring that cutting-edge methodologies are adopted across projects. Collaborate with cross-functional teams to translate business problems into technical solutions and communicate findings effectively to both technical and non-technical stakeholders. Required Skills and Qualifications: Experience: 7-8 years of experience in data science and AI/ML, with a strong foundation in machine learning, deep learning, generative AI and data engineering. Generative AI Expertise: Minimum 2+ years of experience with generative AI. Hands-on experience with LLMs, RAG, and AI agents. AI Agents & Frameworks: Hands-on experience with AI agent frameworks/libraries (e.g., AutoGen, CrewAI, OpenAI's Function Calling, Semantic Kernel, etc.). Programming: Strong proficiency in Python, with experience using TensorFlow, PyTorch, and Scikit-learn. LangChain & LlamaIndex: Experience integrating LLMs with structured and unstructured data. Knowledge Graphs: Expertise in building and utilizing Knowledge Graphs for AI-driven applications. SQL & NoSQL Databases: Hands-on experience with SQL (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra, etc.). API Development: Experience in developing APIs using Flask, FastAPI, or Django. Cloud & MLOps: Experience working with AWS, GCP, Azure and MLOps best practices. Excellent communication, leadership, and project management skills. Strong problem-solving ability with a focus on delivering scalable, impactful solutions. Preferred Skills: Experience with Computer Vision applications. Chain of Thought Reasoning: Familiarity with CoT prompting and reasoning techniques. Ontology: Understanding of ontologies for knowledge representation in AI systems. Data Engineering: Experience with ETL pipelines and data engineering workflows. Familiarity with big data tools like Spark, Hadoop, or distributed computing.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Role Overview We are seeking an experienced Data Engineer with 7-10 years of experience to design, develop, and optimize data pipelines while integrating machine learning (ML) capabilities into production workflows. The ideal candidate will have a strong background in data engineering, big data technologies, cloud platforms, and ML model deployment. This role requires expertise in building scalable data architectures, processing large datasets, and supporting machine learning operations (MLOps) to enable data-driven decision-making. Key Responsibilities Data Engineering & Pipeline Development Design, develop, and maintain scalable, robust, and efficient data pipelines for batch and real-time data processing. Build and optimize ETL/ELT workflows to extract, transform, and load structured and unstructured data from multiple sources. Work with distributed data processing frameworks like Apache Spark, Hadoop, or Dask for large-scale data processing. Ensure data integrity, quality, and security across the data pipelines. Implement data governance, cataloging, and lineage tracking using appropriate tools. Machine Learning Integration Collaborate with data scientists to deploy, monitor, and optimize ML models in production. Design and implement feature engineering pipelines to improve model performance. Build and maintain MLOps workflows, including model versioning, retraining, and performance tracking. Optimize ML model inference for low-latency and high-throughput applications. Work with ML frameworks such as TensorFlow, PyTorch, Scikit-learn, and deployment tools like Kubeflow, MLflow, or SageMaker. Cloud & Big Data Technologies Architect and manage cloud-based data solutions using AWS, Azure, or GCP. Utilize serverless computing (AWS Lambda, Azure Functions) and containerization (Docker, Kubernetes) for scalable deployment. Work with data lakehouses (Delta Lake, Iceberg, Hudi) for efficient storage and retrieval. Database & Storage Management Design and optimize relational (PostgreSQL, MySQL, SQL Server) and NoSQL (MongoDB, Cassandra, DynamoDB) databases. Manage and optimize data warehouses (Snowflake, BigQuery, Redshift, Databricks) for analytical workloads. Implement data partitioning, indexing, and query optimizations for performance improvements. Collaboration & Best Practices Work closely with data scientists, software engineers, and DevOps teams to develop scalable and reusable data solutions. Implement CI/CD pipelines for automated testing, deployment, and monitoring of data workflows. Follow best practices in software engineering, data modeling, and documentation. Continuously improve the data infrastructure by researching and adopting new technologies. Required Skills & Qualifications Technical Skills: Programming Languages: Python, SQL, Scala, Java Big Data Technologies: Apache Spark, Hadoop, Dask, Kafka Cloud Platforms: AWS (Glue, S3, EMR, Lambda), Azure (Data Factory, Synapse), GCP (BigQuery, Dataflow) Data Warehousing: Snowflake, Redshift, BigQuery, Databricks Databases: PostgreSQL, MySQL, MongoDB, Cassandra ETL/ELT Tools: Airflow, dbt, Talend, Informatica Machine Learning Tools: MLflow, Kubeflow, TensorFlow, PyTorch, Scikit-learn MLOps & Model Deployment: Docker, Kubernetes, SageMaker, Vertex AI DevOps & CI/CD: Git, Jenkins, Terraform, CloudFormation Soft Skills: Strong analytical and problem-solving abilities. Excellent collaboration and communication skills. Ability to work in an agile and cross-functional team environment. Strong documentation and technical writing skills. Preferred Qualifications Experience with real-time streaming solutions like Apache Flink or Spark Streaming. Hands-on experience with vector databases and embeddings for ML-powered applications. Knowledge of data security, privacy, and compliance frameworks (GDPR, HIPAA). Experience with GraphQL and REST API development for data services. Understanding of LLMs and AI-driven data analytics.

Posted 3 weeks ago

Apply

6.0 - 7.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role As a Node.js Developer, you will be responsible for building and maintaining the backbone of our applications. You will utilize your deep understanding of JavaScript, the Node.js ecosystem, and asynchronous programming principles to create efficient and reliable server-side code. You will participate in architectural discussions, contribute to technical designs, and ensure the delivery of robust and well-tested backend services. Responsibilities - Design, develop, and maintain robust and scalable server-side applications and APIs using Node.js and related frameworks (e.g., Express.js, NestJS, Koa.js). - Architect and implement well-documented and secure RESTful APIs and potentially other web service protocols (e.g., GraphQL). - Follow best practices for API design, versioning, and error handling. - Integrate applications with various types of databases, including relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra, Redis). - Design and implement efficient database schemas and data models. - Write optimized database queries and utilize ORM/ODM libraries (e.g., Sequelize, Mongoose) effectively. - Leverage Node.js's non-blocking, event-driven architecture to build high-performance and scalable applications. - Implement efficient asynchronous patterns using Promises, Async/Await, and event loops. - Identify and address performance bottlenecks through profiling and optimization techniques. - Write comprehensive unit, integration, and end-to-end tests using appropriate testing frameworks (e.g., Jest, Mocha, Chai, Supertest). - Participate actively in code reviews to ensure code quality, maintainability, and adherence to coding standards. - Implement and maintain CI/CD pipelines for automated testing and deployment. - Collaborate effectively with frontend developers to define API contracts and ensure seamless integration. - Communicate technical concepts clearly and concisely to both technical and non-technical stakeholders. - Participate in team meetings, sprint planning, and other agile ceremonies. - Participate in the deployment process of Node.js applications to various environments (e.g., cloud platforms like AWS, Azure, GCP; on-premise servers). - Understand and utilize containerization technologies like Docker and orchestration tools like Kubernetes (preferred). - Implement monitoring and logging solutions to ensure application health and facilitate troubleshooting. - Implement security best practices to protect applications from common vulnerabilities (e.g., OWASP top 10). - Implement authentication and authorization mechanisms. - Ensure data security and compliance. - Stay up-to-date with the latest trends and advancements in Node.js, JavaScript, and backend development. - Proactively explore and evaluate new technologies and tools to improve our development processes and application quality. Skills Required - Deep understanding of JavaScript fundamentals, including asynchronous programming, closures, and prototypical inheritance. - Comprehensive knowledge of the Node.js runtime environment, event loop, and core modules. - Strong experience with at least one popular Node.js framework (Express.js is highly preferred; experience with NestJS or Koa.js is a plus). - Proven ability to design, develop, and document RESTful APIs. - Hands-on experience integrating with various types of databases (both SQL and NoSQL). - Testing Proficiency in writing unit, integration, and end-to-end tests with relevant testing frameworks. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Job Posting Back-End Developer Your Role : As a Back-End Developer, you'll collaborate with the development team to build and maintain scalable, secure, and high-performing back-end systems for our SaaS products. You will play a key role in designing and implementing microservices architectures, integrating databases, and ensuring seamless operation of cloud-based applications. Responsibilities : - Design, develop, and maintain robust and scalable back-end solutions using modern frameworks and tools. - Create, manage, and optimize microservices architectures, ensuring efficient communication between services. - Develop and integrate RESTful APIs to support front-end and third-party systems. - Design and implement database schemas and optimize performance for SQL and NoSQL databases. - Support deployment processes by aligning back-end development with CI/CD pipeline requirements. - Implement security best practices, including authentication, authorization, and data protection. - Collaborate with front-end developers to ensure seamless integration of back-end services. - Monitor and enhance application performance, scalability, and reliability. - Keep up-to-date with emerging technologies and industry trends to improve back-end practices. Your Qualifications : Must-Have Skills - Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field. - Proven experience as a Back-End Developer with expertise in modern frameworks such as Node.js, Express.js, or Django. - Expertise in .NET frameworks including development in C++ and C# for high performance databases - Strong proficiency in building and consuming RESTful APIs. - Expertise in database design and management with both SQL (e.g., PostgreSQL, MS SQL Server) and NoSQL (e.g., MongoDB, Cassandra) databases. - Hands-on experience with microservices architecture and containerization tools like Docker and Kubernetes. - Strong understanding of cloud platforms like Microsoft Azure, AWS, or Google Cloud for deployment, monitoring, and management. - Proficiency in implementing security best practices (e.g., OAuth, JWT, encryption techniques). - Experience with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or Azure DevOps. - Familiarity with Agile methodologies and participation in sprint planning and reviews. Good-to-Have Skills - Experience with time-series databases like TimescaleDB or InfluxDB. - Experience with monitoring solutions like Datadog or Splunk. - Experience with real-time data processing frameworks like Kafka or RabbitMQ. - Familiarity with serverless architecture and tools like Azure or AWS Lambda Functions. - Expertise in Java backend services and microservices - Hands-on experience with business intelligence tools like Grafana or Kibana for monitoring and visualization. - Knowledge of API management platforms like Kong or Apigee. - Experience with integrating AI/ML models into back-end systems. - Familiarity with MLOps pipelines and managing AI/ML workloads. - Understanding of iPaaS (Integration Platforms as a Service) and related technologies. Key Competencies & Attributes : - Strong problem-solving and analytical skills. - Exceptional organizational skills with the ability to manage multiple priorities. - Adaptability to evolving technologies and industry trends. - Excellent collaboration and communication skills to work effectively in cross-functional teams. - Ability to thrive in self-organizing teams with a focus on transparency and trust. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Experience with : Leading Java development (Spring Boot, Java 8+, Spring, Microservices) Rest APIs - how to design, build, maintain microservices and best practices, including security and performance tuning. Experience with AWS (Amazon Web Services) services Relational and object databases e.g., MongoDB, Redis, MySQL, Cassandra Architecture of cloud solutions includes focus on scalability, high availability, in a microservice architecture (Spring Boot). Experienced in designing, enhancing, updates, and programming changes for portions and subsystems of systems software, including utilities, databases, and CI/CD tools Basic Knowledge with Kubernetes deployments for micro services, tools like Kops, Helm, etc. Strong Core Java Skills, Design pattern, Collections, Garbage Collection, Multithreading Extensive experience with Java frameworks and technologies, including Spring Boot, Hibernate, and Maven Experience in writing and managing JUnits and Mock frameworks like Mockito, JMock or equivalent Knowledge of relational databases and SQL, as well as NoSQL databases such as MongoDB, Oracle, SQL Server or Cassandra. Experience with RESTful web services and API design Knowledge of relational databases and SQL, as well as NoSQL Required Skills : 5+ of Java and microservices experience Excellent analytical and problem-solving skills Strong ability to work independently, propose architectural solutions, create prototypes, and deliver necessary technical documentation Ability to provide technical guidance on full stack, design, coding, and delivery Good at extracting and writing requirements and specifications, extensive experience with multiple software applications design tools and languages, excellent analytical and problem-solving skills Excellent written and verbal communication skills; proficiency in English and local language Ability to effectively communicate product architectures, design proposals and negotiate Ability to work independently in a fast-paced environment and deliver results under pressure Passion for quality and attention to detail Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

7.0 - 8.0 years

17 - 22 Lacs

Mumbai

Work from Office

Naukri logo

About the Role - We are seeking a highly skilled and experienced Senior Data Architect to join our growing data engineering team. - As a Senior Data Architect, you will play a critical role in designing, developing, and implementing robust and scalable data solutions to support our business needs. - You will be responsible for defining data architectures, ensuring data quality and integrity, and driving data-driven decision making across the organization. Key Responsibilities - Design and implement data architectures for various data initiatives, including data warehouses, data lakes, and data marts. - Define data models, data schemas, and data flows for complex data integration projects. - Develop and maintain data dictionaries and metadata repositories. - Ensure data quality and consistency across all data sources. - Design and implement data warehousing solutions, including ETL/ELT processes, data transformations, and data aggregations. - Support the development and implementation of business intelligence and reporting solutions. - Optimize data warehouse performance and scalability. - Define and implement data governance policies and procedures. - Ensure data security and compliance with relevant regulations (e.g., GDPR, CCPA). - Develop and implement data access controls and data masking strategies. - Design and implement data solutions on cloud platforms (AWS, Azure, GCP), leveraging cloud-native data services. - Implement data pipelines and data lakes on cloud platforms. - Collaborate effectively with data engineers, data scientists, business analysts, and other stakeholders. - Communicate complex technical information clearly and concisely to both technical and non-technical audiences. - Present data architecture designs and solutions to stakeholders. Qualifications Essential - 7+ years of experience in data architecture, data modeling, and data warehousing. - Strong understanding of data warehousing concepts, including dimensional modeling, ETL/ELT processes, and data quality. - Experience with relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). - Experience with data integration tools and technologies. - Excellent analytical and problem-solving skills. - Strong communication and interpersonal skills. - Bachelor's degree in Computer Science, Computer Engineering, or a related field. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

Exploring Cassandra Jobs in India

Cassandra is a popular open-source distributed database management system that is widely used in the tech industry. In India, the demand for professionals with Cassandra skills is on the rise, with many companies actively hiring for roles related to this technology.

Top Hiring Locations in India

Here are 5 major cities in India where there is a high demand for Cassandra professionals:

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Chennai
  5. Mumbai

Average Salary Range

The salary range for Cassandra professionals in India varies based on experience level. Entry-level positions can expect to earn around INR 5-8 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.

Career Path

Typically, a career in Cassandra progresses from roles such as Junior Developer or Database Administrator to Senior Developer, Tech Lead, and eventually Architect or Data Engineer.

Related Skills

In addition to proficiency in Cassandra, employers often look for candidates with the following skills:

  • Strong knowledge of SQL and NoSQL databases
  • Experience with data modeling and database design
  • Proficiency in programming languages like Java or Python
  • Understanding of distributed systems and cloud technologies

Interview Questions

  • What is Cassandra and why is it used? (basic)
  • Explain the CAP theorem and how it relates to Cassandra. (medium)
  • How does Cassandra ensure fault tolerance and high availability? (medium)
  • What are the different types of consistency levels in Cassandra? (medium)
  • How does compaction work in Cassandra? (medium)
  • Explain the role of partition key in Cassandra. (basic)
  • What is a secondary index in Cassandra? (basic)
  • How does Cassandra handle data replication? (medium)
  • What is a tombstone in Cassandra? (medium)
  • Explain the concept of tunable consistency in Cassandra. (advanced)
  • How does read repair work in Cassandra? (medium)
  • What is a batch statement in Cassandra? (basic)
  • How does compaction help in managing disk space in Cassandra? (medium)
  • What is a token in Cassandra? (basic)
  • Explain the role of a coordinator node in Cassandra. (medium)
  • How does gossip protocol work in Cassandra? (medium)
  • What is the role of a snitch in Cassandra? (basic)
  • How does hinted handoff help in ensuring eventual consistency in Cassandra? (medium)
  • Explain the process of anti-entropy repair in Cassandra. (advanced)
  • How does Cassandra handle data distribution across nodes? (medium)
  • What is the role of a commit log in Cassandra? (basic)
  • How does lightweight transactions work in Cassandra? (advanced)
  • Explain the concept of compaction strategy in Cassandra. (medium)
  • What is the role of a Bloom filter in Cassandra? (basic)
  • How does range tombstones help in deleting data in Cassandra? (advanced)

Closing Remark

As you explore job opportunities in the Cassandra domain, make sure to brush up on your skills and be well-prepared for interviews. With the right preparation and confidence, you can land a rewarding career in this growing field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies