Jobs
Interviews

864 Cassandra Jobs - Page 10

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

6 - 10 Lacs

Bengaluru

Work from Office

We're looking for talented software engineers with a passion for a systems-oriented view of software engineering.You’ll be challenged with untangling complex knots in code-bases and concurrent systems, with expertise in Java or other C-lineage languages (Scala, Kotlin, C#, C++, Rust, etc). A gut passion for quality, elegance, performance and simplicity in solutions and code is critical in this role.If you're comfortable in navigating multi-threaded, large distributed systems at scale this will be a great fit.This role is part of the engineering teams that develop our database products, Astra (our multi-cloud database-as-a-service), and on-premises DSE/HCD which are allbased on Apache Cassandra. Our products are used by many major organizations across the world! What you will do: Develop new features, enhancements, and bug-fixes on our highly scalable, multi-tenant, database products. Collaborate extensively with internal teams across to coordinate releases, support existing customers through defect fixes and improvements, and review/advise on documentation for the project. Potentially contribute to Cassandra NoSQL database management system, which handles large sets of data. Model solid engineering practices around architectural design, testability, scalability and maintainability. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-6 Years of relevant experience in software engineering Substantial experience programming distributed and high throughput applications A deep knowledge of Java and JVM ecosystem of open source libraries and projects Deep understanding of algorithms, data structures and software design Strong coding skills with Java, Python, C/C++ Comfortable handing problems related to concurrency and distributed computing Familiarity with bug tracking tools, version control tools, build automation tools and test automation tools. Experience in SDLC having contributed at each stepPlan, Track, Code, Build, Test, Deploy and Monitor. Experience with concurrency, memory management and I/O Experience with Linux or other Unix-like systems An open-minded and collaborative attitude Preferred technical and professional experience Experience with database internals (preferably NoSQL) preferred Experience with at least one major public cloud providers preferred Experience with Kubernetes preferred Experience with Apache Cassandra is a plus. Experience with cloud-scale Saas applications is a plus Prior experience contributing to open source projects is a plus

Posted 2 weeks ago

Apply

9.0 - 14.0 years

4 - 8 Lacs

Bengaluru

Work from Office

We're looking for talented software engineers with a passion for learning and a systems-oriented view of software engineering to join our team working full-time on the core of our proprietary database based on Apache Cassandra. In this position you will be working in an important role on a complex infrastructure project used by many major organizations across the world, and collaborating with fellow engineers to improve the project. If you want to work on the most interesting problems of your career with the most collaborative and skilled peers you've ever worked with, this might be the role for you! You'll take on a critical role on the core of our platform, working on enhancements and bug-fixes on our multi-model distributed database. Engineers on this team collaborate extensively with internal teams across to coordinate for releases, support existing customers through defect fixes and improvements, and review and advise on documentation for the project. We’re looking for engineers that have a knack for untangling complex knots in code-bases and concurrent systems, with expertise in a C-lineage language (Java, Scala, Kotlin, C#, C++, Rust, etc). A gut passion for quality, elegance, performance and simplicity in solutions and code is critical in this role.If you're comfortable in navigating multi-threaded, large distributed systems at scale this will be a great fit. What you will do: Author, debug, and improve code in the core of DataStax Enterprise Cassandra Actively and self-driven collaborate with other engineers, field team and support members Work on maintenance, bug fixes , new feature development and improvements to the platform Help prepare different teams for DSE releases (documentation, field, etc) Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 7 – 9 + years of relevant work experience Expertise in at least one C-lineage language that supports OOP and FP (Java, Scala, Kotlin, C#, C++, Rust, etc) Ability to work autonomously, self-manage your time, and to an extent self-direct when given high level strategic priorities Ability to communicate clearly with peers and stakeholders verbally and via text (video calls, JIRA, Slack, email) Demonstrated ability to focus on analytical tasks such as finding issues in a huge, distributed system A desire to learn and grow daily, both technically and w/soft-skills interpersonally An open-minded and collaborative attitude Preferred technical and professional experience Expertise in Java and Scala programming on the JVM Experience with concurrency, memory management and I/O Experience with Linux or other Unix-like systems Experience with distributed databases, DataStax Enterprise or Apache Cassandra in particular Experience with distributed computing platforms, Apache Spark in particular

Posted 2 weeks ago

Apply

4.0 - 6.0 years

3 - 7 Lacs

Bengaluru

Work from Office

As a Hybrid Cloud Support Engineer, you will utilize your passion for helping others to ensure that our Users and Enterprises are successful in their use of DataStax products and solutions. This is a continuous learning and teaching role where you will develop and share your knowledge of troubleshooting, configuration, and exciting new technologies inclusive of and complementary to Apache Cassandra, DataStax Enterprise, and Astra. What you will do: Research, reproduce, troubleshoot, and solve highly challenging technical issues Provide thoughtful direction and support for technical inquiries Ensure that customer issues are resolved as expediently as possible Diagnose and reproduce customer reported issues and log JIRA tickets Participate in on-call rotation for after-hours, holiday, weekend support coverage Create code samples, tutorials, and articles for the DataStax Knowledge Base Collaborate and contribute to Support Team infrastructure tools and processes Fulfill the on-call rotation requirements of this role Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-6 years of experience supporting large enterprise customers in a customer-facing support role Experience with supporting a Software as a Service Cloud product Experience with Grafana, Prometheus, Splunk, Datadog and other monitoring solutions Experience supporting Kubernetes-based distributed applications, or an understanding of Kubernetes fundamentals Experience with pub-sub, messaging and streaming solutions like Pulsar, Kafka Experience using APIs and understanding app development lifecycle with a language or framework based on Java, Python or Go would be preferred Experience/certifications with AWS/GCP/Azure deployments and associated cloud based monitoring tools would be preferred Experience with Linux operating systems, including command line, performance, and network troubleshooting Excellent verbal and written communication skills Lifetime learner, self-motivated with ability to multi-task during high pressure situations Preferred technical and professional experience Supporting Apache Cassandra environments or other relational and/or alternative database technologies is a plus Understanding of Java, Python, Go and/or another language (Troubleshooting skills) Experience with escalation management and customer success or premium support Experience working in a fast-moving high-pressure environment

Posted 2 weeks ago

Apply

4.0 - 6.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Databases and event logs are complementary infrastructure in modern software architecture. You will be a core member of the engineering team, enhancing and optimizing Pulsar to work with Cassandra in DataStax’s Astra cloud service. You will be highly involved in the design, implementation, and operation of solutions to solve problems for the world’s leading enterprises as we scale up and deliver an amazing developer experience. What you will do: Author, debug, and improve code in the core of Apache Pulsar Contribute to open-source and proprietary projects that interface with Pulsar Aid production support teams debugging and root causing user-facing issues Work in a fast-moving environment to rapidly prototype, iterate and evolve solutions for real-world developer need Perform regular code reviews among peers Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-6 Years of relevant experience Systems level proficiency in Java or another popular language. Experience working on large scale distributed systems Experience with multi-threaded programming and concurrency primitives. Knowledge of distributed data stores (NoSQL systems) to achieve massive scalability and availability of the data made available by your data pipelines. Familiarity with bug tracking tools, version control tools, build automation tools and test automation tools. Experience in SDLC having contributed at each stepPlan, Track, Code, Build, Test, Deploy and Monitor. Preferred technical and professional experience Experience with Apache Pulsar or Kafka is a plus. Experience with Apache Cassandra is a plus.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

30 - 35 Lacs

Bengaluru

Work from Office

FunctionSoftware Engineering, Backend Development Responsibilities: You will work on building the biggest neo-banking app of India You will own the design process, implementation of standard software engineering methodologies while improving performance, scalability and maintainability You will be translating functional and technical requirements into detailed design and architecture You will be collaborating with UX designers and product owners for detailed product requirements You will be part of a fast growing engineering group You will be responsible for mentoring other engineers, defining our tech culture and helping build a fast growing team Requirements: 2-6 years of experience in product development, design and architecture Hands on expertise in at least one of the following programming languages Java, Python NodeJS and Go Hands on expertise in SQL and NoSQL databases Expertise in problem solving, data structure and algorithms Deep understanding and experience in object oriented design Ability in designing and architecting horizontally scalable software systems Drive to constantly learn and improve yourself and processes surrounding you Mentoring, collaborating and knowledge sharing with other engineers in the team Self-starter Strive to write the optimal code possible day in day out What you will get:

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Changing the world through digital experiences is what Adobe is all about. We give everyone - from emerging artists to global brands - everything they need to design and deliver exceptional digital experiences! We are passionate about empowering people to create beautiful and powerful images, videos, and apps, transforming how companies interact with customers across every screen. We are on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Role Summary: Digital Experience (DX) is a USD 4B+ business serving the needs of enterprise businesses, including 95%+ of Fortune 500 organizations. Adobe Marketo Engage, within Adobe DX, the leading marketing automation platform, helps businesses engage customers effectively through various surfaces and touchpoints. We are looking for strong and passionate engineers to join our team as we scale the business by building next-gen products and contributing to our existing offerings. If you're passionate about innovative technology, then we would be excited to talk to you! What You'll Do: - Collaborate with architects, product management, and engineering teams to build solutions that increase the product's value. - Develop technical specifications, prototypes, and presentations to communicate your ideas. - Stay proficient in emerging industry technologies and trends, communicating that knowledge to the team and using it to influence product direction. - Demonstrate exceptional coding skills by writing unit tests, ensuring code quality, and code coverage. - Ensure code is always checked in and that source control standards are followed. What You Need to Succeed: - 5+ years of experience in software development. - Expertise in Java, Spring Boot, Rest Services, MySQL or Postgres, MongoDB. - Good working knowledge of Azure ecosystem and Azure data factory. - Good understanding of working with Cassandra, Solr, ElasticSearch, Snowflake. - Ambitious and not afraid to tackle unknowns, demonstrating a strong bias to action. - Knowledge in Apache Spark and Scala is an added advantage. - Strong interpersonal, analytical, problem-solving, and conflict resolution skills. - Excellent speaking, writing, and presentation skills, as well as the ability to persuade, encourage, and empower others. - Bachelors/Masters in Computer Science or a related field. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.,

Posted 2 weeks ago

Apply

10.0 - 15.0 years

0 Lacs

delhi

On-site

You are looking for a Senior Data Architect to join the team at Wingify in Delhi. As a Senior Data Architect, you will be responsible for leading and mentoring a team of data engineers, optimizing scalable data infrastructure, driving data governance frameworks, collaborating with cross-functional teams, and ensuring data security, compliance, and quality. Your role will involve optimizing data processing workflows, fostering a culture of innovation and technical excellence, and aligning technical strategy with business objectives. To be successful in this role, you should have at least 10 years of experience in software/data engineering, with a minimum of 3 years in a leadership position. You should possess expertise in backend development using programming languages like Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS. Proficiency in SQL, Python, and Scala for data processing and analytics is essential, along with a strong understanding of cloud platforms such as AWS, GCP, or Azure and their data services. Additionally, you should have experience with big data technologies like Spark, Hadoop, Kafka, and distributed computing frameworks, as well as hands-on experience with data warehousing solutions like Snowflake, Redshift, or BigQuery. Deep knowledge of data governance, security, and compliance, along with familiarity with NoSQL databases and automation/DevOps tools, is required. Strong leadership, communication, and stakeholder management skills are crucial for this role. Preferred qualifications include experience in machine learning infrastructure or MLOps, exposure to real-time data processing and analytics, and interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture. Prior experience in a SaaS or high-growth tech company would be advantageous. Please note that candidates must have a minimum of 10 years of experience to be eligible for this role. Graduation from Tier - 1 colleges, such as IIT, is preferred. Candidates from B2B Product Companies with High data-traffic are encouraged to apply, while those who do not meet these criteria are kindly requested not to apply.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an Engineer, IT Data at American Airlines, you will be part of a diverse, high-performing team dedicated to technical excellence. Your primary focus will be on delivering unrivaled digital products that drive a more reliable and profitable airline. The Data Domain you will work in encompasses managing and leveraging data as a strategic asset, including data management, storage, integration, and governance. This domain also involves Machine Learning, AI, Data Science, and Business Intelligence. In this role, you will collaborate closely with source data application teams and product owners to design, implement, and support analytics solutions that provide insights to make better decisions. You will be responsible for implementing data migration and data engineering solutions using Azure products and services such as Azure Data Lake Storage, Azure Data Factory, Azure Functions, Event Hub, Azure Stream Analytics, Azure Databricks, among others, as well as traditional data warehouse tools. Your tasks will span multiple aspects of the development lifecycle, including design, cloud engineering (Infrastructure, network, security, and administration), ingestion, preparation, data modeling, testing, CICD pipelines, performance tuning, deployments, consumption, BI, alerting, and prod support. Furthermore, you will provide technical leadership within a team environment and work independently. As part of a DevOps team, you will completely own and support the product, implementing batch and streaming data pipelines using cloud technologies. Your responsibilities will also include leading the development of coding standards, best practices, and privacy and security guidelines, as well as mentoring others on technical and domain skills to create multi-functional teams. For success in this role, you will need a Bachelor's degree in Computer Science, Computer Engineering, Technology, Information Systems (CIS/MIS), Engineering, or a related technical discipline, or equivalent experience/training. You should have at least 3 years of software solution development experience using agile, DevOps, operating in a product model, as well as 3 years of data analytics experience using SQL. Additionally, a minimum of 3 years of cloud development and data lake experience, preferably in Microsoft Azure, is required. Preferred qualifications include 5+ years of software solution development experience using agile, dev ops, a product model, and 5+ years of data analytics experience using SQL. Experience in full-stack development, preferably in Azure, and familiarity with Teradata Vantage development and administration are also preferred. Airline industry experience is a plus. In terms of skills, licenses, and certifications, you should have expertise with the Azure Technology stack for data management, data ingestion, capture, processing, curation, and creating consumption layers. An Azure Development Track Certification and Spark Certification are preferred. Proficiency in several tools/platforms such as Python, Spark, Unix, SQL, Teradata, Cassandra, MongoDB, Oracle, SQL Server, ADLS, Snowflake, and more is required. Additionally, experience with Azure Cloud Technologies, CI/CD tools, BI Analytics Tool Stack, and Data Governance and Privacy tools is beneficial for this role.,

Posted 2 weeks ago

Apply

14.0 - 18.0 years

0 Lacs

karnataka

On-site

Changing the world through digital experiences is what Adobe is all about. At Adobe, we provide everyone - from emerging artists to global brands - with everything they need to design and deliver exceptional digital experiences. We are passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We are on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We believe that new ideas can come from anywhere in the organization, and we value the contribution of every individual. Digital Experience (DX) is a USD 4B+ business that serves the needs of enterprise businesses, including 95%+ of Fortune 500 organizations. Adobe Marketo Engage, within Adobe DX, is the world's largest marketing automation platform. It is a comprehensive solution that enables enterprises to attract, segment, and nurture customers from discovery to becoming their biggest fans. It helps enterprises effectively engage with customers through various surfaces and touchpoints. We are looking for dedicated and enthusiastic engineers to join our team as we expand the business by developing next-generation products and enhancing our current offerings. If you are passionate about innovative technology, we would be thrilled to have a conversation with you! **What you'll do:** - Be an inspiring leader in building next-generation Multi-cloud services. - Deliver high-performance services that are adaptable to multifaceted business needs and influence ideation & outstanding problem-solving. - Build secure cloud services that provide very high availability, reliability, and security to our customers and their assets. - Lead the technical design, vision, and implementation strategy. - Define and apply best practices to build maintainable and modular solutions with high quality. - Partner with global product management, UX, engineering, and operations teams to help shape technical product architecture & practices, roadmap, and release plans. - Develop and evolve engineering processes and teamwork models, applying creative problem-solving to optimize team efficiency. - Create technical specifications, prototypes, and presentations to communicate your ideas. - Mentor and guide a high-performing engineering team. - Craft a positive winning culture built on collaboration and shared accomplishments. - Lead discussions on emerging industry technologies & trends and work with the team & leadership to use this knowledge to influence product direction. **What you need to succeed:** - Passion and love for what you do! - 14+ years of experience in software development with 5+ years as an Architect. - Strong design, coding, and architectural skills along with problem-solving and analytical abilities. - Expertise in architecting, designing, and building scalable and performant frontend applications. - Expertise in Java, Spring Boot, Rest Services, MySQL or Postgres, MongoDB, Kafka. - Experience in developing and building solutions with cloud technologies (AWS and/or Azure). - Good understanding of working with Cassandra, Solr, ElasticSearch, Snowflake. - Experience with API Design, the ability to architect and implement an intuitive customer and third-party integration story. - Exceptional coding skills, including an excellent understanding of optimization, performance ramifications of coding decisions, and object-oriented design. - Proven track record of working, coaching, and mentoring software engineers. - Ambitious and not afraid to tackle unknowns, demonstrates a strong bias for action. - Self-motivated with a desire to mentor a team and the ability to drive the team to accomplish high-quality work. - Strong interpersonal, analytical, problem-solving, and conflict resolution skills. - Excellent speaking, writing, and presentation skills, as well as the ability to persuade, encourage, and empower others. - BS or MS or equivalent experience in Computer Science or a related field. Adobe aims to make Adobe.com accessible to all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, please email accommodations@adobe.com or call (408) 536-3015.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Backend Engineer with 4 to 8 years of experience, you will be responsible for developing enterprise-grade systems with a focus on backend development using Scala and Akka. You will work on building microservices, high-performance APIs, and real-time event processing engines. Your key responsibilities will include building microservices using Akka HTTP and Akka Streams, designing backend architecture and data processing flows, developing and maintaining RESTful APIs, optimizing application performance, and ensuring system reliability. You will also collaborate with DevOps for CI/CD pipelines and production deployments. In terms of technical skills, you should have proficiency in Scala and the Akka ecosystem, experience in REST API design and implementation, familiarity with Akka Streams and reactive systems, and knowledge of databases like PostgreSQL, MongoDB, or Cassandra. Experience with containerization tools like Docker and Kubernetes is also required. Preferred skills for this role include an understanding of data pipelines and message queues such as Kafka or RabbitMQ, as well as knowledge of authentication protocols like OAuth2 and JWT. In addition to technical skills, soft skills such as being detail-oriented with strong debugging abilities and the ability to work independently and in teams are essential for success in this role. This position offers you the opportunity to work on complex backend engineering projects, gain exposure to cloud-native development, and grow into tech lead roles. If you are looking to further develop your skills in Scala, Akka, and backend development in a collaborative environment, this role is perfect for you.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

bharuch, gujarat

On-site

As a seasoned Full Stack Lead Developer with 12-14 years of experience, you will have the opportunity to join a prestigious product-based technology company located in Bangalore. Known for its innovation, stability, and exceptional work-life balance, this company is seeking a talented individual to take on a leadership role within their development team. Your role will involve leading and mentoring a team of Full Stack Developers, collaborating closely with product managers, designers, and stakeholders, and designing scalable, maintainable, and high-performing web applications. You will be expected to demonstrate expertise in front-end technologies such as HTML, CSS, JavaScript, and modern frameworks like React, Angular, or Vue.js, as well as proficiency in back-end technologies like Node.js, Python, Ruby, or Java. Strong knowledge of both relational (MySQL, PostgreSQL) and NoSQL (MongoDB, Cassandra) databases will be essential for success in this position. The ideal candidate will hold a Bachelor's or Master's degree in Computer Science, Engineering, or a related field, and possess proven experience as a Full Stack Developer with a robust portfolio. You must be capable of writing clean, efficient, and well-documented code, conducting code reviews, and promoting best practices within the team. Joining a 175-year-old product development company in the climate-tech domain, you will become part of a workplace recognized among the best in the tech industry for its stability and commitment to promoting a healthy work-life balance. Don't miss this opportunity to contribute to innovation and career growth in a supportive and forward-thinking environment. If you are passionate about Full Stack development, leadership, and working with cutting-edge technologies, apply now to be a part of this exciting journey in Bangalore. #Hiring #FullStackDeveloper #LeadDeveloper #BangaloreJobs #TechJobs #ClimateTech #ProductDevelopment #WorkLifeBalance #Innovation #CareerOpportunity,

Posted 2 weeks ago

Apply

11.0 - 15.0 years

0 Lacs

karnataka

On-site

As an AI Research Scientist, your role will involve developing the overarching technical vision for AI systems that cater to both current and future business needs. You will be responsible for architecting end-to-end AI applications, ensuring seamless integration with legacy systems, enterprise data platforms, and microservices. Collaborating closely with business analysts and domain experts, you will translate business objectives into technical requirements and AI-driven solutions. Working in partnership with product management, you will design agile project roadmaps that align technical strategy with market needs. Additionally, you will coordinate with data engineering teams to guarantee smooth data flows, quality, and governance across various data sources. Your responsibilities will also include leading the design of reference architectures, roadmaps, and best practices for AI applications. You will evaluate emerging technologies and methodologies, recommending innovations that can be integrated into the organizational strategy. Identifying and defining system components such as data ingestion pipelines, model training environments, CI/CD frameworks, and monitoring systems will be crucial aspects of your role. Leveraging containerization (Docker, Kubernetes) and cloud services, you will streamline the deployment and scaling of AI systems. Implementing robust versioning, rollback, and monitoring mechanisms to ensure system stability, reliability, and performance will also be part of your duties. Project management will be a key component of your role, overseeing the planning, execution, and delivery of AI and ML applications within budget and timeline constraints. You will be responsible for the entire lifecycle of AI application development, from conceptualization and design to development, testing, deployment, and post-production optimization. Enforcing security best practices throughout each phase of development, with a focus on data privacy, user security, and risk mitigation, will be essential. Furthermore, providing mentorship to engineering teams and fostering a culture of continuous learning will play a significant role in your responsibilities. In terms of mandatory technical and functional skills, you should possess a strong background in working with or developing agents using langgraph, autogen, and CrewAI. Proficiency in Python, along with robust knowledge of machine learning libraries such as TensorFlow, PyTorch, and Keras, is required. You should also have proven experience with cloud computing platforms (AWS, Azure, Google Cloud Platform) for building and deploying scalable AI solutions. Hands-on skills with containerization (Docker), orchestration frameworks (Kubernetes), and related DevOps tools like Jenkins and GitLab CI/CD are necessary. Experience using Infrastructure as Code (IaC) tools such as Terraform or CloudFormation to automate cloud deployments is essential. Additionally, proficiency in SQL and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra) and expertise in designing distributed systems, RESTful APIs, GraphQL integrations, and microservices architecture are vital for this role. Knowledge of event-driven architectures and message brokers (e.g., RabbitMQ, Apache Kafka) is also required to support robust inter-system communications. Preferred technical and functional skills include experience with monitoring and logging tools (e.g., Prometheus, Grafana, ELK Stack) to ensure system reliability and operational performance. Familiarity with cutting-edge libraries such as Hugging Face Transformers, OpenAI's API integrations, and other domain-specific tools is advantageous. Experience in large-scale deployment of ML projects, along with a good understanding of DevOps/MLOps/LLM Ops and training and fine-tuning of Large Language Models (SLMs) like PALM2, GPT4, LLAMA, etc., is beneficial. Key behavioral attributes for this role include the ability to mentor junior developers, take ownership of project deliverables, contribute to risk mitigation, and understand business objectives and functions to support data needs. If you have a Bachelor's or Master's degree in Computer Science, certifications in cloud technologies (AWS, Azure, GCP), and TOGAF certification (good to have), along with 11 to 14 years of relevant work experience, this role might be the perfect fit for you.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

Qualcomm India Private Limited is a leading technology innovator pushing the boundaries of what's possible to enable next-generation experiences and drive digital transformation for a smarter, connected future. As a Qualcomm Software Engineer, you will design, develop, create, modify, and validate embedded and cloud edge software, applications, and specialized utility programs to launch cutting-edge, world-class products exceeding customer needs. Collaborate with various teams to design system-level software solutions and gather performance requirements and interfaces. Minimum Qualifications: - Possess a Bachelor's degree in Engineering, Information Systems, Computer Science, or a related field. Senior Machine Learning & Data Engineer: Join our team as a Senior Machine Learning & Data Engineer with expertise in Python development. Design scalable data pipelines, build and deploy ML/NLP models, and enable data-driven decision-making within the organization. Key Responsibilities: - Data Engineering & Infrastructure: Design and implement robust ETL pipelines and data integration workflows using SQL, NoSQL, and big data technologies. - Machine Learning & NLP: Build, fine-tune, and deploy ML/NLP models using frameworks like TensorFlow, PyTorch, and Scikit-learn. - Python Development: Develop scalable backend services using Python frameworks such as FastAPI, Flask, or Django. - Collaboration & Communication: Work closely with cross-functional teams to integrate ML solutions into production systems. Required Qualifications: - Hold a Bachelors or Masters degree in Computer Science, Engineering, or a related field. - Possess strong Python programming skills and experience with modern libraries and frameworks. - Deep understanding of ML/NLP concepts and practical experience with LLMs and RAG architectures. Automation Engineer: As an Automation Engineer proficient in C#/Python development, you will play a crucial role in developing advanced solutions for Product Test automation. Collaborate with stakeholders to ensure successful implementation and operation of automation solutions. Responsibilities: - Design, develop, and maintain core APIs using C#. - Identify, troubleshoot, and optimize API development and testing. - Stay updated with industry trends in API development. Requirements: - Hold a Bachelor's degree in Computer Science, Engineering, or a related field. - Proven experience in developing APIs using C# and Python. - Strong understanding of software testing principles and methodologies. Qualcomm is an equal opportunity employer committed to providing accessible processes for individuals with disabilities. For accommodations, contact disability-accommodations@qualcomm.com.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

noida, uttar pradesh

On-site

Changing the world through digital experiences is what Adobe is all about. Adobe provides emerging artists to global brands with everything they need to design and deliver exceptional digital experiences. The company is passionate about empowering individuals to create beautiful and powerful images, videos, and apps, and revolutionize how companies engage with customers across all screens. Adobe is on a mission to hire the very best talent and is dedicated to creating exceptional employee experiences where everyone is respected and has equal opportunities. The company values new ideas from all levels within the organization, recognizing that the next big idea could come from anyone. At Adobe, employees are immersed in a work environment that is globally recognized. For 20 consecutive years, Adobe has been listed as one of Fortune's "100 Best Companies to Work For." Employees at Adobe are surrounded by colleagues who are committed to each other's growth and success. If you are looking to make a meaningful impact, Adobe is the place for you. Learn more about the career experiences of Adobe employees on the Adobe Life blog and explore the comprehensive benefits offered by the company. Role Summary: Digital Experience (DX) is a USD 4B+ business catering to enterprise needs, including 95%+ of Fortune 500 organizations. Adobe Marketo Engage, part of Adobe DX, is the world's largest marketing automation platform, offering enterprises a comprehensive solution to attract, segment, and nurture customers from initial discovery to becoming loyal advocates. The platform enables effective customer engagement across various touchpoints and surfaces. We are seeking talented and passionate engineers to join our team as we expand the business by developing next-generation products and enhancing our current offerings. If you have a passion for innovative technology, we would love to speak with you! What you'll do: This is an individual contributor role with responsibilities including: - Developing new services - Working in full DevOps mode, overseeing multiple engineering phases from early specs to deployment - Collaborating with architects, product management, and other engineering teams to enhance product features - Participating in the resolution of production issues and creating preventive solutions for future incidents Requirements: - B.Tech / M.Tech degree in Computer Science from a premier institute - 2+ years of relevant experience in software development - Strong computer science fundamentals and understanding of algorithm design and performance - Passionate about delivering quality work with persistence and high standards - Proficiency in Java, Spring Boot, Rest Services, and good knowledge of MySQL, Postgres, Cassandra, Kafka, Redis, MongoDB, Solr, Elastic Search, Spark is an added advantage Adobe is committed to ensuring accessibility for all users on Adobe.com. If you require accommodation due to a disability or special need during the website navigation or application process, please contact accommodations@adobe.com or call (408) 536-3015.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Lead Software Engineer (Java Full Stack Developer) at our company, you will have the opportunity to work with the industry's most passionate and engaged global team. Our employees are empowered to drive innovation daily to support a more connected world - A World Beyond Cash. The MasterCard B&MI Technology development team is dedicated to leveraging new and innovative technologies to create business solutions that maintain Mastercard's position as a leader in delivering value-added business analytics and reporting solutions to our diverse customer base. Collaborating with product partners and technical teams, you will play a key role in enhancing existing products and introducing new offerings to the global market. We are currently looking for candidates who can propose and design scalable solutions for our products. Your responsibilities will include designing secure, reliable, and scalable solutions for globally distributed customer-facing products. You will also lead and mentor junior team members, support agile development practices, and contribute to the technology strategy for Operational insights products. Engaging with various technical teams, you will research, create, and evaluate technical solutions using current and upcoming technologies and frameworks. In this role, you will work with technologies such as Java, J2EE, microservices, RESTful APIs, Angular, Web Services, JavaScript, Docker, and frameworks like Spring and Hibernate. Additionally, you will utilize UI/UX frameworks, API tooling, Cloud and DevOps architecture, and various databases and tools to support product development. The ideal candidate for this position holds a Bachelor's degree in Information Technology, Computer Science, or Management Information Systems, along with relevant work experience. You should have a solid understanding of software engineering concepts and methodologies, coupled with a proactive approach to learning and problem-solving. Strong communication skills and experience in designing and deploying Java webservices are essential for success in this role. Beneficial experiences include familiarity with standard and regulatory compliance, agile development practices, and exposure to trending technologies like AI/ML, IOT, Bot, and Quantum Computing. By joining our team, you will have the opportunity to contribute to impactful projects and drive innovation in a collaborative and dynamic work environment. As a responsible member of our organization, you are expected to adhere to Mastercard's security policies, maintain the confidentiality and integrity of information, report any security violations, and participate in mandatory security training sessions. Your commitment to information security is crucial in safeguarding Mastercard's assets, information, and networks.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Senior Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. Your responsibilities will include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will also be monitoring and controlling all phases of the development process, providing user and operational support on applications to business users, and recommending and developing security measures post-implementation to ensure successful system design and functionality. Furthermore, you will be utilizing in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgments. You will consult with users/clients and other technology groups on issues, recommend advanced programming solutions, and ensure essential procedures are followed while defining operating standards and processes. As an Applications Development Senior Programmer Analyst, you will also serve as an advisor or coach to new or lower-level analysts, operate with a limited level of direct supervision, and exercise independence of judgment and autonomy. You will act as a subject matter expert to senior stakeholders and/or other team members and appropriately assess risk when making business decisions. Qualifications: - Must Have: - 8+ years of application/software development/maintenance - 5+ years of experience on Big Data Technologies like Apache Spark, Hive, Hadoop - Knowledge of Python, Java, or Scala programming language - Experience with JAVA, Web services, XML, Java Script, Microservices, etc. - Strong technical knowledge of Apache Spark, Hive, SQL, and Hadoop ecosystem - Experience with developing frameworks and utility services, code quality tools - Ability to work independently, multi-task, and take ownership of various analyses - Strong analytical and communication skills - Banking domain experience is a must - Good to Have: - Work experience in Citi or Regulatory Reporting applications - Hands-on experience on cloud technologies, AI/ML integration, and creation of data pipelines - Experience with vendor products like Tableau, Arcadia, Paxata, KNIME - Experience with API development and data formats Education: Bachelors degree/University degree or equivalent experience This job description provides a high-level overview of the work performed. Other job-related duties may be assigned as required.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

indore, madhya pradesh

On-site

You should have expert-level proficiency in Python and Python frameworks or Java. You must have hands-on experience with AWS Development, PySpark, Lambdas, Cloud Watch (Alerts), SNS, SQS, CloudFormation, Docker, ECS, Fargate, and ECR. Deep experience with key AWS services like Compute (PySpark, Lambda, ECS), Storage (S3), Databases (DynamoDB, Snowflake), Networking (VPC, 53, CloudFront, API Gateway), DevOps/CI-CD (CloudFormation, CDK), Security (IAM, KMS, Secrets Manager), Monitoring (CloudWatch, X-Ray, CloudTrail), and NoSQL Databases like Cassandra, PostGreSQL is required. You should have very strong hands-on knowledge of using Python for integrations between systems through different data formats. Expertise in deploying and maintaining applications in AWS, along with hands-on experience in Kinesis streams and Auto-scaling, is essential. Designing and implementing distributed systems and microservices, and following best practices for scalability, high availability, and fault tolerance are key responsibilities. Strong problem-solving and debugging skills are necessary for this role. You should also have the ability to lead technical discussions and mentor junior engineers. Excellent written and verbal communication skills are a must. Comfort working in agile teams with modern development practices and collaborating with business and other teams to understand business requirements and work on project deliverables is expected. Participation in requirements gathering, understanding, designing a solution based on available framework and code, and experience with data engineering tools or ML platforms (e.g., Pandas, Airflow, SageMaker) are required. An AWS certification such as AWS Certified Solutions Architect or Developer is preferred. This position is based in multiple locations including Indore, Mumbai, Noida, Bangalore, Chennai in India. Qualifications: - Bachelor's degree or foreign equivalent required from an accredited institution. Consideration will be given to three years of progressive experience in the specialty in lieu of every year of education. - At least 8+ years of Information Technology experience.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an American Airlines team member, you will be a part of a diverse, high-performing team dedicated to technical excellence, focusing on delivering unrivaled digital products that drive a more reliable and profitable airline. The Infrastructure Domain in Information Technology manages and maintains the client/server hardware and software that underpins an organization's computing environment, covering responsibilities such as IT logistics, virtualization, storage, containers, integration technologies, and data center management. In the role of a Redis Engineer within the Data Movement team, you will be responsible for installing, configuring, maintaining, and providing production support for Redis instances, including standalone and clustered setups. Your duties will involve automating operational tasks to ensure high availability, performance, and scalability of Redis deployments. You will collaborate closely with development and operations teams to optimize data storage, caching, and real-time messaging performance. Key Responsibilities: - Install, configure, and maintain Redis in development, testing, and production environments. - Monitor and troubleshoot Redis-related production issues to ensure system reliability and performance. - Design and implement high-availability and disaster recovery strategies for Redis deployments. - Develop automation scripts and tools to streamline Redis infrastructure provisioning and monitoring. - Work with other IT professionals to optimize data storage and ensure seamless integration of IT systems. - Implement security measures to protect IT infrastructure and ensure compliance with security policies. - Participate in on-call rotations to support Redis infrastructure during off-hours. - Maintain documentation of configurations, standard operating procedures, and best practices. Qualifications: - Bachelor's degree in Computer Science or related technical discipline. - 3+ years of hands-on experience with Redis in production environments. - Proficiency in scripting languages like Bash, Python, or Go. - Experience with containerized environments and Infrastructure as Code tools. - Familiarity with CI/CD tools, cloud platforms, and networking concepts. - Strong Linux system administration skills. - Excellent verbal and written communication skills. - Ability to work independently and in a collaborative, team-based environment. If you are looking for a challenging role that offers opportunities for growth and technical innovation in a dynamic environment, consider joining American Airlines as a Redis Engineer. Feel free to be yourself and make a significant impact on the world-class customer experience we deliver.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

As a Product Director of Global Customer Platforms in Consumer & Community Banking, you will ignite your passion for product innovation by leading customer-centric development, inspiring solutions, and shaping the future with your strategic vision and influence. Your role will involve leading innovation through the development of products and features that delight customers. You will leverage your advanced capabilities to challenge traditional approaches, remove barriers to success, and foster a culture of continuous innovation that helps inspire cross-functional teams to create groundbreaking solutions that address customer needs. Your responsibilities will include acting as a regional leader obsessed with spreading organizational values, collaborating with other local site leaders, coordinating regional changes, and fostering cohesiveness across geographic locations for product teams. You will oversee the product roadmap, vision, development, execution, risk management, and business growth targets. Leading the entire product life cycle through planning, execution, and future development by continuously adapting, developing new products and methodologies, managing risks, and achieving business targets like cost, features, reusability, and reliability to support growth will be a key aspect of your role. Furthermore, you will coach and mentor the product team on best practices such as solution generation, market research, storyboarding, mind-mapping, prototyping methods, product adoption strategies, and product delivery, enabling them to effectively deliver on objectives. You will own product performance and be accountable for investing in enhancements to achieve business objectives. Monitoring market trends, conducting competitive analysis, and identifying opportunities for product differentiation will also be part of your responsibilities. To be successful in this role, you should have 8+ years of experience delivering products, projects, technology applications with experience managing technical platforms and/or data-focused capabilities. You should have a good understanding of technologies including API, Microservices, Cassandra, Kafka, AWS, etc. A customer obsessed leader with the ability to build and maintain strong, productive relationships with engineers and technical partners, and an ability to translate customer needs into clear technical requirements is essential. Moreover, you should be a strong leader who can drive change through influence and collaboration across a matrix organization in a highly dynamic environment. Extensive knowledge of the product development life cycle, technical design, and data analytics is required. You should have proven ability to influence the adoption of key product life cycle activities including discovery, ideation, strategic development, requirements definition, and value management. Experience driving change within organizations and managing stakeholders across multiple functions is crucial. Exceptional written and presentation skills are a must for this role. Preferred qualifications include being recognized as a thought leader within a related field, having team skills and the ability to cross-functionally drive/influence work through others, ability to mentor and lead teams to achieve results for complex, ambiguous projects, skills in cost-efficient solution building, financial performance metric creation and analysis, business acumen, and knowledge of root cause analysis and problem detection.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an Engineer, IT Data at American Airlines, you will be part of a diverse and high-performing team dedicated to technical excellence. Your main focus will be on delivering unrivaled digital products that drive a more reliable and profitable airline. The Data Domain you will be working in refers to the area within Information Technology that focuses on managing and leveraging data as a strategic asset. This includes data management, storage, integration, and governance, leaning into Machine Learning, AI, Data Science, and Business Intelligence. In this role, you will work closely with source data application teams and product owners to design, implement, and support analytics solutions that provide insights to make better decisions. You will be responsible for implementing data migration and data engineering solutions using Azure products and services such as Azure Data Lake Storage, Azure Data Factory, Azure Functions, Event Hub, Azure Stream Analytics, Azure Databricks, etc., as well as traditional data warehouse tools. Your responsibilities will involve multiple aspects of the development lifecycle including design, cloud engineering, ingestion, preparation, data modeling, testing, CICD pipelines, performance tuning, deployments, consumption, BI, alerting, and production support. You will provide technical leadership, collaborate within a team environment, and work independently. Additionally, you will be part of a DevOps team that completely owns and supports the product, implementing batch and streaming data pipelines using cloud technologies. As an essential member of the team, you will lead the development of coding standards, best practices, privacy, and security guidelines. You will also mentor others on technical and domain skills to create multi-functional teams. Your success in this role will require a Bachelor's degree in Computer Science, Computer Engineering, Technology, Information Systems (CIS/MIS), Engineering, or a related technical discipline, or equivalent experience/training. To excel in this position, you should have at least 3 years of software solution development experience using agile, DevOps, and operating in a product model. Moreover, you should have 3+ years of data analytics experience using SQL and cloud development and data lake experience, preferably with Microsoft Azure. Preferred qualifications include 5+ years of software solution development experience, 5+ years of data analytics experience using SQL, 3+ years of full-stack development experience, and familiarity with Azure technologies. Additionally, skills, licenses, and certifications required for success in this role include expertise with the Azure Technology stack, practical direction within Azure Native cloud services, Azure Development Track Certification, Spark Certification, and a combination of Development, Administration & Support experience with various tools/platforms such as Scripting (Python, Spark, Unix, SQL), Data Platforms (Teradata, Cassandra, MongoDB, Oracle, SQL Server, ADLS, Snowflake), Azure Cloud Technologies, CI/CD tools (GitHub, Jenkins, Azure DevOps, Terraform), BI Analytics Tool Stack (Cognos, Tableau, Power BI, Alteryx, Denodo, Grafana), and Data Governance and Privacy tools (Alation, Monte Carlo, Informatica, BigID). Join us at American Airlines, where you can explore a world of possibilities, travel the world, grow your expertise, and become the best version of yourself while contributing to the transformation of technology delivery for our customers and team members worldwide.,

Posted 2 weeks ago

Apply

2.0 - 7.0 years

15 - 22 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

We are pleased to inform you the openings with our INFOSYS Client on their direct and permanent payroll. Please find the job details for your perusal. Client: INFOSYS Skill: Java+UI Full stack Experience: 2 15 Yrs Job location: Chennai/Hyderabad/Bangalore/Pune/TRVM/Chandigarh Interested candidates please forward your updated profile in word format with the below mandatory details for further process: Candidate Name (as per passport): Date of Birth (as per passport): Contact Number: Email ID: Current Company: Overall Experience: Relevant Experience: Current CTC: Expected CTC: Holding Offers in Hand: If yes, please share the details. Notice Period: Currently Service Notice: if yes, please mention last working day in the current company & DOJ as well. Current Location: Job Location:

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

We are pleased to inform you the openings with our INFOSYS Client on their direct and permanent payroll. Please find the job details for your perusal. Client: INFOSYS Skill: Java + Cassandra Job location: Chennai/Hyderabad/Bangalore/Pune/TRVM/Chandigarh Interested candidates please forward your updated profile in word format with the below mandatory details for further process: Candidate Name (as per passport): Date of Birth (as per passport): Contact Number: Email ID: Current Company: Overall Experience: Relevant Experience: Current CTC: Expected CTC: Holding Offers in Hand: If yes, please share the details. Notice Period: Currently Service Notice: if yes, please mention last working day in the current company & DOJ as well. Current Location: Job Location:

Posted 2 weeks ago

Apply

5.0 - 10.0 years

25 - 35 Lacs

Bengaluru

Hybrid

We're Hiring: Senior Java Developer | Microservices + Cloud | Xebia Location : Whitefield, Bangalore (Hybrid 3 Days Office/Week) Availability : Immediate Joiners Preferred (Max 2 Weeks Notice) Xebia is looking for passionate, skilled, and driven Senior Java Developers who thrive in building scalable, cloud-native microservices and possess strong architectural fundamentals. If you're empathetic, exhibit Xebia’s values, and actively engage in tech communities – we want you on board! Key Skills & Technologies Core Java, Spring Boot, Spring WebFlux Microservices Architecture Design Patterns & SOLID Principles Kubernetes, Docker, AKS SQL (PostgreSQL) and NoSQL (MongoDB, Couchbase, Cassandra) Cloud : AWS / Azure (Cloud-native development) Strong communication & organizational skills Empathetic, personable, and collaborative Active participant in meetups, conferences, or webinars Technical and cultural fit with Xebia values Must-Haves Immediate to max 2 weeks notice Willing to work in Hybrid mode (3 days/week in Whitefield office) Experience range: As per role fitment and demonstrated technical capability Strong awareness of Xebia —our business, values, Glassdoor rating, and industry presence How to Apply Please email your profile to vijay.s@xebia.com with the following details: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Xebia Location Notice Period / Last Working Day (if serving) Primary Skills LinkedIn Profile URL Join us to shape the future of tech while staying true to craftsmanship, passion, and innovation. #Java #SpringBoot #Microservices #Kubernetes #Docker #CloudNative #AWS #Azure #NoSQL #TechJobs #Xebia #HiringNow #ImmediateJoiners #BangaloreJobs

Posted 2 weeks ago

Apply

9.0 - 17.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Join our Team About this opportunity: The Data Scientist will support software developers, database architects, data analysts and AI/ML architects on various internal and external data science and AI/ML related initiatives and projects that run in cloud based environments. You are self-directed and comfortable supporting the data analytics, and AIML needs of multiple teams, systems and products. You will also be responsible for integrating the solutions created with the architecture used across the company and its customers. What you will do: 9 - 17 years of Telecom/IT experience in a Data related Role with AI ML and Data Science experience. At least 5 years experience using the following software/tools: Python Development for AI/ML and automation Data pipeline development in Elastic (ELK) Stack, Hadoop, Spark, Kafka, etc. Relational SQL and NoSQL databases, for example Postgres and Cassandra. MLOps experience including Model deployment and monitoring AI/ML Solutions for prediction, classification, Natural Language Processing Generative AI technologies like Large Language Models, Agentic AI, RAG and other developing technologies Proficiency in Python programming, with experience in Python libraries commonly used in data engineering and machine learning (e.g., pandas, numpy, scikit-learn) Hands-on experience with Kubernetes and containerization technologies (e.g., Docker). Proficiency in Linux & Shell scripting knowledge What you will Bring: Responsible to work hand in hand with business representatives to design, architect and deliver AI/ML and/or Generative AI solutions considering business ROI in perspective. Create and maintain optimal data pipeline architecture. Identify Data intensive and AL/ML Use cases for different existing Managed service accounts. Prepare technical presentation, design documents and demonstrations for customer presentations on Data strategy and AI/ML Solutions Prepare design, solution documents for AI/ML Solutions including classic AI/ML and Generative AI based solutions. Assemble large, complex data sets that meet functional / non-functional business requirements. Design solutions that will keep data separated and secure across national boundaries through multiple data centers and strategic customers/partners keeping international security standards, organization and customer security requirements in mind. Working knowledge of Generative AI technologies, Large Language Models (LLMs), Agentic AI architecture and implementation tools. Why join Ericsson? What happens once you apply? Primary country and city: India (IN) || Bangalore Req ID: 769881

Posted 2 weeks ago

Apply

6.0 - 11.0 years

19 - 27 Lacs

Haryana

Work from Office

About Company Founded in 2011, ReNew, is one of the largest renewable energy companies globally, with a leadership position in India. Listed on Nasdaq under the ticker RNW, ReNew develops, builds, owns, and operates utility-scale wind energy projects, utility-scale solar energy projects, utility-scale firm power projects, and distributed solar energy projects. In addition to being a major independent power producer in India, ReNew is evolving to become an end-to-end decarbonization partner providing solutions in a just and inclusive manner in the areas of clean energy, green hydrogen, value-added energy offerings through digitalisation, storage, and carbon markets that increasingly are integral to addressing climate change. With a total capacity of more than 13.4 GW (including projects in pipeline), ReNew’s solar and wind energy projects are spread across 150+ sites, with a presence spanning 18 states in India, contributing to 1.9 % of India’s power capacity. Consequently, this has helped to avoid 0.5% of India’s total carbon emissions and 1.1% India’s total power sector emissions. In the over 10 years of its operation, ReNew has generated almost 1.3 lakh jobs, directly and indirectly. ReNew has achieved market leadership in the Indian renewable energy industry against the backdrop of the Government of India’s policies to promote growth of this sector. ReNew’s current group of stockholders contains several marquee investors including CPP Investments, Abu Dhabi Investment Authority, Goldman Sachs, GEF SACEF and JERA. Its mission is to play a pivotal role in meeting India’s growing energy needs in an efficient, sustainable, and socially responsible manner. ReNew stands committed to providing clean, safe, affordable, and sustainable energy for all and has been at the forefront of leading climate action in India. Job Description Key responsibilities: 1. Understand, implement, and automate ETL pipelines with better industry standards 2. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, design infrastructure for greater scalability, etc 3. Developing, integrating, testing, and maintaining existing and new applications 4. Design, and create data pipelines (data lake / data warehouses) for real world energy analytical solutions 5. Expert-level proficiency in Python (preferred) for automating everyday tasks 6. Strong understanding and experience in distributed computing frameworks, particularly Spark, Spark-SQL, Kafka, Spark Streaming, Hive, Azure Databricks etc 7. Limited experience in using other leading cloud platforms preferably Azure. 8. Hands on experience on Azure data factory, logic app, Analysis service, Azure blob storage etc. 9. Ability to work in a team in an agile setting, familiarity with JIRA and clear understanding of how Git works 10. Must have 5-7 years of experience

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies