Jobs
Interviews

566 Solr Jobs - Page 12

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Apply if you are: • Passionate about finding pesky bugs and moving forward with fixing them to have a flawless, bug-free product. • You like to be in a fast-paced distributed organization and have a strong sense of urgency • You are highly productive under pressure and pride yourself for producing timely accurate issue diagnostics and test reports • You have a strong commitment to extending your technical knowledge and maintaining expertise in wide range of testing and automation • You have excellent communication skill, detail oriented, driven, mission driven and a team player • You are enthusiastic to contribute, lead, and excited to be in a startup environment Your Role: • Responsible for test infrastructure, allowing the company to exceed its quality standards • Work closely with the Engineering teams to reproduce customer issues, define test strategies, build frameworks, automate customer use cases and assist with customer escalation • Design and implement robust test cases that integrate well with modern CI/CD deployment technologies • Own end-to-end test cycle for product feature from test design to test automation including issue reproduction from customer escalation • Maintain existing automated test cases for functional, regression and performance test suites • Work side-by-side with engineers on testing and automating test cases in sprint cycles • Identify, document, and debug software defects • Maintain and enhance in-house test frameworks • Create test data sources and validate data ingestion & connectivity • Indexing & search testing • Permission & access control testing, role-based testing • AI chat & AI agent testing • UI/UX & usability testing Requirements Qualifications / Experience / Technical Skills • Bachelor’s Degree in Computer Science or equivalent of education • 3+ years of Software Quality Engineering • Expert in distributed systems, container orchestration (Docker, K8S), and Cloud technologies (AWS, Azure, GCP) • Knowledge of at least one programming language (Java, C++, etc...) and one scripting language (Ruby, JavaScript, etc...) • Knowledge of common Web application vulnerabilities such as the OWASP Top 10 • Understanding of authentication protocols (OAuth, SAML), role-based access control, and general security best practices • Experience with test automation frameworks (Playwright, RSpec,…) • Proficient in testing REST APIs, data ingestion pipelines, and familiarity with indexing/search technologies (e.g., Elasticsearch, Solr) • Experience testing AI-driven applications or chatbots • Understanding of various test methodologies: regression, functional, performance • Able to analyze and triage automated test runs and take appropriate actions for failure classification

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Visa is a world leader in payments and technology, with over 259 billion payments transactions flowing safely between consumers, merchants, financial institutions, and government entities in more than 200 countries and territories each year. Our mission is to connect the world through the most innovative, convenient, reliable, and secure payments network, enabling individuals, businesses, and economies to thrive while driven by a common purpose – to uplift everyone, everywhere by being the best way to pay and be paid. Make an impact with a purpose-driven industry leader. Join us today and experience Life at Visa. Job Description As a Software Engineer, you will be responsible for design and implementation of the platform for consumer facing mobile and digital products. Must be able to coordinate and manage input from business and technology stakeholders across the enterprise and be responsible for the design and execution of large-scale technology initiatives. You will drive development of multiple applications, prototype new applications and feature ideas and explore new technologies that are at the forefront of backend development. We are looking for someone with serious development skills, strong interests in all things related to backend, and a passion for delivering high quality, rock-solid code. You will play part of teams tasked with multiple projects ranging from building full stack development of real-time transactional services, REST services API, Container based services to highly functional, secure, scalable and resilient Platform Libraries / Services / Framework and Infrastructure. You will be providing the best platform for PPT Platform engineers for them to succeed. If this sounds exciting, we would love to chat and tell you more about our work culture and environment. Key responsibilities include: Design the appropriate solution that is based on the business requirements. Perform proofs of concept (PoCs) and other technical evaluations of technologies, designs, and solutions. Work with engineering professionals, architects and others within Visa to ensure that the solution will scale appropriately while ensuring that what is implemented is a cost-effective solution. Continue to learn about the payments industry, and the factors impacting the payments industry. · Experience in commercial software development on Unix/Linux environment. · You have experience in consumer facing application development experience. · You have amazing work ethics that will help us all work extremely well together · Knowledge of MVC design pattern · Experience of programming in Java, J2EE · Experience with web services standards and related technologies (HTTP, Spring, XML, JSON, REST) · Experience on Big Data Cluster [ Solr, Kafka etc] · Solid understanding of database technologies and SQL. · Knowledge of NoSQL Databases is a plus. · Strong foundation in computer science, with strong competencies in data structures and algorithms. · Proven problem solving skills and an ability to respond resourcefully to new demands, priorities and challenge · Solid coding practices including good design documentation, unit testing, source control (GIT, SVN etc) · Experience with build tools (Maven, Gradle etc). · Ability and desire to learn new skills and take on new initiatives. This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager. Qualifications Basic Qualifications Associate: Minimum of 6 months of work experience or a Bachelor's Degree Preferred Qualifications Associate: 2 or more years of work experience Additional Information Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law.

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job ID: Sen-ETP-Pun-1073 Location: Pune Role Support customer project deployments in AWS, Google Cloud (GCP) and Microsoft Azure Collaborate with development teams to design and implement robust, scalable, and efficient solutions Identify areas for improvement in system performance, reliability, and security Conduct regular performance testing, capacity planning, and optimization activities Maintain comprehensive documentation of system configurations, processes, and procedures Collaborate with development teams to deploy new code to production environment Manage incident response with engineers and clients Work in rotating 24/7 shifts Skills Familiarity with the following technologies: Linux, git, ruby, bash, AWS / Azure / Google Cloud, Kubernetes, MySQL, Solr, Apache Tomcat, Java, Graylog, Kibana, Zabbix, and Datadog Proven troubleshooting and problem-solving skills in a cloud-based application environment Outstanding communication skills with the ability to work in a client-facing role

Posted 1 month ago

Apply

1.0 - 3.0 years

6 - 10 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Technically hands-on, with deep dive expertise in designing and developing real time internet applications or high scale SAAS applications 1 - 3 years of experience in developing products Excellent command over Data Structures & Algorithms Exceptional coding skills in an Object-Oriented programming language (Java+Golang) Strong problem solving and analytical skills Experience with web technologies Java, Spring, Python, Linux, Apache, MySQL, Solr, Memcache, Redis Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 1 month ago

Apply

3.0 - 8.0 years

14 - 18 Lacs

Hyderabad

Work from Office

Job Area: Information Technology Group, Information Technology Group > IT Software Developer General Summary: Qualcomm IT is seeking an experienced full-stack lead with experience in the IP space. We are looking for someone who architects, designs, develops and supports custom IP analysis applications for one of the world’s largest patent portfolios. Should be able to builds custom applications using latest web technologies to deliver the data in a faster and more efficient manner to meet the dynamic needs of our legal partners.Should be comfortable to work in an extremely agile environment.Job Responsibilities:Understand & analyze functional requirements and translate into well-engineered code that comply with accepted industry standards.Ownership of assigned user stories with minimal supervision.Guide the junior team on development standards, good design practices etc. & ensure compliance as needed (frequent code reviews, spot checks etc).Identify opportunities for process improvement and make constructive suggestions for change.Manage priorities and work in a fast pace environment.Should effectively communicate with business partners, analysts, project manager & team members on functional & technical aspects as neededThe job will involve:Full stack development in Java Web Applications, with experience in UI development using web technologies.Build custom applications using latest web technologies (Angular / React, Java 8+, Python, Relational & NoSQL databases, Spring, Hibernate, Apache Solr, Messaging system, HTML5, CSS3, JavaScript, ES6 and TypeScript) to deliver the data in a faster and more efficient manner as our legal partners strategies changeDevelopment experience in a BDD or TDD approachKnowledge on AWS, Cloud & Micro Services is a plusRequirement analysis, design, code reviews, deployments etc along with developmentProduct development and unit testing frameworksFamiliarity with agile methodologies and experience working in an Agile/Scrum development environment. Minimum Qualifications: 3+ years of IT-relevant work experience with a Bachelor's degree in a technical field (e.g., Computer Engineering, Computer Science, Information Systems). OR 5+ years of IT-relevant work experience without a Bachelor’s degree. 3+ years of any combination of academic or work experience with Full-stack Application Development (e.g., Java, Python, JavaScript, etc.) 1+ year of any combination of academic or work experience with Data Structures, algorithms, and data stores. 5+ years of experience in software development with highly reputed organizations- Proficiency in Java 8 or higher, Python, Groovy, Spring, Hibernate- Excellent problem-solving skills- Good understanding of data structures and algorithms- Hands-on experience in full-stack development along with experience on building user interfaces using technologies like Angular / React JS and other necessary web technologies- Exposure in building cloud native software preferably with AWS - Exposure in designing and developing data models using RDBMS (Oracle, MySQL, etc.) and NoSQL (MongoDB) - Exposure in building solutions using Apache Solr or Elastic search 2+ years experience with more than one operating system (e.g., Linux, OSX, Windows). Programming Certifications such as Java, Scrum etc. Bachelors / Masters degree in any stream

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! The challenge Search, Discovery, and Content AI (SDC) is a cornerstone of Adobe’s ecosystem, enabling creative professionals and everyday users to access, discover, and demonstrate a wide array of digital assets and creative content, including images, videos, documents, vector graphics and more. With increasing demand for intuitive search, contextual discovery, and seamless content interactions across Adobe products like Express, Lightroom, Adobe Stock, SDC is evolving into a generative AI powerhouse. This team develops innovative solutions for intent understanding, personalized recommendations, and action orchestration to transform how users interact with content. Working with extensive datasets and pioneering technologies, you will help redefine the discovery experience and drive user success. The Opportunity How can you participate? We’re looking for a top-notch search engineering leadership in the area of information retrieval, search indexing, Elasticsearch, Lucene, algorithms, relevance & ranking, data mining, machine learning, data analysis & metrics, query processing, multi-lingual search, and search UX. This is an opportunity to make a huge impact in a fast-paced, startup-like environment in a great company. Join us! Responsibilities Work on Big data, data ingestion, search indexing, Hadoop, distributed systems, deep learning, recommendations, and performance by developing a Search platform at Adobe that would power Adobe product lines such as Express, Creative Cloud, Acrobat, Marketing Cloud, Stock. Apply machine learning to improve the ranking and recommendations as part of search work-flow. Build platform to index billions of images, documents and other assets in real-time. Maintain and optimize search engine, identify new ideas to evolve it, develop new features and benchmark possible solutions, in terms of search relevance, recommendations but also user experience, performance and feasibility. Build these products using technologies such as Elastic Search, REST web services, SQS/Kafka, Machine Learning, and more. What You Need To Succeed B. Tech or M. Tech in Computer Science Minimum 5-10 years of relevant experience in industry Experience in engineering SaaS based software development Hands on experience with Java and Python. Hands-on experience in Big data processing, Hadoop and Spark. Experience in Web services and REST. Experience in RDBMS & NOSQL database. Experience in AWS resources. Experience with Elastic Search/Solr. Experience with search engine technology, and inverted indexes Hands-on experience in building indexing pipelines Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 1 month ago

Apply

6.0 years

8 - 9 Lacs

Hyderābād

On-site

Project description Are you an enthusiastic technology professional? Are you excited about seeking an enriching career; working for a large Tier One Bank? We are seeking for Scala engineer to join our development team within the bank for an exciting opportunity, building on top of existing technology to be implemented across other APAC regions. You will work on a new and challenging project to implement banking systems, helping to shape the future of the Banks business. Responsibilities Responsibilities would be related to the following aspects: work closely with senior engineers in order to find best possible technical solution for the project/available requirements Scala development to provide banking solutions experience using Nexus repository software working against a ticketing system with different priorities reporting key metrics post go live development of continuous improvement themes such as automation and whitelisting. improve developer experience and make it easy to do the right thing challenge team to follow the best practices, eliminate process waste troubleshoot production/infrastructure issues be keen to expand current Scala/Java Skills Must have 6+ years building back-end systems 4+ years developing in Scala or other functional programming language Java, JS and React TDD distributed version control: Git or Mercurial strong written and verbal communication skills in English Banking experience be able to work in multicultural work environment Nice to have Nice to have actor system: Akka . HTTP stack and building REST APIs functional programming with Cats or ScalaZ ScalaTest and BDD Continuous integration & deployment practices pen minded and able to quickly learn new technologies and paradigms Kafka or other distributed messaging system Distributed environments and multi-threading profiling and application tuning build tools: Gradle experience with Yaml, Json and Xml (Xsd) experience with Unix shell and CLI tools search engine, eg. Solr, ElasticSearch "search" topic issues eg. building queries, indexing, etc Other Languages English: C2 Proficient Seniority Senior Hyderabad, IN, India Req. VR-113727 Scala BCM Industry 23/06/2025 Req. VR-113727

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Visa is a world leader in payments and technology, with over 259 billion payments transactions flowing safely between consumers, merchants, financial institutions, and government entities in more than 200 countries and territories each year. Our mission is to connect the world through the most innovative, convenient, reliable, and secure payments network, enabling individuals, businesses, and economies to thrive while driven by a common purpose – to uplift everyone, everywhere by being the best way to pay and be paid. Make an impact with a purpose-driven industry leader. Join us today and experience Life at Visa. Job Description As a Software Engineer, you will be responsible for design and implementation of the platform for consumer facing mobile and digital products. Must be able to coordinate and manage input from business and technology stakeholders across the enterprise and be responsible for the design and execution of large-scale technology initiatives. You will drive development of multiple applications, prototype new applications and feature ideas and explore new technologies that are at the forefront of backend development. We are looking for someone with serious development skills, strong interests in all things related to backend, and a passion for delivering high quality, rock-solid code. You will play part of teams tasked with multiple projects ranging from building full stack development of real-time transactional services, REST services API, Container based services to highly functional, secure, scalable and resilient Platform Libraries / Services / Framework and Infrastructure. You will be providing the best platform for PPT Platform engineers for them to succeed. If this sounds exciting, we would love to chat and tell you more about our work culture and environment. Key responsibilities include: Design the appropriate solution that is based on the business requirements. Perform proofs of concept (PoCs) and other technical evaluations of technologies, designs, and solutions. Work with engineering professionals, architects and others within Visa to ensure that the solution will scale appropriately while ensuring that what is implemented is a cost-effective solution. Continue to learn about the payments industry, and the factors impacting the payments industry. · Experience in commercial software development on Unix/Linux environment. · You have experience in consumer facing application development experience. · You have amazing work ethics that will help us all work extremely well together · Knowledge of MVC design pattern · Experience of programming in Java, J2EE · Experience with web services standards and related technologies (HTTP, Spring, XML, JSON, REST) · Experience on Big Data Cluster [ Solr, Kafka etc] · Solid understanding of database technologies and SQL. · Knowledge of NoSQL Databases is a plus. · Strong foundation in computer science, with strong competencies in data structures and algorithms. · Proven problem solving skills and an ability to respond resourcefully to new demands, priorities and challenge · Solid coding practices including good design documentation, unit testing, source control (GIT, SVN etc) · Experience with build tools (Maven, Gradle etc). · Ability and desire to learn new skills and take on new initiatives. This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager. Qualifications Basic Qualifications Associate: Minimum of 6 months of work experience or a Bachelor's Degree Preferred Qualifications Associate: 2 or more years of work experience Additional Information Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law.

Posted 1 month ago

Apply

5.0 - 10.0 years

18 - 33 Lacs

Hyderabad, Pune, Delhi / NCR

Hybrid

Role: Sitecore Developer Location: Greater Noida, Hyderabad, Gurugram, Pune Experience: 5 to 10 years Technical Requirements:-- Sitecore Headless Services API-driven content delivery. Sitecore JSS (JavaScript Services) – Next.js. GraphQL API – Efficient querying of Sitecore content. Rendering Host – Next.js or ASP.NET Core for decoupled frontend. Security & Authentication – OAuth, JWT, or Sitecore Identity Server. Experienced in Sitecore ContentSearch API, indexing strategies, and Solr integration. Skilled in Apache Solr configuration, schema design, and query optimization. Proficient in C#/.NET, Sitecore API, and Solr query handling. Indexing Best Practices – Ensuring efficient indexing with proper schema definitions. Query Performance Tuning – Using boosting, faceted search, and fuzzy search for better results. Nice to have Handling CI/CD pipelines.

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Role - Backend Developer Experience - 3-5 yrs Location - Bangalore ● Bachelors/Masters in Computer science from a reputed institute/university ● 3-7 years of strong experience in building Java/golang/python based server side solutions ● Strong in data structure, algorithm and software design ● Experience in designing and building RESTful micro services ● Experience with Server side frameworks such as JPA (HIbernate/SpringData), Spring, vertex, Springboot, Redis, Kafka, Lucene/Solr/ElasticSearch etc. ● Experience in data modeling and design, database query tuning ● Experience in MySQL and strong understanding of relational databases. ● Comfortable with agile, iterative development practices ● Excellent communication (verbal & written), interpersonal and leadership skills ● Previous experience as part of a Start-up or a Product company. ● Experience with AWS technologies would be a plus ● Experience with reactive programming frameworks would be a plus · Contributions to opensource are a plus ● Familiarity with deployment architecture principles and prior experience with container orchestration platforms, particularly Kubernetes, would be a significant advantage

Posted 1 month ago

Apply

10.0 - 15.0 years

20 - 35 Lacs

Maharashtra

Work from Office

#LI-hybrid While we have our offices in Bangalore, Chennai, Hyderabad, Nagpur and Pune this position is hybrid with you being able to report to a location nearest to your current location, if the need arises. HCL Commerce (v9.x or v8.x) Core HCL Commerce Expertise: Good hands-on experience in HCL Commerce (v8/v9) (formerly IBM WebSphere Commerce). Deep knowledge of B2B and/or B2C store models. Strong with HCL Commerce subsystems: Catalog, Order, Member, Marketing, Pricing, Promotions. SOLR Search integration (search runtime, indexing, customization). Experience with DynaCache, Redis Cache, Caching strategies. Proficient in HCL Commerce customization frameworks (Command, Task Command, Data Beans, Data Services Layer). REST microservices customization and extension Familiarity with HCL Commerce Docker/Kubernetes deployment. Experience on WebSphere Application Server (for v8) or Liberty (for v9) Knowledge of designing scalable, extensible, and secure commerce solutions. Strong grasp of enterprise integration: Payment, Tax, OMS, ERP, Loyalty systems. Experience in git, Jenkins, Maven/Gradle. Experience with CI/CD pipelines Exposure to Kubernetes, Docker, and container orchestration. Understanding of SEO, Analytics, and Digital Marketing integrations. Experience in performance tuning and high-traffic commerce site management.

Posted 1 month ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Title: Drupal Developer Experience: 4+ Yrs Notice Period: 15 Days or Less Location: Navi Mumbai Job Description Custom module development (Hooks, Plugin system, Form API, Entity API)REST / JSON:API / GraphQL in Drupal Drupal theming using Twig and preprocess functions Experience with user roles and access control Best practices in securing Drupal applications Familiarity with APIs and third-party integration. Preferred Experience with Rocket.Chat integration or other messaging tools Preferred Exposure to Solr/Elasticsearch using Drupal Search API Please DO NOT apply if your profile does not meet the job description or required qualifications. Irrelevant applications will not be considered. Share this opportunity to help it reach more job seekers! © Allime Tech Solutions Pvt. Ltd. All rights reserved. About Us At Allime Tech Solutions, we believe in empowering innovation through technology. Our mission is to connect talent with opportunity, creating a future where everyone can thrive. Driven by integrity and excellence, we are committed to providing tailored solutions for our clients.

Posted 1 month ago

Apply

0.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Location Chennai, Tamil Nadu, India Job ID R-229964 Date posted 24/06/2025 Job Title: Senior Consultant - Data & Analytics Career Level: D2 Introduction to Role Are you ready to make a significant impact in the world of data and analytics? We are looking for a highly skilled Senior Data & Analytics Engineer to join our offshore Knowledge Engineering Team. In this pivotal role, you will support various projects and initiatives across AstraZeneca by developing and managing the data ingestion, clean-up, and enrichment processes. Your expertise in data engineering and analytics, coupled with hands-on experience with AWS DevOps tools, will be crucial in driving our projects forward. Accountabilities Collaborate with project teams across diverse domains to understand their data needs and provide expertise in data ingestion and enrichment processes. Design, develop, and maintain scalable data pipelines and ETL workflows for the Knowledge Graph Team. Implement advanced data engineering techniques to ensure optimal performance and reliability of data systems. Work closely with data scientists and analysts to ensure high-quality data for knowledge graph construction and advanced analytics. Troubleshoot and resolve complex issues related to data pipelines, ensuring efficient data flow. Optimize data storage and processing for performance, scalability, and cost-efficiency. Stay updated with the latest trends in data engineering, analytics, and AWS DevOps to drive innovation. Provide DevOps/CloudOps support for the Knowledge Graph Team as needed. Essential Skills/Experience Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Strong expertise in data engineering, ETL workflows, and data pipeline development. Proficiency in programming languages such as Python, Java, or Scala. Experience with advanced data engineering techniques and best practices. Proven experience in DevOps/CloudOps support, particularly in AWS environment. Excellent troubleshooting and problem-solving skills. Web-services development and consumption (e.g., RESTful, GraphQL). Strong communication and collaboration skills, with the ability to lead and work effectively with cross-functional teams, present findings, and influence decision-making. Desirable Skills/Experience Experience in a senior or leadership role, guiding and mentoring junior data and analytics engineers. Familiarity with knowledge graph construction and advanced analytics is a plus. Familiarity with Object Graph Mapping libraries and/or Aspect-Oriented Programming. Expertise in data engineering using modern data platforms to build, deploy, and maintain data applications. Application deployment technologies (e.g., containerised workflows, Kubernetes) and cloud provisioning tools (e.g., CloudFormation, Terraform). Experience with full-text search and indexing (e.g., Solr/Lucene, Elastic Search). Knowledge of testing frameworks. AWS certification(s) is highly desirable. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, your work directly impacts patients by transforming our ability to develop life-changing medicines. We empower our business to perform at its peak by combining cutting-edge science with leading digital technology platforms. Join us at a crucial stage of our journey as we become a digital and data-led enterprise. Here, you will have the opportunity to innovate, take ownership, and explore new solutions in a dynamic environment that values diverse minds working inclusively together. Are you ready to shape the future of the pharmaceutical industry? Apply now to join our team! Date Posted 25-Jun-2025 Closing Date AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements. ]]>

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

At Roche you can show up as yourself, embraced for the unique qualities you bring. Our culture encourages personal expression, open dialogue, and genuine connections, where you are valued, accepted and respected for who you are, allowing you to thrive both personally and professionally. This is how we aim to prevent, stop and cure diseases and ensure everyone has access to healthcare today and for generations to come. Join Roche, where every voice matters. The Position Search specialist - Job profile The Search Specialist within the Enterprise Search Platform Team is responsible for maintaining and optimising the platform, powered by Sinequa technology. This role focuses on platform lifecycle and data source indexing to ensure the platform operates efficiently and effectively, and properly supports the use cases. This role requires close collaboration with the Product Owner and Solution Architect to translate business requirements into technical solutions to drive excellence in our search capabilities. Key Responsibilities: Search Implementation: Develop and integrate advanced search functionalities, including indexing, query processing, and relevancy tuning Work on the design, implementation, and optimization of Sinequa search solutions Configure, customize, and maintain Sinequa's search platform to meet organizational requirements Implement data ingestion procedures leveraging Sinequa connectors and APIs Development of Search-Based Applications: Design, develop, and deploy search-based applications that leverage the Sinequa platform Integrate search functionalities into existing and new applications to enhance user experience and information retrieval Collaborate with UI/UX designers to create engaging and efficient search interfaces Implement security best practices to protect sensitive data within search-based applications Test, debug, and document search-based applications to ensure high-quality deliverables Optimization and Performance: Monitor and optimize search performance, ensuring low latency and high accuracy of search results Troubleshoot and resolve issues related to search indexing, relevance, and data retrieval Conduct regular performance analyses and make necessary adjustments to enhance search efficiency Collaboration and Communication: Work closely with Product Owners, Solution Architects, and development teams to translate business needs into technical solutions Collaborate with data and content experts to ensure data quality and search index integrity Provide clear and effective communication on search-related topics to both technical and non-technical stakeholders User Experience Enhancement: Design and implement user-friendly search interfaces and experiences Gather and analyze user feedback to continually improve search functionalities Ensure the search platform provides intuitive and relevant results to end-users Data Management: Oversee data taxonomy, metadata, and tagging processes to ensure consistency and accuracy in search results Implement best practices for data governance and quality management in the context of search Innovation and Continuous Improvement: Keep up-to-date with the latest trends and best practices in search technologies and apply this knowledge to improve the Sinequa platform Identify opportunities for leveraging new features and functionalities within the Sinequa platform to enhance organizational search capabilities Qualifications: Solid understanding of search algorithms, information retrieval, and relevancy tuning Programming/scripting skills. Familiarity with Java, Angular, Python, .NET or similar Experience with the Sinequa platform OR similar search solutions (Elasticsearch, Solr, etc.) Strong analytical and problem-solving skills Experience with data integration tools and techniques Excellent communication and collaboration skills, with the ability to work effectively in a team environment Background in natural language processing (NLP) or machine learning as applied to search is a plus Familiarity with cloud platforms and services, particularly AWS is a plus Knowledge of data governance and data quality best practices is a plus Who we are A healthier future drives us to innovate. Together, more than 100’000 employees across the globe are dedicated to advance science, ensuring everyone has access to healthcare today and for generations to come. Our efforts result in more than 26 million people treated with our medicines and over 30 billion tests conducted using our Diagnostics products. We empower each other to explore new possibilities, foster creativity, and keep our ambitions high, so we can deliver life-changing healthcare solutions that make a global impact. Let’s build a healthier future, together. Roche is an Equal Opportunity Employer.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Kochi, Bengaluru, Thiruvananthapuram

Work from Office

Knowledge of Angular, Python, or C#. Familiarity with CI/CD pipelines and version control systems (e.g., Azure DevOps or Git) web technologies like PHP, or MySQL. learn Salesforce, Azure cloud solutions, SharePoint Online, and modern web frameworks. Required Candidate profile Support the development of IoT frontend applications using Angular, Capacitor,Prime NG. Assist with IoT backend tasks in C#, .Net 8.0, the Azure platform.CI/CD pipelines using Azure DevOps and Bicep.

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! The challenge Search, Discovery, and Content AI (SDC) is a cornerstone of Adobe’s ecosystem, enabling creative professionals and everyday users to access, discover, and demonstrate a wide array of digital assets and creative content, including images, videos, documents, vector graphics and more. With increasing demand for intuitive search, contextual discovery, and seamless content interactions across Adobe products like Express, Lightroom, Adobe Stock, SDC is evolving into a generative AI powerhouse. This team develops innovative solutions for intent understanding, personalized recommendations, and action orchestration to transform how users interact with content. Working with extensive datasets and pioneering technologies, you will help redefine the discovery experience and drive user success. The Opportunity How can you participate? We’re looking for a top-notch search engineering leadership in the area of information retrieval, search indexing, Elasticsearch, Lucene, algorithms, relevance & ranking, data mining, machine learning, data analysis & metrics, query processing, multi-lingual search, and search UX. This is an opportunity to make a huge impact in a fast-paced, startup-like environment in a great company. Join us! Responsibilities Work on Big data, data ingestion, search indexing, Hadoop, distributed systems, deep learning, recommendations, and performance by developing a Search platform at Adobe that would power Adobe product lines such as Express, Creative Cloud, Acrobat, Marketing Cloud, Stock. Apply machine learning to improve the ranking and recommendations as part of search work-flow. Build platform to index billions of images, documents and other assets in real-time. Maintain and optimize search engine, identify new ideas to evolve it, develop new features and benchmark possible solutions, in terms of search relevance, recommendations but also user experience, performance and feasibility. Build these products using technologies such as Elastic Search, REST web services, SQS/Kafka, Machine Learning, and more. What You Need To Succeed B. Tech or M. Tech in Computer Science Minimum 5-10 years of relevant experience in industry Experience in engineering SaaS based software development Hands on experience with Java and Python. Hands-on experience in Big data processing, Hadoop and Spark. Experience in Web services and REST. Experience in RDBMS & NOSQL database. Experience in AWS resources. Experience with Elastic Search/Solr. Experience with search engine technology, and inverted indexes Hands-on experience in building indexing pipelines Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 1 month ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Requirements Description and Requirements Summary: A Big Data (Hadoop) Administrator responsible for supporting the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Strong expertise in DevOps practices, automation, and scripting (e.g . Ansible , Azure DevOps, Shell, Python ) to streamline operations and improve efficiency is highly valued. Job Responsibilities: Assist in the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Perform routine monitoring, troubleshooting, and issue resolution to ensure the stability and performance of Hadoop clusters. Develop and maintain scripts (e.g., Python, Bash, Ansible) to automate operational tasks and improve system efficiency. Collaborate with cross-functional teams, including application development, infrastructure, and operations, to support business requirements and implement new features. Implement and follow best practices for cluster security, including user access management and integration with tools like Apache Ranger and Kerberos. Support backup, recovery, and disaster recovery processes to ensure data availability and business continuity. Conduct performance tuning and optimization of Hadoop clusters to enhance system efficiency and reduce latency. Analyze logs and use tools like Splunk to debug and resolve production issues. Document operational processes, maintenance procedures, and troubleshooting steps to ensure knowledge sharing and consistency. Stay updated on emerging technologies and contribute to the adoption of new tools and practices to improve cluster management. Education: Bachelor’s degree in computer science, Information Systems, or another related field with 7+ years of IT and Infrastructure engineering work experience Experience: 7+ Years Total IT experience & 4+ Years relevant experience in Big Data database Big Data Platform Management : Big Data Platform Management: Knowledge in managing and optimizing the Cloudera Data Platform, including components such as Apache Hadoop (YARN and HDFS), Apache HBase, Apache Solr , Apache Hive, Apache Kafka, Apache NiFi , Apache Ranger, Apache Spark, as well as JanusGraph and IBM BigSQL . Automation and Scripting : Expertise in automation tools and scripting languages such as Ansible, Python, and Bash to streamline operational tasks and improve efficiency. DevOps Practices : Proficiency in DevOps tools and methodologies, including CI/CD pipelines, version control systems (e.g., Git), and infrastructure-as-code practices. Monitoring and Troubleshooting : Experience with monitoring and observability tools such as Splunk, Elastic Stack, or Prometheus to identify and resolve system issues. Linux Administration : Solid knowledge of Linux operating systems, including system administration, troubleshooting, and performance tuning. Backup and Recovery : Familiarity with implementing and managing backup and recovery processes to ensure data availability and business continuity. Security and Access Management : Understanding of security best practices, including user access management and integration with tools like Kerberos. Agile Methodologies : Knowledge of Agile practices and frameworks, such as SAFe , with experience working in Agile environments. ITSM Tools : Familiarity with ITSM processes and tools like ServiceNow for incident and change management. Other Critical Requirement: Excellent Analytical and Problem-Solving skills Ability to work in a 24x7 rotational shift to support Hadoop platforms and ensure high availability. Excellent written and oral communication skills, including the ability to clearly communicate/articulate technical and functional issues with conclusions and recommendations to stakeholders. Prior experience in handling state side and offshore stakeholders Experience in creating and delivering Business presentations. Demonstrate ability to work independently and in a team environment Demonstrate willingness to learn and adopt new technologies and tools to improve operational efficiency About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us!

Posted 1 month ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 23-Jun-2025 Job ID 10076 Summary Description and Requirements A Big Data (Hadoop) Administrator responsible for supporting the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Strong expertise in DevOps practices, automation, and scripting (e.g. Ansible, Azure DevOps, Shell, Python) to streamline operations and improve efficiency is highly valued. Job Responsibilities Assist in the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Perform routine monitoring, troubleshooting, and issue resolution to ensure the stability and performance of Hadoop clusters. Develop and maintain scripts (e.g., Python, Bash, Ansible) to automate operational tasks and improve system efficiency. Collaborate with cross-functional teams, including application development, infrastructure, and operations, to support business requirements and implement new features. Implement and follow best practices for cluster security, including user access management and integration with tools like Apache Ranger and Kerberos. Support backup, recovery, and disaster recovery processes to ensure data availability and business continuity. Conduct performance tuning and optimization of Hadoop clusters to enhance system efficiency and reduce latency. Analyze logs and use tools like Splunk to debug and resolve production issues. Document operational processes, maintenance procedures, and troubleshooting steps to ensure knowledge sharing and consistency. Stay updated on emerging technologies and contribute to the adoption of new tools and practices to improve cluster management. Education Bachelor’s degree in computer science, Information Systems, or another related field with 7+ years of IT and Infrastructure engineering work experience Experience 7+ Years Total IT experience & 4+ Years relevant experience in Big Data database Big Data Platform Management: Big Data Platform Management: Knowledge in managing and optimizing the Cloudera Data Platform, including components such as Apache Hadoop (YARN and HDFS), Apache HBase, Apache Solr, Apache Hive, Apache Kafka, Apache NiFi, Apache Ranger, Apache Spark, as well as JanusGraph and IBM BigSQL. Automation and Scripting: Expertise in automation tools and scripting languages such as Ansible, Python, and Bash to streamline operational tasks and improve efficiency. DevOps Practices: Proficiency in DevOps tools and methodologies, including CI/CD pipelines, version control systems (e.g., Git), and infrastructure-as-code practices. Monitoring and Troubleshooting: Experience with monitoring and observability tools such as Splunk, Elastic Stack, or Prometheus to identify and resolve system issues. Linux Administration: Solid knowledge of Linux operating systems, including system administration, troubleshooting, and performance tuning. Backup and Recovery: Familiarity with implementing and managing backup and recovery processes to ensure data availability and business continuity. Security and Access Management: Understanding of security best practices, including user access management and integration with tools like Kerberos. Agile Methodologies: Knowledge of Agile practices and frameworks, such as SAFe, with experience working in Agile environments. ITSM Tools: Familiarity with ITSM processes and tools like ServiceNow for incident and change management. Other Critical Requirement Excellent Analytical and Problem-Solving skills Ability to work in a 24x7 rotational shift to support Hadoop platforms and ensure high availability. Excellent written and oral communication skills, including the ability to clearly communicate/articulate technical and functional issues with conclusions and recommendations to stakeholders. Prior experience in handling state side and offshore stakeholders Experience in creating and delivering Business presentations. Demonstrate ability to work independently and in a team environment Demonstrate willingness to learn and adopt new technologies and tools to improve operational efficiency About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!

Posted 1 month ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Come work at a place where innovation and teamwork come together to support the most exciting missions in the world! Job Description: We are seeking a talented Lead Big Data Engineer to deliver roadmap features of Unified Asset Inventory. This is a great opportunity to be an integral part of a team building Qualys next generation Micro-Services based platform processing over a 100 million transactions and terabytes of data per day, leverage open-source technologies, and work on challenging and business-impacting projects. Responsibilities: You will be building the Unified Asset Management product in the cloud You will be building highly scalable Micro-services that interacts with Qualys Cloud Platform. Research, evaluate and adopt next generation technologies Produce high quality software following good architecture and design principles that you and your team will find easy to work with in the future This is a fantastic opportunity to be an integral part of a team building Qualys next generation platform using Big Data & Micro-Services based technology to process over billions of transactions data per day, leverage open-source technologies, and work on challenging and business-impacting initiatives. Qualifications: Bachelor’s degree in computer science or equivalent 10+ years of total experience. 4+ years of relevant experience in design and architecture Big Data solutions using Spark 3+ years experience in working with engineering resources for innovation. 4+ years experience in understanding Big Data events flow pipeline. 3+ years experience in performance testing for large infrastructure. 3+ In depth experience in understanding various search solutions solr/elastic. 3+ years experience in Kafka In depth experience in Data lakes and related ecosystems. In depth experience of messing queue In depth experience in giving requirements to build a scalable architecture for Big data and Micro-services environments. In depth experience in understanding caching components or services Knowledge in Presto technology. Knowledge in Airflow. Hands-on experience in scripting and automation In depth understanding of RDBMS/NoSQL, Oracle , Cassandra , Kafka , Redis, Hadoop, lambda architecture, kappa , kappa ++ architectures with flink data streaming and rule engines Experience in working with ML models engineering and related deployment. Design and implement secure big data clusters to meet many compliances and regulatory requirements. Experience in leading the delivery of large-scale systems focused on managing the infrastructure layer of the technology stack. Strong experience in doing performance benchmarking testing for Big data technologies. Strong troubleshooting skills. Experience leading development life cycle process and best practices Experience in Big Data services administration would be added value. Experience with Agile Management (SCRUM, RUP, XP), OO Modeling, working on internet, UNIX, Middleware, and database related projects. Experience mentoring/training the engineering community on complex technical issue.

Posted 1 month ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title: Java Full Stack Developer (Java + Angular) Experience: 3+ Years Location: Gurgaon Summary: We are looking for a Java Full Stack Developer with 3+ years of experience to join our dynamic team. The ideal candidate will play a crucial role in developing, optimizing, and maintaining scalable web applications using Java, Spring Boot, and modern frontend frameworks like Angular or React.js. The candidate will collaborate with backend, frontend, and DevOps teams to build end-to-end solutions, optimize performance, and ensure seamless user experiences. Strong expertise in Agile methodologies and modern software development practices is essential. Core Responsibilities: · Develop, test, and maintain RESTful and GraphQL APIs using Java 8/11/17 and Spring Boot. · Implement microservices architecture and event-driven systems. · Utilize Spring Security, JWT, and OAuth2 for authentication and authorization. · Optimize application performance through caching (Redis, Memcached) and query optimization. · Develop and manage database integrations using JPA, Hibernate, MySQL, and PostgreSQL. · Design and optimize database schemas, indexing, and query performance. · Write unit and integration tests using JUnit, Mockito, TestNG. · Build responsive UI components using Angular TypeScript, and JavaScript. · Ensure cross-browser compatibility and mobile-first designs. · Optimize UI performance using lazy loading, caching, and component reusability. · Utilize CSS frameworks like Tailwind CSS, Bootstrap, and Material UI for styling. · Optimize repository performance and metadata indexing using Solr and Elasticsearch. Preferred Skills: · Strong expertise in Java, Spring Boot, Hibernate (JPA), and PostgreSQL. · Proficiency in Angular, TypeScript, JavaScript, and modern frontend frameworks. · Experience with Solr and Elasticsearch for metadata indexing. · Strong knowledge of authentication systems (LDAP, OAuth2, Shibboleth, SAML). · Familiarity with metadata standards (Dublin Core, MODS, MARC, METS). · Hands-on experience with version control tools (Git, GitHub, GitLab, Bitbucket). · Excellent analytical and troubleshooting skills with strong attention to detail. · Ability to translate business requirements into technical solutions. Educational Background · BE/BTech in Computer Science, IT or MCA from a reputed institution. Work Experience: 3+ years of experience in Java full-stack development, working on enterprise-grade applications across web and cloud environments.

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Desired Competencies (Technical/Behavioral Competency) Exp Range - 4-7 yrs Hiring Location - Bangalore Must-Have Person will be responsible to Perform Big Data Administration and Engineering activities on multiple Hadoop, Hbase and spark clusters. • Work on Performance Tuning and Increase Operational efficiency on a continuous basis • Monitor health of the platforms and Generate Performance Reports and Monitor and provide continuous improvements • Working closely with development, engineering and operation teams, jointly work on key deliverables ensuring production scalability and stability • Develop and enhance platform best practices • Ensure the Hadoop platform can effectively meet performance & SLA requirements • Responsible for Big Data Production environment which includes Hadoop (HDFS and YARN), Hive, Spark, Livy, SOLR, Oozie, Kafka, Airflow, Nifi, Hbase etc • Perform optimization, debugging and capacity planning of a Big Data cluster • Perform security remediation, automation and self heal as per the requirement.

Posted 1 month ago

Apply

9.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Our company Changing the world through digital experiences is what Adobe’s all about. We give global brands everything they need to design and deliver exceptional digital experiences. We’re passionate about redefining how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organisation, and we know the next big idea could be yours. The challenge Beyond the usual responsibility of designing, developing, documenting, and thoroughly testing code, Engineering Managers @ Adobe would own features of varying complexity, which may require understanding interactions with other parts of the system, highly complex algorithms and good design judgment. We are looking for passionate, enthusiastic and highly motivated engineers with a zeal and ability to understand new product features or technologies. Job Description What you’ll do Lead the design, development and testing of large scale, low latency high volume services which serves millions of requests every day. Collaborate with architects, product management and other engineering teams to create the technical vision, and road map for the team. Mentoring people, helping them grow, and develop their careers. Providing oversight, accountability, and leadership for technical decisions with a bias for action Build new functionality/services and APIs for Adobe Campaign Build Reliable, scalable and secure cloud services. Responsible for full lifecycle of the project from user story to design, development, testing, documentation and maintenance. Create technical specifications, prototypes and presentations to communicate your ideas. What you need to succeed Passion and love of what you do! 9+ years of experience in software development with at least 2 years of experience in management role Strong technical skills with experience in designing & developing full stack enterprise applications Strong working knowledge of software development methodologies and software design patterns and have experience leading significant software projects/products Proven experience running or leading sophisticated projects through full product delivery cycles Proven track record developing, leading, coaching, and mentoring software engineers and engineering managers towards delivering timely, high-quality software as a team Ambitious and not afraid to tackle unknowns, demonstrates a strong bias to action Strong interpersonal, analytical, problem-solving and conflict resolution skills Proven track record in recruiting and building robust teams Strong React/angular(frontend), Java/JEE (backend), design, coding and architectural skills along with problem solving and analytical abilities Good knowledge of Kafka, MySQL/Postgres, Solr, Elastic Search Experience in developing and leading teams building solution with cloud technologies (AWS and/or Azure) Self-motivated with a desire to mentor a team and the ability to drive the team to accomplish high-quality work. Excellent speaking, writing, and presentation skills, as well as the ability to persuade, encourage, and empower others A Bachelor's or Master's degree in Computer Science or a related field is preferred. Adobe’s Experience Cloud & Adobe Campaign Adobe’s Digital Experience Business Unit provides solutions that empower businesses to make, manage, measure and monetise online, offline and multichannel business initiatives. Through the Adobe Experience Cloud which includes our Advertising, Analytics and Marketing Clouds, companies have everything they need to deliver a well-designed, personal and consistent experience to their customers at the right time, regardless of the channel. The Adobe Marketing Cloud offers integrated solutions that help brands connect with customers on a personal level; the Adobe Analytics Cloud is a customer intelligence engine that helps brands move from insights to action; and, the Adobe Advertising Cloud offers the industry’s first end- to-end platform that manages advertising for TV and all digital formats. Adobe Campaign provides a platform for designing cross-channel customer experiences and provides an environment for visual campaign orchestration, real time interaction management and cross channel execution. Adobe Campaign, formerly Neolane, provides best-in-class campaign, offer, and personalization management capabilities for sophisticated automation and execution of marketing programs across all channels, digital and traditional. Adobe Campaign addresses a key challenge for marketers: how to build and extend relationships with their customer base to drive top-line revenue growth and ROI. Adobe experience cloud complements the solution by giving its customers everything they need to get deep insight into their customers, build personalized campaigns and manage their content and assets. Marketers now have an intuitive, automated way to deliver one-to-one messages across online and offline marketing channels. Adobe Campaign lets you orchestrate personalized experiences determined by the customer’s habits and preferences. Finally, a solution that helps you know what customers want even before they do. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Designation : Developer- Magento B.E. Experience: 3 to 6 Years Company Website: https://www.growisto.com/ About Us: We are a group of techies, marketers, data analysts, and operations professionals who have come together to provide next-generation e-commerce solutions. We help brands & D2C brands to grow their businesses on platforms such as Amazon and their own website. We love brainstorming ideas on marketing businesses through digital media and utilize technology to execute them further to achieve the desired goals. However, nothing gives us a bigger kick than boosting sales! Key Responsibilities: Technical Expertise Hands on working with complicated customization of the latest versions of the Magento Regular review of codes before it goes for UAT and optimize the system for speed and scalability Identify the productization and automation opportunities as per the internal team requirement Stay up to date with the latest Magento technology Project Delivery Understand the business requirement of clients to design & deliver the best possible solutions to the client Working handson on the projects to ensure timely and quality delivery of the projects Guide and train the team to help them execute their projects Identify and implement the process improvements related to effective coding practices Responsible for the flawless delivery of the each and every project Team Building & Management Build a high performing team with right technical talent and outstanding project execution skills Get involved in team building by actively participating in hiring, training and people management of team members Requirement: Min 4 years of experience in E-Commerce CMS such as Magento Experience of developing/editing third-Party extensions Extensive experience of PHP and MySQL Experience on Javascript frameworks such as jQuery Must handle multiple projects simultaneously Developer with strong PHP, Zend, and object-oriented programming experience Good experience of Magento core development (php) and Module development which involves database level operations (Creating new tables in database and developing Models for that) Website Development project including both back-end customizing API, plugins, and modules Performance Optimization using cache, solr (or any nosql search) Good to Have: Experience in Magento Enterprise is a bonus Contributions to the open-source community Experience with building scalable Magento systems Why should you consider joining Growisto? It will be a challenging role and you will get complete ownership to solve challenging problems. If you like challenges and think from a first principle basis, you should definitely take this up If you have the aspiration to grow and develop as leader in parallel to a multifold growth rate of a start up then you should join us If your thought process and personality resonate with our cultural values, then you should join us. Why should you not consider joining Growisto? If the role description does not excite you, then you should not join us We are a startup and things will move fast. If you are not comfortable in a fast paced environment, then you should not join us If we have to choose between culture/team and profits, our obvious choice is culture. If you don’t believe in the same philosophy then you should not join us.

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

AWS Cloud Developer Serverless Lambda OpenSearch JD Strong programming skills in Python Hands on experience with AWS Lambda and other serverless services. Proficiency in Amazon OpenSearch, with working knowledge of Solr and Elasticsearch query languages and indexing strategies. Experience with API Gateway, S3, DynamoDB, CloudWatch, IAM. Familiarity with DevOps tools such as Git, Jenkins, and CI/CD workflows. Experience with monitoring and logging using CloudWatch, X Ray, or third party tools. Knowledge of RESTful API development and integration. Good understanding of data modeling and search optimization

Posted 1 month ago

Apply

0 years

3 - 5 Lacs

Chennai

On-site

AWS Cloud Developer Serverless Lambda OpenSearch JD Strong programming skills in Python Hands on experience with AWS Lambda and other serverless services. Proficiency in Amazon OpenSearch, with working knowledge of Solr and Elasticsearch query languages and indexing strategies. Experience with API Gateway, S3, DynamoDB, CloudWatch, IAM. Familiarity with DevOps tools such as Git, Jenkins, and CI/CD workflows. Experience with monitoring and logging using CloudWatch, X Ray, or third party tools. Knowledge of RESTful API development and integration. Good understanding of data modeling and search optimization About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies