Home
Jobs

4657 Apache Jobs - Page 40

Filter Interviews
Min: 0 years
Max: 25 years
Min: β‚Ή0
Max: β‚Ή10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Ghaziabad, Uttar Pradesh, India

On-site

Linkedin logo

Responsibilities As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development & solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills: ● Design, develop, and support data pipelines and related data products and platforms. ● Design and build data extraction, loading, and transformation pipelines and data products across on- prem and cloud platforms. ● Perform application impact assessments, requirements reviews, and develop work estimates. ● Develop test strategies and site reliability engineering measures for data products and solutions. ● Participate in agile development and solution reviews. ● Mentor junior Data Engineers. ● Lead the resolution of critical operations issues, including post-implementation reviews. ● Perform technical data stewardship tasks, including metadata management, security, and privacy by design. ● Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies ● Demonstrate SQL and database proficiency in various data engineering tasks. ● Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. ● Develop Unix scripts to support various data operations. ● Model data to support business intelligence and analytics initiatives. ● Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. ● Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). Qualifications: ● Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. ● 4+ years of data engineering experience. ● 2 years of data solution architecture and design experience. ● GCP Certified Data Engineer (preferred). Interested candidates can send their resumes to riyanshi@etelligens.in Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform. Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation. Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Preferred Education Non-Degree Program Required Technical And Professional Expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms. Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred Technical And Professional Experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Develop, test, and maintain backend services and microservices using Java and Spring Boot. Design and build RESTful APIs for internal and external integration. Implement event-driven architectures using Apache Kafka. Collaborate with DevOps teams to deploy and manage services on Kubernetes. Write clean, scalable, and well-documented code following industry best practices. Participate in code reviews, sprint planning, and agile development processes. Troubleshoot issues and contribute to performance tuning and optimization. Work closely with QA, product, and business teams to deliver high-quality software. Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Develop, test, and maintain backend services and microservices using Java and Spring Boot. Design and build RESTful APIs for internal and external integration. Implement event-driven architectures using Apache Kafka. Collaborate with DevOps teams to deploy and manage services on Kubernetes. Write clean, scalable, and well-documented code following industry best practices. Participate in code reviews, sprint planning, and agile development processes. Troubleshoot issues and contribute to performance tuning and optimization. Work closely with QA, product, and business teams to deliver high-quality software. Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

We are seeking a talented and experienced Java AWS Developer to join our engineering team. The ideal candidate will have a strong background in Java, Spring Boot, and microservices architecture, with practical experience in developing scalable, cloud-native applications using AWS. Proficiency in Kafka, REST APIs, and ElastiCache is essential for this role. Design, develop, and maintain microservices-based applications using Java and Spring Boot. Build and expose RESTful APIs for internal and external integration. Implement message-driven solutions using Apache Kafka. Develop and deploy applications on AWS Cloud, ensuring scalability, performance, and security. Utilize ElastiCache to enhance application responsiveness and performance. Collaborate with cross-functional teams including product owners, architects, and QA engineers. Participate in code reviews, technical discussions, and continuous improvement initiatives. Troubleshoot and resolve production issues in a timely manner. Show more Show less

Posted 6 days ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Thornton Tomasetti applies engineering and scientific principles to solve the world’s challenges. An independent organization of creative thinkers and innovative doers collaborating from offices worldwide, our mission is to bring our clients' ideas to life and, in the process, lay the groundwork for a better, more resilient future. We provide support and opportunities to our employees to achieve their full potential and cultivate a rewarding career. Thornton Tomasetti has a rich history of developing world-leading software for applications internal to our business and in some cases for external use as well. We have a challenging combination of global exposure, business opportunity, a rich variety of technology in use, and sensitive client requirements. This position is an unusual opportunity for a cyber specialist who enjoys thinking creatively, working in a people-oriented dynamic environment, and producing clever solutions that balance risk and benefit. Thornton Tomasetti’s Custom-developed Tools Integrate Existing Backend Systems, Expose Their Data To Broad Staff Use, And Automate Certain Activities. These Tools Are Of Two Key Types β€œbusiness” systems that help the company communicate, manage its operations, and measure performance. An example of this is a master database of our projects, current and historical, that allows staff to find relevant work that might help them and allows management to analyze trends and better sell the next generation of work. β€œdesign/engineering” systems that allow us to conduct our technical work of designing, evaluating, and improving buildings and other structures more efficiently around the world. This includes, for example, harvesting data from Autodesk design modeling tools to use in other analytical tools. The Role Role We are seeking an analyst to work with our software development teams to advise on key risks and options for mitigating them, as well as hands-on help to improve the code. This team member will work with other developers, internal customers, and specialists (including information technology, software vendors and consultants). The primary responsibility of the Cyber Specialist is to evaluate options for improving our cyber security, help build and get consensus on implementation plans, and then lead execution of those improvements in collaboration with other IT and Cyber specialists. This team member will be tasked with identifying potential enhancements, helping to make a business case for implementing them, and leading implementation of improvements in cooperation with other experts and stakeholders. The role would also include: Develop and maintain best practices. Broad understanding of big picture, software architecture, networking. Help during planning and execution phases. Standardization around authentication mechanisms. Privacy awareness (e.g., PII/GDPR/CCPA). Backend security: Website/server, database, container. Frontend security: Client applications, web pages, APIs. Coordinate vulnerability testing. Create and maintain review checklists. Responsibilities Strategic Evaluate the β€œcurrent state” to identify strengths, weaknesses and opportunities associated with our systems and fit with our culture and business needs. Understand the available commercial software/service options and configuration possibilities and identify opportunities that will benefit the firm. Monitor changes in external requirements including security certification programs, regulation, and industry best practices. Tactical Develop plans for changes, upgrades, maintenance, training, and communication. Hands on management of improvement programs, re-configurations, and effectiveness of our evolving tools, including issue tracking, user support and vendor relations. Develop training materials and other ways to evangelize good cyber security practices in the company. Participate in proactive problem avoidance as well as incident investigation/lessons learned activities. Requirements 5+ years in software development with significant experience in planning and design for custom software development. Deep awareness of cyber issues and options for software development, with a CISSP or another credential preferred. Awareness of evolving US Government cybersecurity requirements and international standards in major markets where TT operates (especially North America, UK, India). Strong knowledge of full stack javascript development and cloud services architecture (AWS and Azure services) is essential. Familiarity with Web & Desktop Application Development, Cloud services, Linux & Windows System administration. Tools And Technologies Node.js (most important) Javascript React, Vue, Angular front-end frameworks MongoDB AWS & Microsoft Azure cloud platforms C# .NET Python PowerShell Docker Linux Apache Nginx Git Windows Flexibility to learn other languages, tools, techniques, etc. as needed. Thornton Tomasetti is proud to be an equal employment workplace. Individuals seeking employment at Thornton Tomasetti are considered without regards to age, ancestry, color, gender (including pregnancy, childbirth, or related medical conditions), gender identity or expression, genetic information, marital status, medical condition, mental or physical disability, national origin, protected family care or medical leave status, race, religion (including beliefs and practices or the absence thereof), sexual orientation, military or veteran status, or any other characteristic protected by federal, state, or local laws. Thornton Tomasetti Global Terms of Use and Privacy Statement Carefully read these Terms of Use before using this website. Your access to and use of this website and application for a job at Thornton Tomasetti are conditioned on your acceptance and compliance with these terms. Please access the linked document by clicking here, select the country where you are applying for employment, and review. Before submitting your application you will be asked to confirm your agreement with the terms. Beware Of Recruitment Fraud: Scammers may attempt to impersonate Thornton Tomasetti. Messages from our firm come only from the ThorntonTomasetti.com domain, Thornton Tomasetti does not use any third-party recruiters. When in doubt, please contact us through our web form here and see how you can protect yourself online here. Show more Show less

Posted 6 days ago

Apply

89.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Full-time Company Description GFK - Growth from Knowledge. For over 89 years, we have earned the trust of our clients around the world by solving critical questions in their decision-making process. We fuel their growth by providing a complete understanding of their consumers’ buying behavior, and the dynamics impacting their markets, brands and media trends. In 2023, GfK combined with NIQ, bringing together two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights - delivered with advanced analytics through state-of-the-art platforms - GfK drives β€œGrowth from Knowledge”. Job Description It's an exciting time to be a builder. Constant technological advances are creating an exciting new world for those who understand the value of data. The mission of NIQ’s Media Division is to turn NIQ into the global leader that transforms how consumer brands plan, activate and measure their media activities. Recombine is the delivery area focused on maximising the value of data assets in our NIQ Media Division. We apply advanced statistical and machine learning techniques to unlock deeper insights, whilst integrating data from multiple internal and external sources. Our teams develop data integration products across various markets and product areas, delivering enriched datasets that power client decision-making. Role Overview We are looking for a Principal Software Engineer for our Recombine delivery area to provide technical leadership within our development teams, ensuring best practices, architectural coherence, and effective collaboration across projects. This role is ideal for a highly experienced engineer who can bridge the gap between data engineering, data science, and software engineering, helping teams build scalable, maintainable, and well-structured data solutions. As a Principal Software Engineer, you will play a hands-on role in designing and implementing solutions while mentoring developers, influencing technical direction, and driving best practices in software and data engineering. This role includes line management responsibilities, ensuring the growth and development of team members. The role will be working within an AWS environment, leveraging the power of cloud-native technologies and modern data platforms Key Responsibilities Technical Leadership & Architecture Act as a technical architect, ensuring alignment between the work of multiple development teams in data engineering and data science. Design scalable, high-performance data processing solutions within AWS, considering factors such as governance, security, and maintainability. Drive the adoption of best practices in software development, including CI/CD, testing strategies, and cloud-native architecture. Work closely with Product Owners to translate business needs into technical solutions. Hands-on Development & Technical Excellence Lead by example through high-quality coding, code reviews, and proof-of-concept development. Solve complex engineering problems and contribute to critical design decisions. Ensure effective use of AWS services, including AWS Glue, AWS Lambda, Amazon S3, Redshift, and EMR. Develop and optimise data pipelines, data transformations, and ML workflows in a cloud environment. Line Management & Team Development Provide line management to engineers, ensuring their professional growth and development. Conduct performance reviews, set development goals, and mentor team members to enhance their skills. Foster a collaborative and high-performing engineering culture, promoting knowledge sharing and continuous improvement beyond team boundaries. Support hiring, onboarding, and career development initiatives within the engineering team. Collaboration & Cross-Team Coordination Act as the technical glue between data engineers, data scientists, and software developers, ensuring smooth integration of different components. Provide mentorship and guidance to developers, helping them level up their skills and technical understanding. Work with DevOps teams to improve deployment pipelines, observability, and infrastructure as code. Engage with stakeholders across the business, translating technical concepts into business-relevant insights. Governance, Security & Data Best Practices Champion data governance, lineage, and security across the platform. Advocate for and implement scalable data architecture patterns, such as Data Mesh, Lakehouse, or event-driven pipelines. Ensure compliance with industry standards, internal policies, and regulatory requirements. Qualifications Requirements & Experience Strong software engineering background with experience in designing and building production-grade applications in Python, Scala, Java, or similar languages. Proven experience with AWS-based data platforms, specifically AWS Glue, Redshift, Athena, S3, Lambda, and EMR. Expertise in Apache Spark and AWS Lake Formation, with experience building large-scale distributed data pipelines. Experience with workflow orchestration tools like Apache Airflow or AWS Step Functions. Cloud experience in AWS, including containerisation (Docker, Kubernetes, ECS, EKS) and infrastructure as code (Terraform, CloudFormation). Strong knowledge of modern software architecture, including microservices, event-driven systems, and distributed computing. Experience leading teams in an agile environment, with a strong understanding of CI/CD pipelines, automated testing, and DevOps practices. Excellent problem-solving and communication skills, with the ability to engage with both technical and non-technical stakeholders. Proven line management experience, including mentoring, career development, and performance management of engineering teams. Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insightsβ€”delivered with advanced analytics through state-of-the-art platformsβ€”NIQ delivers the Full Viewβ„’. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion I'm interested I'm interested Privacy Policy Show more Show less

Posted 6 days ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Gurugram

Work from Office

Naukri logo

Program management in web app development with Microsoft tech stack Managed below projects: Integration with Microsoft - Azure Integration Layer, cross-platform business applications & payment gateways Middleware support related to installation and configuration of the SSL certificates, application servers and web servers Collaboration with COE Team: Leveraging Oracle COE for solutioning, design, code reviews & effort estimation Leveraging Salesforce COE for solutioning, design, code reviews & effort estimation Leveraging middleware and DevOps COE for supporting version upgrades, middleware support, SSL certificate renewals & back-end support Resource Management: Resource allocation, attrition and transition management Coordinate activities across cross-functional teams, including developers, designers, and QA engineers. Foster a collaborative and productive team environment. Coordinating with multiple internal (vendor) teams for technical & Commercial proposals Facilitate problem-solving and troubleshooting sessions as needed. Managed below type of projects: Website Development: Set up Drupal environment (local, staging, production) along with Theme development or customization Installation and configuration of the CMS Modules 6+ years of experience in enhancement & operations support in Drupal based projects Exposure to PHP, Python, XHTML, CSS, JavaScript, jQuery, Apache, MySQL, Angular JS, Azure Services, etc. Shall be comfortable with Linux and Windows OS Content Migration: Hands on experience in version Upgrade of Drupal along with content migration from existing CMS to alternative tools Exposure to PHP frameworks for architectural development. Experience with CI/CD pipelines (DevOps) and version control systems like Git and Azure TFS User Roles, Permissions & SEO Optimization: Define and implement user roles and permissions. Implement basic SEO best practices. Integration Integration with Microsoft - Azure Integration Layer, cross-platform business applications & payment gateways Middleware support related to installation and configuration of the SSL certificates Testing, Documentation & Support Perform unit and integration testing along with documentation for site architecture, module usage, and maintenance. Understanding the ITSM framework especially Incident and Problem management

Posted 6 days ago

Apply

7.0 - 9.0 years

15 - 18 Lacs

Pune

Work from Office

Naukri logo

We are looking for a highly skilled Senior Databricks Developer to join our data engineering team. You will be responsible for building scalable and efficient data pipelines using Databricks, Apache Spark, Delta Lake, and cloud-native services (Azure/AWS/GCP). You will work closely with data architects, data scientists, and business stakeholders to deliver high-performance, production-grade solutions. Key Responsibilities : - Design, build, and maintain scalable and efficient data pipelines on Databricks using PySpark, Spark SQL, and optionally Scala. - Work with Databricks components including Workspace, Jobs, DLT (Delta Live Tables), Repos, and Unity Catalog. - Implement and optimize Delta Lake solutions aligned with Lakehouse and Medallion architecture best practices. - Collaborate with data architects, engineers, and business teams to understand requirements and deliver production-grade solutions. - Integrate CI/CD pipelines using tools such as Azure DevOps, GitHub Actions, or similar for Databricks deployments. - Ensure data quality, consistency, governance, and security by using tools like Unity Catalog or Azure Purview. - Use orchestration tools such as Apache Airflow, Azure Data Factory, or Databricks Workflows to schedule and monitor pipelines. - Apply strong SQL skills and data warehousing concepts in data modeling and transformation logic. - Communicate effectively with technical and non-technical stakeholders to translate business requirements into technical solutions. Required Skills and Qualifications : - Hands-on experience in data engineering, with specifically in Databricks. - Deep expertise in Databricks Workspace, Jobs, DLT, Repos, and Unity Catalog. - Strong programming skills in PySpark, Spark SQL; Scala experience is a plus. - Proficient in working with one or more cloud platforms : Azure, AWS, or GCP. - Experience with Delta Lake, Lakehouse architecture, and medallion architecture patterns. - Proficient in building CI/CD pipelines for Databricks using DevOps tools. - Familiarity with orchestration and ETL/ELT tools such as Airflow, ADF, or Databricks Workflows. - Strong understanding of data governance, metadata management, and lineage tracking. - Excellent analytical, communication, and stakeholder management skills.

Posted 6 days ago

Apply

8.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

Remote

Linkedin logo

Mizuho Global Services Pvt Ltd (MGS) is a subsidiary company of Mizuho Bank, Ltd, which is one of the largest banks or so called β€˜Mega Banks’ of Japan. MGS was established in the year 2020 as part of Mizuho’s long-term strategy of creating a captive global processing center for remotely handling banking and IT related operations of Mizuho Bank’s domestic and overseas offices and Mizuho’s group companies across the globe. At Mizuho we are committed to a culture that is driven by ethical values and supports diversity in all its forms for its talent pool. Direction of MGS’s development is paved by its three key pillars, which are Mutual Respect, Discipline and Transparency, which are set as the baseline of every process and operation carried out at MGS. What’s in it for you? o Immense exposure and learning o Excellent career growth o Company of highly passionate leaders and mentors o Ability to build things from scratch Know more about MGS: https://www.mizuhogroup.com/asia-pacific/mizuho-global-services Job Description for IT – Application Support with SQL Knowledge Job Heading We are looking for a passionate individual, having computer science/IT Graduate/PG background, having 8+ years of work experience and expertise in Application support, hands on in SQLs, IIS, SWIFT knowledge. Job Location: Navi Mumbai. Other Roles and Responsibilities Β· Must to accept to rotational shifts (24*7). Banking Working days. Β· Application support profile. Knowledge on Banking domain and products. Β· IBM MQ Support, JBoss, Apache Tomacat, Java knowledge is desirable. Β· In-depth knowledge in SQL & PL/SQL. Β· Well versed with Shell Scripting, Linux and Windows Platform. Β· Hands on in SWIFT, SFMS, NEFT/RTGS, Export, Import. Β· ITIL Framework knowledge. Β· Adherence to ISO 9001:2008, ISO 27001, Policies & Procedures Β· Proven experience troubleshooting security issues across various technologies Β·Customer-centric career experience and excellent Time management skills. Β· Ability to work within customer focused team and Excellent communication skills Β· Take ownership of customer issues reported and see problems through to resolution. Β· Troubleshoot and resolve issues through sharing best practices and direct resolution. Β· Excellent written and verbal communication and effective organizational and multi-tasking skills. Proven ability to quickly learn new technical domains and train others. Β· Should be flexible to work in an operational environment, rotational shifts and on-call schedule. Β· Other general responsibilities as instructed by management. Qualification: Graduate/MBA Interested candidates can send resume on mgs.rec@mizuho-cb.com along with below details. Available for F2F? Y/N NP , Can start immediately ? Y/N Current residential location Exp in SWIFT - Exp in SQL - Exp in BFSI domain - Address: - Mizuho Global Services India Pvt. 11th Floor, Q2 Building Aurum Q Park, Gen 4/1, Ttc, Thane Belapur Road, MIDC Industrial Area, Ghansoli, Navi Mumbai- 400710. Show more Show less

Posted 6 days ago

Apply

5.0 - 7.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

What you’ll do: Utilize advanced mathematical, statistical, and analytical expertise to research, collect, analyze, and interpret large datasets from internal and external sources to provide insight and develop data driven solutions across the company Build and test predictive models including but not limited to credit risk, fraud, response, and offer acceptance propensity Responsible for the development, testing, validation, tracking, and performance enhancement of statistical models and other BI reporting tools leading to new innovative origination strategies within marketing, sales, finance, and underwriting Leverage advanced analytics to develop innovative portfolio surveillance solutions to track and forecast loan losses, that influence key business decisions related to pricing optimization, credit policy and overall profitability strategy Use decision science methodologies and advanced data visualization techniques to implement creative automation solutions within the organization Initiate and lead analysis to bring actionable insights to all areas of the business including marketing, sales, collections, and credit decisioning Develop and refine unit economics models to enable marketing and credit decisions What you’ll need: 5 to 8 years of experience in data science or a related role with a focus on Python programming and ML models. Strong in Python, experience with Jupyter notebooks, and Python packages like polars, pandas, numpy, scikit-learn, matplotlib, etc. Experience with ML lifecycle: data preparation, training, evaluation, and deployment Hands-on experience with GCP services for ML & data science Experience with Vector Search and Hybrid Search techniques Embeddings generation using BERT, Sentence Transformers, or custom models Embedding indexing and retrieval (Elastic, FAISS, ScaNN, Annoy) Experience with LLMs and use cases like RAG Understanding of semantic vs lexical search paradigms Experience with Learning to Rank (LTR) and libraries like XGBoost, LightGBM with LTR support Proficient in SQL and BigQuery Experience with Dataproc clusters for distributed data processing using Apache Spark or PySpark Model deployment using Vertex AI, Cloud Run, or Cloud Functions Familiarity with BM25 ranking (Elasticsearch or OpenSearch) and vector blending Awareness of search relevance evaluation metrics (precision@k, recall, nDCG, MRR) Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!

Posted 6 days ago

Apply

2.0 - 4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Data Scientist II BANGALORE, KARNATAKA / TECH – DATA SCIENCE / FULL TIME EMPLOYEE About the Team Our Data Science team is the Avengers to our S.H.I.E.L.D πŸ›‘. And why not? We are the ones who assemble during the toughest challenges and devise creative solutions, building intelligent systems for millions of our users looking at a thousand different categories of products. We’ve barely scratched the surface, and have amazing challenges in charting the future of commerce for Bharat. Our typical day involves dealing with fraud detection, inventory optimization, and platform venularization. As Data Scientist, you will navigate uncharted territories with us, discovering new paths to creating solutions for our users.πŸ” You will be at the forefront of interesting challenges and solve unique customer problems in an untapped market. But wait – there’s more to us. Our team is huge on having a well-rounded personal and professional life. When we aren't nose-deep in data, you will most likely find us belting β€œSummer of 69” at the nearest Karaoke bar, or debating who the best Spider-Man is: Maguire, Garfield, or Holland? You tell us ☺️ About the Role Love deep data? Love discussing solutions instead of problems? Then you could be our next Data Scientist. In a nutshell, your primary responsibility will be enhancing the productivity and utilization of the generated data. Other things you will do are: -Work closely with the business stakeholders -Transform scattered pieces of information into valuable data -Share and present your valuable insights with peers What You Will Do Develop models and run experiments to infer insights from hard data Improve our product usability and identify new growth opportunities Understand reseller preferences to provide them with the most relevant products Designing discount programs to help our resellers sell more Help resellers better recognize end-customer preferences to improve their revenue Use data to identify bottlenecks that will help our suppliers meet their SLA requirements Model seasonal demand to predict key organizational metrics Mentor junior data scientists in the team What You Will Need Bachelor's/Master's degree in computer science (or similar degrees) 2-4 years of experience as a Data Scientist in a fast-paced organization, preferably B2C Familiarity with Neural Networks, Machine Learning , etc Familiarity with tools like SQL, R, Python , etc. Strong understanding of Statistics and Linear Algebra Strong understanding of hypothesis/model testing and ability to identify common model testing errors Experience designing and running A/B tests and drawing insights from them Proficiency in machine learning algorithms Excellent analytical skills to fetch data from reliable sources to generate accurate insights Experience in tech and product teams is a plus Bonus points for: -Experience in working on personalization or other ML problems -Familiarity with Big Data tech stacks like Apache Spark, Hadoop, Redshift About It is India’s fastest-growing e-commerce company. We started in 2015 with the idea of helping mom & pop stores to sell online. Today, 5% of Indian households shop with us on any given day. We’ve helped over 15 million individual entrepreneurs start online businesses with zero investment. We’re democratizing internet commerce by offering a 0% commission model for sellers on our platform β€” a first for India. We aim to become the e-commerce destination for Bharat. We’re currently valued at $4.9 billion with marquee investors supporting our vision. Some of them include Sequoia Capital, Softbank, Fidelity, Proses Ventures, Facebook, and Elevation Capital. We were also featured in Y Combinator’s 2021 Top Companies List and were the only Indian startup to make it to Fast Company’s The World’s 50 Most Innovative Companies in 2020. We ranked 6th in LinkedIn's Top Startups List 2021. Our strongest asset is our people. We have gender-neutral and inclusive policies to promote our people-first culture. Our Mission Democratize internet commerce for everyone Our Vision Enable 100M small businesses in India to succeed online Show more Show less

Posted 6 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Experience: 5+ Years Role Overview: Responsible for designing, building, and maintaining scalable data pipelines and architectures. This role requires expertise in SQL, ETL frameworks, big data technologies, cloud services, and programming languages to ensure efficient data processing, storage, and integration across systems. Requirements: β€’ Minimum 5+ years of experience as a Data Engineer or similar data-related role. β€’ Strong proficiency in SQL for querying databases and performing data transformations. β€’ Experience with data pipeline frameworks (e.g., Apache Airflow, Luigi, or custom-built solutions). β€’ Proficiency in at least one programming language such as Python, Java, or Scala for data processing tasks. β€’ Experience with cloud-based data services and Datalakes (e.g., Snowflake, Databricks, AWS S3, GCP BigQuery, or Azure Data Lake). β€’ Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka). β€’ Experience with ETL tools (e.g., Talend, Apache NiFi, SSIS, etc.) and data integration techniques. β€’ Knowledge of data warehousing concepts and database design principles. β€’ Good understanding of NoSQL and Big Data Technologies like MongoDB, Cassandra, Spark, Hadoop, Hive, β€’ Experience with data modeling and schema design for OLAP and OLTP systems. β€’ Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Educational Qualification: Bachelor’s/Master’s degree in computer science, Information Technology, or a related field. Show more Show less

Posted 6 days ago

Apply

130.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Senior Specialist, Data and Analytics Architect THE OPPORTUNITY Based in Hyderabad, join a global healthcare biopharma company and be part of a 130-year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Lead an Organization driven by digital technology and data-backed approaches that supports a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be the leaders who have a passion for using data, analytics, and insights to drive decision-making, which will allow us to tackle some of the world's greatest health threats. Our Technology centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. AN integral part of our company IT operating model, Tech centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each tech center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview We are seeking a talented and motivated Technical Architect to join our Data and Analytics Strategy & Architecture team. Reporting to the Lead Architect, this mid-level Technical Architect role is critical in shaping the technical foundation of our cross-product architecture. The ideal candidate will focus on reference architecture, driving proofs of concept (POCs) and points of view (POVs), staying updated on industry trends, solving technical architecture issues, and enabling a robust data observability framework. The role will also emphasize enterprise data marketplaces and data catalogs to ensure data accessibility, governance, and usability. This position will also focus on creating a customer-centric development environment that is resilient and easily adoptable by various user personas. The outcome of the cross-product integration will be improved efficiency and productivity through accelerated provisioning times and a seamless user experience, eliminating the need for interacting with multiple platforms and teams. What Will You Do In The Role Collaborate with product line teams to design and implement cohesive architecture solutions that enable cross-product integration, spanning ingestion, governance, analytics, and visualization. Develop, maintain, and advocate for reusable reference architectures that align with organizational goals and industry standards. Lead technical POCs and POVs to evaluate new technologies, tools, and methodologies, providing actionable recommendations. Diagnose and resolve complex technical architecture issues, ensuring stability, scalability, and performance across platforms. Implement and maintain frameworks to monitor data quality, lineage, and reliability across data pipelines. Contribute to the design and implementation of an enterprise data marketplace to facilitate self-service data discovery, analytics, and consumption. Oversee and extend the use of Collibra or similar tools to enhance metadata management, data governance, and cataloging across the enterprise. Monitor emerging industry trends in data and analytics (e.g., AI/ML, data engineering, cloud platforms) and identify opportunities to incorporate them into our ecosystem. Work closely with data engineers, data scientists, and other architects to ensure alignment with the enterprise architecture strategy. Create and maintain technical documentation, including architecture diagrams, decision records, and POC/POV results. What Should You Have Strong experience with Databricks, Dataiku, Starburst and related data engineering/analytics platforms. Proficiency in AWS cloud platforms and AWS Data and Analytics technologies Knowledge of modern data architecture patterns like data Lakehouse, data mesh, or data fabric. Hands-on experience with Collibra or similar data catalog tools for metadata management and governance. Familiarity with data observability tools and frameworks to monitor data quality and reliability. Experience contributing to or implementing enterprise data marketplaces, including facilitating self-service data access and analytics. Exposure to designing and implementing scalable, distributed architectures. Proven experience in diagnosing and resolving technical issues in complex systems. Passion for exploring and implementing innovative tools and technologies in data and analytics. 3–5 years of total experience in data engineering, analytics, or architecture roles. Hands-on experience with developing ETL pipelines with DBT, Matillion and Airflow. Experience with data modeling, and data visualization tools (e.g., ThoughtSpot, Power BI). Strong communication and collaboration skills. Ability to work in a fast-paced, cross-functional environment. Focus on continuous learning and professional growth. Preferred Skills Certification in Databricks, Dataiku, or a major cloud platform. Experience with orchestration tools like Airflow or Prefect. Understanding of AI/ML workflows and platforms. Exposure to frameworks like Apache Spark or Kubernetes. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are intellectually curious, join usβ€”and start making your impact today. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Enterprise Architecture (BEA), Business Process Modeling, Data Modeling, Emerging Technologies, Requirements Management, Solution Architecture, Stakeholder Relationship Management, Strategic Planning, System Designs Preferred Skills Job Posting End Date 07/3/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R345601 Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Transport is at the core of modern society. Imagine using your expertise to shape sustainable transport and infrastructure solutions for the future? If you seek to make a difference on a global scale, working with next-gen technologies and the sharpest collaborative teams, then we could be a perfect match. Data Analyst Quality Customer Satisfaction Are you an experienced data analyst that has a passion and strong desire to utilize data to identify and drive decisions related to Quality of Volvo trucks? Who We Are If you want to be a part of an exciting Quality and Customer Satisfaction (Q&CS) team where you can drive and impact the quality of Volvo Trucks Brand, come join us! You will be engaged within the advanced analytics and machine learning initiatives focused on emerging issue detection, problem definition and resolution support within our team. This is done in parallel with other data and product quality leaders located across our sites Q&CS’s is part of Volvo Group Truck Technology Team. Our team’s mission is to ensure the product quality performance for the complete product line through investigating and prioritizing market product quality issues. Our data team secures the voice of the customer in the problem-solving processes through clear and concise data storytelling. This role supports achieving product quality goals and metrics that directly pertain to customer satisfaction. In this position you will use fundamental analytics approaches as well as advanced analytics techniques to accelerate field quality issue solving process. This process includes conventional, electric and fuel cell powertrains in our Trucks and Coaches. Volvo is a global organization. This position involve collaboration with sites located in Sweden, France, Brazil and US locations. What You Will Be Doing You will be part of the Q&CS product quality Bangalore organization. Your creative thinking and problem-solving skills will contribute to identifying and solving complex quality challenges through the areas of data mining and optimization. You will be actively working with quality issue management and ways to increase Uptime for our customers. QCS Data Analyst Will Be Responsible For All steps of an analytics assignment ranging from fundamental to advanced statistical techniques, in the areas of data & business value exploration, data structuring, modelling, problem solving & recommendation within the Quality team. You will develop & implement methodologies to assist Product Quality Leaders in emerging issue detection, forecasting & predicting quality & uptime performance. Working with various data sources but not limited to warranty, logged vehicle, customer data, manufacturing data, high resolution truck data. You will be able to deliver all steps independently but also establish a network working closely with colleagues in Europe, India & Brazil on various initiatives. Collaborateclosely with other Data Analysts, Data engineers, Data Architects and Data Scientists during theimplementationand deployment of the advance analytics solution Who You Are You are person with a passion for data driven decisions to solve important technical issues related to quality. A team player with eagerness to try and develop through collaboration, co-operation, and sharing insights, solutions, and ideas. Position Requirements Minimum bachelor’s degree in engineering, Computer Science, Mathematics, Statistics, or equivalent required. Graduate Degree Preferred Experience with problem solving in a quality organization that is technical focused. Very skilled in PowerBI Demonstrated experience with relevant technical tools such as: SQL, Python, R, Azure Analytics Cloud, Jupiter Notebook, and Apache Spark. Knowledge of unsupervised and supervised machine learning techniques. (Algorithms for classification, regression, clustering, or anomaly detection including but not limited to: K-means, Random Forest, etc.) You have excellent communication skills in English To be the perfect match for this role, your critical competencies will include: Must have good energy, a positive attitude, and thrive in creative thinking. Real problem solver and confident to make data driven decisions. Excellent written and verbal communication skills for coordinating across teams. A drive to learn and master new technologies and techniques Self-starter with the ability to prioritize critical tasks, as well as deliver on assignments within deadlines. Preferred qualifications (or willingness to learn!) Exposure to software development practices, including version control, unit testing, and code review. Process oriented individual with a continuous improvement mindset to drive DA product Ready to join our team and shape tomorrow’s society together with us? We value your data privacy and therefore do not accept applications via mail. Who We Are And What We Believe In We are committed to shaping the future landscape of efficient, safe, and sustainable transport solutions. Fulfilling our mission creates countless career opportunities for talents across the group’s leading brands and entities. Applying to this job offers you the opportunity to join Volvo Group . Every day, you will be working with some of the sharpest and most creative brains in our field to be able to leave our society in better shape for the next generation. We are passionate about what we do, and we thrive on teamwork. We are almost 100,000 people united around the world by a culture of care, inclusiveness, and empowerment. Group Trucks Technology are seeking talents to help design sustainable transportation solutions for the future. As part of our team, you’ll help us by engineering exciting next-gen technologies and contribute to projects that determine new, sustainable solutions. Bring your love of developing systems, working collaboratively, and your advanced skills to a place where you can make an impact. Join our design shift that leaves society in good shape for the next generation. Show more Show less

Posted 6 days ago

Apply

2.0 - 4.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Linkedin logo

Location: Trivandrum About us At Arbor, we're on a mission to transform the way schools work for the better. We believe in a future of work in schools where being challenged doesn't mean being burnt out and overworked. Where data guides progress without overwhelming staff. And where everyone working in a school is reminded why they got into education every day. Our MIS and school management tools are already making a difference in over 7,000 schools and trusts. Giving time and power back to staff, turning data into clear, actionable insights, and supporting happier working days. At the heart of our brand is a recognition that the challenges schools face today aren't just about efficiency, outputs and productivity - but about creating happier working lives for the people who drive education everyday: the staff. We want to make schools more joyful places to work, as well as learn. About the role We're seeking a PHP Backend Developer (Platform) with 2-4 years of hands-on experience in developing and maintaining scalable backend systems. The ideal candidate is well-versed in PHP and modern frameworks such as Symfony /Laravel , with a solid understanding of OOPs, writing unit test cases, RESTful APIs, MySQL database management, and performance optimization techniques. You'll work closely with product managers, and engineering teams to deliver reliable, high-quality features. Familiarity with cloud platforms like AWS is a strong advantage. A strong emphasis on clean, maintainable code and the ability to troubleshoot production issues is essential. We value a passion for continuous learning and a collaborative approach to cross-functional teamwork. Core responsibilities Develop core platform components to aid reusability and stability of the system Work with Head of Platform Engineering/SRE to identify and progress platform improvements related to stability, scalability, and performance Work with the QA automation framework to ensure functionality is delivered to a high quality Work with DevOps Engineers to understand application impacts and system performance and stability, and work with engineering teams to rectify Assist in incident response and resolution, and subsequent post-mortems and retrospectives Contribute to the platform code base and framework which is used by Product Engineers across Engineering Requirements About you Experience of PHP at scale through frameworks such as Symfony /Laravel Experience of distributed cloud systems, and specifically Amazon Web Services Enterprise Software design patterns and their implementation in real-world enterprise systems Experience of message queuing and/or streaming systems such as SQS, ActiveMQ, Apache Kafka, AWS Kinesis, AWS Firehose Understanding of relational database technologies and their cloud versions (e.g. AWS MySQL Aurora) Experience with DataDog, Prometheus or similar observability tools A positive and proactive attitude to problem solving A team player, willing to muck in and help others when needed, driven personality who asks questions and actively participates in discussions Bonus skills Past experience with enterprise solutions running at scale Familiarity with Scrum methodology or other agile development processes Experience with Docker and containerization Experience with AWS or other Cloud Infrastructure Familiarity with software best practices such as Refactoring, Clean Code, Domain-Driven Design, Test-Driven Development, etc. Benefits Interview process Phone screen 1st stage 2nd stage We are committed to a fair and comfortable recruitment process, so if you require any reasonable adjustments during your application or interview process, please reach out to a member of the team at careers@arbor-education.com. What we offer The chance to work alongside a team of hard-working, passionate people in a role where you'll see the impact of your work everyday. We also offer: Flexible work environment (3 days work from office) Group Term Life Insurance paid out at 3x Annual CTC (Arbor India) 32 days holiday (plus Arbor Holidays). This is made up of 25 days annual leave plus 7 extra companywide days given over Easter, Summer & Christmas Work time: 9.30 am to 6 pm (8.5 hours only) Compensation - 100% fixed salary disbursement and no variable component Arbor Education is an equal opportunities organisation Our goal is for Arbor to be a workplace which represents, celebrates and supports people from all backgrounds, and which gives them the tools they need to thrive - whatever their ambitions may be so we support and promote diversity and equality, and actively encourage applications from people of all backgrounds. Refer a friend Know someone else who would be good for this role? You can refer a friend, family member or colleague, if they are offered a role with Arbor, we will say thank you with a voucher valued up to Β£200! Simply email: careers@arbor-education.com Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job description: Role Purpose opics / Areas of discussion Weblogic L3 Support / Lead experience Total Experience Relevant Experience Communication Team member OR Team lead Websphere installation and configuration Deployment and patching Certificates and security vulnerabilities Troubleshooting Apache installation and app server integration Jboss installation and configuration The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipro’s standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Stakeholder Interaction Stakeholder Type Stakeholder Identification Purpose of Interaction Internal Technology Solutions Group, BU Teams, Different Infrastructure teams Understanding requirements, planning and status updates, maintenance and back up, issue resolution etc. IRMC, QA Guidance on risk mitigation and quality standards External Clients Understanding requirements, planning and status updates, maintenance and back up, issue resolution etc. Vendors/ Manufacturers Development and deployment of platforms, applications, databases etc. Display Lists the competencies required to perform this role effectively: Functional Competencies/ Skill Technical Knowledge - Knowledge of own tower (platform, application, database etc) - Expert Domain Knowledge - Understanding of IT industry and its trends - Competent to Expert Competency Levels Foundation Knowledgeable about the competency requirements. Demonstrates (in parts) frequently with minimal support and guidance. Competent Consistently demonstrates the full range of the competency without guidance. Extends the competency to difficult and unknown situations as well. Expert Applies the competency in all situations and is serves as a guide to others as well. Master Coaches others and builds organizational capability in the competency area. Serves as a key resource for that competency and is recognised within the entire organization. Behavioral Competencies Managing Complexity Client centricity Execution Excellence Passion for Results Team Management Stakeholder Management Deliver No. Performance Parameter Measure 1. Operations of the tower SLA adherence Knowledge management CSAT/ Customer Experience Identification of risk issues and mitigation plans Knowledge management 2. New projects Timely delivery Avoid unauthorised changes No formal escalations Show more Show less

Posted 6 days ago

Apply

12.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Summary We are looking for a technically adept Virtualization Engineer specializing in Linux, KVM-based virtualization and proficient in managing infrastructure using Apache CloudStack as an orchestration layer. The ideal candidate should excel in automation, leveraging Ansible and Python scripting to streamline operations and enhance scalability in virtualized environments . Knowledge of VMware and AWS experience would be an added advantage. Job Responsibilities Design, deploy, and manage KVM-based virtualization environments as per organizational needs. Utilize Apache CloudStack for orchestration, configuration management, and provisioning of virtual resources. Develop and maintain automation scripts using Ansible and Python to automate routine tasks such as provisioning, monitoring, and configuration management. Implement Infrastructure-as-Code principles to ensure consistency and repeatability in deployments. Monitor virtualization infrastructure performance, identifying bottlenecks and optimizing resource allocation. Implement proactive measures to ensure high availability, scalability, and disaster recovery readiness. Implement security best practices and compliance measures in virtualized environments. Conduct regular security audits and vulnerability assessments, addressing findings promptly. Provide technical support and troubleshooting expertise for complex virtualization issues. Collaborate with cross-functional teams to resolve escalated incidents and ensure smooth operation of virtualized systems. Job Qualifications Bachelor’s degree in Computer Science, Engineering, or related field; or equivalent practical experience. Minimum 12+ years of experience in designing and managing KVM-based virtualization environments. Hands-on expertise with Apache CloudStack for orchestration and management of virtualized infrastructure. Strong scripting skills with proficiency in Ansible and Python for automation of infrastructure tasks. Familiarity with VMware virtualization technologies is a plus. Experience deploying and managing virtualized instances on AWS cloud would be a plus. Solid understanding of networking concepts, storage technologies, and virtualization architectures. Ability to troubleshoot complex technical issues independently and provide effective solutions. Excellent communication skills and ability to collaborate effectively within a team environment. Relevant certifications such as RHCE, AWS Certified Solutions Architect, or similar credentials preferred. More information about NXP in India... Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Mohali district, India

On-site

Linkedin logo

We are looking for a Server Admin Intern ( DevOps) with IT Based Background and loves to work with an amazing team and can handle services including Servers, operating systems, storage and supporting systems. The role of Server Admin Intern is a technical position that requires hard IT skills and knowledge. Server Administrators must have a working knowledge of TCP/IP, DNS, DHCP, LAN and WAN to troubleshoot and address server issues. They must know about firewalls, proxy and databases to keep the network and user information secure. Server Admin Intern also have to be good communicators as part of their job involves helping non-IT users with common hardware and software issues. Required Skills: Must have Knowledge of Amazon Services (AWS ) and Firewall . knowledge of PHP Linux and related applications. knowledge of Linux, OS , including deployment, configuration & troubleshooting. knowledge of Configuration & Install Windows & Linux. Must have knowledge of Installation, configuration, and maintenance of networking services, equipment, and devices. Advanced Linux/Unix system administration skills. Basic Knowledge of Windows is required. Knowledge in GIT Knowledge of Docker will be an Added Advantage Knowledge Azure, Loading Balancing, Apache Decent Communications. Job Types: Full-time(Work from office), Permanent, Fresher with training in CCNA or Red Hat Certification Pay: β‚Ή12,000.00 - β‚Ή15,000.00 per month Schedule: Day shift Morning shift Interview mode is face to face only Work Location: In person (Mohali) Show more Show less

Posted 6 days ago

Apply

3.0 - 13.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Hi Connections, We are Hiring: - Java Developer Location: - Anywhere In India Experience: - 3 to 13 Years Requirements: - Java Full Stack Developer Java Backend Developer Strong knowledge of Java, Spring Boot, Hibernate. Hands on experience with Rest APIs and Microservices. Database: - MongoDB Cloud: - AWS/ Azure/ GCP Version Control experience using Git. CI/CD pipeline and containerization like Doker/ Kubernetes. Problem solving skills. Bonus Points for: Experience with Apache Kafka, Oracle, Frontend Skills for full stack. Apply Now: If you are ready to take the next step in your JAVA career, send your resume to divya.rghav@nagarro.com. Let's build something great together. Show more Show less

Posted 6 days ago

Apply

2.0 - 4.0 years

0 Lacs

Surat, Gujarat, India

On-site

Linkedin logo

Job Title -Devops Engineer Location - Surat (On-site ) Experience - 2-4 years Job Summary: We are looking for a DevOps Engineer to help us build functional systems that improve customer experience. DevOps Engineer responsibilities include deploying product updates, identifying production issues, and implementing integrations that meet customer needs. If you have a solid background in software engineering and are familiar with Ruby or Python, we’d like to meet you. Ultimately, you will execute and automate operational processes quickly, accurately, and securely. Roles & Responsibilities: Strong experience with essential DevOps tools and technologies including Kubernetes , Terraform , Azure DevOps , Jenkins , Maven , Git , GitHub , and Docker . Hands-on experience in Azure cloud services , including: Virtual Machines (VMs) Blob Storage Virtual Network (VNet) Load Balancer & Application Gateway Azure Resource Manager (ARM) Azure Key Vault Azure Functions Azure Kubernetes Service (AKS) Azure Monitor, Log Analytics, and Application Insights Azure Container Registry (ACR) and Azure Container Instances (ACI) Azure Active Directory (AAD) and RBAC Creative in automating, configuring, and deploying infrastructure and applications across Azure environments and hybrid cloud data centers. Build and maintain CI/CD pipelines using Azure DevOps , Jenkins , and scripting for scalable SaaS deployments. Develop automation and infrastructure-as-code (IaC) using Terraform , ARM Templates , or Bicep for managing and provisioning cloud resources. Expert in managing containerized applications using Docker and orchestrating them via Kubernetes (AKS). Proficient in setting up monitoring , logging , and alerting systems using Azure-native tools and integrating with third-party observability stacks. Experience implementing auto-scaling , load balancing , and high-availability strategies for cloud-native SaaS applications. Configure and maintain CI/CD pipelines and integrate with quality and security tools for automated testing , compliance , and secure deployments . Deep knowledge in writing Ansible playbooks and ad hoc commands for automating provisioning and deployment tasks across environments. Experience integrating Ansible with Azure DevOps/Jenkins for configuration management and workflow automation. Proficient in using Maven and Artifactory for build management and writing POM.xml scripts for Java-based applications. Skilled in GitHub repository management , including setting up project-specific access, enforcing code quality standards, and managing pull requests. Experience with web and application servers such as Apache Tomcat for deploying and troubleshooting enterprise-grade Java applications. Ability to design and maintain scalable , resilient , and secure infrastructure to support rapid growth of SaaS applications. Qualifications & Requirements: Proven experience as a DevOps Engineer , Site Reliability Engineer , or in a similar software engineering role. Strong experience working in SaaS environments with a focus on scalability, availability , and performance . Proficiency in Python or Ruby for scripting and automation. Working knowledge of SQL and database management tools. Strong analytical and problem-solving skills with a collaborative and proactive mindset. Familiarity with Agile methodologies and ability to work in cross-functional teams . Show more Show less

Posted 6 days ago

Apply

8.0 years

0 Lacs

India

On-site

Linkedin logo

Job Description Do you have what it takes to build the technology of tomorrow? Work with an amazing global team of innovators! Join the Akamai Cloud Technology Group The Akamai Cloud Technology Engineering team owns, develops and manages the solutions used by our engineers globally. These solutions help to run one of the largest distributed systems in the world. We collaborate with internal teams to create innovative, powerful, scalable, highly reliable and secure systems at scale. Partner with the best As Senior Software Engineer II, you'll design, build, and deploy scalable, reliable, user-friendly solutions on our infrastructure. You'll collaborate with teams across Akamai to gather requirements, architect solutions, implement features, and perform unit testing. You'll also stay updated on emerging technologies and integrating them to enhance innovation and efficiency. As a Senior Software Engineer II, you will be responsible for: Creating new features, or enhance existing functionality, from design through testing and deployment Working in an agile sprint environment to deliver quality software on a regular and consistent basis Measuring, maintaining and optimizing distributed system performance and performing bug fixes of existing applications Mentoring Junior Software Engineer on technology aspects Working in diverse nature of programming languages as needed Do What You Love To be successful in this role you will: Have 8+ years of relevant experience and a Bachelors degree in Computer Science or related field Have demonstrated experience in developing full stack applications in Python/Golang and Angular/bootstrap/ReactJS Have exposure to Container technologies like Dockers and Kubernetes Have experience with Django, Celery, Redis, MongoDB, Postgres, Grafana/InfluxDB, Flask, Apache spark, Jenkins, Selenium, Appium etc. Have experience using GIT for version control; knowledge about internet technologies like TCP/IP, HTTP, DNS etc. Be passionate about solving large scale, secure distributed systems problems using Data Structures and complex Algorithms Work in a way that works for you FlexBase, Akamai's Global Flexible Working Program, is based on the principles that are helping us create the best workplace in the world. When our colleagues said that flexible working was important to them, we listened. We also know flexible working is important to many of the incredible people considering joining Akamai. FlexBase, gives 95% of employees the choice to work from their home, their office, or both (in the country advertised). This permanent workplace flexibility program is consistent and fair globally, to help us find incredible talent, virtually anywhere. We are happy to discuss working options for this role and encourage you to speak with your recruiter in more detail when you apply. Learn what makes Akamai a great place to work Connect with us on social and see what life at Akamai is like! We power and protect life online, by solving the toughest challenges, together. At Akamai, we're curious, innovative, collaborative and tenacious. We celebrate diversity of thought and we hold an unwavering belief that we can make a meaningful difference. Our teams use their global perspectives to put customers at the forefront of everything they do, so if you are people-centric, you'll thrive here. Working for you Benefits At Akamai, we will provide you with opportunities to grow, flourish, and achieve great things. Our benefit options are designed to meet your individual needs for today and in the future. We provide benefits surrounding all aspects of your life: Your health Your finances Your family Your time at work Your time pursuing other endeavors Our benefit plan options are designed to meet your individual needs and budget, both today and in the future. About Us Akamai powers and protects life online. Leading companies worldwide choose Akamai to build, deliver, and secure their digital experiences helping billions of people live, work, and play every day. With the world's most distributed compute platform from cloud to edge we make it easy for customers to develop and run applications, while we keep experiences closer to users and threats farther away. Join us Are you seeking an opportunity to make a real difference in a company with a global reach and exciting services and clients? Come join us and grow with a team of people who will energize and inspire you! Akamai Technologies is an Affirmative Action, Equal Opportunity Employer that values the strength that diversity brings to the workplace. All qualified applicants will receive consideration for employment and will not be discriminated against on the basis of gender, gender identity, sexual orientation, race/ethnicity, protected veteran status, disability, or other protected group status. Show more Show less

Posted 6 days ago

Apply

4.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

This role is for one of the Weekday's clients Min Experience: 4 years Location: Ahmedabad JobType: full-time We are seeking a highly skilled Senior Database Administrator with 5-8 years of experience in data engineering and database management. The ideal candidate will have a strong foundation in data architecture, modeling, and pipeline orchestration. Hands-on experience with modern database technologies and exposure to generative AI tools in production environments will be a significant advantage. This role involves leading efforts to streamline data workflows, improve automation, and deliver high-impact insights across the organization. Requirements Key Responsibilities: Design, develop, and manage scalable and efficient data pipelines (ETL/ELT) across multiple database systems. Architect and maintain high-availability, secure, and scalable data storage solutions. Utilize generative AI tools to automate data workflows and enhance system capabilities. Collaborate with engineering, analytics, and data science teams to fulfill data requirements and optimize data delivery. Implement and monitor data quality standards, governance practices, and compliance protocols. Document data architectures, systems, and processes for transparency and maintainability. Apply data modeling best practices to support optimal storage and querying performance. Continuously research and integrate emerging technologies to advance the data infrastructure. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or related field. 5-8 years of experience in database administration and data engineering for large-scale systems. Proven experience in designing and managing relational and non-relational databases. Mandatory Skills: SQL - Proficient in advanced queries, performance tuning, and database management. NoSQL - Experience with at least one NoSQL database such as MongoDB, Cassandra, or CosmosDB. Hands-on experience with at least one of the following cloud data warehouses: Snowflake, Redshift, BigQuery, or Microsoft Fabric. Cloud expertise - Strong experience with Azure and its data services. Working knowledge of Python for scripting and data processing (e.g., Pandas, PySpark). Experience with ETL tools such as Apache Airflow, Microsoft Fabric, Informatica, or Talend. Familiarity with generative AI tools and their integration into data pipelines. Preferred Skills & Competencies: Deep understanding of database performance, tuning, backup, recovery, and security. Strong knowledge of data governance, data quality management, and metadata handling. Experience with Git or other version control systems. Familiarity with AI/ML-driven data solutions is a plus. Excellent problem-solving skills and the ability to resolve complex database issues. Strong communication skills to collaborate with cross-functional teams and stakeholders. Demonstrated ability to manage projects and mentor junior team members. Passion for staying updated with the latest trends and best practices in database and data engineering technologies. Show more Show less

Posted 6 days ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Roles and Responsibilities: Data Pipeline Development: Design, develop, and maintain scalable data pipelines to support ETL (Extract, Transform, Load) processes using tools like Apache Airflow, AWS Glue, or similar. Database Management: Design, optimize, and manage relational and NoSQL databases (such as MySQL, PostgreSQL, MongoDB, or Cassandra) to ensure high performance and scalability. SQL Development: Write advanced SQL queries, stored procedures, and functions to extract, transform, and analyze large datasets efficiently. Cloud Integration: Implement and manage data solutions on cloud platforms such as AWS, Azure, or Google Cloud, utilizing services like Redshift, BigQuery, or Snowflake. Data Warehousing: Contribute to the design and maintenance of data warehouses and data lakes to support analytics and BI requirements. Programming and Automation: Develop scripts and applications in Python or other programming languages to automate data processing tasks. Data Governance: Implement data quality checks, monitoring, and governance policies to ensure data accuracy, consistency, and security. Collaboration: Work closely with data scientists, analysts, and business stakeholders to understand data needs and translate them into technical solutions. Performance Optimization: Identify and resolve performance bottlenecks in data systems and optimize data storage and retrieval. Documentation: Maintain comprehensive documentation for data processes, pipelines, and infrastructure. Stay Current: Keep up-to-date with the latest trends and advancements in data engineering, big data technologies, and cloud services. Required Skills and Qualifications: Education: Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Technical Skills: Proficiency in SQL and relational databases (PostgreSQL, MySQL, etc.). Experience with NoSQL databases (MongoDB, Cassandra, etc.). Strong programming skills in Python; familiarity with Java or Scala is a plus. Experience with data pipeline tools (Apache Airflow, Luigi, or similar). Expertise in cloud platforms (AWS, Azure, or Google Cloud) and data services (Redshift, Big Query, Snowflake). Knowledge of big data tools like Apache Spark, Hadoop, or Kafka is a plus. Data Modeling: Experience in designing and maintaining data models for relational and non-relational databases. Analytical Skills: Strong analytical and problem-solving abilities with a focus on performance optimization and scalability. Soft Skills: Excellent verbal and written communication skills to convey technical concepts to non-technical stakeholders. Ability to work collaboratively in cross-functional teams. Certifications (Preferred): AWS Certified Data Analytics, Google Professional Data Engineer, or similar. Mindset: Eagerness to learn new technologies and adapt quickly in a fast-paced environment. Show more Show less

Posted 6 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

As a Software Developer you will work in a constantly evolving environment, due to technological advances and the strategic direction of the organization you work for. You will create, maintain, audit, and improve systems to meet particular needs, often as advised by a systems analyst or architect, testing both hard and software systems to diagnose and resolve system faults. The role also covers writing diagnostic programs and designing and writing code for operating systems and software to ensure efficiency. When required, you will make recommendations for future developments Benefits of Joining Us Challenging Projects : Work on cutting-edge projects and solve complex technical problems. Career Growth : Advance your career quickly and take on leadership roles. Mentorship : Learn from experienced mentors and industry experts. Global Opportunities : Work with clients from around the world and gain international experience. Competitive Compensation : Receive attractive compensation packages and benefits. If you're passionate about technology and want to work on challenging projects with a talented team, becoming an Infosys Power Programmer could be a great career choice. Mandatory Skills AWS Glue, AWS Redshift/Spectrum, S3, API Gateway, Athena, Step and Lambda functions Experience in Extract Transform Load (ETL) and Extract Load & Transform (ELT) data integration pattern. Experience in designing and building data pipelines. Development experience in one or more object-oriented programming languages, preferably Python Job Specs 5+ years of in depth hands on experience of developing, testing, deployment and debugging of Spark Jobs using Scala in Hadoop Platform In depth knowledge of Spark Core, working with RDDs, Spark SQL In depth knowledge on Spark Optimization Techniques and Best practices Good Knowledge of Scala Functional Programming: Try, Option, Future, Collections Good Knowledge of Scala OOPS: Classes, Traits and Objects (Singleton and Companion), Case Classes Good Understanding of Scala Language Features: Type System, Implicit/Givens Hands on experience of working in Hadoop Environment (HDFS/Hive), AWS S3, EMR Python programming skills Working experience on Workflow Orchestration tools like Airflow, Oozie Working with API calls in Scala Understanding and exposure to file formats such as Apache AVRO, Parquet, JSON Good to have knowledge of Protocol Buffers and Geospatial data analytics. Writing Test cases using frameworks such as scalatest. Good Knowledge of Build Tools such as: Gradle & SBT in depth Experience on using GIT, resolving conflicts, working with branches. Good to have worked on some workflow systems as Airflow Strong programming skills using data structures and algorithms. Excellent analytical skills Good communication skills Qualification 7-10 Yrs in the industry BE/B.tech CS or equivalent Show more Show less

Posted 6 days ago

Apply

Exploring Apache Jobs in India

Apache is a widely used software foundation that offers a range of open-source software solutions. In India, the demand for professionals with expertise in Apache tools and technologies is on the rise. Job seekers looking to pursue a career in Apache-related roles have a plethora of opportunities in various industries. Let's delve into the Apache job market in India to gain a better understanding of the landscape.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving IT sectors and see a high demand for Apache professionals across different organizations.

Average Salary Range

The salary range for Apache professionals in India varies based on experience and skill level. - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

In the Apache job market in India, a typical career path may progress as follows: 1. Junior Developer 2. Developer 3. Senior Developer 4. Tech Lead 5. Architect

Related Skills

Besides expertise in Apache tools and technologies, professionals in this field are often expected to have skills in: - Linux - Networking - Database Management - Cloud Computing

Interview Questions

  • What is Apache HTTP Server and how does it differ from Apache Tomcat? (medium)
  • Explain the difference between Apache Hadoop and Apache Spark. (medium)
  • What is mod_rewrite in Apache and how is it used? (medium)
  • How do you troubleshoot common Apache server errors? (medium)
  • What is the purpose of .htaccess file in Apache? (basic)
  • Explain the role of Apache Kafka in real-time data processing. (medium)
  • How do you secure an Apache web server? (medium)
  • What is the significance of Apache Maven in software development? (basic)
  • Explain the concept of virtual hosts in Apache. (basic)
  • How do you optimize Apache web server performance? (medium)
  • Describe the functionality of Apache Solr. (medium)
  • What is the purpose of Apache Camel? (medium)
  • How do you monitor Apache server logs? (medium)
  • Explain the role of Apache ZooKeeper in distributed applications. (advanced)
  • How do you configure SSL/TLS on an Apache web server? (medium)
  • Discuss the advantages of using Apache Cassandra for data management. (medium)
  • What is the Apache Lucene library used for? (basic)
  • How do you handle high traffic on an Apache server? (medium)
  • Explain the concept of .htpasswd in Apache. (basic)
  • What is the role of Apache Thrift in software development? (advanced)
  • How do you troubleshoot Apache server performance issues? (medium)
  • Discuss the importance of Apache Flume in data ingestion. (medium)
  • What is the significance of Apache Storm in real-time data processing? (medium)
  • How do you deploy applications on Apache Tomcat? (medium)
  • Explain the concept of .htaccess directives in Apache. (basic)

Conclusion

As you embark on your journey to explore Apache jobs in India, it is essential to stay updated on the latest trends and technologies in the field. By honing your skills and preparing thoroughly for interviews, you can position yourself as a competitive candidate in the Apache job market. Stay motivated, keep learning, and pursue your dream career with confidence!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies