Home
Jobs

4657 Apache Jobs - Page 39

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 years

0 - 1 Lacs

Taliparamba

On-site

GlassDoor logo

HAZERCLOUD™ is a DevOps as a Service company that delivers robust Cloud solutions with a focus on automation and simplifying web application development processes. Our expert team of DevOps engineers enables businesses and developers to focus on delivering what matters without being held back by technology. Role Description This is a full-time on-site position for a DevOps Specialist located in Kannur, Kerala. The DevOps Engineer will be in charge of implementing, automating, and maintaining web application deployments on various platforms such as AWS. Additionally, the DevOps Engineer will assist with CI/CD automation and scripting. Subject Matter Expert in administering and deploying CICD tools such as Git, Jenkins, AWS Code Pipeline, etc. Expertise in Deploying Python Django, Node, and PHP Applications. Expertise in Linux System Administration. Specialized in Containerisation, EKS, and Kubernetes. Experience in CI/CD pipeline design and implementation for Python, Node, and PHP applications. Working experience on Apache, Nginx, and MySQL. Expertise in troubleshooting and resolving issues in dev, test, and production environments. Knowledge of databases including MySQL, Mongo Elastic-Search, and DB Cluster. Jenkins Automation server, with plugins built for developing CI/ CD pipelines. Hands-on Experience in AWS environment (EC2, RDS, S3, EBS, ALB, Route 53, VPC, IAM, etc). Solid understanding of DNS, CDN, SSL, and WAF. Infrastructure as code (IaC) skills. Critical thinking, time-management skills, and problem-solving skills. Experience with Software Development process and Continuous Integration tools. Excellent verbal and written communication skills. Ability to work well in a team environment with limited guidance. Qualifications Minimum 1-5 Years Of DevOps, Linux System Admin Experience. AWS Certified Solution Architect / SysOps / CloudOps Associate level Certification is mandatory. RHCSA / RHCE Certification will be a plus Btech / Diploma / Bachelor's degree in Computer Science or related field. Job Type: Fresher Pay: ₹30,000.00 - ₹100,000.00 per month Schedule: Day shift UK shift US shift Work Location: In person Speak with the employer +91 9207670011

Posted 6 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Responsible to assemble large, complex sets of data that meet non-functional and functional business requirements. Responsible to identify, design and implement internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes. Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using Azure, Databricks and SQL technologies Responsible for the transformation of conceptual algorithms from R&D into efficient, production ready code. The data developer must have a strong mathematical background in order to be able to document and maintain the code Responsible for integrating finished models into larger data processes using UNIX scripting languages such as ksh, Python, Spark, Scala, etc. Produce and maintain documentation for released data sets, new programs, shared utilities, or static data. This must be done within department standards Ensure quality deliverables to clients by following existing quality processes, manually calculating comparison data, developing statistical pass/fail testing, and visually inspecting data for reasonableness: the requirement is on-time with zero defects Qualifications Education/Training B.E./B.Tech. with a major in Computer Science, BIS, CIS, Electrical Engineering, Operations Research or some other technical field. Course work or experience in Numerical Analysis, Mathematics or Statistics is a plus Hard Skills Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Programming experience in Python, SQL, Scala Direct experience of building data pipelines using Apache Spark (preferably in Databricks), Airflow. Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake Experience with big data technologies (Hadoop) Databricks & Azure Big Data Architecture Certification would be plus Must be team oriented with strong collaboration, prioritization, and adaptability skills required Ability to write highly efficient code in terms of performance / memory utilization Basic knowledge of SQL; capable of handling common functions Experience Minimum 5 -8 year of experience as Data engineer Experience modeling or manipulating large amounts of data is a plus Experience with Demographic, Retail business is a plus Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion Show more Show less

Posted 6 days ago

Apply

3.0 years

2 - 4 Lacs

Cochin

On-site

GlassDoor logo

Linux System Administrator We are looking for a Linux System Administrator who specializes in migrating websites, domains, and email boxes between Linux-based systems. This role is critical to ensuring the seamless transfer of our clients' online assets while maintaining security, performance, and data integrity. Basic Qualification Masters/Bachelors degree in Computer science or related field 6 Months-3 years of experience in the related field Preferred Skills MigratioProven experience as a Linux System Administrator with a focus on website, domain, and emailns. In-depth knowledge of Linux operating systems, including CentOS, Ubuntu, or similar distributions. Proficiency in web server technologies such as Apache and Nginx. Configure and maintain Linux operating systems across various environments. Manage and support Docker containers for application deployment and maintenance. Implement and maintain AWS cloud services, including EC2, S3, and RDS. Utilize Git for version control, including repository management and branch strategies. Implement Continuous Integration/Continuous Deployment (CI/CD) pipelines for automated software releases and deployments. Familiarity with DNS management and domain registration processes. Experience with email servers and protocols (SMTP, IMAP, POP3). Problem-solving skills and the ability to troubleshoot migration-related issues. Excellent communication and customer service skills. Attention to detail and a commitment to data security and integrity. Ensure system security through access controls, backups, and regular updates. Key Responsibilities: Website Migration Plan and execute the migration of websites, including HTML, PHP, and database-driven sites, between Linux servers. Perform site backups, ensuring data integrity during the migration process. Troubleshoot and resolve any migration-related issues, such as DNS changes, database connectivity, and permissions. Domain Migration Transfer domain registrations and DNS records between registrars or hosting providers. Ensure minimal downtime during domain migration and DNS propagation. Collaborate with clients to update DNS settings and ensure domain functionality. Email Box Migration Migrate email accounts, including emails, contacts, and configurations, between email servers (e.g., Postfix, Dovecot). Test email functionality post-migration to confirm the successful transfer of data. Provide guidance and support to clients during email configuration updates. Security and Data Integrity Implement security best practices during migrations to protect sensitive data. Verify data integrity before and after migrations, ensuring no data loss or corruption. Monitor server logs and conduct post-migration audits to address any anomalies. Documentation and Reporting Create comprehensive migration plans and document procedures for future reference. Maintain accurate records of migration activities, including timelines and issues. Provide regular status updates and reports to clients and management. What do you receive in return Good work-life balance. Opportunity to grow and visibility to your work. 13th Month Salary Attractive employee incentives and bonuses Career learning and Development opportunities Pleasant work environment Fun events Techrish Solutions is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, pregnancy, age, marital status, disability or status as a protected veteran Job Type: Full-time Pay: ₹200,000.00 - ₹450,000.00 per year Schedule: Day shift Education: Bachelor's (Preferred) Work Location: In person Application Deadline: 12/07/2025 Expected Start Date: 12/07/2025

Posted 6 days ago

Apply

9.0 years

0 Lacs

Gurgaon

Remote

GlassDoor logo

Job description About this role Are you interested in building innovative technology that crafts the financial markets? Do you like working at the speed of a startup, and solving some of the world’s most exciting challenges? Do you want to work with, and learn from, hands-on leaders in technology and finance? At BlackRock, we are looking for Software Engineers who like to innovate and solve sophisticated problems. We recognize that strength comes from diversity, and will embrace your outstanding skills, curiosity, and passion while giving you the opportunity to grow technically and as an individual. We invest and protect over $11.6 trillion (USD) of assets and have an extraordinary responsibility to our clients all over the world. Our technology empowers millions of investors to save for retirement, pay for college, buy a home, and improve their financial well-being. Being a technologist at BlackRock means you get the best of both worlds: working for one of the most sophisticated financial companies and being part of a software development team responsible for next generation technology and solutions. What are Aladdin and Aladdin Engineering? You will be working on BlackRock's investment operating system called Aladdin. Aladdin is used both internally within BlackRock and externally by many financial institutions. Aladdin combines sophisticated risk analytics with comprehensive portfolio management, trading, and operations tools on a single platform to power informed decision-making and create a connective tissue for thousands of users investing worldwide. Our development teams reside inside the Aladdin Engineering group. We collaboratively build the next generation of technology that changes the way information, people, and technology intersect for global investment firms. We build and package tools that manage trillions in assets and supports millions of financial instruments. We perform risk calculations and process millions of transactions for thousands of users every day worldwide! Being a member of Aladdin Engineering, you will be: Tenacious: Work in a fast paced and highly complex environment Creative thinker: Analyse multiple solutions and deploy technologies in a flexible way. Great teammate: Think and work collaboratively and communicate effectively. Fast learner: Pick up new concepts and apply them quickly. Responsibilities include: Collaborate with team members in a multi-office, multi-country environment. Deliver high efficiency, high availability, concurrent and fault tolerant software systems. Significantly contribute to development of Aladdin’s global, multi-asset trading platform. Work with product management and business users to define the roadmap for the product. Design and develop innovative solutions to complex problems, identifying issues and roadblocks. Apply validated quality software engineering practices through all phases of development. Ensure resilience and stability through quality code reviews, unit, regression and user acceptance testing, dev ops and level two production support. Be a leader with vision and a partner in brainstorming solutions for team productivity, efficiency, guiding and motivating others. Drive a strong culture by bringing principles of inclusion and diversity to the team and setting the tone through specific recruiting, management actions and employee engagement. Candidate should be able to lead individual projects priorities, deadlines and deliverables using AGILE methodologies. Qualifications: B.E./ B.TECH./ MCA or any other relevant engineering degree from a reputed university. 9+ years of proven experience Skills and Experience: A proven foundation in core Java and related technologies, with OO skills and design patterns Hands-on experience in designing and writing code with object-oriented programming knowledge in Java, Spring, TypeScript, JavaScript, Microservices, Angular , React. Strong knowledge of Open-Source technology stack (Spring, Hibernate, Maven, JUnit, etc.). Exposure to building microservices and APIs ideally with REST, Kafka or gRPC. Experience with relational database and/or NoSQL Database (e.g., Apache Cassandra) Exposure to high scale distributed technology like Kafka, Mongo, Ignite, Redis Track record building high quality software with design-focused and test-driven approaches Great analytical, problem-solving and communication skills Some experience or a real interest in finance, investment processes, and/or an ability to translate business problems into technical solutions. Candidate should have experience leading development teams, projects or being responsible for the design and technical quality of a significant application, system, or component. Ability to form positive relationships with partnering teams, sponsors, and user groups. Nice to have and opportunities to learn: Experience working in an agile development team or on open-source development projects. Experience with optimization, algorithms or related quantitative processes. Experience with Cloud platforms like Microsoft Azure, AWS, Google Cloud Experience with DevOps and tools like Azure DevOps Experience with AI-related projects/products or experience working in an AI research environment. A degree, certifications or opensource track record that shows you have a mastery of software engineering principles. Our benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Job Requisition # R251092

Posted 6 days ago

Apply

6.0 years

2 - 10 Lacs

Gurgaon

On-site

GlassDoor logo

You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. About Enterprise Architecture: Enterprise Architecture is an organization within the Chief Technology Office at American Express and it is a key enabler of the company’s technology strategy. The four pillars of Enterprise Architecture include: Architecture as Code: this pillar owns and operates foundational technologies that are leveraged by engineering teams across the enterprise. Architecture as Design: this pillar includes the solution and technical design for transformation programs and business critical projects which need architectural guidance and support. Governance: this pillar is responsible for defining technical standards, and developing innovative tools that automate controls to ensure compliance. Colleague Enablement: this pillar is focused on colleague development, recognition, training, and enterprise outreach. Responsibilities: Designing, developing, and scalable, secure, and resilient applications and data pipelines Support regulatory audits by providing architectural guidance and documentation as needed. Contribute to enterprise architecture initiatives, domain reviews, and solution architecture. Foster innovation by exploring new tools, frameworks, and design methodologies. Qualifications: Preferably a BS or MS degree in computer science, computer engineering, or other technical discipline 6+ years of software engineering experience with strong proficiency in Java and Node.js. Experience with Python and workflow orchestration tools like Apache Airflow is highly desirable. Proven experience in designing and implementing distributed systems and APIs. Familiarity with cloud platforms (e.g., GCP, AWS) and modern CI/CD pipelines. Ability to write clear architectural documentation and present ideas concisely. Demonstrated success working collaboratively in a cross-functional, matrixed environment. Passion for innovation, problem-solving, and driving technology modernization. Experience with micro services architectures and event driven architecture is preferred. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 6 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Company Description JMAN Group is a growing technology-enabled management consultancy that empowers organizations to create value through data. Founded in 2010, we are a team of 450+ consultants based in London, UK, and a team of 300+ engineers in Chennai, India. Having delivered multiple projects in the US, we are now opening a new office in New York to help us support and grow our US client base. We approach business problems with the mindset of a management consultancy and the capabilities of a tech company. We work across all sectors, and have in depth experience in private equity, pharmaceuticals, government departments and high-street chains. Our team is as cutting edge as our work. We take pride for ourselves on being great to work with – no jargon or corporate-speak, flexible to change and receptive of feedback. We have a huge focus on investing in the training and professional development of our team, to ensure they can deliver high quality work and shape our journey to becoming a globally recognised brand. The business has grown quickly in the last 3 years with no signs of slowing down. Technical specifications 5 + years of experience in data platform builds. Familiarity with multi cloud data warehousing solutions (Snowflake, Redshift, Databricks, Fabric, AWS Glue, Azure Data Factory, Synapse, Matillion,DBT ). Proficient in SQL, Apache Spark / Python programming languages. Good to have skills includes Data visualization using Power BI, Tableau, or Looker, and familiarity with full-stack technologies. Experience with containerization technologies (e.g., Docker, Kubernetes) Experience with CI/CD pipelines and DevOps methodologies. Ability to work independently, adapt to changing priorities, and learn new technologies quickly. Experience in implementing or working with data governance frameworks and practices to ensure data integrity and regulatory compliance. Knowledge of data quality tools and practices. Responsibilities Design and implement data pipelines using ETL/ELT tools and techniques. Configure and manage data storage solutions, including relational databases, data warehouses, and data lakes. Develop and implement data quality checks and monitoring processes. Automate data platform deployments and operations using scripting and DevOps tools (e.g., Git, CI/CD pipeline). Ensuring compliance with data governance and security standards throughout the data platform development process. Troubleshoot and resolve data platform issues promptly and effectively. Collaborate with the Data Architect to understand data platform requirements and design specifications. Assist with data modelling and optimization tasks. Work with business stakeholders to translate their needs into technical solutions. Document the data platform architecture, processes, and best practices. Stay up to date with the latest trends and technologies in full stack development, data engineering, and DevOps. Proactively suggest improvements and innovations for the data platform. Requirements Required Skillset: ETL or ELT : AWS Glue/ Azure Data Factory/ Synapse/ Matillion/dbt. Data Warehousing : Azure SQL Server/Redshift/Big Query/Databricks/Snowflake/fabric (Anyone - Mandatory). Data Visualization : Looker, Power BI, Tableau. SQL and Apache Spark / Python programming languages Containerization technologies (e.g., Docker, Kubernetes) Cloud Experience : AWS/Azure/GCP. Scripting and DevOps tools (e.g., Git, CI/CD pipeline) Show more Show less

Posted 6 days ago

Apply

8.0 years

0 Lacs

India

On-site

Linkedin logo

Coursera was launched in 2012 by Andrew Ng and Daphne Koller, with a mission to provide universal access to world-class learning. It is now one of the largest online learning platforms in the world, with 175 million registered learners as of March 31, 2025. Coursera partners with over 350 leading universities and industry leaders to offer a broad catalog of content and credentials, including courses, Specializations, Professional Certificates, and degrees. Coursera’s platform innovations enable instructors to deliver scalable, personalized, and verified learning experiences to their learners. Institutions worldwide rely on Coursera to upskill and reskill their employees, citizens, and students in high-demand fields such as GenAI, data science, technology, and business. Coursera is a Delaware public benefit corporation and a B Corp. Join us in our mission to create a world where anyone, anywhere can transform their life through access to education. We're seeking talented individuals who share our passion and drive to revolutionize the way the world learns. At Coursera, we are committed to building a globally diverse team and are thrilled to extend employment opportunities to individuals in any country where we have a legal entity. We require candidates to possess eligible working rights and have a compatible timezone overlap with their team to facilitate seamless collaboration. Coursera has a commitment to enabling flexibility and workspace choices for employees. Our interviews and onboarding are entirely virtual, providing a smooth and efficient experience for our candidates. As an employee, we enable you to select your main way of working, whether it's from home, one of our offices or hubs, or a co-working space near you. About The Role Coursera is seeking a highly skilled and motivated Senior AI Specialist to join our team. This individual will play a pivotal role in developing and deploying advanced AI solutions that enhance our platform and transform the online learning experience. The ideal candidate has 5–8 years of experience , combining deep technical expertise with strong leadership and collaboration skills. This is a unique opportunity to work on cutting-edge projects in AI/ML, including recommendation systems, predictive analytics, and content optimization. We’re looking for someone who is not only a strong individual contributor but also capable of mentoring others and influencing technical direction across teams. Key Responsibilities Deploy and customize AI/ML solutions using platforms such as Google AI, AWS SageMaker, and other cloud-based tools. Design, implement, and optimize models for predictive analytics, semantic parsing, topic modeling, and information extraction. Enhance customer journey analytics to identify actionable insights and improve user experience across Coursera’s platform. Build and maintain AI pipelines for data ingestion, curation, training, evaluation, and model monitoring. Conduct advanced data preprocessing and cleaning to ensure high-quality model inputs. Analyze large-scale datasets (e.g., customer reviews, usage logs) to improve recommendation systems and platform features. Evaluate and improve the quality of video and audio content using AI-based techniques. Collaborate cross-functionally with product, engineering, and data teams to integrate AI solutions into user-facing applications. Support and mentor team members in AI/ML best practices and tools. Document workflows, architectures, and troubleshooting steps to support long-term scalability and knowledge sharing. Stay current with emerging AI/ML trends and technologies, advocating for their adoption where applicable. Qualifications Education Bachelor’s degree in Computer Science, Machine Learning, or a related technical field (required). Master’s or PhD preferred. Experience 5–8 years of experience in AI/ML development with a strong focus on building production-grade models and pipelines. Proven track record in deploying scalable AI solutions using platforms like Google Vertex AI, AWS SageMaker, Microsoft Azure, or Databricks. Strong experience with backend integration, API development, and cloud-native services. Technical Skills Programming: Advanced proficiency in Python (including libraries like TensorFlow, PyTorch, Scikit-learn). Familiarity with Java or similar languages is a plus. Data Engineering: Expertise in handling large datasets using PySpark, AWS Glue, Apache Airflow, and S3. Databases: Solid experience with both SQL (PostgreSQL, MySQL) and NoSQL (MongoDB, DynamoDB) systems. Cloud: Hands-on experience with cloud platforms (AWS, GCP) and tools like Vertex AI, SageMaker, BigQuery, Lambda, etc. Soft Skills & Leadership Attributes (Senior Engineer Level) Technical leadership: Ability to drive end-to-end ownership of AI/ML projects—from design through deployment and monitoring. Collaboration: Skilled at working cross-functionally with product managers, engineers, and stakeholders to align on priorities and deliver impactful solutions. Mentorship: Experience mentoring junior engineers and fostering a culture of learning and growth within the team. Communication: Clear communicator who can explain complex technical concepts to non-technical stakeholders. Problem-solving: Proactive in identifying challenges and proposing scalable, maintainable solutions. Adaptability: Comfortable working in a fast-paced, evolving environment with changing priorities and goals. Coursera is an Equal Employment Opportunity Employer and considers all qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, age, marital status, national origin, protected veteran status, disability, or any other legally protected class. If you are an individual with a disability and require a reasonable accommodation to complete any part of the application process, please contact us at accommodations@coursera.org. For California Candidates, please review our CCPA Applicant Notice here. For our Global Candidates, please review our GDPR Recruitment Notice here. Show more Show less

Posted 6 days ago

Apply

1.0 years

0 - 0 Lacs

Gurgaon

On-site

GlassDoor logo

We are looking to hire a skilled WordPress developer with PHP to design and implement attractive and functional websites with customization for our clients. You will be responsible for both back-end and front-end development including the implementation of WordPress themes and plugins as well as site integration and security updates. You should have in-depth knowledge of front-end programming languages, a good eye for aesthetics, and strong content management skills. You will be responsible to create attractive, user-friendly websites that perfectly meet the design and functionality specifications of the client. Design and implement websites for companies using the WordPress creation tool. Job Role: Work with development teams and product managers to ideate software solutions Meeting with clients to discuss website design and function. Designing and building the website front-end. Creating the website architecture. Designing and managing the website back-end including database and server integration. Generating WordPress themes and plugins. Conducting website performance tests. Run SEO on website to improve website ranking Troubleshooting content issues. Conducting WordPress training with the client. Monitoring the performance of the live website. Skills Requirements: Proven at least 1 year work experience as a WordPress developer with PHP, WordPress Customization and SEO Experience of front-end technologies including CSS3, JavaScript, HTML5, and jQuery. Knowledge of code versioning tools including Git, Mercurial, and SVN. Experience working with debugging tools such as Chrome Inspector and Firebug. Experience in On-page and Off page SEO, SEO keywords research Good understanding of website architecture and aesthetics. Ability to manage projects. Excellent communication and teamwork skills Great attention to detail Organizational skills An analytical mind Problem-solving skills Good to Have: Familiarity with databases (e.g. MySQL, MongoDB), web servers (e.g. Apache) and UI/UX design Certification in website development field Education: Associate’s or bachelor’s degree in Computer Science, or similar field Job Types: Full-time, Permanent Pay: ₹15,000.00 - ₹25,000.00 per month Benefits: Leave encashment Schedule: Day shift Supplemental Pay: Yearly bonus Education: Bachelor's (Required) Experience: total work: 1 year (Required) WordPress with PHP: 1 year (Required) Work Location: In person

Posted 6 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Gurgaon

On-site

GlassDoor logo

Ahom Technologies Pvt Ltd is looking for Python Developers Who we are AHOM Technologies Private Limited is a specialized Web Development Company based out at Gurgaon, India. We provide high quality and professional software services to the clients residing across the globe. Our professionals have been working with clients of India as well as from International origin. Based in Gurugram, India, we have a proven track record of catering to clients across the globe, including the USA, UK, and Australia. Our team of experts brings extensive experience in providing top-notch solutions to diverse clientele, ensuring excellence in every project What you’ll be doing We are seeking an experienced Python Developer with a strong background in Databricks to join our data engineering and analytics team. The ideal candidate will play a key role in building and maintaining scalable data pipelines and analytical platforms using Python and Databricks, with an emphasis on performance and cloud integration. You will be responsible for: · Design, develop, and maintain scalable Python applications for data processing and analytics. · Build and manage ETL pipelines using Databricks on Azure/AWS cloud platforms. · Collaborate with analysts and other developers to understand business requirements and implement data-driven solutions. · Optimize and monitor existing data workflows to improve performance and scalability. · Write clean, maintainable, and testable code following industry best practices. · Participate in code reviews and provide constructive feedback. · Maintain documentation and contribute to project planning and reporting. What skills & experience you’ll bring to us · Bachelor's degree in Computer Science, Engineering, or related field · Prior experience as a Python Developer or similar role, with a strong portfolio showcasing your past projects. · 4-6 years of Python experience · Strong proficiency in Python programming. · Hands-on experience with Databricks platform (Notebooks, Delta Lake, Spark jobs, cluster configuration, etc.). · Good knowledge of Apache Spark and its Python API (PySpark). · Experience with cloud platforms (preferably Azure or AWS) and working with Databricks on cloud. · Familiarity with data pipeline orchestration tools (e.g., Airflow, Azure Data Factory, etc.). · Strong understanding of database systems (SQL/NoSQL) and data modeling. · Strong communication skills and ability to collaborate effectively with cross-functional teams Want to apply? Get in touch today We’re always excited to hear from passionate individuals ready to make a difference and join our team, we’d love to connect. Reach out to us through our email: shubhangi.chandani@ahomtech.com and hr@ahomtech.com — and let’s start the conversation. *Immediate joiners need only apply *Candidates from Delhi NCR are preferred Job Type: Full-time Pay: ₹600,000.00 - ₹800,000.00 per year Benefits: Provident Fund Schedule: Day shift Application Question(s): We want to fill this position urgently. Are you an immediate joiner? Do you have hands-on experience with Databricks platform (Notebooks, Delta Lake, Spark jobs, cluster configuration, etc.)? Do you have experience with cloud platforms (preferably Azure or AWS) and working with Databricks on cloud? Work Location: In person Application Deadline: 15/06/2025 Expected Start Date: 18/06/2025

Posted 6 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Full-time Job Description Responsible to assemble large, complex sets of data that meet non-functional and functional business requirements. Responsible to identify, design and implement internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes. Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using Azure, Databricks and SQL technologies Responsible for the transformation of conceptual algorithms from R&D into efficient, production ready code. The data developer must have a strong mathematical background in order to be able to document and maintain the code Responsible for integrating finished models into larger data processes using UNIX scripting languages such as ksh, Python, Spark, Scala, etc. Produce and maintain documentation for released data sets, new programs, shared utilities, or static data. This must be done within department standards Ensure quality deliverables to clients by following existing quality processes, manually calculating comparison data, developing statistical pass/fail testing, and visually inspecting data for reasonableness: the requirement is on-time with zero defects Qualifications Education/Training B.E./B.Tech. with a major in Computer Science, BIS, CIS, Electrical Engineering, Operations Research or some other technical field. Course work or experience in Numerical Analysis, Mathematics or Statistics is a plus Hard Skills Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Programming experience in Python, SQL, Scala Direct experience of building data pipelines using Apache Spark (preferably in Databricks), Airflow. Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake Experience with big data technologies (Hadoop) Databricks & Azure Big Data Architecture Certification would be plus Must be team oriented with strong collaboration, prioritization, and adaptability skills required Ability to write highly efficient code in terms of performance / memory utilization Basic knowledge of SQL; capable of handling common functions Experience Minimum 5 -8 year of experience as Data engineer Experience modeling or manipulating large amounts of data is a plus Experience with Demographic, Retail business is a plus Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion I'm interested I'm interested Privacy Policy Show more Show less

Posted 6 days ago

Apply

0 years

4 - 6 Lacs

India

On-site

GlassDoor logo

Responsibilities: Develop and maintain server-side applications using Node.js. Design, implement, and optimize APIs, ensuring seamless integration with third-party services. Work with message brokers like RabbitMQ, Kafka, or similar technologies for asynchronous communication. Integrate and manage payment gateway systems (e.g., Stripe, PayPal, Razorpay, etc.) to facilitate secure transactions. Containerize applications using Docker for consistent environments across development, testing, and production. Collaborate with front-end developers to ensure smooth integration between front-end and back-end. Troubleshoot, optimize, and refactor code for performance, scalability, and maintainability. Work on deployment pipelines, automate testing, and assist in continuous integration (CI) and continuous delivery (CD). Ensure application security, stability, and performance in production environments. Required Skills: Node.js and experience with frameworks like Express.js. Proficient in JavaScript and asynchronous programming. Experience with RESTful APIs and integration of third-party APIs. Docker: Experience with containerization, Docker images, and managing containerized applications. Experience with message brokers like RabbitMQ, Apache Kafka, or similar. Integration of payment gateways such as Stripe, PayPal, Razorpay, or other platforms. Experience with databases like MongoDB, MySQL, or PostgreSQL. Familiarity with front-end technologies such as HTML, CSS, and JavaScript frameworks. Understanding of version control systems like Git. Experience with unit testing and debugging Node.js applications. Desired Skills: Experience with TypeScript. Familiarity with cloud platforms like AWS, Azure, or Google Cloud. Knowledge of GraphQL and real-time communication technologies (e.g., WebSockets). Familiarity with orchestration tools like Kubernetes. Experience with task runners such as Gulp, Grunt, or Webpack. Basic knowledge of CI/CD pipelines and automated testing. Job Type: Full-time Pay: ₹400,000.00 - ₹600,000.00 per year Schedule: Day shift Monday to Friday Weekend availability Work Location: In person Application Deadline: 30/06/2025

Posted 6 days ago

Apply

2.0 years

5 Lacs

Mohali

On-site

GlassDoor logo

Responsibilities: ● Lead and mentor a group of System Support Engineers. ● Ensure the team's adherence to documented procedures, quality standards, and SLAs for managed systems. ● Coordinate daily operational tasks, including ticket assignments and workload distribution for system support. ● Act as a senior technical resource and escalation point for the nearshore team regarding system-level issues. ● Assist the ScaleSec Sr. TAM with operational reporting, performance analysis, and service improvement initiatives for managed systems. ● Foster a collaborative and high-performing team environment. ● Operate during US business hours to align with Customer support needs for its systems. Job Activities: ● Perform hands-on advanced technical troubleshooting and resolution for complex incidents related to managed systems such as SQL database systems, Apache Tomcat systems, IIS systems, Tivoli LDAP systems, AWS OS (Linux/Windows), AWS Networking, and Cloud Data Backup systems. ● Oversee and participate in change management activities for managed systems, including patching and upgrades. ● Review and approve technical documentation and knowledge base articles related to system support created by the team. ● Monitor team performance and provide regular feedback and coaching. ● Participate in incident reviews and contribute to problem management for system-related issues. ● Collaborate with the Alternate Team Lead to ensure consistent team coverage and leadership for system support. ● Generate and contribute to operational reports for the Sr. TAM concerning managed system performance. Required Education: ● Bachelor's degree in Computer Science, Information Technology, or a related technical field, or equivalent professional experience. Required Qualifications and Experience: ● Typically 2+ years of hands-on experience in IT system support, infrastructure management, or a similar technical role. ● Proven experience in a technical lead or senior engineer capacity, guiding or mentoring other engineers in system support. ● Strong technical proficiency in several of the core technologies and systems managed under this SOW: ○ Database system administration and support (e.g., SQL Server). ○ Application server system support (e.g., Apache Tomcat, IIS). ○ Web server system management. ○ LDAP system services (e.g., Tivoli). ○ AWS cloud infrastructure (EC2, VPC, S3, AWS Backup, Linux and Windows OS management). ● Excellent problem-solving, analytical, and critical thinking skills for system-level issues. ● Strong communication and interpersonal skills, with the ability to explain technical system issues clearly. ● Experience with IT service management tools for ticketing and monitoring of systems. Job Type: Full-time Pay: From ₹500,000.00 per year Shift: Night shift Work Days: Monday to Friday Work Location: In person

Posted 6 days ago

Apply

0 years

0 - 0 Lacs

Mohali

On-site

GlassDoor logo

Server Admin Intern (DevOps) Job Type: Full-Time | Permanent Preferred Certifications/Trainings for Server Admin Intern (DevOps) Role : Red Hat Certified System Administrator (RHCSA) or Cisco Certified Network Associate (CCNA) or AWS Certified Cloud Practitioner (Beginner Level – Highly Recommended) Salary: ₹12,000 – ₹15,000 per month Location: On-site (In-person Interview Only) Schedule: Day Shift / Morning Shift About the Role: We are seeking a Server Admin Intern (DevOps) with certification in Red Hat Certified System Administrator (RHCSA) or Cisco Certified Network Associate (CCNA) or AWS Certified Cloud Practitioner (Beginner Level – Highly Recommended) who is passionate about system administration and eager to work with a dynamic and collaborative team. This is a technical position focused on managing and supporting servers, operating systems, storage, and networking services in a real-time environment. Key Responsibilities: Manage and maintain servers and networking services Assist with deployment, configuration, and troubleshooting of Linux/Windows operating systems Monitor and maintain system security, including firewalls, proxies, and database protections Provide technical support to team members and assist non-IT users with basic issues Collaborate with the DevOps and IT teams to improve system reliability and scalability Required Skills & Knowledge: Fundamentals of Networking: TCP/IP, DNS, DHCP, LAN/WAN Operating Systems: Hands-on with Linux (preferred) Cloud Services: Familiarity with Amazon Web Services (AWS); knowledge of Azure is a plus Security: Basic understanding of firewalls, proxies, and system security Scripting & Tools: PHP and Linux-based applications Git (version control) Docker (added advantage) Apache and Load Balancing configuration Preferred Qualities: Strong communication skills to support both technical and non-technical users Eagerness to learn and grow in a fast-paced IT environment Proactive approach to problem-solving and system optimization This profile has a face-to-face interview process and on-site role. Therefore candidates from South India, Delhi and East India please don't bother to apply. Job Types: Full-time, Permanent, Fresher Pay: ₹12,000.00 - ₹15,000.00 per month Benefits: Flexible schedule Health insurance Leave encashment Life insurance Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Morning shift Supplemental Pay: Performance bonus Yearly bonus Location: Mohali, Punjab (Required) Work Location: In person Speak with the employer +91 9041792888

Posted 6 days ago

Apply

5.0 - 6.0 years

5 - 10 Lacs

India

On-site

GlassDoor logo

Job Summary: We are seeking a highly skilled Python Developer to join our team. Key Responsibilities: Design, develop, and deploy Python applications Work independently on machine learning model development, evaluation, and optimization. Implement scalable and efficient algorithms for predictive analytics and automation. Optimize code for performance, scalability, and maintainability. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Integrate APIs and third-party tools to enhance functionality. Document processes, code, and best practices for maintainability. Required Skills & Qualifications: 5-6 years of professional experience in Python application development. Proficiency in Python libraries such as Pandas, NumPy, SciPy, and Matplotlib. Experience with SQL and NoSQL databases (PostgreSQL, MongoDB, etc.). Hands-on experience with big data technologies (Apache Spark, Delta Lake, Hadoop, etc.). Strong experience in developing APIs and microservices using FastAPI, Flask, or Django. Good understanding of data structures, algorithms, and software development best practices. Strong problem-solving and debugging skills. Ability to work independently and handle multiple projects simultaneously. Good to have - Working knowledge of cloud platforms (Azure/AWS/GCP) for deploying ML models and data applications. Job Type: Full-time Pay: ₹500,000.00 - ₹1,000,000.00 per year Schedule: Fixed shift Work Location: In person Application Deadline: 30/06/2025 Expected Start Date: 01/07/2025

Posted 6 days ago

Apply

8.0 years

6 - 8 Lacs

Chennai

On-site

GlassDoor logo

Develop, test, and deploy data processing applications using Apache Spark and Scala. Optimize and tune Spark applications for better performance on large-scale data sets. Work with the Cloudera Hadoop ecosystem (e.g., HDFS, Hive, Impala, HBase, Kafka) to build data pipelines and storage solutions. Collaborate with data scientists, business analysts, and other developers to understand data requirements and deliver solutions. Design and implement high-performance data processing and analytics solutions. Ensure data integrity, accuracy, and security across all processing tasks. Troubleshoot and resolve performance issues in Spark, Cloudera, and related technologies. Implement version control and CI/CD pipelines for Spark applications. Required Skills & Experience: Minimum 8 years of experience in application development. Strong hands on experience in Apache Spark, Scala, and Spark SQL for distributed data processing. Hands-on experience with Cloudera Hadoop (CDH) components such as HDFS, Hive, Impala, HBase, Kafka, and Sqoop. Familiarity with other Big Data technologies, including Apache Kafka, Flume, Oozie, and Nifi. Experience building and optimizing ETL pipelines using Spark and working with structured and unstructured data. Experience with SQL and NoSQL databases such as HBase, Hive, and PostgreSQL. Knowledge of data warehousing concepts, dimensional modeling, and data lakes. Ability to troubleshoot and optimize Spark and Cloudera platform performance. Familiarity with version control tools like Git and CI/CD tools (e.g., Jenkins, GitLab).

Posted 6 days ago

Apply

5.0 years

8 - 10 Lacs

Chennai

On-site

GlassDoor logo

Job Description Responsible to assemble large, complex sets of data that meet non-functional and functional business requirements. Responsible to identify, design and implement internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes. Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using Azure, Databricks and SQL technologies Responsible for the transformation of conceptual algorithms from R&D into efficient, production ready code. The data developer must have a strong mathematical background in order to be able to document and maintain the code Responsible for integrating finished models into larger data processes using UNIX scripting languages such as ksh, Python, Spark, Scala, etc. Produce and maintain documentation for released data sets, new programs, shared utilities, or static data. This must be done within department standards Ensure quality deliverables to clients by following existing quality processes, manually calculating comparison data, developing statistical pass/fail testing, and visually inspecting data for reasonableness: the requirement is on-time with zero defects Qualifications Education/Training B.E./B.Tech. with a major in Computer Science, BIS, CIS, Electrical Engineering, Operations Research or some other technical field. Course work or experience in Numerical Analysis, Mathematics or Statistics is a plus Hard Skills Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Programming experience in Python, SQL, Scala Direct experience of building data pipelines using Apache Spark (preferably in Databricks), Airflow. Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake Experience with big data technologies (Hadoop) Databricks & Azure Big Data Architecture Certification would be plus Must be team oriented with strong collaboration, prioritization, and adaptability skills required Ability to write highly efficient code in terms of performance / memory utilization Basic knowledge of SQL; capable of handling common functions Experience Minimum 5 -8 year of experience as Data engineer Experience modeling or manipulating large amounts of data is a plus Experience with Demographic, Retail business is a plus Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion

Posted 6 days ago

Apply

3.0 years

4 - 7 Lacs

Chennai

On-site

GlassDoor logo

- 3+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with SQL Amazon Retail Financial Intelligence Systems is seeking a seasoned and talented Senior Data Engineer to join the Fortune Platform team. Fortune is a fast growing team with a mandate to build tools to automate profit-and-loss forecasting and planning for the Physical Consumer business. We are building the next generation Business Intelligence solutions using big data technologies such as Apache Spark, Hive/Hadoop, and distributed query engines. As a Data Engineer in Amazon, you will be working in a large, extremely complex and dynamic data environment. You should be passionate about working with big data and are able to learn new technologies rapidly and evaluate them critically. You should have excellent communication skills and be able to work with business owners to translate business requirements into system solutions. You are a self-starter, comfortable with ambiguity, and working in a fast-paced and ever-changing environment. Ideally, you are also experienced with at least one of the programming languages such as Java, C++, Spark/Scala, Python, etc. Major Responsibilities: - Work with a team of product and program managers, engineering leaders, and business leaders to build data architectures and platforms to support business - Design, develop, and operate high-scalable, high-performance, low-cost, and accurate data pipelines in distributed data processing platforms - Recognize and adopt best practices in data processing, reporting, and analysis: data integrity, test design, analysis, validation, and documentation - Keep up to date with big data technologies, evaluate and make decisions around the use of new or existing software products to design the data architecture - Design, build and own all the components of a high-volume data warehouse end to end. - Provide end-to-end data engineering support for project lifecycle execution (design, execution and risk assessment) - Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers - Interface with other technology teams to extract, transform, and load (ETL) data from a wide variety of data sources - Own the functional and nonfunctional scaling of software systems in your ownership area. - Implement big data solutions for distributed computing. Key job responsibilities As a DE on our team, you will be responsible for leading the data modelling, database design, and launch of some of the core data pipelines. You will have significant influence on our overall strategy by helping define the data model, drive the database design, and spearhead the best practices to delivery high quality products. About the team Profit intelligence systems measures, predicts true profit(/loss) for each item as a result of a specific shipment to an Amazon customer. Profit Intelligence is all about providing intelligent ways for Amazon to understand profitability across retail business. What are the hidden factors driving the growth or profitability across millions of shipments each day? We compute the profitability of each and every shipment that gets shipped out of Amazon. Guess what, we predict the profitability of future possible shipments too. We are a team of agile, can-do engineers, who believe that not only are moon shots possible but that they can be done before lunch. All it takes is finding new ideas that challenge our preconceived notions of how things should be done. Process and procedure matter less than ideas and the practical work of getting stuff done. This is a place for exploring the new and taking risks. We push the envelope in using cloud services in AWS as well as the latest in distributed systems, forecasting algorithms, and data mining. Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 6 days ago

Apply

7.0 - 12.0 years

0 Lacs

Mysore, Karnataka, India

On-site

Linkedin logo

Job Responsibilities Develop teaching materials including exercises & assignments Conduct classroom training / virtual training Design assessments for various proficiency levels in a given competency Enhance course material & course delivery based on feedback to improve training effectiveness Gather feedback from stakeholders, identify actions based on feedback and implement changes Location: Mysore, Mangalore, Bangalore, Chennai, Pune, Hyderabad, Chandigarh Description of the Profile We are looking for trainers with 7 to 12 years of teaching experience and technology know-how in one or more of the following areas: Java – Java programming, Spring, Angular / React, Bootstrap Microsoft – C# programming, SQL Server, ADO.NET, ASP.NET, MVC design pattern, Azure, MS Power platforms, MS Dynamics 365 CRM, MS Dynamics 365 ERP, SharePoint Testing – Selenium, Microfocus UFT, Microfocus-ALM tools, SOA testing, SOAPUI, Rest assured, Appium Big Data – Python programming, Hadoop, Spark, Scala, Mongo DB, NoSQL SAP – SAP ABAP programming / SAP MM / SAP SD /SAP BI / SAP S4 HANA Oracle – Oracle E-Business Suite (EBS) / PeopleSoft / Siebel CRM / Oracle Cloud / OBIEE / Fusion Middleware API and integration – API, Microservices, TIBCO, APIGee, Mule Digital Commerce – SalesForce, Adobe Experience Manager Digital Process Automation PEGA, Appian, Camunda, Unqork, UIPath MEAN / MERN stacks Business Intelligence – SQL Server, ETL using SQL Server, Analysis using SQL Server, Enterprise reporting using SQL, Visualization Data Science – Python for data science, Machine learning, Exploratory data analysis, Statistics & Probability Cloud & Infrastructure Management – Network administration / Database administration / Windows administration / Linux administration / Middleware administration / End User Computing / ServiceNow Cloud platforms like AWS / GCP/ Azure / Oracle Cloud, Virtualization Cybersecurity Infra Security / Identity & Access Management / Application Security / Governance & Risk Compliance / Network Security Mainframe – COBOL, DB2, CICS, JCL Open source – Python, PHP, Unix / Linux, MySQL, Apache, HTML5, CSS3, JavaScript Show more Show less

Posted 6 days ago

Apply

12.0 years

5 - 6 Lacs

Noida

On-site

GlassDoor logo

Description Job Title: Solution Architect Designation : Senior Company: Hitachi Rail GTS India Location: Noida, UP, India Salary: As per Industry Company Overview: Hitachi Rail is right at the forefront of the global mobility sector following the acquisition. The closing strengthens the company's strategic focus on helping current and potential Hitachi Rail and GTS customers through the sustainable mobility transition – the shift of people from private to sustainable public transport, driven by digitalization. Position Overview: We are looking for a Solution Architect that will be responsible for translating business requirements into technical solutions, ensuring the architecture is scalable, secure, and aligned with enterprise standards. Solution Architect will play a crucial role in defining the architecture and technical direction of the existing system. you will be responsible for the design, implementation, and deployment of solutions that integrate with transit infrastructure, ensuring seamless fare collection, real-time transaction processing, and enhanced user experiences. You will collaborate with development teams, stakeholders, and external partners to create scalable, secure, and highly available software solutions. Job Roles & Responsibilities: Architectural Design : Develop architectural documentation such as solution blueprints, high-level designs, and integration diagrams. Lead the design of the system's architecture, ensuring scalability, security, and high availability. Ensure the architecture aligns with the company's strategic goals and future vision for public transit technologies. Technology Strategy : Select the appropriate technology stack and tools to meet both functional and non-functional requirements, considering performance, cost, and long-term sustainability. System Integration : Work closely with teams to design and implement the integration of the AFC system with various third-party systems (e.g., payment gateways, backend services, cloud infrastructure). API Design & Management : Define standards for APIs to ensure easy integration with external systems, such as mobile applications, ticketing systems, and payment providers. Security & Compliance : Ensure that the AFC system meets the highest standards of data security, particularly for payment information, and complies with industry regulations (e.g., PCI-DSS, GDPR). Stakeholder Collaboration : Act as the technical lead during project planning and discussions, ensuring the design meets customer and business needs. Technical Leadership : Mentor and guide development teams through best practices in software development and architectural principles. Performance Optimization : Monitor and optimize system performance to ensure the AFC system can handle high volumes of transactions without compromise. Documentation & Quality Assurance : Maintain detailed architecture documentation, including design patterns, data flow, and integration points. Ensure the implementation follows best practices and quality standards. Research & Innovation : Stay up to date with the latest advancements in technology and propose innovative solutions to enhance the AFC system. Skills: 1. Equipment Programming Languages DotNet (C#), C/C++, Java, Python 2. Web Development Frameworks ASP.NET Core (C#), Angular 3. Microservices & Architecture Spring Cloud, Docker, Kubernetes, Istio, Apache Kafka, RabbitMQ, Consul, GraphQL 4. Cloud Platforms Amazon Web Services (AWS) Google Cloud Platform (GCP) Microsoft Azure Kubernetes on Cloud (e.g., AWS EKS, GCP GKE) Terraform (Infrastructure as Code) 5. Databases Relational Databases (SQL) NoSQL Databases Data Warehousing 6. API Technologies SOAP/RESTful API Design GraphQL gRPC OpenAPI / Swagger (API Documentation) 7. Security Technologies OAuth2 / OpenID Connect (Authentication & Authorization) JWT (JSON Web Tokens) SSL/TLS Encryption OWASP Top 10 (Security Best Practices) Vault (Secret Management) Keycloak (Identity & Access Management) 8. Design & Architecture Tools UML (Unified Modeling Language) Lucidchart / Draw.io (Diagramming) PlantUML (Text-based UML generation) C4 Model (Software architecture model) Enterprise Architect (Modeling) 9. Miscellaneous Tools & Frameworks Apache Hadoop / Spark (Big Data) Elasticsearch (Search Engine) Apache Kafka (Stream Processing) TensorFlow / PyTorch (Machine Learning/AI) Redis (Caching & Pub/Sub) DevOps & CI/CD Tools Education: Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. Experience Required: 12+ years of experience in solution architecture or software design. Proven experience with enterprise architecture frameworks (e.g., TOGAF, Zachman). Strong understanding of cloud platforms (AWS, Azure, or Google Cloud). Experience in system integration, API design, microservices, and SOA. Familiarity with data modeling and database technologies (SQL, NoSQL). Strong communication and stakeholder management skills. Preferred: Certification in cloud architecture (e.g., AWS Certified Solutions Architect, Azure Solutions Architect Expert). Experience with DevOps tools and CI/CD pipelines. Knowledge of security frameworks and compliance standards (e.g., ISO 27001, GDPR). Experience in Agile/Scrum environments. Domain knowledge in [insert industry: e.g., finance, transportation, healthcare]. Soft Skills: Analytical and strategic thinking. Excellent problem-solving abilities. Ability to lead and mentor cross-functional teams. Strong verbal and written communication.

Posted 6 days ago

Apply

0 years

3 - 6 Lacs

Ghaziabad

On-site

GlassDoor logo

We're not your average tech company! Rightcrowd is a global leader in keeping people safe and organizations secure. We build smart solutions that manage who's on-site, what access they have, and ensure everything is compliant. Think big names – Fortune 50 and ASX 10 companies rely on us to tackle their toughest security challenges. We're a passionate, global team with offices in places like Australia, the USA, Belgium, India, and the Philippines, and we're on a mission to make the world a safer place, one clever line of code at a time. Your Quest, Should You Choose to Accept It (aka What You'll Do): Become a Data Alchemist: Dive deep into our data troves (especially those rich MS SQL databases), transforming raw information into actionable insights. Craft Visual Masterpieces: Use your Apache Superset skills to build stunning, intuitive, and delightful dashboards and reports that tell compelling data stories. Be the SQL Sorcerer: Write, optimize, and troubleshoot complex MS SQL queries with a focus on performance and efficiency. Collaborate with a Fellowship of Innovators: Work alongside product managers, engineers, and stakeholders to understand their data needs and bring them to life. Champion Data-Driven Decisions: Create reports and dashboards that guide strategic decisions and unlock new levels of understanding for our clients. Uncover Hidden Trends & Patterns: Discover insights that others might miss to help continuously improve our cutting-edge security solutions. Keep the Data Flowing: Contribute to the design and maintenance of reporting data structures, ensuring data quality and integrity. Embrace the Adventure: Explore new analytics techniques and push the boundaries of data in the physical security world. The Tools & Talents You'll Bring to the Table (aka Skills & Qualifications): MS SQL Mastery: Expertise in MS SQL, including designing schemas, writing complex queries, stored procedures, and optimizing performance. Apache Superset Savvy: Proven experience building, customizing, and deploying insightful dashboards and reports using Apache Superset. A Love for Data Storytelling: Ability to craft narratives that make data easy to understand and inspire action. Analytical Prowess & Problem-Solving Grit: Strong skills in dissecting complex problems and identifying key data points. Excellent Communicator: Ability to explain complex data concepts to both technical and non- technical audiences. Team Player Spirit: Thrive in collaborative environments, sharing knowledge and learning from others. Bonus Points (Nice-to-Haves): Experience with other BI tools like Tableau or Power BI. Knowledge of ETL processes and tools. Familiarity with data warehousing concepts. Python scripting skills for data manipulation or automation. An understanding of the security industry (physical or cyber). Why Join Us? Make a Real-World Impact: Your work directly contributes to enhancing safety and security for major organizations worldwide. Join a League of Extraordinary People: Work with a passionate, global team dedicated to innovation and excellence. Level Up Your Skills: Sharpen your existing talents and learn new ones in a supportive environment. Unearth Epic Rewards: Competitive salary, great benefits, and the chance to work with a company that leads its field. Ready to unleash your data powers for good? Don't just send a resume—send us a sign you're the Data Wizard we've been searching for! Apply now and let's build a safer, smarter future together. #DataAnalytics #Reporting #MSSQL #ApacheSuperset #BIdeveloper #TechJobs #Hiring

Posted 6 days ago

Apply

0 years

0 Lacs

Calcutta

On-site

GlassDoor logo

Join our Team About this opportunity: We are looking for and experienced Java Developer or Architect with strong technical expertise to design and lead development of scalable, high performance Java applications. The ideal candidate should have in depth understanding of Java/J2ee technologies, Design Pattern, Microservice Architecture, Docker & Kubernetes, Integration Framework. This role requires design skills, excellent problem-solving skills, and the ability to collaborate with cross-functional teams, including DevOps, and Front-End developers. What you will do: Architect, design, and implement back-end solutions using Java/J2ee, Spring MVC, Spring Boot and related frameworks. Design, develop and maintain scalable Java components using REST or SOAP based Web Services  Design & develop enterprise solution with Messaging or Streaming Framework like ActiveMQ, HornetQ & Kafka  Work with Integration Framework like Apache Camel/Jboss Fuse/Mule ESB/EAI/Spring Integration.  Make effective use of Caching Technologies (like Hazlecast /Redis /Infinispan /EHCache /MemCache) in application to handle large volume of data set.  Deploy the application in Middleware or App Server (like Jboss/Weblogic/tomcat)  Collaborate with the DevOps team to manage builds and CI/CD pipelines using Jira, GitLab, Sonar and other tools. The skills you bring: Strong expertise in Java/J2ee, Springboot & Micriservices.  Good understanding of Core Java concepts (like Collections Framework, Object Oriented Design)  Experience in working with Multithreading Concepts (like Thread Pool, Executor Service, Future Task, Concurrent API, Countdown Latch)  Detailed working exposure of Java8 with Stream API, Lambda, Interface, Functional Interfaces  Proficiency in Java Web Application Development using Spring MVC & Spring Boot  Good Knowledge about using Data Access Frameworks using ORM (Hibernate & JPA)  Familiar with Database concepts with knowledge in RDBMS/SQL  Good understanding of Monolithic & Microservice Architecture What happens once you apply? Click Here to find all you need to know about what our typical hiring process looks like. We encourage you to consider applying to jobs where you might not meet all the criteria. We recognize that we all have transferrable skills, and we can support you with the skills that you need to develop. Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity Employer. learn more.

Posted 6 days ago

Apply

3.0 years

12 - 15 Lacs

Jaipur

On-site

GlassDoor logo

Role Overview We are looking for a detail-oriented and business-savvy Data Scientist with a strong domain understanding of smart metering and utility data . This role is central to transforming raw metering data into actionable insights and delivering high-quality dashboards and analytics to support operational and strategic decision-making across the organization. The ideal candidate is not only technically proficient in data analysis and visualization but also able to interpret metering patterns, consumption behavior, and system anomalies that impact customer experience, revenue, and operational efficiency. Key Responsibilities Analyze large volumes of smart metering data (interval consumption, events, read quality, exceptions) from MDM and HES systems. Identify and interpret consumption patterns, anomalies, and trends that drive actionable insights for business and product teams. Design and build dashboards, visualizations, and reports using tools. Collaborate with product managers, operations, and engineering teams to define data requirements and design meaningful analytics views. Develop rule-based or statistical models for event analytics , billing exceptions , load profiling , and customer segmentation . Translate complex findings into easy-to-understand business insights and recommendations. Ensure data consistency, accuracy, and integrity across different systems and reporting layers. Build automated pipelines to support near real-time and periodic reporting needs. Skills and Qualifications Must Have: 3+ years of experience working with utilities , energy data , particularly from smart meters. Strong SQL skills and experience working with relational databases and data marts. Proficiency in data visualization tools . Solid understanding of smart meter data structure (interval reads, TOU, events, consumption patterns). Ability to independently explore data, validate assumptions, and present clear narratives. Preferred: Familiarity with MDM (Meter Data Management), HES , and utility billing systems. Exposure to AMI events analysis , load curves , and customer behavior analytics . Knowledge of regulatory requirements, data retention, and data privacy in the energy sector. Experience working with large-scale datasets and data platforms (e.g., Delta Lake, Apache Airflow, Apache Spark). Job Type: Full-time Pay: ₹1,200,000.00 - ₹1,500,000.00 per year Schedule: Day shift Work Location: In person

Posted 6 days ago

Apply

3.0 years

0 - 0 Lacs

Satna

On-site

GlassDoor logo

Responsibilities: Develop and maintain scalable and robust backend services using languages such as Java, Python, Node.js, etc. Design and implement frontend components and user interfaces using JavaScript, HTML, and CSS. Integrate with third-party APIs and services. Participate in code reviews and provide constructive feedback to other developers. Troubleshoot and debug applications. Collaborate with cross-functional teams to define, design, and ship new features. Stay updated with emerging technologies and industry trends. Requirements: Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience). Proven experience as a Full Stack Developer or similar role. Strong proficiency with fundamental frontend languages such as HTML, CSS, and JavaScript. Experience with frontend frameworks such as React, Angular, or Vue.js. Solid understanding of backend technologies such as Node.js, Python/Django, Ruby on Rails, etc. Familiarity with databases (e.g., MySQL, MongoDB), web servers (e.g., Apache, Nginx), and UI/UX design principles. Knowledge of DevOps practices such as CI/CD pipelines, containerization, and cloud platforms (AWS, Azure, Google Cloud). Excellent communication and collaboration skills. Ability to work independently and in a team environment. Preferred Qualifications: Experience with mobile application development (iOS/Android). Familiarity with Agile development methodologies. Contributions to open-source projects or personal GitHub repositories. Relevant certifications. Job Types: Full-time, Permanent Pay: ₹20,000.00 - ₹50,000.00 per month Benefits: Cell phone reimbursement Health insurance Location Type: In-person Schedule: Day shift Education: Master's (Preferred) Experience: Angular: 3 years (Preferred) Java: 4 years (Preferred) total work: 5 years (Preferred) Work Location: In person

Posted 6 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Primary Responsibilities Cloud Expertise: Familiarity or hands-on experience with AWS and Google Cloud Platform (GCP) technologies to support data transformation, data structures, metadata management, dependency tracking, and workload orchestration. Collaboration & Independence: Self-motivated and capable of supporting the data needs of multiple teams, systems, and products within Amway’s data ecosystem. Big Data & Distributed Systems: Strong understanding of distributed systems for large-scale data processing and analytics, with a proven track record of manipulating, processing, and deriving insights from large, complex, and disconnected datasets. Database Proficiency: Advanced knowledge of relational databases and SQL, with working experience across a variety of platforms including Microsoft SQL Server and Oracle to enhance analytics capabilities. Software Back-end and Front-end: Familiarity with programing languages that support back-end and front-end, specially node.js, React.js. Growth Mindset: A passion for continuous learning and a desire to help evolve our capabilities to support Machine Learning and advanced analytics initiatives. Required skills and competencies 8+ years of IT experience and at least 5+ years as with Cloud Based Technology familiarity of Node.js and React.js programing knowledge. Expertise in SQL (PLSQL, Big Query, Redshift, GraphSQL) Familiarity in Python programing Knowledge of Mango DB, Apache kafka, Pyspark Must be competent with Confluence, Jira, Github, and other AWS DevOps tools. Show more Show less

Posted 6 days ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Data Analyst – AdTech (1+ Years Experience) Location: Hyderabad Experience Level: 2–3 Years Employment Type: Full-time Shift Timings: 5PM - 2AM IST About The Role We are looking for a highly motivated and detail-oriented Data Analyst with 1+ years of experience to join our AdTech analytics team. In this role, you will be responsible for working with large-scale advertising and digital media datasets, building robust data pipelines, querying and transforming data using GCP tools, and delivering insights through visualization platforms like Looker Studio, Looker, Tableau etc Key Responsibilities Analyze AdTech data (e.g., ads.txt, programmatic delivery, campaign performance, revenue metrics) to support business decisions. Design, develop, and maintain scalable data pipelines using GCP-native tools (e.g., Cloud Functions, Dataflow, Composer). Write and optimize complex SQL queries in BigQuery for data extraction and transformation. Build and maintain dashboards and reports in Looker Studio to visualize KPIs and campaign performance. Collaborate with cross-functional teams including engineering, operations, product, and client teams to gather requirements and deliver analytics solutions. Monitor data integrity, identify anomalies, and work on data quality improvements. Provide actionable insights and recommendations based on data analysis and trends. Required Qualifications 1+ years of experience in a data analytics or business intelligence role. Hands-on experience with AdTech datasets and understanding of digital advertising concepts. Strong proficiency in SQL, particularly with Google BigQuery. Experience building and managing data pipelines using Google Cloud Platform (GCP) tools. Proficiency in Looker Studio Strong problem-solving skills and attention to detail. Excellent communication skills with the ability to explain technical topics to non-technical stakeholders. Preferred Qualifications Experience with additional visualization tools such as Tableau, Power BI, or Looker (BI). Exposure to data orchestration tools like Apache Airflow (via Cloud Composer). Familiarity with Python for scripting or automation. Understanding of cloud data architecture and AdTech integrations (e.g., DV360, Ad Manager, Google Ads). Show more Show less

Posted 6 days ago

Apply

Exploring Apache Jobs in India

Apache is a widely used software foundation that offers a range of open-source software solutions. In India, the demand for professionals with expertise in Apache tools and technologies is on the rise. Job seekers looking to pursue a career in Apache-related roles have a plethora of opportunities in various industries. Let's delve into the Apache job market in India to gain a better understanding of the landscape.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving IT sectors and see a high demand for Apache professionals across different organizations.

Average Salary Range

The salary range for Apache professionals in India varies based on experience and skill level. - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

In the Apache job market in India, a typical career path may progress as follows: 1. Junior Developer 2. Developer 3. Senior Developer 4. Tech Lead 5. Architect

Related Skills

Besides expertise in Apache tools and technologies, professionals in this field are often expected to have skills in: - Linux - Networking - Database Management - Cloud Computing

Interview Questions

  • What is Apache HTTP Server and how does it differ from Apache Tomcat? (medium)
  • Explain the difference between Apache Hadoop and Apache Spark. (medium)
  • What is mod_rewrite in Apache and how is it used? (medium)
  • How do you troubleshoot common Apache server errors? (medium)
  • What is the purpose of .htaccess file in Apache? (basic)
  • Explain the role of Apache Kafka in real-time data processing. (medium)
  • How do you secure an Apache web server? (medium)
  • What is the significance of Apache Maven in software development? (basic)
  • Explain the concept of virtual hosts in Apache. (basic)
  • How do you optimize Apache web server performance? (medium)
  • Describe the functionality of Apache Solr. (medium)
  • What is the purpose of Apache Camel? (medium)
  • How do you monitor Apache server logs? (medium)
  • Explain the role of Apache ZooKeeper in distributed applications. (advanced)
  • How do you configure SSL/TLS on an Apache web server? (medium)
  • Discuss the advantages of using Apache Cassandra for data management. (medium)
  • What is the Apache Lucene library used for? (basic)
  • How do you handle high traffic on an Apache server? (medium)
  • Explain the concept of .htpasswd in Apache. (basic)
  • What is the role of Apache Thrift in software development? (advanced)
  • How do you troubleshoot Apache server performance issues? (medium)
  • Discuss the importance of Apache Flume in data ingestion. (medium)
  • What is the significance of Apache Storm in real-time data processing? (medium)
  • How do you deploy applications on Apache Tomcat? (medium)
  • Explain the concept of .htaccess directives in Apache. (basic)

Conclusion

As you embark on your journey to explore Apache jobs in India, it is essential to stay updated on the latest trends and technologies in the field. By honing your skills and preparing thoroughly for interviews, you can position yourself as a competitive candidate in the Apache job market. Stay motivated, keep learning, and pursue your dream career with confidence!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies