Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
0 Lacs
India
Remote
🌟 We’re Hiring: Customer Service Representatives & Support Managers 📍 Location : Remote 🕒 Employment Type : Contract-based / Freelance / Part-time – 1 Month 📅 Start Date : [Immediate] Are you passionate about delivering exceptional customer experiences and driving support excellence? Join our fast-paced, customer-obsessed team where you’ll play a critical role in shaping how we support users across multiple channels and platforms. 🔧 Key Responsibilities Respond to and resolve multichannel support tickets (email, chat, voice, social, etc.) Monitor and report key support KPIs and metrics (e.g., CSAT, FRT, ART, etc.) Update and maintain internal knowledge bases and help center documentation Handle customer escalations with professionalism and urgency Coach, mentor, and lead junior support agents to consistently meet quality standards Identify and implement process improvements to increase efficiency and customer satisfaction Collaborate with cross-functional teams (product, sales, QA) to relay customer insights 💻 Tools & Platforms You’ll Work With Commercial Support & CX Platforms: Zendesk, Freshdesk, Salesforce Service Cloud, ServiceNow HubSpot Service Hub, Intercom, Helpscout NICE IEX, Verint, Assembled RingCentral, Nextiva Tableau, Qualtrics, SurveyMonkey Slack, Microsoft Teams Open Source / Free Tools: Ticketing: osTicket, Zammad, Request Tracker, UVDesk, FreeScout Messaging: Rocket.Chat, Mattermost, Element, Jitsi Meet Documentation: DokuWiki, BookStack, MediaWiki, Outline Reporting & Analytics: Metabase, Apache Superset, Google Data Studio (free) Survey & Feedback: Google Forms, LimeSurvey ✅ What We’re Looking For 2+ years of experience in customer support or service delivery roles Strong verbal and written communication skills Proven ability to manage and resolve complex customer issues Familiarity with support automation, AI/chatbots, or workflow optimization is a plus Experience with both enterprise and open-source tools is an advantage Leadership or team coaching experience (for Support Manager applicants) Interested Please share your Profiles to Ganapathikumar@highbrowtechnology.com Show more Show less
Posted 4 days ago
0 years
0 Lacs
Mulshi, Maharashtra, India
On-site
Area(s) of responsibility Data Management (AWS) developer We are looking for a Data Management (AWS) developer who will serve as the technical counterpart to data stewards across various business domains. This role will focus on the technical aspects of data management, including the integration of data catalogs, data quality management, and access management frameworks within our data lakehouse. Key Responsibilities Integrate Acryl data catalog with AWS Glue data catalog to enhance data discoverability and management. Develop frameworks and processes for deploying and maintaining data classification and data quality rules in the data lakehouse. Implement and maintain Lake Formation access frameworks, including OpenID Connect (OIDC) for secure data access. Build and maintain data quality and classification reports and visualizations to support data-driven decision-making. Develop and implement mechanisms for column-level data lineage in the data lakehouse. Collaborate with data stewards to ensure effective data ownership, cataloging, and metadata management. Qualifications Relevant experience in data management, data governance, or related technical fields. Strong technical expertise in AWS services, particularly in AWS Glue, Lake Formation, and data quality management tools. Familiarity with data security practices, including OIDC and AWS IAM. Experience with AWS Athena, Apache Airflow. Relevant certifications (e.g., CDMP) are a plus. Terraform, Github, Python, Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Exp:5+yrs NP: Imm-15 days Rounds: 3 Rounds (Virtual) Mandate Skills: Apache spark, hive, Hadoop, spark, scala, Databricks Job Description The Role Designing and building optimized data pipelines using cutting-edge technologies in a cloud environment to drive analytical insights. Constructing infrastructure for efficient ETL processes from various sources and storage systems. Leading the implementation of algorithms and prototypes to transform raw data into useful information. Architecting, designing, and maintaining database pipeline architectures, ensuring readiness for AI/ML transformations. Creating innovative data validation methods and data analysis tools. Ensuring compliance with data governance and security policies. Interpreting data trends and patterns to establish operational alerts. Developing analytical tools, programs, and reporting mechanisms Conducting complex data analysis and presenting results effectively. Preparing data for prescriptive and predictive modeling. Continuously exploring opportunities to enhance data quality and reliability. Applying strong programming and problem-solving skills to develop scalable solutions. Requirements Experience in the Big Data technologies (Hadoop, Spark, Nifi, Impala) 5+ years of hands-on experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient, and distributed data pipelines. High proficiency in Scala/Java and Spark for applied large-scale data processing. Expertise with big data technologies, including Spark, Data Lake, and Hive. Show more Show less
Posted 4 days ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job description: Job Description Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ͏ Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ͏ Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ͏ Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ͏ Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Apache Spark . Experience: 5-8 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome. Show more Show less
Posted 4 days ago
3.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Work experience: 3-6 years Budget is 7 Lac Max Notice period: Immediate to 30days. Linux Ø Install, configure, and maintain Linux servers (Red Hat, CentOS, Ubuntu, Amazon Linux). Ø Linux OS through Network and Kick Start Installation Ø Manage system updates, patch management, kernel upgrades. Ø Create and manage user accounts, file systems, permissions, and storage. Ø Write shell scripts (Bash, Python) for task automation. Ø Monitor server performance and troubleshoot hardware/software issues. Ø Handle incident management, root cause analysis, and preventive maintenance. Ø Implement and manage backup solutions (rsync, cron jobs, snapshot backups). Ø Harden servers by configuring firewalls (iptables, firewalld), securing SSH, and managing SELinux. Ø Configure and troubleshoot networking services (DNS, DHCP, FTP, HTTP, NFS, Samba). Ø Work on virtualization and cloud technologies (AWS EC2, VPC, S3, RDS basics if required). Ø Maintain detailed documentation of system configuration and procedures. Ø Implement and configure APACHE & Tomcat web server with open SSL on Linux. Ø SWAP Space Management. Ø LVM (extending, reducing, removing and merging), Backup and Restoration. Amazon Web Services Ø AWS Infrastructure Management : Provision and manage cloud resources like EC2, S3, RDS, VPC, IAM, EKS, Lambda. Ø Cloud Architecture : Design and implement secure, scalable, and reliable cloud solutions. Ø Automation and IaC : Automate deployments using tools like Terraform, CloudFormation, or AWS CDK. Ø Security Management : Configure IAM roles, security groups, encryption (KMS), and enforce best security practices. Ø Monitoring and Optimization : Monitor cloud resources with CloudWatch, X-Ray, and optimize for cost and performance. Ø Backup and Disaster Recovery : Set up data backups (S3, Glacier, EBS snapshots) and design DR strategies. Ø CI/CD Implementation : Build and maintain CI/CD pipelines using AWS services (CodePipeline, CodeBuild) or Jenkins, GitLab,GitHub. Ø Networking : Manage VPCs, Subnets, Internet Gateways, NAT, VPNs, Route53 DNS configurations. Ø Troubleshooting and Support : Identify and fix cloud resource issues, perform root cause analysis. Ø Migration Projects : Migrate on-premises servers, databases, and applications to AWS. Windows Server and Azure: Ø Active Directory: Implementation, Migration, Managing and troubleshooting. Ø Deep knowledge on DHCP Server Ø Deep knowledge in Patch management Ø Troubleshooting Windows operating System Ø Decent knowledge in Azure (Creation of VMs, configuring network rules, Migration, Managing and troubleshooting) Ø Deep knowledge in VMware ESXi (Upgrading the server firmware, creation of VMs, Managing backups, monitoring etc) Networking: Ø Knowledge on IP Addressing, NAT, P2P protocols, SSL and IPsec VPNS etc Ø Deep knowledge in VPN Ø Knowledge in MVoIP, VMs, SIP PRI and Lease Line. Ø Monitoring the Network bandwidth and maintaining the stability Ø Configuring Switch and Routers Ø Troubleshooting Network Devices Ø Must be able to work on Cisco Meraki Access Point devices Firewall & Endpoint Security: Ø Decent knowledge in Fortinet Firewalls which includes creating Objects, Routing, creating Rules and monitoring etc. Ø Decent knowledge in CrowdStrike Ø Knowledge in Vulnerability and assessment Office365 Ø Deep knowledge in Office365 (Creation of mail, Backup and archive, Security rules, Security Filters, Creation of Distribution list etc) Ø Knowledge in MX, TX and other records Ø Deep knowledge in Office365 Apps like Teams, Outlook, Excel etc Ø SharePoint management Other Tasks: Ø Hardware Servicing Laptops and desktops Ø Maintaining Asset inventory up to date. Ø Managing the utility invoices. Ø Handling L1 and L2 troubleshooting Ø Vendor Management Ø Handling application related issues Ø Website hosting and monitoring Ø Tracking all Software licenses, Cloud Service renewal period and ensue they are renewed on time. Ø Monitoring, managing and troubleshooting servers. Ø Knowledge in NAS Ø Knowledge in EndPoint Central tool and Ticketing tool. Show more Show less
Posted 4 days ago
0.0 - 1.0 years
0 Lacs
Bengaluru, Karnataka
On-site
We're Hiring: GCP DevOps Engineer (with Node.js Skills) Locations: Bengaluru / Chennai / Pune / Hyderabad / Vadodara (On-site/Hybrid as per role) Positions Available: 3 Employment Type: Full-time Salary: ₹10–14 LPA (Based on experience and interview performance) About the Role: We are looking for passionate and curious GCP DevOps Engineers who are comfortable working in dynamic environments and love combining DevOps best practices with backend development. If you have 1–3 years of hands-on experience, basic knowledge of Node.js , and a solid grip on GCP, Kubernetes, and Git , this could be the perfect role to elevate your career. What You’ll Be Doing: Deploy, manage, and monitor cloud infrastructure on Google Cloud Platform (GCP) Work with Kubernetes to orchestrate containerized applications Collaborate with developers to integrate Node.js -based services and APIs Handle Kafka messaging pipelines (consumers & producers) Manage PostgreSQL databases (schema design, queries, performance tuning) Utilize Git and GitHub for version control, code reviews, and CI workflows Use VS Code or similar IDEs for development and troubleshooting Troubleshoot issues independently and ensure smooth deployment cycles Collaborate effectively in distributed teams and maintain clear documentation Minimum Qualifications: Bachelor’s degree in Computer Science, Engineering, or equivalent practical experience 1–3 years of hands-on experience in software development or DevOps engineering Key Skills We’re Looking For: Google Cloud Platform (GCP) services Kubernetes and containerization tools Basic to intermediate Node.js development (especially REST APIs/backend services) Apache Kafka (publishing/consuming messages) PostgreSQL or similar RDBMS Git, GitHub, and collaborative workflows Excellent troubleshooting, problem-solving, and team collaboration skills Good to Have: Experience with CI/CD pipelines (e.g., Jenkins, GitHub Actions) Familiarity with Agile/Scrum methodologies Exposure to observability tools (Prometheus, Grafana, ELK, etc.) Why Join Us? Work on impactful, production-grade cloud solutions Collaborate with highly skilled teams across geographies Gain experience across cutting-edge DevOps stacks Fast-paced, learning-rich environment with room to grow Job Types: Full-time, Permanent Pay: ₹1,000,000.00 - ₹1,400,000.00 per year Schedule: Day shift Monday to Friday Ability to commute/relocate: Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Required) Experience: Google Cloud Platform: 2 years (Required) Kubernetes: 1 year (Required) Node.js: 1 year (Preferred) Work Location: In person
Posted 4 days ago
9.0 - 14.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Greetings from TCS!!! TCS is hiring for Big Data Architect Location - PAN India Years of Experience - 9-14 years Job Description- Experience with Python, Spark, and Hive data pipelines using ETL processes Apache Hadoop development and implementation Experience with streaming frameworks such as Kafka Hands on experience in Azure/AWS/Google data services Work with big data technologies (Spark, Hadoop, BigQuery, Databricks) for data preprocessing and feature engineering. Show more Show less
Posted 4 days ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job Title :Data Engineer Key Skills :Python , ETL, Snowflake, Apache Airflow Job Locations : Pan India. Experience : 6-7 Education Qualification : Any Graduation. Work Mode : Hybrid. Employment Type : Contract. Notice Period : Immediate Job Description: 6 to 10 years of experience in data engineering roles with a focus on building scalable data solutions. Proficiency in Python for ETL, data manipulation, and scripting. Hands-on experience with Snowflake or equivalent cloud-based data warehouses. Strong knowledge of orchestration tools such as Apache Airflow or similar. Expertise in implementing and managing messaging queues like Kafka , AWS SQS , or similar. Demonstrated ability to build and optimize data pipelines at scale, processing terabytes of data. Experience in data modeling, data warehousing, and database design. Proficiency in working with cloud platforms like AWS, Azure, or GCP. Strong understanding of CI/CD pipelines for data engineering workflows. Experience working in an Agile development environment , collaborating with cross-functional teams. Preferred Skills: Familiarity with other programming languages like Scala or Java for data engineering tasks. Knowledge of containerization and orchestration technologies (Docker, Kubernetes). Experience with stream processing frameworks like Apache Flink . Experience with Apache Iceberg for data lake optimization and management. Exposure to machine learning workflows and integration with data pipelines. Soft Skills: Strong problem-solving skills with a passion for solving complex data challenges. Excellent communication and collaboration skills to work with cross-functional teams. Ability to thrive in a fast-paced, innovative environment. Show more Show less
Posted 4 days ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
At Aramya, we’re redefining fashion for India’s underserved Gen X/Y women, offering size-inclusive, comfortable, and stylish ethnic wear at affordable prices. Launched in 2024, we’ve already achieved ₹40 Cr in revenue in our first year, driven by a unique blend of data-driven design, in-house manufacturing, and a proprietary supply chain. Today, with an ARR of ₹100 Cr, we’re scaling rapidly with ambitious growth plans for the future. Our vision is bold to build the most loved fashion and lifestyle brands across the world while empowering individuals to express themselves effortlessly. Backed by marquee investors like Accel and Z47, we’re on a mission to make high-quality ethnic wear accessible to every woman. We’ve built a community of loyal customers who love our weekly design launches, impeccable quality, and value-for-money offerings. With a fast-moving team driven by creativity, technology, and customer obsession, Aramya is more than a fashion brand—it’s a movement to celebrate every woman’s unique journey. We’re looking for a passionate Data Engineer with a strong foundation. The ideal candidate should have a solid understanding of D2C or e-commerce platforms and be able to work across the stack to build high-performing, user-centric digital experiences. Key Responsibilities Design, build, and maintain scalable ETL/ELT pipelines using tools like Apache Airflow, Databricks , and Spark . Own and manage data lakes/warehouses on AWS Redshift (or Snowflake/BigQuery). Optimize SQL queries and data models for analytics, performance, and reliability. Develop and maintain backend APIs using Python (FastAPI/Django/Flask) or Node.js . Integrate external data sources (APIs, SFTP, third-party connectors) and ensure data quality & validation. Implement monitoring, logging, and alerting for data pipeline health. Collaborate with stakeholders to gather requirements and define data contracts. Maintain infrastructure-as-code (Terraform/CDK) for data workflows and services. Must-Have Skills Strong in SQL and data modeling (OLTP and OLAP). Solid programming experience in Python , preferably for both ETL and backend. Hands-on experience with Databricks , Redshift , or Spark . Experience with building and managing ETL pipelines using tools like Airflow , dbt , or similar. Deep understanding of REST APIs , microservices architecture, and backend design patterns. Familiarity with Docker , Git, CI/CD pipelines. Good grasp of cloud platforms (preferably AWS ) and services like S3, Lambda, ECS/Fargate, CloudWatch. Nice-to-Have Skills Exposure to streaming platforms like Kafka, Kinesis, or Flink. Experience with Snowflake , BigQuery , or Delta Lake . Proficient in data governance , security best practices, and PII handling. Familiarity with GraphQL , gRPC , or event-driven systems. Knowledge of data observability tools (Monte Carlo, Great Expectations, Datafold). Experience working in a D2C/eCommerce or analytics-heavy product environment. Show more Show less
Posted 4 days ago
0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Job Title: Senior Java Developer Location: Airoli, Mumbai (Onsite) Industry: BFSI / Fintech About the Role We are looking for a highly skilled and passionate Senior Java Developer with strong hands-on experience in developing scalable, high-performance applications. You will play a critical role in building low-latency, high-throughput systems focused on risk and fraud monitoring in the BFSI domain. Key Responsibilities Design and develop microservices using Java (latest versions) , Spring Boot , and RESTful APIs Build robust data streaming solutions using Apache Flink and Kafka Implement business rules using Drools Rule Engine Contribute to the development of low-latency, high-throughput platforms for fraud detection and risk monitoring Participate in Agile development, code reviews, and CI/CD pipelines with a strong focus on Test-Driven Development (TDD) Debug complex issues and take full ownership from design to deployment Collaborate with cross-functional teams and participate in cloud-native development using AWS (IaaS / PaaS) Required Skills Java , Spring Boot , REST APIs , Virtual Threads Apache Flink , Kafka – real-time data stream processing Drools Rule Engine Strong grasp of J2EE , OOP principles , and Design Patterns Experience working with CI/CD tools , Git , Quay , TDD Familiarity with Cloud-native solutions, especially in AWS environments Preferred Experience BFSI / Fintech domain experience in building risk and fraud monitoring applications Exposure to Agile methodology and tools like JIRA Solid communication skills and strong sense of ownership Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Req ID: 26743 Includes the following essential duties and responsibilities (other duties may also be assigned): Supermicro seeks qualified QA manager with hands-on experience to create and enforce web-based products. As a QA manager, you will leverage your expert technical knowledge and past implementation experience in developing processes and standards to build our new cloud solution based on the latest industry cloud software development technologies such as LAMP stack (Linux, Apache, Python, Mysql, etc.). You will develop comprehensive test plans, strategies, and schedules for enterprise-scale requirements and lead its initial adoption across various test cases. You’ll be responsible for managing the lab hardware and quality processes. You will need an excellent understanding of infrastructure operations, tools, and patterns used in an agile development continuous delivery environment. Monitor and report using test tools for automated, manual and regression test Knowledgeable of VDBench and Jenkins Skilled in understanding and deploying cloud technologies About Supermicro Supermicro® is a Top Tier provider of advanced server, storage, and networking solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/ Big Data, Hyperscale, HPC and IoT/Embedded customers worldwide. We are the #5 fastest growing company among the Silicon Valley Top 50 technology firms. Our unprecedented global expansion has provided us with the opportunity to offer a large number of new positions to the technology community. We seek talented, passionate, and committed engineers, technologists, and business leaders to join us. Job Summary Monitor and report using test tools for automated, manual and regression test Knowledgeable of VDBench and Jenkins Skilled in understanding and deploying cloud technologies Qualifications Education and/or Experience: BS/MS EE, CE, ME 5+ years of quality assurance expertise Experience with Agile development tools (Redmine, Git) Confident presenter, and strong influencer; able to adapt level and style to the audience EEO Statement Supermicro is an Equal Opportunity Employer and embraces diversity in our employee population. It is the policy of Supermicro to provide equal opportunity to all qualified applicants and employees without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, protected veteran status or special disabled veteran, marital status, pregnancy, genetic information, or any other legally protected status. Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
5 years of experience as a Data Engineer or a similar role. Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related field. Strong knowledge of data engineering tools and technologies (e.g. SQL, ETL, data warehousing). Experience with data pipeline frameworks and data processing platforms (e.g. Apache Kafka, Apache Spark). Proficiency in programming languages such as Python, Java, or Scala. Experience with cloud platforms (e.g. AWS, Google Cloud Platform, Azure). Knowledge of data modeling, database design, and data governance. Mongo DB Is Must Show more Show less
Posted 4 days ago
6.0 - 8.0 years
0 Lacs
Kochi, Kerala, India
On-site
6 - 8 years of experience as a full stack Java developer Expertise on UI Frameworks like Angular JS, Node JS, React JS, bootstrap. Must have Apache Camel hands on expericence. Experience on front end technologies using HTML5, CSS3, Bootstrap & SASS Experience in JavaScript/TypeScript such as, ReactJs , NodeJs, and AngularJs In-depth experience in responsive web design and development. Hands on experience in Linux/Unix Platform with knowledge day to day routine commands. Java, SOA and Web Services (REST/SOAP) required. Knowledge of DevOps process with enterprise architecture. Experience in Java 8, Spring, Spring Boot, Microservices, ORM Tools, and Cloud technologies Experience of Java Microservices architecture Experience with designing, implementing, and deploying microservices in distributed systems Good to have knowledge and experience of deploying to application in AWS Cloud using Jenkins, Docker Strong knowledge and experience with SQL queries and databases like PostgreSQL / SQL Server. Familiarity with a source control system (GitHub, SVN, etc.) Experience in unit testing code with JEST / enzyme / Jasmine / Mocha / Chai is desired Experience in agile delivery and tools like Jira Show more Show less
Posted 4 days ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
TEKsystems is seeking a Senior AWS + Data Engineer to join our dynamic team. The ideal candidate should have expertise Data engineer + Hadoop + Scala/Python with AWS services. This role involves designing, developing, and maintaining scalable and reliable software solutions. Job Title: Data Engineer – Spark/Scala (Batch Processing) Location: Manyata- Hybrid Experience: 7+yrs Type: Full-Time Mandatory Skills: 7-10 years’ experience in design, architecture or development in Analytics and Data Warehousing. Experience in building end-to-end solutions with the Big data platform, Spark or scala programming. 5 years of Solid experience in ETL pipeline building with spark or sclala programming framework with knowledge in developing UNIX Shell Script, Oracle SQL/ PL-SQL. Experience in Big data platform for ETL development with AWS cloud platform. Proficiency in AWS cloud services, specifically EC2, S3, Lambda, Athena, Kinesis, Redshift, Glue , EMR, DynamoDB, IAM, Secret Manager, Step functions, SQS, SNS, Cloud Watch. Excellent skills in Python-based framework development are mandatory. Should have experience with Oracle SQL database programming, SQL performance tuning, and relational model analysis. Extensive experience with Teradata data warehouses and Cloudera Hadoop. Proficient across Enterprise Analytics/BI/DW/ETL technologies such as Teradata Control Framework, Tableau, OBIEE, SAS, Apache Spark, Hive Analytics & BI Architecture appreciation and broad experience across all technology disciplines. Experience in working within a Data Delivery Life Cycle framework & Agile Methodology. Extensive experience in large enterprise environments handling large volume of datasets with High SLAs Good knowledge in developing UNIX scripts, Oracle SQL/PL-SQL, and Autosys JIL Scripts. Well versed in AI Powered Engineering tools like Cline, GitHub Copilo Please send the resumes to nvaseemuddin@teksystems.com or kebhat@teksystems.com Show more Show less
Posted 4 days ago
0.0 - 1.0 years
0 Lacs
Mumbai, Maharashtra
On-site
Job Information Industry IT Services Date Opened 06/16/2025 Job Type Software Engineering Work Experience 0-1 years City Mumbai State/Province Maharashtra Country India Zip/Postal Code 400080 Job Description What we want: We are looking for a Intern DevOps Engineer who should have good experience with Linux and exposure to DevOps Tools. Who we are: Vertoz (NSEI: VERTOZ), an AI-powered MadTech and CloudTech Platform offering Digital Advertising, Marketing and Monetization (MadTech) & Digital Identity, and Cloud Infrastructure (CloudTech) caters to Businesses, Digital Marketers, Advertising Agencies, Digital Publishers, Cloud Providers, and Technology companies. For more details, please visit our website here. What you will do: Linux - be comfortable with the command line. (Preferably on Ubuntu. Completion of a course will be an advantage) Possess knowledge of AWS or equivalent cloud services provider. Virtualization (KVM, VMware, or VirtualBox) Knowledge of networking (OSI, basic troubleshooting, Internet services) Knowledge of web technologies like Redis, Apache Tomcat, or Apache Web server. Should know any SQL-based DB (MySQL MariaDB or PostgreSQL) Must be self-driven and able to follow and execute instructions specified in user guides. Knowledge of Jenkins, Ansible /chef/puppet, git, and docker preferred. Must be able to document activities, procedures, etc. Requirements BE, BSC in CS/IT, ME in CS & MSC in CS/IT Linux (RHCE/RHCSA) Certification is must Mumbai candidates only Willing to work in a 24x7 environment Benefits No dress codes Flexible working hours 5 days working 24 Annual Leaves International Presence Celebrations Team outings
Posted 4 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
QA Manager Date: Jun 16, 2025 Location: Chennai, Tamil Nadu, IN Company: Super Micro Computer Job Req ID: 26743 Includes the following essential duties and responsibilities (other duties may also be assigned): Supermicro seeks qualified QA manager with hands-on experience to create and enforce web-based products. As a QA manager, you will leverage your expert technical knowledge and past implementation experience in developing processes and standards to build our new cloud solution based on the latest industry cloud software development technologies such as LAMP stack (Linux, Apache, Python, Mysql, etc.). You will develop comprehensive test plans, strategies, and schedules for enterprise-scale requirements and lead its initial adoption across various test cases. You’ll be responsible for managing the lab hardware and quality processes. You will need an excellent understanding of infrastructure operations, tools, and patterns used in an agile development continuous delivery environment. Monitor and report using test tools for automated, manual and regression test Knowledgeable of VDBench and Jenkins Skilled in understanding and deploying cloud technologies About Supermicro: Supermicro® is a Top Tier provider of advanced server, storage, and networking solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/ Big Data, Hyperscale, HPC and IoT/Embedded customers worldwide. We are the #5 fastest growing company among the Silicon Valley Top 50 technology firms. Our unprecedented global expansion has provided us with the opportunity to offer a large number of new positions to the technology community. We seek talented, passionate, and committed engineers, technologists, and business leaders to join us. Job Summary: Monitor and report using test tools for automated, manual and regression test Knowledgeable of VDBench and Jenkins Skilled in understanding and deploying cloud technologies Qualifications: Education and/or Experience: BS/MS EE, CE, ME 5+ years of quality assurance expertise Experience with Agile development tools (Redmine, Git) Confident presenter, and strong influencer; able to adapt level and style to the audience EEO Statement Supermicro is an Equal Opportunity Employer and embraces diversity in our employee population. It is the policy of Supermicro to provide equal opportunity to all qualified applicants and employees without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, protected veteran status or special disabled veteran, marital status, pregnancy, genetic information, or any other legally protected status.
Posted 4 days ago
0.0 - 6.0 years
0 Lacs
Delhi, Delhi
Remote
Apache Superset Data Engineer Experience : 3 - 6 years Bhubaneswar, Delhi - NCR, Remote Working About the Job Featured The Apache Superset Data Engineer plays a key role in designing, developing, and maintaining scalable data pipelines and analytics infrastructure, with a primary emphasis on data visualization and dashboarding using Apache Superset. This role sits at the intersection of data engineering and business intelligence, enabling stakeholders to access accurate, actionable insights through intuitive dashboards and reports. Core Responsibilities Create, customize, and maintain interactive dashboards in Apache Superset to support KPIs, experimentation, and business insights Work closely with analysts, BI teams, and business users to gather requirements and deliver effective Superset-based visualizations Perform data validation, feature engineering, and exploratory data analysis to ensure data accuracy and integrity Analyze A/B test results and deliver insights that inform business strategies Establish and maintain standards for statistical testing, data validation, and analytical workflows Integrate Superset with various database systems (e.g., MySQL, PostgreSQL) and manage associated drivers and connections Ensure Superset deployments are secure, scalable, and high-performing Clearly communicate findings and recommendations to both technical and non-technical stakeholders Required Skills Proven expertise in building dashboards and visualizations using Apache Superset Strong command of SQL and experience working with relational databases like MySQL, or PostgreSQL Proficiency in Python (or Java) for data manipulation and workflow automation Solid understanding of data modelling, ETL/ELT pipelines, and data warehousing principles Excellent problem-solving skills and a keen eye for data quality and detail Strong communication skills, with the ability to simplify complex technical concepts for non-technical audiences Nice to have familiarity with cloud platforms (AWS, ECS) Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field 3+ yrs of relevant experience
Posted 4 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
IT Full-Time Job ID: DGC00739 Chennai, Tamil Nadu 1-4 Yrs ₹1.5 - ₹05 Yearly Job description React JS / React Native Developer Job Summary: We are seeking a highly skilled and motivated React Developer with experience in real-time data integration, visualizations, and charting libraries such as Apache ECharts. As part of our team, you will work on building interactive and high-performance web applications that handle real-time data and display dynamic charts. Knowledge of D3.js is a plus, and experience with React Native is an added advantage for expanding to mobile platforms. Key Responsibilities: Develop and maintain responsive and performant React applications that integrate real-time data. Use Apache ECharts (or similar charting libraries) to build interactive and dynamic charts that visualize real-time data. Implement real-time data streaming using WebSockets or REST APIs to update visualizations and ensure smooth, live data feeds. Work closely with UI/UX designers to ensure data visualizations and user interfaces are intuitive and user-friendly. Ensure the performance and scalability of applications handling large volumes of real-time data. Utilize D3.js (as an additional skill) for advanced data visualizations and custom chart components. Collaborate with backend developers to design and integrate APIs for real-time data synchronization. Optimize application performance for both desktop and mobile environments. Maintain code quality by writing tests, debugging issues, and performing code reviews. Stay up-to-date with the latest React development practices, real-time technologies, and charting libraries. Requirements: At least 5 years of experience in React development with a strong focus on front-end technologies. Strong knowledge of JavaScript (ES6+), React, and modern web development practices. Proven experience integrating real-time data using WebSockets or REST APIs. Hands-on experience with Apache ECharts (or similar charting libraries like Chart.js or Highcharts) to build interactive charts. Familiarity with charting libraries for creating custom data visualizations and charts. Proficiency in state management with tools such as Redux, Context API, or similar. Experience with version control systems, primarily Git. Strong problem-solving and debugging skills to work with real-time data and dynamic UI. Familiarity with responsive design principles and building mobile-first applications. Strong communication skills and the ability to work in a collaborative team environment. Preferred Qualifications: Experience with React Native for building cross-platform mobile applications. Familiarity with Agile/Scrum methodologies. Knowledge of build tools like Webpack, Babel, and NPM/Yarn for front-end development. Experience with CI/CD pipelines and automated testing. Familiarity with cloud platforms (AWS, Azure, Google Cloud) and hosting services. Basic understanding of backend technologies and APIs to assist with integration.At Aetram, we are looking for a creative senior digital marketing executive to oversee our digital strategy and increase brand awareness. The ideal applicant should have extensive experience in brand marketing and the ability to create, organise, and manage digital campaigns across many platforms.
Posted 4 days ago
0.0 - 7.0 years
0 Lacs
Bengaluru, Karnataka
Remote
Bengaluru, Karnataka, India Customer Success & Support Full-time Ref ID: 3125726 Our Mission At Palo Alto Networks® everything starts and ends with our mission: Being the cybersecurity partner of choice, protecting our digital way of life. Our vision is a world where each day is safer and more secure than the one before. We are a company built on the foundation of challenging and disrupting the way things are done, and we’re looking for innovators who are as committed to shaping the future of cybersecurity as we are. Who We Are We take our mission of protecting the digital way of life seriously. We are relentless in protecting our customers and we believe that the unique ideas of every member of our team contributes to our collective success. Our values were crowdsourced by employees and are brought to life through each of us everyday - from disruptive innovation and collaboration, to execution. From showing up for each other with integrity to creating an environment where we all feel included. As a member of our team, you will be shaping the future of cybersecurity. We work fast, value ongoing learning, and we respect each employee as a unique individual. Knowing we all have different needs, our development and personal wellbeing programs are designed to give you choice in how you are supported. This includes our FLEXBenefits wellbeing spending account with over 1,000 eligible items selected by employees, our mental and financial health resources, and our personalized learning opportunities - just to name a few! At Palo Alto Networks, we believe in the power of collaboration and value in-person interactions. This is why our employees generally work full time from our office with flexibility offered where needed. This setup fosters casual conversations, problem-solving, and trusted relationships. Our goal is to create an environment where we all win with precision. Your Career The Engineering TAC (ETAC) Advanced Solutions team is an exciting crossroads between Technical Assistance Center (TAC) and Engineering. This team is uniquely empowered to drive decisions and to be the thought leaders within the Global Customer Support organization at Palo Alto Networks. We are a relatively small, global team consisting of top performers with support, engineering and development backgrounds. Our roles are very hands-on and have a high impact on the company. The Advanced Solutions team role also includes building/architecting robust environments to assist with complex issue reproduction/resolution, as well as large-scale, cross-platform lab buildouts for feature testing, software release, etc… Our Team consists of Engineers who are experienced in Network Engineering, NetSec, QA, Software Development/DevOps, Cloud, as well as SME’s in areas for bleeding edge tools such as Ixia/Keysight, Spirent, etc… Team's Mission includes Application and Tools Development, AI/Machine Learning R&D, DB Systems Administration, Release Management, and Data Analytics. You will network and collaborate with key stakeholders within Global Support, Engineering, QA, PM, Sales, and more, leveraging your capability of detailing difficult technical issues to both non-technical and technical professionals. Your Impact An ETAC engineer has the highest level of expertise amongst support teams, and is responsible for staying up to date with technical details on Palo Alto Networks new products and industry in general Work with TAC to provide expert-level technical support of customer issues that involve very complex network topologies, architectures, and security designs Lead technical discussions with cross-functional teams, fostering an environment of transparency that ultimately leads to better products. Develop advanced troubleshooting focused tools and scripts to help solve complex customer issues and improve product supportability Help drive and enable ML/AI related projects Own critical and executive level issues partnering primarily with Customer Support and Engineering to provide expertise in identifying and resolving customer issues, which entails working with the TAC case owner and Engineering on a replication or verification and communicating updates Lead in Identifying problems and taking actions to fix them across support and product life cycles Develop and deliver expert level training materials for TAC support, Engineering, and Professional Services teams Ownership of Release Management: Assist with managing the end-to-end release process, including coordinating with various teams to gather release requirements and dependencies. Responsible for scheduling, planning, and controlling the software delivery process for on-prem and cloud products (CSP/Adminsite/AWS/Azure/OCI/GCP) Coordinate with IT/Dev/QA to assure IT requirements are met for a seamless release process SW release after completing testing/deployment stage Define strategic usage of release management tools (Autoex/Jenkins/Automation Staging Scripts) Collaborate on product development with cross-functional teams including Engineering/QA/PM Triage Production issues impacting customer deliverables on Palo Alto Networks Support Portal Your Experience Minimum of 7 years of professional experience Technical Support or Development experience supporting enterprise customers with very complex LAN/WAN environments Deep understanding of TCP/IP and advanced knowledge of LAN/WAN technologies, expertise with general routing/switching, Routing protocols (e.g. BGP, OSPF, Multicast), branch and DataCenter Architectures Expertise with Remote Access VPN solutions, IPSEC, PKI & SSL Expertise with Cloud services and Infrastructure a plus Familiarity with C, Python, or at least one scripting language - While this is not a role to be a developer, one should have some experience in automating moderately complex tasks. Experience with Palo Alto Networks products is highly desired Understand how data packets get processed - Devices shouldn’t be a “black box”, one should have an understanding of packet processing at various stages and how that can result in different symptoms/outcomes. Excellent communication skills with the ability to deliver highly technical informative presentations - While you will not be involved with taking call from a queue, there may be cases where your expertise is called upon to speak with customers from time to time, along with Support members, Developers, Sales Engineers and the rest of your team Proficiency in creating technical documentation using applications such as Powerpoint/Google Slides or knowledge-base/intranet platforms such as Lumapps, Jive or Confluence Familiar with automation with Jenkins, Terraform, etc. Understanding of Linux operating systems. Able to operate headless Linux systems and Shell scripting. Basic knowledge of deploying and configuring web servers, i.e Nginx, Apache, IIS. Understanding of load balancing technologies and HTTP forwarding with Nginx, HaProxy, and load balancers provided by AWS, Azure, and Google Cloud. Familiarity with virtualization technologies including VMware, KVM, OpenStack, AWS, Google Cloud and Azure. Familiarity with Docker. Able to create, manage, and deploy docker images on Docker server. Manage running containers. Create docker-compose YAML files. Familiar with front-end technologies including JavaScript, React, HTML, and CSS for building responsive, user-friendly interfaces. Experienced in back-end development using frameworks such as Python and Flask Brings a creative and hands-on approach to testing and enhancing small applications, participating in all aspects of the testing lifecycle—from functional and performance testing to idea generation and continuous monitoring—with a focus on improvement and efficacy to ensure optimal quality and user satisfaction. Willing to work flexible times including occasional weekends and evenings. The Team Our technical support team is critical to our success and mission. As part of this team, you enable customer success by providing support to clients after they have purchased our products. Our dedication to our customers doesn’t stop once they sign – it evolves. As threats and technology change, we stay in step to accomplish our mission. You’ll be involved in implementing new products, transitioning from old products to new, and will fix integrations and critical issues as they are raised – in fact, you’ll seek them out to ensure our clients are safely supported. We fix and identify technical problems, with a pointed focus of providing the best customer support in the industry. Our Commitment We’re problem solvers that take risks and challenge cybersecurity’s status quo. It’s simple: we can’t accomplish our mission without diverse teams innovating, together. We are committed to providing reasonable accommodations for all qualified individuals with a disability. If you require assistance or accommodation due to a disability or special need, please contact us at accommodations@paloaltonetworks.com. Palo Alto Networks is an equal opportunity employer. We celebrate diversity in our workplace, and all qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or other legally protected characteristics. All your information will be kept confidential according to EEO guidelines. Our Commitment We’re problem solvers that take risks and challenge cybersecurity’s status quo. It’s simple: we can’t accomplish our mission without diverse teams innovating, together. We are committed to providing reasonable accommodations for all qualified individuals with a disability. If you require assistance or accommodation due to a disability or special need, please contact us at accommodations@paloaltonetworks.com. Palo Alto Networks is an equal opportunity employer. We celebrate diversity in our workplace, and all qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or other legally protected characteristics. All your information will be kept confidential according to EEO guidelines.
Posted 4 days ago
10.0 years
0 Lacs
Kerala, India
On-site
Senior Data Engineer – AWS Expert (Lead/Associate Architect Level) 📍 Location: Trivandrum or Kochi (On-site/Hybrid) Experience:10+ Years (Relevant exp in AWS- 5+ is mandatory) About the Role We’re hiring a Senior Data Engineer with deep expertise in AWS services , strong hands-on experience in data ingestion, quality, and API development , and the leadership skills to operate at a Lead or Associate Architect level . This role demands a high level of technical ownership , especially in architecting scalable, reliable data pipelines and robust API integrations. You’ll collaborate with cross-functional teams across geographies, so a willingness to work night shifts overlapping with US hours (till 10 AM IST) is essential. Key Responsibilities Data Engineering Leadership : Design and implement scalable, end-to-end data ingestion and processing frameworks using AWS. AWS Architecture : Hands-on development using AWS Glue, Lambda, EMR, Step Functions, S3, ECS , and other AWS services. Data Quality & Validation : Build automated checks, validation layers, and monitoring for ensuring data accuracy and integrity. API Development : Develop secure, high-performance REST APIs for internal and external data integration. Collaboration : Work closely with product, analytics, and DevOps teams across geographies. Participate in Agile ceremonies and CI/CD pipelines using tools like GitLab. What We’re Looking For Experience : 5+ years in Data Engineering, with a proven track record in designing scalable AWS-based data systems. Technical Mastery : Proficient in Python/PySpark, SQL, and building big data pipelines. AWS Expert : Deep knowledge of core AWS services used for data ingestion and processing. API Expertise : Experience designing and managing scalable APIs. Leadership Qualities : Ability to work independently, lead discussions, and drive technical decisions. Preferred Qualifications Experience with Kinesis, Firehose, SQS , and data lakehouse architectures . Exposure to tools like Apache Iceberg , Aurora , Redshift , and DynamoDB . Prior experience in distributed, multi-cluster environments. Working Hours US Time Zone Overlap Required : Must be available to work night shifts overlapping with US hours (up to 10:00 AM IST). Work Location Trivandrum or Kochi – On-site or hybrid options available for the right candidate. Show more Show less
Posted 4 days ago
11.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Contract Duration: 12 Months Location: Pune Preferred Experience: 7–11 Years 🔧 Key Skills Required AEM Backend Development Custom components, templates, experience fragments AEM as a Cloud Service, Dynamic Media, Smart Crop, Dispatcher Cache OSGi bundles/services, CRX, Apache Sling Frontend with React.js API integration, React Hooks, functional components Web performance optimization, reducing re-renders Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Title: Intermediate Application Developer Experience Range: 5-7 Years Location: Chennai, Hybrid Employment Type: Full-Time About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Track Alert The UPS Track Alert API aims to enhance the tracking experience for high-volume customers and third parties, while reducing the overall load on the Track API, and monetizing our tracking services. The goal of Track Alert API is to reduce unnecessary burden on our system, while giving our customers’ ability to receive status updates on the small packages quickly and accurately. Track Alert API benefits are to enhance customer experience, operational efficiency, data-driven decision making, optimizing cash flow through near real-time delivery tracking, and mitigating fraud and theft through near real-time package status monitoring. About The Role The Intermediate Applications Developer participates in full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of applications software, etc.) for business critical UPS.com Track Visibility applications. Candidate must have strong analytical and problem-solving skills. He/She collaborates with teams and supports emerging technologies and must have strong verbal and written communication skills. The ideal candidate should have experience in designing, developing, and deploying scalable web applications while adhering to the SAFe Agile methodology using Azure DevOps. Key Responsibilities Collaborate with cross-functional teams to design, develop, and maintain Java Spring Boot based Restful Web Services. Design and implement microservices-based solutions for high scalability and maintainability. Develop and maintain OCP4 and GCP hosted solutions, ensuring high availability and security. Participate in SAFe Agile ceremonies including PI planning, daily stand-ups, and retrospectives. Utilize Azure DevOps for CI/CD pipeline setup, version control, and automated deployments. Perform code reviews, ensure coding standards, and mentor junior developers. Troubleshoot and resolve complex technical issues across frontend and backend systems. Primary Skills Backend: Java, Spring Boot, AMQ, WMQ, Kafka, Apache Camel, JSON, and XML Cloud: OpenShift, Google Cloud Platform DevOps & CI/CD: Azure DevOps – Pipelines, Repos, Boards. Architecture & Design Patterns: Restful Web Service Client/Server Development Microservices Architecture. Object Oriented Analysis and Design Secondary Skills Testing: Unit Testing (xUnit, NUnit) and Integration Testing. Cucumber JMeter API Management: RESTful API design and development. API Gateway, OAuth, OpenAPI/Swagger. Security & Performance: Application performance optimization and monitoring. Methodologies: SAFe Agile Framework – Familiarity with PI Planning, Iterations, and Agile ceremonies. Tools & Collaboration: Git, IntelliJ or Eclipse Collaboration tools like Microsoft Teams. Qualifications Bachelor’s degree in Computer Science, Information Technology, or related field. Proven experience in client and server-side web service development. Strong understanding of cloud-native application design, especially on GCP. Excellent problem-solving skills and the ability to lead technical discussions. Nice To Have Exposure to containerization technologies (Docker, Kubernetes). Knowledge of Code Quality Inspection Tools, Dependency Management Systems and Software Vulnerability Detection and Remediation Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. About The Team You will be part of a dynamic and collaborative team of passionate developers, architects, and product owners dedicated to building high-performance web applications. Our team values innovation, continuous learning, and agile best practices. We work closely using the SAFe Agile framework and foster an inclusive environment where everyone's ideas are valued. Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less
Posted 4 days ago
5.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Fiche De Poste Job Title: Intermediate Application Developer Experience Range: 5-7 Years Location: Chennai, Hybrid Employment Type: Full-Time About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Track Alert The UPS Track Alert API aims to enhance the tracking experience for high-volume customers and third parties, while reducing the overall load on the Track API, and monetizing our tracking services. The goal of Track Alert API is to reduce unnecessary burden on our system, while giving our customers’ ability to receive status updates on the small packages quickly and accurately. Track Alert API benefits are to enhance customer experience, operational efficiency, data-driven decision making, optimizing cash flow through near real-time delivery tracking, and mitigating fraud and theft through near real-time package status monitoring. About The Role The Intermediate Applications Developer participates in full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of applications software, etc.) for business critical UPS.com Track Visibility applications. Candidate must have strong analytical and problem-solving skills. He/She collaborates with teams and supports emerging technologies and must have strong verbal and written communication skills. The ideal candidate should have experience in designing, developing, and deploying scalable web applications while adhering to the SAFe Agile methodology using Azure DevOps. Key Responsibilities Collaborate with cross-functional teams to design, develop, and maintain Java Spring Boot based Restful Web Services. Design and implement microservices-based solutions for high scalability and maintainability. Develop and maintain OCP4 and GCP hosted solutions, ensuring high availability and security. Participate in SAFe Agile ceremonies including PI planning, daily stand-ups, and retrospectives. Utilize Azure DevOps for CI/CD pipeline setup, version control, and automated deployments. Perform code reviews, ensure coding standards, and mentor junior developers. Troubleshoot and resolve complex technical issues across frontend and backend systems. Primary Skills Backend: Java, Spring Boot, AMQ, WMQ, Kafka, Apache Camel, JSON, and XML Cloud: OpenShift, Google Cloud Platform DevOps & CI/CD: Azure DevOps – Pipelines, Repos, Boards. Architecture & Design Patterns: Restful Web Service Client/Server Development Microservices Architecture. Object Oriented Analysis and Design Secondary Skills Testing: Unit Testing (xUnit, NUnit) and Integration Testing. Cucumber JMeter API Management: RESTful API design and development. API Gateway, OAuth, OpenAPI/Swagger. Security & Performance: Application performance optimization and monitoring. Methodologies: SAFe Agile Framework – Familiarity with PI Planning, Iterations, and Agile ceremonies. Tools & Collaboration: Git, IntelliJ or Eclipse Collaboration tools like Microsoft Teams. Qualifications Bachelor’s degree in Computer Science, Information Technology, or related field. Proven experience in client and server-side web service development. Strong understanding of cloud-native application design, especially on GCP. Excellent problem-solving skills and the ability to lead technical discussions. Nice To Have Exposure to containerization technologies (Docker, Kubernetes). Knowledge of Code Quality Inspection Tools, Dependency Management Systems and Software Vulnerability Detection and Remediation Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. About The Team You will be part of a dynamic and collaborative team of passionate developers, architects, and product owners dedicated to building high-performance web applications. Our team values innovation, continuous learning, and agile best practices. We work closely using the SAFe Agile framework and foster an inclusive environment where everyone's ideas are valued. Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés. Show more Show less
Posted 4 days ago
5.0 - 7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less
Posted 4 days ago
5.0 - 7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes Show more Show less
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Apache is a widely used software foundation that offers a range of open-source software solutions. In India, the demand for professionals with expertise in Apache tools and technologies is on the rise. Job seekers looking to pursue a career in Apache-related roles have a plethora of opportunities in various industries. Let's delve into the Apache job market in India to gain a better understanding of the landscape.
These cities are known for their thriving IT sectors and see a high demand for Apache professionals across different organizations.
The salary range for Apache professionals in India varies based on experience and skill level. - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum
In the Apache job market in India, a typical career path may progress as follows: 1. Junior Developer 2. Developer 3. Senior Developer 4. Tech Lead 5. Architect
Besides expertise in Apache tools and technologies, professionals in this field are often expected to have skills in: - Linux - Networking - Database Management - Cloud Computing
As you embark on your journey to explore Apache jobs in India, it is essential to stay updated on the latest trends and technologies in the field. By honing your skills and preparing thoroughly for interviews, you can position yourself as a competitive candidate in the Apache job market. Stay motivated, keep learning, and pursue your dream career with confidence!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.