Home
Jobs
Companies
Resume

366 Hbase Jobs - Page 3

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

About PhonePe Group PhonePe is India s leading digital payments company with 50 crore (500 Million) registered users and 3.7 crore (37 Million) merchants covering over 99% of the postal codes across India. On the back of its leadership in digital payments, PhonePe has expanded into financial services (Insurance, Mutual Funds, Stock Broking, and Lending) as well as adjacent tech-enabled businesses such as Pincode for hyperlocal shopping and Indus App Store which is India's first localized App Store. The PhonePe Group is a portfolio of businesses aligned with the company's vision to offer every Indian an equal opportunity to accelerate their progress by unlocking the flow of money and access to services. Culture At PhonePe, we take extra care to make sure you give your best at work, Everyday! And creating the right environment for you is just one of the things we do. We empower people and trust them to do the right thing. Here, you own your work from start to finish, right from day one. Being enthusiastic about tech is a big part of being at PhonePe. If you like building technology that impacts millions, ideating with some of the best minds in the country and executing on your dreams with purpose and speed, join us! Challenges Building for Scale, Rapid Iterative Development, and Customer-centric Product Thinking at each step defines every day for a developer at PhonePe. Though we engineer for a 50million+ strong user base, we code with every individual user in mind. While we are quick to adopt the latest in Engineering, we care utmost for security, stability, and automation. Apply if you want to experience the best combination of passionate application development and product-driven thinking As a Software Engineer: You will build Robust and scalable web-based applications You will need to think of platforms & reuse Build abstractions and contracts with separation of concerns for a larger scope Drive problem-solving skills for high-level business and technical problems. Do high-level design with guidance; Functional modeling, break-down of a module Do incremental changes to architectureimpact analysis of the same Do performance tuning and improvements in large scale distributed systems Mentor young minds and foster team spirit, break down execution into phases to bring predictability to overall execution Work closely with Product Manager to derive capability view from features/solutions, Lead execution of medium-sized projects Work with broader stakeholders to track the impact of projects/features and proactively iterate to improve them As a senior software engineer you must have Extensive and expert programming experience in at least one general programming language (e.g. Java, C, C++) & tech stack to write maintainable, scalable, unit-tested code. Experience with multi-threading and concurrency programming Extensive experience in object-oriented design skills, knowledge of design patterns, and huge passion and ability to design intuitive module and class-level interfaces Excellent coding skills - should be able to convert the design into code fluently Knowledge of Test Driven Development Good understanding of databases (e.g. MySQL) and NoSQL (e.g. HBase, Elasticsearch, Aerospike, etc) Strong desire to solving complex and interesting real-world problems Experience with full life cycle development in any programming language on a Linux platform Go-getter attitude that reflects in energy and intent behind assigned tasks Worked in a startups environment with high levels of ownership and commitment BTech, MTech, or Ph.D. in Computer Science or related technical discipline (or equivalent). Experience in building highly scalable business applications, which involve implementing large complex business flows and dealing with a huge amount of data. 5+ years of experience in the art of writing code and solving problems on a Large Scale. An open communicator who shares thoughts and opinions frequently listens intently and takes constructive feedback. As a Software Engineer, good to have The ability to drive the design and architecture of multiple subsystems Ability to break-down larger/fuzzier problems into smaller ones in the scope of the product Understanding of the industry s coding standards and an ability to create appropriate technical documentation. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog . Life at PhonePe PhonePe in the news

Posted 1 week ago

Apply

4.0 - 9.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasn t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target s global team and has more than 4,000 team members supporting the company s global strategy and operations. Pyramid overview A role with Target Data Science & Engineering means the chance to help develop and manage state of the art predictive algorithms that use data at scale to automate and optimize decisions at scale. Whether you join our Statistics, Optimization or Machine Learning teams, you ll be challenged to harness Target s impressive data breadth to build the algorithms that power solutions our partners in Marketing, Supply Chain Optimization, Network Security and Personalization rely on. Position Overview As a Senior Engineer on the Search Team , you serve as a specialist in the engineering team that supports the product. You help develop and gain insight in the application architecture. You can distill an abstract architecture into concrete design and influence the implementation. You show expertise in applying the appropriate software engineering patterns to build robust and scalable systems. You are an expert in programming and apply your skills in developing the product. You have the skills to design and implement the architecture on your own, but choose to influence your fellow engineers by proposing software designs, providing feedback on software designs and/or implementation. You leverage data science in solving complex business problems. You make decisions based on data. You show good problem solving skills and can help the team in triaging operational issues. You leverage your expertise in eliminating repeat occurrences. About You 4-year degree in Quantitative disciplines (Science, Technology, Engineering, Mathematics) or equivalent experience Experience with Search Engines like SOLR and Elastic Search Strong hands-on programming skills in Java, Kotlin, Micronaut, Python, Experience on Pyspark, SQL, Hadoop/Hive is added advantage Experience on streaming systems like Kakfa. Experience on Kafka Streams is added advantage. Experience in MLOps is added advantage Experience in Data Engineering is added advantage Strong analytical thinking skills with an ability to creatively solve business problems, innovating new approaches where required Able to produce reasonable documents/narrative suggesting actionable insights Self-driven and results oriented Strong team player with ability to collaborate effectively across geographies/time zones Know More About Us here: Life at Target - https://india.target.com/ Benefits - https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging

Posted 1 week ago

Apply

12.0 - 15.0 years

13 - 17 Lacs

Mumbai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1688_JOB Date Opened 24/12/2022 Industry Technology Job Type Work Experience 12-15 years Job Title Big Data Architect City Mumbai Province Maharashtra Country India Postal Code 400008 Number of Positions 4 LocationMumbai, Pune, Chennai, Hyderabad, Coimbatore, Kolkata 12+ Years experience in Big data Space across Architecture, Design, Development, testing & Deployment, full understanding in SDLC. 1. Experience of Hadoop and related technology stack experience 2. Experience of the Hadoop Eco-system(HDP+CDP) / Big Data (especially HIVE) Hand on experience with programming languages such as Java/Scala/python Hand-on experience/knowledge on Spark3. Being responsible and focusing on uptime and reliable running of all or ingestion/ETL jobs4. Good SQL and used to work in a Unix/Linux environment is a must.5. Create and maintain optimal data pipeline architecture.6. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.7. Good to have cloud experience8. Good to have experience for Hadoop integration with data visualization tools like PowerBI. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

6.0 - 10.0 years

3 - 7 Lacs

Chennai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2199_JOB Date Opened 15/04/2024 Industry Technology Job Type Work Experience 6-10 years Job Title Sr Data Engineer City Chennai Province Tamil Nadu Country India Postal Code 600004 Number of Positions 4 Strong experience in Python Good experience in Databricks Experience working in AWS/Azure Cloud Platform. Experience working with REST APIs and services, messaging and event technologies. Experience with ETL or building Data Pipeline tools Experience with streaming platforms such as Kafka. Demonstrated experience working with large and complex data sets. Ability to document data pipeline architecture and design Experience in Airflow is nice to have To build complex Deltalake check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

5.0 - 8.0 years

2 - 5 Lacs

Chennai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2168_JOB Date Opened 10/04/2024 Industry Technology Job Type Work Experience 5-8 years Job Title AWS Data Engineer City Chennai Province Tamil Nadu Country India Postal Code 600002 Number of Positions 4 Mandatory Skills: AWS, Python, SQL, spark, Airflow, SnowflakeResponsibilities Create and manage cloud resources in AWS Data ingestion from different data sources which exposes data using different technologies, such asRDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations Develop an infrastructure to collect, transform, combine and publish/distribute customer data. Define process improvement opportunities to optimize data collection, insights and displays. Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible Identify and interpret trends and patterns from complex data sets Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. Key participant in regular Scrum ceremonies with the agile teams Proficient at developing queries, writing reports and presenting findings Mentor junior members and bring best industry practices check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

5.0 - 8.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1628_JOB Date Opened 09/12/2022 Industry Technology Job Type Work Experience 5-8 years Job Title Data Engineer City Bangalore Province Karnataka Country India Postal Code 560001 Number of Positions 4 Roles and Responsibilities: 4+ years of experience as a data developer using Python Knowledge in Spark, PySpark preferable but not mandatory Azure Cloud experience (preferred) Alternate Cloud experience is fine preferred experience in Azure platform including Azure data Lake, data Bricks, data Factory Working Knowledge on different file formats such as JSON, Parquet, CSV, etc. Familiarity with data encryption, data masking Database experience in SQL Server is preferable preferred experience in NoSQL databases like MongoDB Team player, reliable, self-motivated, and self-disciplined check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

10+ years of software development experience building large scale distributed data processing systems/application, Data Engineering or large scale internet systems. Experience of at least 4 years in Developing/ Leading Big Data solution at enterprise scale with at least one end to end implementation Strong experience in programming languages Java/J2EE/Scala. Good experience in Spark/Hadoop/HDFS Architecture, YARN, Confluent Kafka , Hbase, Hive, Impala and NoSQL database. Experience with Batch Processing and AutoSys Job Scheduling and Monitoring Performance analysis, troubleshooting and resolution (this includes familiarity and investigation of Cloudera/Hadoop logs) Work with Cloudera on open issues that would result in cluster configuration changes and then implement as needed Strong experience with databases such as SQL,Hive, Elasticsearch, HBase, etc Knowledge of Hadoop Security, Data Management and Governance Primary Skills: Java/Scala, ETL, Spark, Hadoop, Hive, Impala, Sqoop, HBase, Confluent Kafka, Oracle, Linux, Git, Jenkins CI/CD

Posted 1 week ago

Apply

5.0 - 10.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Were seeking a Senior Software Engineer or a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, locations and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Senior Software Engineer or a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5-10 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Django Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in Kafka or any other stream message processing solutions. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for lead or principal software engineers to join our Data Cloud team. Our Data Cloud team is responsible for the Zeta Identity Graph platform, which captures billions of behavioural, demographic, environmental, and transactional signals, for people-based marketing. As part of this team, the data engineer will be designing and growing our existing data infrastructure to democratize data access, enable complex data analyses, and automate optimization workflows for business and marketing operations. Job Description: Essential Responsibilities: As a Lead or Principal Data Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as HDFS, Spark, Snowflake, Hive, HBase, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in 24/7 on-call rotation (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 7 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and onpremises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark, HDFS, Hive, HBase Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience with web frameworks such as Flask, Django .

Posted 1 week ago

Apply

2.0 - 4.0 years

2 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Fusion Plus Solutions Inc is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Tech Stalwart Solution Private Limited is looking for Sr. Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 1 week ago

Apply

10.0 - 15.0 years

16 - 20 Lacs

Pune

Work from Office

Naukri logo

Arista Networks is seeking high-caliber self-motivated test engineers for its enterprise and cloud-managed WiFi products. This role will involve testing and validating complex cloud infrastructure and services that power WiFi networks, ensuring they perform at scale and meet the highest standards of reliability, security, and efficiency. You will play a key role in the design, development, and execution of comprehensive test plans for cloud-based Wi-Fi solutions, working closely with cross-functional teams to deliver industry-leading products. Responsibilities: Lead the testing efforts on the cloud infrastructure that supports Wi-Fi networks, ensuring scalability, performance, and resilience in different environments (Cloud and On-premise deployments) Design and execute performance, scalability, and load tests to ensure the cloud services can handle heavy traffic and provide reliable WiFi services under various conditions. Collaborate with engineering teams to conduct integration tests for cloud-based services that interface with hardware. Ensure seamless interactions between cloud APIs and the Access Points. Create solution based detailed test plans, test cases, and test reports for all stages of the development lifecycle. Ensure traceability between requirements, test cases, and defects. Own the feature and core area of the product, and be the single point of contact and testing expert for your deliverables. Track defects and issues using defect management systems, ensuring they are escalated appropriately and resolved in a timely manner. Work closely with product managers, developers, and other stakeholders to understand requirements, provide feedback, and ensure high-quality product delivery. Participate in escalations debugging and reproduce the complex issues reported by customers. Mentor junior team members and share best practices in testing and cloud-based WiFi solution validation. Bachelor s degree in E&TC/EE/CE/CS/IT or a related field; Master s degree a plus. 10 - 15 years of experience in testing & troubleshooting Cloud technologies, virtualization, Microservices, and debugging complex software Good experience in working with Hadoop & Big data technologies like HDFS, HBase, Spark, Kafka streaming, and Elasticsearch Good working knowledge of Database (SQL) and Server side validations. Experience with various performance testing methods (Load test, Stress test, and capacity test) will be a plus. Strong understanding of distributed systems fundamentals around scalability, elasticity, availability, and fault tolerance. Experience in Networking Tools and networking fundamentals across the OSI stack is a plus. Strong analytical and debugging skills and proven expertise in creating test methodologies and writing test plans. Strong knowledge of Linux systems. Familiarity with scripting languages such as Python will be a plus. Excellent written and oral English communication skills. Adaptive to learning new things and technologies

Posted 1 week ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

The role requires strategic thinking and planning and provide expertise throughout the entire product development life cycle with a strong knowledge of SAS Viya programming, API architecture , Kubernetes , Risk and Finance domain. Also the role requires ownership, making sure that quality is baked in from the start Key Responsibilities N/A Skills and Experience Hands on in Python - Data Manipulation- Pandas, Nunphy, SAS developer framework and expert in SAS development work Desirable skills on SAS admin Framework. Desirable skills on Hadoop development Framework. Sound statistical knowledge, analytical and problem-solving skills are desirable. Good to have knowledge on Big data technologies (Hortonworks HDP, Apache Hadoop, HDFS, Hive, Sqoop, Flume, Zookeeper and HBase, Oozie, Spark, Ni-Fi, Kafka, Snap logic, AWS, Red shift). Have experience with monitoring tools. Development capabilities using python, spark, sas, R languages. Good management and analytical skill Good writing and oral communication skills Good understanding of and experience in projects (e. g. SDLC, Agile methodology) Desirable skills in Big Data space (Hadoop Stack like HDFS, Pig, Hive, HBase, Sqoop etc) Ability to debug & write / modify Shell script/Python Willing to work on-call support over weekends Liaise with multiple application teams & co-ordinate for issue resolution Good analytical & interaction skills Qualifications N/A Together we: Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. 26272

Posted 1 week ago

Apply

10.0 - 15.0 years

30 - 35 Lacs

Pune

Work from Office

Naukri logo

Who You ll Work With Arista Networks is seeking high-caliber self-motivated test engineers for its enterprise and cloud-managed WiFi products. This role will involve testing and validating complex cloud infrastructure and services that power WiFi networks, ensuring they perform at scale and meet the highest standards of reliability, security, and efficiency. You will play a key role in the design, development, and execution of comprehensive test plans for cloud-based Wi-Fi solutions, working closely with cross-functional teams to deliver industry-leading products. What You ll Do Lead the testing efforts on the cloud infrastructure that supports Wi-Fi networks, ensuring scalability, performance, and resilience in different environments (Cloud and On-premise deployments) Design and execute performance, scalability, and load tests to ensure the cloud services can handle heavy traffic and provide reliable WiFi services under various conditions. Collaborate with engineering teams to conduct integration tests for cloud-based services that interface with hardware. Ensure seamless interactions between cloud APIs and the Access Points. Create solution based detailed test plans, test cases, and test reports for all stages of the development lifecycle. Ensure traceability between requirements, test cases, and defects. Own the feature and core area of the product, and be the single point of contact and testing expert for your deliverables. Track defects and issues using defect management systems, ensuring they are escalated appropriately and resolved in a timely manner. Work closely with product managers, developers, and other stakeholders to understand requirements, provide feedback, and ensure high-quality product delivery. Participate in escalations debugging and reproduce the complex issues reported by customers. Mentor junior team members and share best practices in testing and cloud-based WiFi solution validation. Bachelor s degree in E&TC/EE/CE/CS/IT or a related field; Master s degree a plus. 10 - 15 years of experience in testing & troubleshooting Cloud technologies, virtualization, Microservices, and debugging complex software

Posted 1 week ago

Apply

3.0 - 6.0 years

13 - 18 Lacs

Pune

Work from Office

Naukri logo

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. ZS’s Platform Developmentteam designs, implements, tests and supports ZS’s ZAIDYN Platform which helps drive superior customer experiences and revenue outcomes through integrated products & analytics. Whether writing distributed optimization algorithms or advanced mapping and visualization interfaces, you will have an opportunity to solve challenging problems, make an immediate impact and contribute to bring better health outcomes. What you'll do : Pair program, write unit tests, lead code reviews, and collaborate with QA analysts to ensure you develop the highest quality multi-tenant software that can be productized As part of our full-stack product engineering team, you will build multi-tenant cloud-based software products/platforms and internal assets that will leverage cutting edge based on the Amazon AWS cloud platform. Work with junior developers to implement large features that are on the cutting edge of Big Data Be a technical leader to your team, and help them improve their technical skills Stand up for engineering practices that ensure quality productsautomated testing, unit testing, agile development, continuous integration, code reviews, and technical design Work with product managers and architects to design product architecture and to work on POCs Take immediate responsibility for project deliverables Understand client business issues and design features that meet client needs Undergo on-the-job and formal trainings and certifications, and will constantly advance your knowledge and problem solving skills What you'll bring : Bachelor's Degree in CS, IT, or related discipline Strong analytic, problem solving, and programming ability Experience in coding in an object-oriented language such as Python, Java, C# etc. Hands on experience on Apache Spark, EMR, Hadoop, HDFS, or other big data technologies Experience with development on the AWS (Amazon Web Services) platform is preferable Experience in Linux shell or PowerShell scripting is preferable Experience in HTML5, JavaScript, and JavaScript libraries is preferable Understanding to Data Science Algorithms God to have Pharma domain understanding Initiative and drive to contribute Excellent organizational and task management skills Strong communication skills Ability to work in global cross-office teams ZS is a global firm; fluency in English is required Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com

Posted 1 week ago

Apply

5.0 - 7.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on Azure Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Exposure to streaming solutions and message brokers like Kafka technologies Experience Unix / Linux Commands and basic work experience in Shell Scripting Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers

Posted 1 week ago

Apply

6.0 - 11.0 years

18 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

SUMMARY Data Modeling Professional Location Hyderabad/Pune Experience: The ideal candidate should possess at least 6 years of relevant experience in data modeling with proficiency in SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools). Key Responsibilities: Develop and configure data pipelines across various platforms and technologies. Write complex SQL queries for data analysis on databases such as SQL Server, Oracle, and HIVE. Create solutions to support AI/ML models and generative AI. Work independently on specialized assignments within project deliverables. Provide solutions and tools to enhance engineering efficiencies. Design processes, systems, and operational models for end-to-end execution of data pipelines. Preferred Skills: Experience with GCP, particularly Airflow, Dataproc, and Big Query, is advantageous. Requirements Requirements: Strong problem-solving and analytical abilities. Excellent communication and presentation skills. Ability to deliver high-quality materials against tight deadlines. Effective under pressure with rapidly changing priorities. Note: The ability to communicate efficiently at a global level is paramount. --- Minimum 6 years of experience in data modeling with SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools). Proficiency in writing complex SQL queries for data analysis. Experience with GCP, particularly Airflow, Dataproc, and Big Query, is an advantage. Strong problem-solving and analytical abilities. Excellent communication and presentation skills. Ability to work effectively under pressure with rapidly changing priorities.

Posted 1 week ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Pune

Work from Office

Naukri logo

Design, develop & maintain scalable data pipelines & systems using PySpark and other big data tools. Monitor, troubleshoot & resolve issues in data workflows & pipelines. Implement best practices for data processing, security & storage. Required Candidate profile Strong programming skills in PySpark & Python. Exp with big data frameworks like Hadoop, Spark, or Kafka. Proficiency in working with cloud platforms. Exp with data modeling & working with databases.

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Pune

Work from Office

Naukri logo

Our Purpose Title and Summary Lead Software Engineer - Java developer, Cloud Job Description Summary Overview Offers team in the Loyalty group at Mastercard is at the heart of providing end consumers highly personalized card linked offers by marrying offers from our advertising network with anonymized transaction data. We are looking for strong, innovative engineers who can bring unique perspectives and innovative ideas to all areas of development and are interested in continuing to improve our platform through the ever-changing technology landscape. In this role, you will work on multiple projects, which are critical to Mastercard s role. An ideal candidate is someone who works well with a cohesive team-oriented group of developers, who has a passion for developing great software, who enjoys solving problems in a challenging environment, and who has the desire to take their career to the next level. If any of these opportunities excite you, we would love to talk. Responsibilities: Design, develop, test, deploy, maintain, and continuously improve software products Manage individual project priorities, deadlines, and deliverables Ensure the final product is highly performant, responsive, and aligned with business goals Actively participate in agile ceremonies including daily stand-ups, story pointing, backlog refinement, and retrospectives Collaborate with cross-functional teams including Product, Business, and Engineering Coach and mentor junior and new team members to foster professional growth and team alignment All About You / Experience: Proven experience as a senior or lead software engineer with a strong technical foundation Proficient in at least one modern programming language, preferably Java Strong knowledge of object-oriented design, data structures, algorithms, and complexity analysis Extensive experience building RESTful APIs and designing microservices-based architectures Hands-on expertise with React, Spring Boot, PostgreSQL, and Elasticsearch Demonstrated full stack development experience, including frontend and backend integration Familiarity with cloud platforms, along with distributed systems Exposure to Big Data technologies like Apache Spark, Python, and HBase is a plus Strong communication and leadership skills with a proactive, collaborative mindset

Posted 2 weeks ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Pune

Work from Office

Naukri logo

Lead Software Engineer - Java developer, Cloud Job Description Summary Overview Offers team in the Loyalty group at Mastercard is at the heart of providing end consumers highly personalized card linked offers by marrying offers from our advertising network with anonymized transaction data. We are looking for strong, innovative engineers who can bring unique perspectives and innovative ideas to all areas of development and are interested in continuing to improve our platform through the ever-changing technology landscape. In this role, you will work on multiple projects, which are critical to Mastercard s role. An ideal candidate is someone who works well with a cohesive team-oriented group of developers, who has a passion for developing great software, who enjoys solving problems in a challenging environment, and who has the desire to take their career to the next level. If any of these opportunities excite you, we would love to talk. Responsibilities: Design, develop, test, deploy, maintain, and continuously improve software products Manage individual project priorities, deadlines, and deliverables Ensure the final product is highly performant, responsive, and aligned with business goals Actively participate in agile ceremonies including daily stand-ups, story pointing, backlog refinement, and retrospectives Collaborate with cross-functional teams including Product, Business, and Engineering Coach and mentor junior and new team members to foster professional growth and team alignment All About You / Experience: Proven experience as a senior or lead software engineer with a strong technical foundation Proficient in at least one modern programming language, preferably Java Strong knowledge of object-oriented design, data structures, algorithms, and complexity analysis Extensive experience building RESTful APIs and designing microservices-based architectures Hands-on expertise with React, Spring Boot, PostgreSQL, and Elasticsearch Demonstrated full stack development experience, including frontend and backend integration Familiarity with cloud platforms, along with distributed systems Exposure to Big Data technologies like Apache Spark, Python, and HBase is a plus Strong communication and leadership skills with a proactive, collaborative mindset

Posted 2 weeks ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Navi Mumbai

Work from Office

Naukri logo

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 2 weeks ago

Apply

2.0 - 5.0 years

14 - 17 Lacs

Gurugram

Work from Office

Naukri logo

As a BigData Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Big Data Developer, Hadoop, Hive, Spark, PySpark, Strong SQL. Ability to incorporate a variety of statistical and machine learning techniques. Basic understanding of Cloud (AWS,Azure, etc). Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience Basic understanding or experience with predictive/prescriptive modeling skills You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions

Posted 2 weeks ago

Apply

6.0 - 11.0 years

14 - 17 Lacs

Mysuru

Work from Office

Naukri logo

As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure

Posted 2 weeks ago

Apply

3.0 - 7.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Your Role Experience in data engineering and end-to-end implementation of CDP projects. Proficient in SQL, CDP (TreasureData), Python/Dig-Dag, Presto/SQL, and data engineering. Hands-on experience with Treasure Data CDP implementation and management. Excellent SQL skills, including advanced query writing and optimization. Oversee the end-to-end maintenance and operation of the Treasure Data CDP. Familiarity with data integration, API operations, and audience segmentation. Your profile Experience in unifying data across multiple brands and regions, ensuring consistency and accuracy. Ability to create and manage data workflows in Treasure Data Collaborate with cross-functional teams to ensure successful data integration and usage. Troubleshoot and optimize data pipelines and processes for scalability and performance. Stay updated on the latest features and best practices in Treasure Data and related technologies.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

About the Role: This role is responsible for managing and maintaining complex, distributed big data ecosystems. It ensures the reliability, scalability, and security of large-scale production infrastructure. Key responsibilities include automating processes, optimizing workflows, troubleshooting production issues, and driving system improvements across multiple business verticals. Roles and Responsibilities: Manage, maintain, and support incremental changes to Linux/Unix environments. Lead on-call rotations and incident responses, conducting root cause analysis and driving postmortem processes. Design and implement automation systems for managing big data infrastructure, including provisioning, scaling, upgrades, and patching clusters. Troubleshoot and resolve complex production issues while identifying root causes and implementing mitigating strategies. Design and review scalable and reliable system architectures. Collaborate with teams to optimize overall system performance. Enforce security standards across systems and infrastructure. Set technical direction, drive standardization, and operate independently. Ensure availability, performance, and scalability of systems and services through proactive monitoring, maintenance, and capacity planning. Resolve, analyze, and respond to system outages and disruptions and implement measures to prevent similar incidents from recurring. Develop tools and scripts to automate operational processes, reducing manual workload, increasing efficiency and improving system resilience. Monitor and optimize system performance and resource usage, identify and address bottlenecks, and implement best practices for performance tuning. Collaborate with development teams to integrate best practices for reliability, scalability, and performance into the software development lifecycle. Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities. Develop and enforce SRE best practices and principles. Align across functional teams on priorities and deliverables. Drive automation to enhance operational efficiency. Skills Required: Over 6 years of experience managing and maintaining distributed big data ecosystems. Strong expertise in Linux including IP, Iptables, and IPsec. Proficiency in scripting/programming with languages like Perl, Golang, or Python. Hands-on experience with the Hadoop stack (HDFS, HBase, Airflow, YARN, Ranger, Kafka, Pinot). Familiarity with open-source configuration management and deployment tools such as Puppet, Salt, Chef, or Ansible. Solid understanding of networking, open-source technologies, and related tools. Excellent communication and collaboration skills. DevOps tools: Saltstack, Ansible, docker, Git. SRE Logging and monitoring tools: ELK stack, Grafana, Prometheus, opentsdb, Open Telemetry. Good to Have: Experience managing infrastructure on public cloud platforms (AWS, Azure, GCP). Experience in designing and reviewing system architectures for scalability and reliability. Experience with observability tools to visualize and alert on system performance.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies