1 - 3 years
3.0 - 6.0 Lacs P.A.
Pune
Posted:2 months ago| Platform:
Hybrid
Full Time
Responsibilities: Testing big data ingestion and aggregation flows using spark shell and related queries. Developing automation framework using programming languages such as python and automate the big data workflows such as ingestion, aggregation, ETL processing etc. Debugging and troubleshooting issues within the big data ecosystem. Set up the Big data platform and Hadoop ecosystem for testing. Define test strategy and write test plan for the data platform enhancements and new features/services built on it. Define the operating procedures, service monitors and alerts and work with the NOC team to get them implemented. Responsible for system & performance testing of the data platform and applications Solve problems, establish plans, and provide technical consultation in the design, development, and test effort of complex engineering projects. Review product specifications and write test cases, develop test plans for assigned areas. Identifies issues and technical interdependencies and suggest possible solutions. Recreate complex customer and production reported issues to determine root cause and verify the fix. Requirements: Should have 1-3 years of experience working as SDET and doing meaningful automation. Good programming skills. Python preferred. Hands on experience in automating backend applications (e.g., database, REST API's) Hands on experience with automating any backend applications (e.g., database, server side). Knowledge of relational databases and SQL. Good debugging skills. Working experience working in Linux/Unix environment. Good understanding of testing methodologies. Good to have hands on experience in working on Big Data technologies like Hadoop, Spark Quick learner and good team member with positive attitude. Good verbal and written communication skills. Qualifications: Primary (Mandatory) Skills: Good hands-on experience with Unix/Linux Good hands-on experience in writing python codes QA Methodologies understanding. Secondary Skills (Good to have): Experience in Big data platform & data analytics testing is an advantage. Knowledge on distributed systems and technologies like Hadoop and spark. Competencies We Celebrate Teamwork We Use Data to Solve Problems We are Biased Towards Action We are Leaders and Innovators We put the Customer First Return to Office : PubMatic employees throughout the global have returned to our offices via a hybrid work schedule (3 days in office and 2 days working remotely) that is intended to maximize collaboration, innovation, and productivity among teams and across functions.
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Pune, Gurgaon, Mumbai (All Areas)
INR 5.0 - 15.0 Lacs P.A.
Ghaziabad, Bengaluru
INR 18.0 - 90.0 Lacs P.A.
INR 5.0 - 10.0 Lacs P.A.
Bengaluru
INR 7.0 - 8.0 Lacs P.A.
INR 7.0 - 12.0 Lacs P.A.
Nasik, Pune, Nagpur, Mumbai, Thane, Aurangabad
INR 7.0 - 12.0 Lacs P.A.
INR 12.0 - 13.0 Lacs P.A.
Chennai
INR 5.0 - 8.0 Lacs P.A.
INR 0.6 - 0.7 Lacs P.A.
Pune, Navi Mumbai, Hyderabad
INR 1.0 - 5.0 Lacs P.A.