Jobs
Interviews

2586 Joins Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

8 - 18 Lacs

pune, maharashtra, india

On-site

Industry & Sector: Recruitment & staffing for technology and analytics roles supporting Financial Services, Retail and Enterprise Data platforms. We are hiring on behalf of clients for an on-site data engineering QA role focused on validating ETL pipelines, data quality, and production-ready data warehouse solutions. Primary Job Title: ETL QA Engineer Location: India — On-site Role & Responsibilities Execute end-to-end ETL test cycles: validate source-to-target mappings, transformations, row counts, and data reconciliation for batch and incremental loads. Create and maintain detailed test plans, test cases, and traceability matrices from functional and technical specifications. Author and run complex SQL/PLSQL queries to perform record-level validation, aggregate checks, and anomaly detection; capture and report metrics. Automate repetitive validation tasks using scripts (Python/Shell) or ETL tool features and integrate checks into CI/CD pipelines where applicable. Log, triage and manage defects in JIRA/ALM; reproduce issues, collaborate with ETL developers to resolve root causes, and validate fixes through regression testing. Participate in requirement and design reviews to improve testability, support production cutovers, and execute post-deployment validations. Skills & Qualifications Must-Have 2+ years of hands-on ETL / Data Warehouse testing experience (source-to-target testing, reconciliation). Strong SQL skills (complex joins, aggregations, window functions) and experience writing validation queries for large datasets. Hands-on experience with at least one ETL tool—Informatica PowerCenter, Microsoft SSIS or Talend. Solid understanding of Data Warehouse concepts: star/snowflake schemas, SCDs, fact and dimension tables, partitions. Experience with test management and defect-tracking tools (JIRA, HP ALM) and basic Unix/Linux command-line proficiency. Good analytical thinking, attention to detail, and effective verbal/written communication for on-site client collaboration. Preferred Experience automating data validation using Python or shell scripts and integrating checks into CI/CD workflows. Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery, Azure Synapse) and ETL scheduling tools (Control-M, Autosys). Exposure to performance testing for ETL jobs and experience working in Agile delivery teams. Benefits & Culture Highlights Exposure to large-scale data warehouse projects across banking, retail and enterprise clients—fast skill growth. Collaborative, client-facing environment with structured onboarding and emphasis on practical, hands-on learning. On-site role enabling close collaboration with cross-functional teams and direct impact on production readiness. To apply: Submit an updated CV highlighting ETL testing experience, sample SQL queries or automation snippets (if available), and your immediate availability. Competitive opportunities for candidates who demonstrate strong technical verification skills and attention to data quality. Skills: etl testing,big data,data warehouse testing

Posted 3 weeks ago

Apply

2.0 years

8 - 18 Lacs

noida, uttar pradesh, india

On-site

Industry & Sector: Recruitment & staffing for technology and analytics roles supporting Financial Services, Retail and Enterprise Data platforms. We are hiring on behalf of clients for an on-site data engineering QA role focused on validating ETL pipelines, data quality, and production-ready data warehouse solutions. Primary Job Title: ETL QA Engineer Location: India — On-site Role & Responsibilities Execute end-to-end ETL test cycles: validate source-to-target mappings, transformations, row counts, and data reconciliation for batch and incremental loads. Create and maintain detailed test plans, test cases, and traceability matrices from functional and technical specifications. Author and run complex SQL/PLSQL queries to perform record-level validation, aggregate checks, and anomaly detection; capture and report metrics. Automate repetitive validation tasks using scripts (Python/Shell) or ETL tool features and integrate checks into CI/CD pipelines where applicable. Log, triage and manage defects in JIRA/ALM; reproduce issues, collaborate with ETL developers to resolve root causes, and validate fixes through regression testing. Participate in requirement and design reviews to improve testability, support production cutovers, and execute post-deployment validations. Skills & Qualifications Must-Have 2+ years of hands-on ETL / Data Warehouse testing experience (source-to-target testing, reconciliation). Strong SQL skills (complex joins, aggregations, window functions) and experience writing validation queries for large datasets. Hands-on experience with at least one ETL tool—Informatica PowerCenter, Microsoft SSIS or Talend. Solid understanding of Data Warehouse concepts: star/snowflake schemas, SCDs, fact and dimension tables, partitions. Experience with test management and defect-tracking tools (JIRA, HP ALM) and basic Unix/Linux command-line proficiency. Good analytical thinking, attention to detail, and effective verbal/written communication for on-site client collaboration. Preferred Experience automating data validation using Python or shell scripts and integrating checks into CI/CD workflows. Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery, Azure Synapse) and ETL scheduling tools (Control-M, Autosys). Exposure to performance testing for ETL jobs and experience working in Agile delivery teams. Benefits & Culture Highlights Exposure to large-scale data warehouse projects across banking, retail and enterprise clients—fast skill growth. Collaborative, client-facing environment with structured onboarding and emphasis on practical, hands-on learning. On-site role enabling close collaboration with cross-functional teams and direct impact on production readiness. To apply: Submit an updated CV highlighting ETL testing experience, sample SQL queries or automation snippets (if available), and your immediate availability. Competitive opportunities for candidates who demonstrate strong technical verification skills and attention to data quality. Skills: etl testing,big data,data warehouse testing

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

gurugram, haryana, india

On-site

Job Description: SQL Tester ( Immediate Joiner can apply) Position: SQL Tester Experience: 4–8 Years Location: Gurgaon / Bangalore Budget: Up to 11 LPA Role Overview We are seeking an experienced SQL Tester with strong expertise in database testing and SQL query validation. The ideal candidate will be responsible for ensuring data integrity, accuracy, and reliability across multiple applications and databases. Key Responsibilities Design, develop, and execute test cases for validating SQL databases and applications. Write and optimize complex SQL queries to test stored procedures, triggers, views, and data models. Perform data integrity, ETL, and data migration testing . Conduct performance testing of SQL queries and optimize execution plans. Collaborate with development and QA teams to analyze requirements and resolve issues. Document test results, defects, and maintain test reports. Work with defect management tools (JIRA, Bugzilla, etc.) for tracking and reporting. Automate database testing processes where applicable. Required Skills Strong proficiency in SQL (joins, indexing, procedures, optimization, etc.) . Hands-on experience in database testing (Oracle, MS SQL, MySQL, PostgreSQL, etc.) . Good understanding of ETL testing, data warehouse concepts, and data validation techniques . Familiarity with automation frameworks for DB testing is a plus. Knowledge of SDLC, STLC, Agile testing methodologies . Strong analytical and problem-solving skills. Preferred Skills Exposure to Big Data testing (Hadoop, Hive, Spark SQL) . Knowledge of Cloud databases (AWS RDS, Azure SQL, Google BigQuery) . Experience with scripting languages (Python, Shell, etc.) for test automation. Educational Qualification Bachelor’s degree in Computer Science, IT, or a related field.

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

coimbatore, tamil nadu, india

On-site

Company Description threeS Data, a cutting-edge technology startup based in Coimbatore, India, specializes in Data Architecture, Management, Governance, Analytics, Intelligence, Business Intelligence, Automation, and Machine Learning. Founded in 2024, we focus on delivering simple, smart, and significant solutions that meet our clients' desired outcomes. Our engagements are partnerships, dedicated to understanding the complexities of day-to-day operations and offering practical, honest approaches to deliver exceptional results. Role Description This is a contract role based in Coimbatore, ideal for professionals who can independently deliver high-quality ETL solutions in a cloud-native, fast-paced environment. The position is hybrid, based in Coimbatore, with some work-from-home flexibility. Day-to-day tasks include designing, developing, and maintaining data pipelines, performing data modeling, implementing ETL processes, and managing data warehousing solutions. We are looking for candidates (6+years of experience) expertise in Apache Airflow , Redshift , and SQ based data pipelines, with upcoming transitions to Snowflake . Key Responsibilities: ETL Design and Development: · Design and develop scalable and modular ETL pipelines using Apache Airflow , with orchestration and monitoring capabilities. · Translate business requirements into robust data transformation pipelines across cloud-data platforms. · Develop reusable ETL components to support a configuration-driven architecture. Data Integration and Transformation: · Integrate data from multiple sources: Redshift, flat files, APIs, Excel, and relational databases. · Implement transformation logic such as cleansing, standardization, enrichment, and de-duplication. · Manage incremental and full loads, along with SCD handling strategies. SQL and Database Development: · Write performant SQL queries for data staging and transformation within Redshift and Snowflake. · Utilize joins, window functions, and aggregations effectively. · Ensure indexing and query tuning for high-performance workloads. Performance Tuning: · Optimize data pipelines and orchestrations for large-scale data volumes. · Tune SQL queries and monitor execution plans. · Implement best practices in distributed data processing and cloud-native optimizations. Error Handling and Logging: · Implement robust error handling and logging in Airflow DAGs. · Enable retry logic, alerting mechanisms, and failure notifications. Testing and Quality Assurance: · Conduct unit and integration testing of ETL jobs. · Validate data outputs against business rules and source systems. · Support QA during UAT cycles and help resolve data defects. Deployment and Scheduling: · Deploy pipelines using Git-based CI/CD practices. · Schedule and monitor DAGs using Apache Airflow and integrated tools. · Troubleshoot failures and ensure data pipeline reliability. Documentation and Maintenance: · Document data flows, DAG configurations, transformation logic, and operational procedures. · Maintain change logs and update job dependency charts. Collaboration and Communication: · Work closely with data architects, analysts, and BI teams to define and fulfill data needs. · Participate in stand-ups, sprint planning, and post-deployment reviews. Compliance and Best Practices: · Ensure ETL processes adhere to data security, governance, and privacy regulations (HIPAA, GDPR, etc.). · Follow naming conventions, version control standards, and deployment protocols. Qualifications o 6+ years of hands-on experience in ETL development. o Proven experience with Apache Airflow , Amazon Redshift , and strong SQL. o Strong understanding of data warehousing concepts and cloud-based data ecosystems. o Familiarity with handling flat files, APIs, and external sources o Experience with job orchestration, error handling, and scalable transformation patterns. o Ability to work independently and meet deadlines. Preferred Skills: § Exposure to Snowflake or plans to migrate to Snowflake platforms. § Experience in healthcare , life sciences , or regulated environments is a plus. § Familiarity with Azure Data Factory , Power BI , or other cloud BI tools. § Knowledge of Git, Azure DevOps, or other version control and CI/CD platforms.

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

gurugram, haryana, india

On-site

Job Description: SQL Tester Position: SQL Tester Experience: 4–8 Years Location: Gurgaon / Bangalore Budget: Up to 11 LPA Role Overview We are seeking an experienced SQL Tester with strong expertise in database testing and SQL query validation. The ideal candidate will be responsible for ensuring data integrity, accuracy, and reliability across multiple applications and databases. Key Responsibilities Design, develop, and execute test cases for validating SQL databases and applications. Write and optimize complex SQL queries to test stored procedures, triggers, views, and data models. Perform data integrity, ETL, and data migration testing . Conduct performance testing of SQL queries and optimize execution plans. Collaborate with development and QA teams to analyze requirements and resolve issues. Document test results, defects, and maintain test reports. Work with defect management tools (JIRA, Bugzilla, etc.) for tracking and reporting. Automate database testing processes where applicable. Required Skills Strong proficiency in SQL (joins, indexing, procedures, optimization, etc.) . Hands-on experience in database testing (Oracle, MS SQL, MySQL, PostgreSQL, etc.) . Good understanding of ETL testing, data warehouse concepts, and data validation techniques . Familiarity with automation frameworks for DB testing is a plus. Knowledge of SDLC, STLC, Agile testing methodologies . Strong analytical and problem-solving skills. Preferred Skills Exposure to Big Data testing (Hadoop, Hive, Spark SQL) . Knowledge of Cloud databases (AWS RDS, Azure SQL, Google BigQuery) . Experience with scripting languages (Python, Shell, etc.) for test automation. Educational Qualification Bachelor’s degree in Computer Science, IT, or a related field.

Posted 3 weeks ago

Apply

0 years

0 Lacs

gurugram, haryana, india

On-site

Key Responsibilities Data Validation Write SQL queries to check source vs. target data. Validate CRUD operations (Create, Read, Update, Delete). Ensure data completeness & correctness after migrations. ETL Testing Validate transformations, aggregations, joins, and filters. Check row counts, duplicate data, and null handling. Test incremental loads (daily, weekly). Database Testing Verify indexes, constraints, triggers, stored procedures. Check performance of queries (execution plans, indexing). Validate transactions & rollback scenarios. Automation Use Python/Java + PyTest/JUnit for automated SQL validations. Integrate with CI/CD (Jenkins, GitHub Actions). 🔹 Required Skills SQL (Core skill) Joins, Subqueries, Window functions, CTEs. Database Systems Oracle, MySQL, PostgreSQL, SQL Server, Teradata, Snowflake, Redshift. ETL & Data Warehousing Concepts Fact & dimension validation. Slowly Changing Dimensions (SCDs). Testing Tools JIRA, HP ALM, Selenium (for UI + backend testing). Programming Basics Python / Shell scripting for automation.

Posted 3 weeks ago

Apply

0 years

0 Lacs

india

Remote

🔍 Real-Time Projects | Remote | Performance-Based Stipend 📅 Application Deadline: 1st September 2025 Are you eager to dig into databases, write meaningful SQL queries, and extract insights that drive decisions? We’re offering a remote internship for aspiring data professionals to work on real-time, industry-level capstone projects that build both skill and portfolio. 🧠 Role: Database Insights Intern Location: Remote Duration: Flexible (minimum commitment required) Stipend: Performance-Based (Top performers are rewarded) Start Date: Rolling basis Deadline to Apply: 1st September 2025 🔧 What You’ll Be Doing: Work on real-world datasets from various industries Write and optimize SQL queries to extract, clean, and transform data Analyze data and generate insights to support business decisions Build basic dashboards and reports to communicate findings Collaborate with mentors and peers in a remote team environment ✅ What You Need: Basic to intermediate SQL skills (joins, subqueries, aggregations, etc.) Understanding of databases and data types Interest in analytics, business intelligence, and storytelling with data Familiarity with Excel, Power BI, or Tableau is a plus Self-motivation, curiosity, and a problem-solving mindset 🎁 What You’ll Gain: Hands-on experience with real-time capstone projects Mentorship and guidance from industry professionals Flexible schedule to work at your own pace Performance-based stipend and rewards for top performers Internship certificate and letter of recommendation for high achievers A strong portfolio to showcase your skills to future employers 📩 How to Apply: Deadline to Apply: 1st September 2025

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

indore, madhya pradesh, india

On-site

Function - People & Culture Sub Function - HR Operations Role Overview Behind every great employee experience is an HR Operations engine that just works. From the first day someone joins to the moment they grow, move, or even exit — HR Ops is the team that makes these transitions smooth, fair, and meaningful. We’re looking for an HR Operations Manager who believes that the right data, the right process, and the right experience can shape someone’s career. This isn’t just about running checklists; it’s about creating moments that matter while removing the friction from day-to-day HR interactions. Champion employee experience Make every HR touchpoint — onboarding, confirmations, transfers, exits — simple, supportive, and human. Be the go-to partner when employees have questions, ensuring they feel heard and cared for Oversee HR data in HRIS/HRMS platforms, ensuring it is complete, accurate, and career-impact safe. Guard the accuracy of employee data Recognize that a single wrong entry can affect an employee’s career progression. Keep employee records clean, consistent, and trustworthy, so leaders and employees can rely on them. Automate what slows people down Spot repetitive, manual tasks and replace them with smart automation. Build self-service tools that let employees get what they need instantly, without waiting. Ensure automation enhances employee convenience without losing the human touch Simplify compliance Make sure our HR practices follow the law and company policy — but do it in a way that feels transparent and employee-friendly. Insights & Continuous Improvement Use AI-generated dashboards to track key HR Ops metrics (turnaround times, employee query patterns, process efficiency). Translate data into action by improving processes, reducing errors, and enhancing employee trust in HR Ops. Use data to tell stories Turn numbers into insights: help leaders understand trends in retention, engagement, and growth. 2 Share dashboards and reports that actually drive decisions — not just fill inboxes. What Makes You a Great Fit 6-10 years in HR Operations, preferably in a fast-scaling or multi-location setup. Strong understanding of HR processes and their impact on employee experience. Comfortable working with HRMS platforms and open to adopting AI/automation tools. Highly detail-oriented with a strong sense of accountability for data accuracy. Empathetic and approachable, with strong problem-solving skills. Collaborative mindset — you partner with payroll, compliance, and business teams effectively. How Success Looks Employees feel HR Ops is responsive, supportive, and accurate. Repetitive HR Ops work is automated, freeing you to focus on employee needs. HR processes run smoothly — no delays, no bottlenecks. The function is seen as an enabler of employee growth and trust, not just a processor of tasks.

Posted 3 weeks ago

Apply

100.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Our client is a global technology company headquartered in Santa Clara, California. it focuses on helping organisations harness the power of data to drive digital transformation, enhance operational efficiency, and achieve sustainability. over 100 years of experience in operational technology (OT) and more than 60 years in IT to unlock the power of data from your business, your people and your machines. We help enterprises store, enrich, activate and monetise their data to improve their customers’ experiences, develop new revenue streams and lower their business costs. Over 80% of the Fortune 100 trust our client for data solutions. The company’s consolidated revenues for fiscal 2024 (ended March 31, 2024). approximately $57.5 billion USD., and the company has approximately 296,000 employees worldwide. It delivers digital solutions utilising Lumada in five sectors, including Mobility, Smart Life, Industry, Energy and IT, to increase our customers’ social, environmental and economic value. Job Title: Python fullstack developer Location: Bengaluru Experience: 4-6 Years Job Type : Contract to hire. Notice Period: Immediate joiners. Mandatory Skills: Angular, Python, DevOps,Python Flask JD : Full-Stack Developer (Angular Python Flask) Experience: 4-6Yrs Job Summary: Experienced Full-Stack Developer with expertise in Angular (TypeScript) for front-end development and Python Flask for back-end API development. Strong background in Microsoft SQL Server, authentication using Azure AD (MSAL), and implementing efficient API integrations. Skilled in unit testing, debugging, and optimizing performance. Key Skills: • Front-End: Angular, TypeScript, PrimeNG, RxJS, State Management, React JS • Back-End: Python Flask, SQLAlchemy, RESTful API Development • Database: Microsoft SQL Server (SQL, Joins, Query Optimization) • Authentication: Azure AD, MSAL, JWT-based authentication • DevOps Deployment: Git, CI/CD (Azure DevOps, GitHub Actions) • Additional: data validation, pagination, performance tuning

Posted 3 weeks ago

Apply

0 years

0 Lacs

india

On-site

Company Overview BroskiesHub is a forward-thinking organization dedicated to cultivating the next generation of technology leaders. Our mission is to remove the 'fresher' tag by bridging the critical gap between academic knowledge and industry demands. We provide a structured, project-based environment where aspiring professionals gain verifiable, hands-on experience. This transforms them from students into skilled individuals, equipped with the confidence and a portfolio to prove they are more than just freshers. Position Summary BroskiesHub is seeking highly motivated and detail-oriented individuals for our 45-day unpaid SQL Developer Internship program. This role is designed as a rigorous, performance-based evaluation for a potential paid position. The successful intern will gain invaluable experience in database management, data analysis, and query optimization, and will have a direct opportunity to transition into a paid role based on demonstrated merit. Please note, this is a strictly unpaid training and evaluation internship. Key Responsibilities Independently manage and execute data-related tasks, including writing and optimizing SQL queries. Assist in designing, developing, and maintaining databases. Apply strong analytical skills to extract, manipulate, and analyze data to generate insights. Participate in mandatory progress reviews and mentorship sessions. Maintain comprehensive documentation for all queries, database schemas, and project work. Required Qualifications Currently enrolled in or a recent graduate of a B.Tech, B.E., BCA, or equivalent program in Computer Science, IT, or a related technical discipline. A solid foundational understanding of relational databases and core SQL concepts (including JOINs, GROUP BY, and subqueries). Basic familiarity with at least one major database system (e.g., MySQL, PostgreSQL, MS SQL Server) is required. Demonstrated ability to work independently and manage time effectively. A strong desire to learn, improve, and build a career in data and database development. Excellent written and verbal communication skills. Program Structure: A 3-Phase Evaluation for a Paid Opportunity This unpaid internship is structured as a comprehensive evaluation process. Your performance is your interview. Phase 1 (Days 1–15): Foundational Skill Assessment: Execute 10 practical tasks designed to validate your core SQL competencies. Phase 2 (Days 16–30): Live Project Execution: Take ownership of a real-world data project to demonstrate your ability to apply skills in a practical setting. Phase 3 (Days 31–45): Innovation Challenge: Design and develop a database project of your own choosing, showcasing your technical creativity and strategic thinking. Compensation and Future Opportunities This is a completely unpaid internship. There will be no stipend or salary provided during the 45-day evaluation period. Performance-Based Paid Role: Top-performing interns who consistently exceed expectations and demonstrate exceptional technical aptitude and professionalism will be extended an offer for a paid internship or a full-time position within our core team. Professional Development: All interns who complete the program will receive a Certificate of Completion and a powerful portfolio of work. High performers will also receive a strong Letter of Recommendation.

Posted 3 weeks ago

Apply

0 years

12 - 15 Lacs

bengaluru, karnataka, india

On-site

Role Overview We are seeking a highly skilled Power BI Developer with strong SQL expertise to design, develop, and deploy interactive dashboards and analytical solutions. The ideal candidate will have hands-on experience in data modeling, DAX, and end-to-end dashboard development with a strong analytical mindset and attention to detail. Key Responsibilities Design and develop Power BI dashboards and reports that deliver actionable insights. Build robust data models and implement DAX calculations for complex business logic. Collaborate with business stakeholders to gather requirements and translate them into technical solutions. Optimize dashboards for performance and scalability. Develop and maintain SQL queries, stored procedures, and ETL processes to support reporting needs. Ensure data quality, accuracy, and consistency across dashboards. Work with cross-functional teams (business analysts, data engineers, stakeholders) to deliver end-to-end BI solutions. Required Skills Power BI expertise – data modeling, DAX, Power Query (M), custom visuals, and advanced dashboard design. Strong SQL development skills – query optimization, joins, functions, stored procedures. Experience with end-to-end dashboard lifecycle (requirement gathering, design, development, deployment, and maintenance). Good understanding of ETL concepts and data pipelines. Strong problem-solving and communication skills. Skills: power bi,sql,dax

Posted 3 weeks ago

Apply

35.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Eurofins Scientific is an international life sciences company, providing a unique range of analytical testing services to clients across multiple industries, to make life and the environment safer, healthier and more sustainable. From the food you eat to the medicines you rely on, Eurofins works with the biggest companies in the world to ensure the products they supply are safe, their ingredients are authentic and labelling is accurate. Eurofins is a global leader in food, environmental, pharmaceutical and cosmetic product testing and in agroscience CRO services. It is also one of the global independent market leaders in certain testing and laboratory services for genomics, discovery pharmacology, forensics, CDMO, advanced material sciences and in the support of clinical studies. In over just 35 years, Eurofins has grown from one laboratory in Nantes, France to 62,000 staff across a network of over 1,000 independent companies in 61 countries, operating 900 laboratories. Performing over 450 million tests every year, Eurofins offers a portfolio of over 200,000 analytical methods to evaluate the safety, identity, composition, authenticity, origin, traceability and purity of biological substances and products, as well as providing innovative clinical diagnostic testing services, as one of the leading global emerging players in specialised clinical diagnostics testing. Eurofins is one of the fastest growing listed European companies with a listing on the French stock exchange since 1997. Eurofins IT Solutions India Pvt Ltd (EITSI) is a fully owned subsidiary of Eurofins and functions as a Global Software Delivery Center exclusively catering to Eurofins Global IT business needs. The code shipped out of EITSI impacts the global network of Eurofins labs and services. The primary focus at EITSI is to develop the next generation LIMS (Lab Information Management system), Customer portals, e-commerce solutions, ERP/CRM system, Mobile Apps & other B2B platforms for various Eurofins Laboratories and businesses. Young and dynamic, we have a rich culture and we offer fulfilling careers. Job Description QA Engineer Eurofins IT Solutions, Bengaluru, Karnataka, India With 36 facilities worldwide, Eurofins BioPharma Product Testing (BPT) is the largest network of bio/pharmaceutical GMP product testing laboratories providing comprehensive laboratory services for the world's largest pharmaceutical, biopharmaceutical, and medical device companies. BPT is enabled by global engineering teams working on next-generation applications and Laboratory Information Management Systems (LIMS). As Automation Engineer, you will be a crucial part of our delivery team, ensuring the product features are completely automated and reducing the idea to live time to Business. As a technology leader, BPT wants to give you the opportunity not just to accept new challenges and opportunities but to impress with your ingenuity, focus, attention to detail and collaboration with a global team of professionals. This role reports to a Senior Manager. Required Experience and Skills Experience: Experience between 2 to 5 years Expertise in Automation Testing of Web and Windows-based applications Good experience in building and using automation frameworks using technologies: Java/ C# Experience in doing in-sprint automation Strong Automation background with experience in identifying and reviewing test cases and testing results Ability to understand complex requirements and transform those into Test Scenarios, Test cases and Test Scripts Proficient in Version Control tool (ex: GIT ) Worked on automating the CI/CD pipelines Good Knowledge of Manual Testing of Web and Windows-based applications. Have an excellent understanding of SDLC and STLC lifecycles. Hands-on work experience in preparing Test Strategy, Test plans and Requirement traceability matrix. Exposure to different Testing types – Sanity, Functional, Integration, Exploratory and System testing. Understanding of Agile/Scrum methodology and working experience in an Agile environment. Proficient in creating Test Reports. Should know about analyzing the Risks. Prior experience in testing LIMS would be an added advantage. Specific skills required Expertise in Automation Testing using Selenium/Protractor/WebdriverIO. Hands-on experience in programming languages (C#, Java, Python). Experienced in Creating VM for Automation and setting up the configuration in the test suite and runners. Hands-on experience in SQL to write medium complex queries (e.g. Joins/sub-queries etc.). Good understanding of JSON, XML, REST and experience in testing web services using POSTMAN, REST Assured, JSON Lint, SOAP UI. Experience with any defect tracking tool like Jira, TFS/MTM, Bugzilla, etc. Experience on any test management tools like TFS/MTM, ALM, etc Desirable Experience Knowledge of Performance testing. Knowledge of Security testing. Knowledge of Mobile Automation Testing. Additional Information Personal Skills: Excellent analytical and problem solving skills Excellent verbal and written communication skills Ability to articulate and present different points-of-views on various topics related to project and otherwise. Eager to learn and continuously develop personal and technical capabilities. Required Qualifications: MCA or Bachelors in Engineering, Computer Science or equivalent. PERFORMANCE APPRAISAL CRITERIA : Eurofins has a strong focus on Performance Management system. This includes quarterly calibrations, half-yearly reviews and annual reviews. The KPIs shall be set and may vary slightly between projects. These will be clearly communicated, documented during the first 30 days of your joining.

Posted 3 weeks ago

Apply

0.0 - 2.0 years

0 - 0 Lacs

pitampura, delhi, delhi

On-site

Job Title: Data Analyst Location: DMALL, 815, Netaji Subhash Place, Pitampura, New Delhi – 110034 Job Type: Full-time Note: Do not Apply Fresher Candidates Preferred : Male Candidate Only Job Summary: We are looking for a Data Analyst with strong expertise in Google Sheets, WhatsApp API, and automation scripting . The candidate should have hands-on experience with data handling, formulas, API integration, and data visualization to support real-time reporting, automation, and business intelligence. Key Responsibilities (KRA):1. Data Handling & Reporting Manage, clean, and organize datasets using Google Sheets & SQL . Build dashboards and MIS reports with advanced Google Sheets formulas : VLOOKUP / HLOOKUP INDEX-MATCH QUERY ARRAYFORMULA IMPORTRANGE REGEXMATCH / REGEXEXTRACT Pivot Tables & Charts Automate data imports/exports between systems using Google Apps Script . 2. WhatsApp API & Integrations Implement and manage WhatsApp Business API (via Twilio / Gupshup / WATI / Meta API). Write scripts to fetch messages, leads, and responses from WhatsApp and link with Google Sheets / CRM. Automate sending alerts, reports, or notifications through WhatsApp API. Track and analyze WhatsApp campaign metrics (open rates, response time, engagement). Delivered ad-hoc analysis for sales and operations teams, providing actionable insights and business intelligence support. Integrated WhatsApp Business API with Google Sheets and CRM systems for lead management and response tracking . Automated WhatsApp alerts, notifications, and daily/weekly reports using Google Apps Script, reducing manual intervention. Designed and delivered interactive dashboards in Power BI and Google Data Studio for sales, operations, and marketing teams, enhancing data-driven decision-making. Managed and cleaned large datasets using Google Sheets advanced formulas and SQL queries to ensure data accuracy. Built and maintained MIS dashboards with pivot tables, query functions, and automated reporting, improving decision-making for management. Automated manual workflows using Google Apps Script and Python , saving 10+ hours of repetitive work per week . Successfully implemented WhatsApp API integration for campaign tracking, enabling real-time engagement and performance reporting. 3. Scripting & Automation Develop automation workflows using: Google Apps Script (JavaScript-based) Python (for data extraction, cleaning, API calls) REST API integration (JSON, XML handling) Automate repetitive reporting tasks and ensure data accuracy. 4. Data Analysis & Visualization Use Google Data Studio / Looker Studio, Power BI, or Tableau for visualization. Generate insights from sales, customer, and operations data. Provide analytical reports for decision-making. 5. Collaboration & Support Work with Sales, Marketing, and Operations teams for data requirements. Provide accurate daily/weekly/monthly MIS reports . Support management with ad-hoc analysis. Technical Skills / Languages Required: Google Sheets (advanced formulas, pivot tables, dashboards) Google Apps Script (JavaScript) for automation Python (data extraction, automation, API handling) SQL (basic queries, joins, filtering, data manipulation) REST API handling (WhatsApp API, CRM APIs, JSON/XML) Visualization Tools : Google Data Studio / Power BI / Tableau Additional Good-to-Have Skills: Experience with BigQuery or cloud-based databases Familiarity with Excel VBA macros Knowledge of CRM systems integration Understanding of business communication analytics Resume WhatsApp: 7290050558 Office Hour : 9:30 Am to 6:30 Pm Weekend off: Sunday Probation Period : 3 Months Job Type: Full-time Pay: ₹25,000.00 - ₹30,000.00 per month Benefits: Cell phone reimbursement Health insurance Leave encashment Paid sick time Provident Fund Application Question(s): Required Male Candidate only Do Not Apply Female Experience: Data analytics: 2 years (Preferred) Google Analytics: 2 years (Preferred) Google Sheets: 2 years (Preferred) Language: English (Preferred) Location: Pitampura, Delhi, Delhi (Preferred) Work Location: In person

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

ahmedabad, gujarat, india

On-site

✨ We’re Hiring: Talent Acquisition Executive (TA Executive) ✨ 📍 Location: Sindhubhavan, Ahmedabad 🗓️ Working Days: 6 days a week 💰 Salary: ₹30,000 – ₹40,000 per month About the Company HOF FURNITURE SYSTEM PVT LTD is a professionally managed fast growing organization working in the design and Furniture Manufacturing.. Hof Furniture System Pvt. Ltd, started in the year 1986 as a limited company, our firm is a well-known Manufacturer of Office Chairs, Executive Chairs, Premium Chairs, Visitor Chairs, Arm Chairs, Living Room Sofa and many more. All these products are provided to the customer after tested on various quality strictures. All our business activities took place in our company’s headquarter located at Ahmedabad, Gujarat. With the skilled administration of our knowledgeable workforce, we have been able to attain a huge client base in the market. About the Role We are looking for a dynamic Talent Acquisition Executive to join our HR team at HOF. This role will be responsible for end-to-end recruitment, from sourcing to onboarding, ensuring the best talent joins our growing organization. Responsibilities Handle full-cycle recruitment: sourcing, screening, interviewing, and closing positions across departments. Build and maintain a strong candidate pipeline through job portals, LinkedIn, referrals, and networking. Coordinate with hiring managers to understand manpower requirements. Draft and post job descriptions across platforms to attract top talent. Schedule interviews and manage communication with candidates throughout the process. Support HR operations during onboarding and induction. Maintain hiring metrics and recruitment reports. What We’re Looking For: Graduate / Postgraduate in HR or related field (MBA preferred) 1–3 years of proven experience in recruitment (preferably in manufacturing/ corporate sector). Excellent communication and interpersonal skills. Ability to multitask and meet hiring deadlines. Strong knowledge of sourcing techniques & social media recruitment. Pay range and compensation package Salary: ₹30,000 – ₹40,000 per month Equal Opportunity Statement HOF is an equal opportunity employer and is committed to creating a diverse and inclusive workplace 📧 To Apply: Share your resume at hr1@hofindia.com

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

delhi

On-site

Job requisition ID :: 86808 Date: Aug 25, 2025 Location: Delhi Designation: Consultant Entity: Deloitte Touche Tohmatsu India LLP Job Summary: We are looking for a detail-oriented and technically proficient professional with strong expertise in Python and SQL as core competencies. The ideal candidate will also have working knowledge of Power BI or Tableau for data visualization, and exposure to PySpark is a plus. This role involves building robust data pipelines, performing advanced analytics, and delivering actionable insights through visual storytelling. Key Responsibilities: Develop and maintain scalable data pipelines using Python and SQL. Perform data extraction, transformation, and loading (ETL) from various sources. Conduct exploratory data analysis and statistical modeling to support business decisions. Create dashboards and reports using Power BI or Tableau to visualize key metrics. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Optimize query performance and ensure data integrity across systems. Contribute to big data processing tasks using PySpark (nice to have). Document processes, workflows, and technical specifications. Required Skills: Primary Skills: Python: Strong experience with data manipulation, scripting, and libraries such as pandas, numpy, and matplotlib. SQL: Advanced proficiency in writing and optimizing queries, joins, and stored procedures. Secondary Skills: Power BI or Tableau: Ability to create interactive dashboards and visualizations. PySpark: Exposure to distributed data processing and Spark-based workflows (preferred but not mandatory). Preferred Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, or related field. Experience with cloud data platforms (Azure, AWS, GCP) is a plus. Familiarity with version control (Git) and Agile development practices.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

chennai, tamil nadu, india

On-site

Job Summary We are seeking a highly skilled and detail-oriented Senior SQL Data Analyst to join our data-driven team. This role will be responsible for leveraging advanced SQL skills to extract, analyze, and interpret complex datasets, delivering actionable insights to support business decisions. You will work closely with cross-functional teams to identify trends, solve problems, and drive data-informed strategies across the organization. Key Responsibilities Develop, write, and optimize advanced SQL queries to retrieve and analyze data from multiple sources. Design and maintain complex data models, dashboards, and reports. Collaborate with stakeholders to understand business needs and translate them into analytical requirements. Conduct deep-dive analysis to identify key business trends and opportunities for growth or improvement. Ensure data integrity and accuracy across systems and reporting tools. Automate recurring reports and develop scalable data pipelines. Present findings in a clear, compelling way to both technical and non-technical audiences. Qualifications Required: Bachelor's degree in Computer Science, Information Systems, Mathematics, Statistics, or related field. 5+ years of experience in data analysis or a similar role with a strong focus on SQL. Expert proficiency in SQL (window functions, joins, CTEs, indexing, etc.). Strong understanding of data warehousing concepts and relational database systems (e.g., PostgreSQL, SQL Server, Snowflake, Redshift). Experience with BI tools like Tableau, Power BI, or Looker. Excellent analytical, problem-solving, and communication skills. Preferred Experience with scripting languages (Python, R) for data manipulation. Familiarity with cloud data platforms (AWS, Azure). Knowledge of ETL tools and best practices. Previous experience in a fast-paced, agile environment.

Posted 3 weeks ago

Apply

0 years

0 Lacs

noida, uttar pradesh, india

On-site

We are looking for an experienced Senior Business Analyst with strong expertise in Banking domain and proven skills in writing and optimizing complex SQL queries . The ideal candidate should have strong analytical abilities, requirement gathering expertise, and hands-on experience working with stakeholders in an Agile environment. Key Responsibilities- Collaborate with business stakeholders to elaborate, document, and prioritize requirements. Translate business needs into functional specifications and user stories. Perform data analysis using advanced SQL queries to validate business requirements and support decision-making. Work closely with development and QA teams to ensure requirements are clearly understood and implemented. Utilize JIRA for backlog management, user story tracking, and sprint planning. Provide insights through data-driven analysis, supporting business decisions and regulatory requirements in the banking domain. Required Skills & Experience- Strong SQL expertise ability to write, optimize, and troubleshoot complex queries, joins, subqueries, and stored procedures. Hands-on experience in the Banking / Financial Services domain. Strong analytical and problem-solving skills with the ability to interpret large datasets. Experience in Business Analysis, Requirement Gathering, and Documentation. Familiarity with Agile methodology and tools such as JIRA. Excellent communication and stakeholder management skills.

Posted 3 weeks ago

Apply

0 years

0 Lacs

noida, uttar pradesh, india

On-site

About Delhivery: Delhivery is India’s leading fulfillment platform for digital commerce. With a vast logistics network spanning 18,000+ pin codes and over 2,500 cities, Delhivery provides a comprehensive suite of services including express parcel transportation, freight solutions, reverse logistics, cross-border commerce, warehousing, and cutting-edge technology services. Since 2011, we’ve fulfilled over 550 million transactions and empowered 10,000+ businesses, from startups to large enterprises. Vision : To become the operating system for commerce in India by combining world-class infrastructure, robust logistics operations, and technology excellence . About the Role: Senior Data Enginee r We're looking for a Senior Data Engine er who can design, optimize, and own our high-throughput data infrastructure. You’ll work across batch and real-time pipelines, scale distributed processing on petabyte-scale data, and bring AI-assisted tooling into your workflow for debugging, testing, and documentation. This is a hands-on role where you'll work with a wide range of big data technologies (Spark, Kafka, Hive, Hudi/Iceberg, Databricks, EMR), data modeling best practices, and real-time systems to power analytics, data products, and machine learning. As a senior engineer, you'll review complex pipelines, manage SLAs, and mentor junior team members — while leveraging GenAI tools to scale your impact. What You’ll Do Build and optimize scalable batch and streaming data pipelines using Apache Spark, Kafka, Flink, Hive, and Airflow. Design and implement efficient data lake architectures with Hudi, Iceberg, or Delta for versioning, compaction, schema evolution, and time travel. Architect and maintain cloud-native data systems (AWS EMR, S3, Glue, Lambda, Athena), focusing on cost, performance, and availability. Model complex analytical and operational data workflows for warehouse and data lake environments. Own pipeline observability — define and monitor SLAs, alerts, and lineage across batch and real-time systems. Debug performance bottlenecks across Spark, Hive, Kafka, and S3 — optimizing jobs with broadcast joins, file formats, resource configs, and partitioning strategies. Leverage AI tools (e.g., Cursor AI, Copilot, Gemini, Windsurf) for Code generation and refactoring of DAGs or Spark jobs Debugging logs, stack traces, and SQL errors. Generating tests for data pipelines Documenting complex pipeline dependencies and architecture Collaborate with product, analytics, data science, and platform teams to deliver end-to-end data products. Mentor junior engineers and establish AI-native development workflows, including prompt libraries and automation best practices. What We’re Looking For: Experience in building and maintaining large-scale data systems. Strong hands-on experience with Apache Spark, Kafka, Hive, and Airflow in production. Deep knowledge of the Hadoop ecosystem (HDFS, YARN, MapReduce tuning, NameNodeHA). Expert in SQL (windowing, recursive queries, tuning) and experience with NoSQL stores (e.g., DynamoDB, HBase). Experience with trino/prestoExperience with cloud-native data platforms — especially AWS Glue, S3 lifecycle policies, EMR, and Athena. Working knowledge of file formats and internals like Parquet, Avro, and best practices for efficient storage. Familiarity with modern Lakehouse formats (Hudi, Iceberg, Delta Lake) and their compaction, versioning, and schema evolution. Hands-on experience managing Databricks or EMR. Solid grounding in data modeling, DWH design, and slowly changing dimensions (SCD). Strong programming in Python/Scala/Java, and ability to write clean, modular, testable code. Proficiency with CI/CD practices, Git, Jenkins/GitHub Actions for data engineering workflows. Bonus: Experience with distributed systems, consensus protocols, and real-time data guarantees. Passion for AI-native engineering — using and evolving prompt-based workflows for greater efficiency and quality.

Posted 3 weeks ago

Apply

1.0 years

3 - 5 Lacs

bengaluru

On-site

DESCRIPTION Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. Amazon India is launching a new service, Strategic Brand services aimed at offering dedicated support to top-tiered brands to grow with Amazon. Under this service, Brand Specialists will work on identifying and improving key customer inputs for growth such as content, marketing and stock availability among others. Apart from this, the Brand Specialists will also help brands leverage Amazon’s tools and programs to improve on their business inputs. We are seeking creative, goal-oriented and highly entrepreneurial people to join our exciting and fast-paced team. About the Role: As a Brand Specialist, you will focus on delivering 5 core focus areas for the brand: Selection, demand generation, catalogue quality, business advice and availability. The person who joins the leadership team in this position must share our passion and commitment for serving our customers. This ideal candidate should have experience in forging and building brand relationships. Some understanding of planning product cycles and selling online is preferred. The right candidate will be flexible, action and results oriented, self-starting and have strong analytical skills. He or she must have a proven track record in taking ownership, driving results and moving with speed to implement ideas in a fast-paced environment. He should be entrepreneurial with the confidence to make independent, data-driven decisions. The candidate must demonstrate the ability to succeed at: planning and forecasting, and driving an online business. The candidate must be an effective communicator in working with some of Amazon’s most important partners and vendors, as well as with internal colleagues and groups. Responsibilities This person will have responsibility for: Building selection: Identify selection gaps. Track brand’s offline catalogue to ensure all relevant selection is present on Amazon. Demand generation: Responsible for demand generation. This includes working with other members on the category management team to create a marketing calendar based on vendor's objectives Business Advice: Support participation of brand in Amazon programs Availability: Ensuring continuous availability of products Catalogue Quality on Amazon: Ensuring the best input from brand is updated for customer interface on Amazon Detail Pages through perfect Images, Product descriptions, etc. Key job responsibilities Job summary Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. The Brand Specialist will work on offering dedicated support to top-tiered brands to grow with Amazon by identifying and improving key customer inputs for growth such as content, marketing and stock availability among others. Apart from this, the Brand Specialists will also help brands leverage Amazon’s tools and programs to improve on their business inputs. We are seeking creative, goal-oriented and highly entrepreneurial people to join our exciting and fast-paced team. About the Role: As a Brand Specialist, you will focus on delivering 5 core focus areas for the brand: Selection, demand generation, catalogue quality, business advice and availability. The person who joins the leadership team in this position must share our passion and commitment for serving our customers. This ideal candidate should have experience in forging and building brand relationships. Some understanding of planning product cycles and selling online is preferred. The right candidate will be flexible, action and results oriented, self-starting and have strong analytical skills. He or she must have a proven track record in taking ownership, driving results and moving with speed to implement ideas in a fast-paced environment. He should be entrepreneurial with the confidence to make independent, data-driven decisions. The candidate must demonstrate the ability to succeed at: planning and forecasting, and driving an online business. The candidate must be an effective communicator in working with some of Amazon’s most important partners and vendors, as well as with internal colleagues and groups. Responsibilities This person will have responsibility for: Building selection: Identify selection gaps. Track brand’s offline catalogue to ensure all relevant selection is present on Amazon. Demand generation: Responsible for demand generation. This includes working with other members on the category management team to create a marketing calendar based on vendor's objectives Business Advice: Support participation of brand in Amazon programs Availability: Ensuring continuous availability of products Catalogue Quality on Amazon: Ensuring the best input from brand is updated for customer interface on Amazon Detail Pages through perfect Images, Product descriptions, etc. BASIC QUALIFICATIONS 1+ years of account management, project or program management or buying experience Bachelor's degree Experience using analytical specific tools such as Google Analytics, SQL or HTML PREFERRED QUALIFICATIONS Experience in process improvement Experience managing large amounts of data Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 3 weeks ago

Apply

30.0 years

3 - 6 Lacs

No locations specified

On-site

ABOUT TEMENOS Temenos powers a world of banking that creates opportunities for billions of people and businesses everywhere. We have been doing this for over 30 years through the pioneering spirit of our Temenosians who are passionate about making banking better, together. We serve over 3000 clients from the largest to challengers and community banks in 150+ countries. We collaborate with clients to build new banking services and state-of-the-art customer experiences on our open banking platform, helping them operate more sustainably. At Temenos, we have an open-minded and inclusive culture, where everyone has the power to create their own destiny and make a positive contribution to the world of banking and society. THE ROLE The Product Analysis and Customer Support (PACS) team provides expert-level support to clients using Temenos products, including those in the implementation phase as well as live customers. PACS manages all client support requests, ensuring seamless product experience and client satisfaction. MSSQL Database Administrator (DBA) is required to support our Production SQL server DB and Core instances in support model. The SQL server DBA will be responsible for performance tuning, administrating, monitoring, troubleshooting performance issues, recovering the database from any critical state of the SQL server databases. Would be responsible for reviewing application/databases, identify performance bottlenecks and fixing Work shifts applicability : Work Shifts: Work Shifts on Rotation applicable (Work from office – 3 days from office and 2 days from home) OPPORTUNITIES Production support on any critical DB issues, DB backup, Restore and Disaster recovery Evaluate and analyze custom SQL server programs/T-SQLs for high performance issues Create SQL objects like Tables, Views, Indexes, Stored Procedures, Triggers, Rules, Defaults and user defined functions. This role requires strong expertise in advance query concepts that includes Dynamic SQL scripting, Recursive CTE and Partitioning management Perform all advance SQL server concepts (Performance Tuning, Larger subqueries, JOINS, PerfMon tooling) Analyze, tune & test identified top 30 programs, integrations and reports based on current CPU, I/O & throughput utilization and implement best practices recommended by SQL server for improved performance Proficient in tuning T-SQL queries using SQL server profiler, Execution Plan, Performance Monitor, DBCC Commands, Database Tuning Advisor, Index Fragmentation techniques to improve the database performance and ensure high availability Document improvement in the SQL level metrics CPU, I/O requests, throughput, Files and File-groups to a desirable level Report trace files comparison for the required parameters before and after the optimization for the improvement of statistics Analysis on trace files, identify problem queries, I/O, Waits, Correlated sub queries and Parameters validation Tune the queries and use Materialized or temp tables if required, Re-write the logic for better performance if required Replace Correlated sub queries, reducing Loop Overhead for DML Statements and Queries Writing Computation-Intensive Programs in T-SQL & Monitor SQL Performance and IO SKILLS BE/Btech candidate with 7 to 9 years of experience as SQL server DBA Experience in database backup & restore activities and HA configuration Working in 24/7 critical environment Using Microsoft SQL server tools like SSMS and BI tools Knowledge on T-SQL Development & Azure SQL Development Excellent written and verbal communication skills Nice to have: Microsoft SQL server certified & Skills on Linux/Unix or any other OS VALUES Care We care and listen to each other, our clients, partners and the communities we serve Commit We commit with determination and persistence to make things happen Collaborate We collaborate within Temenos and across a wider partner ecosystem Challenge We challenge the status quo, try to look at things differently and drive change SOME OF OUR BENEFITS include: Maternity leave: Transition back with 3 days per week in the first month and 4 days per week in the second month Civil Partnership: 1 week of paid leave if you're getting married. This covers marriages and civil partnerships, including same sex/civil partnership Family care: 4 weeks of paid family care leave Recharge days: 4 days per year to use when you need to physically or mentally needed to recharge Study leave: 2 weeks of paid leave each year for study or personal development

Posted 3 weeks ago

Apply

2.0 years

2 - 3 Lacs

shiliguri

On-site

This is a full-time Accountant job for an employee who has knowledge about Income Tax and GST Compliance. It is not necessary to be an extraordinarily experienced person, but to be energetic, a quick learner, have a clear concept of accountancy, have a long-term working tendency, and perform duty with full concentration, are the main qualities we are looking for. We ensure to provide Medical Insurance, Medical leaves, discounts on products for self-use, and a good working atmosphere to the employees. Moreover, if the joinee joins us by mid-September, we may offer one month's extra pay equivalent to his/her previous Job's Salary as a joining fee. Job Types: Full-time, Permanent Pay: ₹18,000.00 - ₹25,000.00 per month Benefits: Health insurance Paid sick time Ability to commute/relocate: Shiliguri, West Bengal: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Required) Experience: total work: 2 years (Preferred) Accounting: 1 year (Preferred)

Posted 3 weeks ago

Apply

30.0 years

0 Lacs

chennai, tamil nadu, india

On-site

About Temenos Temenos powers a world of banking that creates opportunities for billions of people and businesses everywhere. We have been doing this for over 30 years through the pioneering spirit of our Temenosians who are passionate about making banking better, together. We serve over 3000 clients from the largest to challengers and community banks in 150+ countries. We collaborate with clients to build new banking services and state-of-the-art customer experiences on our open banking platform, helping them operate more sustainably. At Temenos, we have an open-minded and inclusive culture, where everyone has the power to create their own destiny and make a positive contribution to the world of banking and society. THE ROLE The Product Analysis and Customer Support (PACS) team provides expert-level support to clients using Temenos products, including those in the implementation phase as well as live customers. PACS manages all client support requests, ensuring seamless product experience and client satisfaction. MSSQL Database Administrator (DBA) is required to support our Production SQL server DB and Core instances in support model. The SQL server DBA will be responsible for performance tuning, administrating, monitoring, troubleshooting performance issues, recovering the database from any critical state of the SQL server databases. Would be responsible for reviewing application/databases, identify performance bottlenecks and fixing Work shifts applicability : Work Shifts: Work Shifts on Rotation applicable (Work from office – 3 days from office and 2 days from home) OPPORTUNITIES Production support on any critical DB issues, DB backup, Restore and Disaster recovery Evaluate and analyze custom SQL server programs/T-SQLs for high performance issues Create SQL objects like Tables, Views, Indexes, Stored Procedures, Triggers, Rules, Defaults and user defined functions. This role requires strong expertise in advance query concepts that includes Dynamic SQL scripting, Recursive CTE and Partitioning management Perform all advance SQL server concepts (Performance Tuning, Larger subqueries, JOINS, PerfMon tooling) Analyze, tune & test identified top 30 programs, integrations and reports based on current CPU, I/O & throughput utilization and implement best practices recommended by SQL server for improved performance Proficient in tuning T-SQL queries using SQL server profiler, Execution Plan, Performance Monitor, DBCC Commands, Database Tuning Advisor, Index Fragmentation techniques to improve the database performance and ensure high availability Document improvement in the SQL level metrics CPU, I/O requests, throughput, Files and File-groups to a desirable level Report trace files comparison for the required parameters before and after the optimization for the improvement of statistics Analysis on trace files, identify problem queries, I/O, Waits, Correlated sub queries and Parameters validation Tune the queries and use Materialized or temp tables if required, Re-write the logic for better performance if required Replace Correlated sub queries, reducing Loop Overhead for DML Statements and Queries Writing Computation-Intensive Programs in T-SQL & Monitor SQL Performance and IO Skills BE/Btech candidate with 7 to 9 years of experience as SQL server DBA Experience in database backup & restore activities and HA configuration Working in 24/7 critical environment Using Microsoft SQL server tools like SSMS and BI tools Knowledge on T-SQL Development & Azure SQL Development Excellent written and verbal communication skills Nice to have: Microsoft SQL server certified & Skills on Linux/Unix or any other OS VALUES Care We care and listen to each other, our clients, partners and the communities we serve Commit We commit with determination and persistence to make things happen Collaborate We collaborate within Temenos and across a wider partner ecosystem Challenge We challenge the status quo, try to look at things differently and drive change SOME OF OUR BENEFITS include: Maternity leave: Transition back with 3 days per week in the first month and 4 days per week in the second month Civil Partnership: 1 week of paid leave if you're getting married. This covers marriages and civil partnerships, including same sex/civil partnership Family care: 4 weeks of paid family care leave Recharge days: 4 days per year to use when you need to physically or mentally needed to recharge Study leave: 2 weeks of paid leave each year for study or personal development Please make sure to read our Recruitment Privacy Policy

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

bhubaneshwar, odisha, india

On-site

Job Description FRESHERS kindly do not apply. CVs having less than 7 years experience will not be considered. This job is for Bhubaneswar(On-Site). We are seeking an experienced Senior Java Developer with a solid background in designing, developing, and maintaining high-quality, scalable, and secure applications. The ideal candidate should have a minimum of 7 years of hands-on coding experience, with a strong understanding of Java technologies and experience in leading development teams. This role will require expertise in Java, Spring Boot, and related frameworks, alongside a strong capability to mentor and guide junior developers while ensuring the successful delivery of complex backend services and microservices. Key Responsibilities Design, develop, and maintain high-quality Java-based applications and services. Write clean, efficient, and scalable code adhering to best practices. Implement and manage Spring Boot, Hibernate, and Spring Security to develop microservices. Optimize and maintain existing applications for performance, scalability, and reliability. Lead and mentor a team of developers, ensuring high standards of code quality and best practices are followed. Collaborate with cross-functional teams to design and implement REST APIs and integrate them with front-end applications. Manage team activities, code reviews, and task prioritization, ensuring timely delivery of projects. Foster a collaborative and productive team environment while addressing challenges and resolving technical issues. Utilize Stream API and Tomcat Server for deployment and server management. Apply MVC architecture, Design Patterns, and SOLID principles in software design and development. Ensure knowledge sharing and continuous improvement within the team. Work on PL/SQL to perform database operations such as joins, triggers, cursors, and ACID transactions. Requirements Java 8+ (version) with a strong focus on Spring Boot, Hibernate, and Spring Security. Strong experience in Microservices Architecture. Expertise in Stream API, Tomcat Server. Strong understanding of MVC Architecture, Design Patterns, and SOLID principles. Solid experience with PL/SQL: joins, triggers, cursors, and ACID compliance. Proven experience in leading and mentoring development teams. Qualifications Tech (in Computer Science, Information Technology, or a related field) or MCA. Location (ref:hirist.tech)

Posted 3 weeks ago

Apply

2.0 years

12 - 20 Lacs

india

Remote

Freelance Recruiter (Mid–Senior Level) — UK Market Specialist Location: Remote Engagement Type: Freelance About The Role We are seeking an experienced Freelance Recruiter (Mid–Senior) with a focus on the UK market to support our clients in sourcing, screening, and delivering top-quality talent across various industries. We welcome recruiters with targeted specialisms (for example Technology, Finance, Sales & Marketing, or Professional Services) — please highlight your specialism when applying. This role is ideal for independent recruiters who want flexibility while working on live roles from multiple companies, and who have experience hiring for mid to senior-level positions. Key Responsibilities Source candidates through job boards, LinkedIn, databases, and networking — with emphasis on mid to senior-level placements. Screen, evaluate, and shortlist candidates based on job requirements and role seniority. Coordinate with clients and hiring managers to understand role specifications, compensation expectations, and market availability. Submit high-quality profiles via our recruitment platform. Maintain regular communication with candidates throughout the hiring process and provide timely feedback. Ensure compliance with data protection and confidentiality standards. Requirements Minimum 2+ years of experience in recruitment (agency or in-house), with proven success placing mid to senior-level candidates. Strong knowledge of the UK hiring market and compliance requirements. Excellent sourcing skills across platforms such as LinkedIn, CV databases, and referrals. Strong communication and relationship-building skills; able to manage client and candidate expectations. Self-motivated, target-oriented, and able to work independently. Access to necessary recruitment tools (job boards, LinkedIn, etc.) is an advantage. Desirable: a clear target specialism (e.g., Technology, Finance, Sales & Marketing, Professional Services) — please indicate this in your application. What We Offer Access to live client roles through our platform. Competitive commission-based payouts (per successful hire); recruiters are paid on a success-fee basis once a candidate joins the client. Flexible, remote working with no fixed hours. Support from our team for queries and platform usage. Transparent payout cycle and dedicated partner support. Skills: hiring,recruitment,linkedin,recruiter,sourcing,freelancer,communication

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

pune, maharashtra, india

Remote

At MARMA.AI, we’re building the next generation of analysts. Our AI-powered platform transforms the way people learn analytics by providing real-world business challenges, instant feedback, personalized learning paths, and dynamic industry datasets. Our mission is to bridge the gap between theory and execution and help aspiring professionals thrive in the AI era. 💡 About the Role We are looking for a Data Analyst with hands-on experience in SQL and Python . In this role, you’ll work with real-world datasets, uncover actionable insights, and help design analytical challenges that power our interactive learning platform. 🌍 Who Should Apply Recent graduates or early-career professionals with 0–2 years of experience Hands-on skills in SQL (data extraction, joins, aggregations) and Python (data wrangling, analysis, visualization) Prior experience in e-commerce data analytics is a plus, but not required Strong curiosity, problem-solving skills, and passion for data-driven decision making 📌 What You’ll Do Use SQL and Python to extract, clean, and analyze large datasets Generate insights that guide both internal product decisions and external learning content Contribute to the creation of industry-oriented case studies and datasets for learners Collaborate with cross-functional teams to design data-driven solutions Continuously explore new tools and methods in analytics and AI ✨ Why Join Us? Work on real-world business problems across industries Be part of an AI-powered learning platform shaping the future of data analytics Grow in a collaborative, innovation-driven environment Opportunity to make an impact by helping build a global community of data-literate professionals 🔎 Location : Remote 📅 Experience : 0–2 years 💼 Skills : SQL, Python, Data Analysis

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies