Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 - 4.0 years
0 - 0 Lacs
Gurugram, Haryana
On-site
Location : Gurugram Sector 43 Employment Type : Full-Time Experience Level: 2 years+ Job Summary: We are looking for a highly skilled Motion Graphic Designer with strong expertise in Adobe After Effects and Premiere Pro to join our creative team. In this role, you will be responsible for conceptualizing, designing, and producing high-quality motion content for branding, marketing campaigns, and social platforms. If you’re passionate about visual storytelling and have a flair for design and animation, we’d love to meet you. Key Responsibilities : Create visually compelling motion graphics for social media, ads, websites, presentations, and internal communications. Design custom animations using After Effects, including kinetic typography, icon animation, explainer elements, and transitions. Edit and assemble raw footage using Premiere Pro, applying color correction, sound design, and visual enhancements. Collaborate closely with designers, video editors, marketers, and copywriters to bring stories to life through motion. Develop animation assets from scratch or enhance static designs with animation and transitions. Stay updated on design trends, motion techniques, and new tools/plugins for more efficient and modern workflows. Manage multiple projects, meet tight deadlines, and maintain a high standard of quality and creativity. Required Skills & Qualifications: Bachelor’s degree in Motion Design, Animation, Graphic Design, or related field. 2–4 years of proven experience in motion design, preferably in an agency or digital content environment. Expert-level proficiency in Adobe After Effects and Adobe Premiere Pro. Strong understanding of animation principles, video editing, typography, and visual hierarchy. Experience in compositing, masking, motion tracking, rotoscoping, and applying visual effects. Familiarity with sound design, color grading, and working with audio in video projects. Ability to integrate After Effects with Premiere Pro for efficient dynamic workflows. Portfolio showcasing a strong body of work in both motion graphics and video editing. Preferred (Bonus) Skills: Experience with 3D tools like Cinema 4D or Blender. Basic scripting or expressions in After Effects. Job Type: Full-time Pay: ₹10,880.62 - ₹50,115.34 per month Benefits: Internet reimbursement Leave encashment Paid sick time Paid time off Schedule: Day shift Monday to Friday Weekend availability Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): Current CTC Expected CTC Work Location: In person
Posted 4 weeks ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Us Zelis is modernizing the healthcare financial experience in the United States (U.S.) by providing a connected platform that bridges the gaps and aligns interests across payers, providers, and healthcare consumers. This platform serves more than 750 payers, including the top 5 health plans, BCBS insurers, regional health plans, TPAs and self-insured employers, and millions of healthcare providers and consumers in the U.S. Zelis sees across the system to identify, optimize, and solve problems holistically with technology built by healthcare experts—driving real, measurable results for clients. Why We Do What We Do In the U.S., consumers, payers, and providers face significant challenges throughout the healthcare financial journey. Zelis helps streamline the process by offering solutions that improve transparency, efficiency, and communication among all parties involved. By addressing the obstacles that patients face in accessing care, navigating the intricacies of insurance claims, and the logistical challenges healthcare providers encounter with processing payments, Zelis aims to create a more seamless and effective healthcare financial system. Zelis India plays a crucial role in this mission by supporting various initiatives that enhance the healthcare financial experience. The local team contributes to the development and implementation of innovative solutions, ensuring that technology and processes are optimized for efficiency and effectiveness. Beyond operational expertise, Zelis India cultivates a collaborative work culture, leadership development, and global exposure, creating a dynamic environment for professional growth. With hybrid work flexibility, comprehensive healthcare benefits, financial wellness programs, and cultural celebrations, we foster a holistic workplace experience. Additionally, the team plays a vital role in maintaining high standards of service delivery and contributes to Zelis’ award-winning culture. Position Overview This role will lead and mentor a team of engineers, modelers, and analysts, providing technical guidance while overseeing the optimization of data assets. With a focus on hands-on technical experience and leadership skills, you will drive impactful outcomes and collaborate with global stakeholders to align data initiatives with business objectives, fostering innovation in healthcare data solutions. About Zelis Zelis is a leading payments company in healthcare, guiding, pricing, explaining, and paying for care on behalf of insurers and their members. We align the interests of payers, providers, and consumers to deliver a better financial experience and more affordable, transparent care for all. Partnering with 700+ payers, supporting 4 million+ providers and 100 million members across the healthcare industry. About ZDI Zelis Data Intelligence (ZDI) is a centralized data team that partners across Zelis business units to unlock the value of data through intelligence and AI solutions. Our mission is to transform data into a strategic and competitive asset by fostering collaboration and innovation. Enable the democratization and productization of data assets to drive insights and decision-making. Develop new data and product capabilities through advanced analytics and AI-driven solutions. Collaborate closely with business units and enterprise functions to maximize the impact of data. Leverage intelligence solutions to unlock efficiency, transparency, and value across the organization. Job Title - Sr Lead / Lead / Associate Manager Location - Hyderabad, India Department - Data Intelligence & Innovation, ZDI Reports To - Tower Lead, ZDI Job Summary This role will lead and mentor a team of engineers, modelers, and analysts, providing technical guidance while overseeing the optimization of data assets. With a focus on hands-on technical experience and leadership skills, you will drive impactful outcomes and collaborate with global stakeholders to align data initiatives with business objectives, fostering innovation in healthcare data solutions. Key Responsibilities Team Leadership & Management (80%) Build and mentor a high-performing, diverse team aligned with organizational needs, fostering a collaborative and inclusive environment. Participate in hiring processes, performance reviews, and professional development initiatives, including upskilling and training opportunities. Collaborate with matrix leads and managers in the United States to align goals, architecture, standards, and technical requirements. Set performance goals, KPIs, and clear expectations for team members, conducting regular reviews and providing actionable feedback. Plan and monitor quarterly deliverables, removing dependencies and blockers to ensure successful project execution. Promote a culture of continuous improvement, collaboration, and technical excellence. Technical Contribution & Oversight (20%) Provide technical leadership in the design and implementation of end-to-end data solutions, including storage, integration, processing, and visualization. Collaborate with data science and business intelligence leaders to ensure scalable, compliant, and secure data products. Optimize deployment workflows, oversee CI/CD processes, and implement data governance and security best practices. Utilize tools like Azure, Snowflake, DBT, and Python to design scalable data pipelines and architectures. Drive process improvements in data quality, masking, automation, and the creation of reusable data products. Conduct code reviews, enforce engineering best practices, and stay informed about emerging technologies to drive innovation. Additional Responsibilities Partner with product teams to align data strategies with business needs, integrating data assets into innovative solutions. Act as a liaison between technical teams and healthcare subject matter experts to translate business needs into technical solutions. Monitor and report on key performance indicators, driving measurable outcomes and continuous improvement. Advocate for adopting innovative tools and methodologies to enhance the organization’s data capabilities. Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. Minimum of 8 years of hands-on technical experience in data engineering, including expertise in Azure, Snowflake, DBT, and Python. At least 2 years of people management experience managing teams in data engineering or intelligence initiatives. Strong understanding of healthcare data requirements, regulatory frameworks, and operational challenges. Proficiency in Agile methodologies and project management tools such as Jira and Confluence. Excellent problem-solving and communication skills, with a focus on conflict resolution and team building. Preferred Qualifications Certifications in cloud platforms (Azure, AWS, Snowflake) or BI tools (Power BI, Tableau). Experience with advanced analytics, machine learning models, or AI applications in healthcare. Familiarity with additional cloud platforms such as AWS and GCP.
Posted 4 weeks ago
0 years
0 Lacs
Guwahati, Assam, India
Remote
Company Description "AGITUFF" is a registered trademark of Abhishek Glass Industries Limited, which manufactures a wide range of glass products including Toughened Glass, Insulating Glass, Laminated Glass, Switchable Glass, Bend Glass, and Decorative Glass. The company also produces Floor Springs, Patch Fittings, Automatic Sensor Doors, Masking Tape, and Aluminium Composite Panels (ACP). Role Description This is a full-time hybrid role for a Sales Executive based in Guwahati, with the flexibility to work from home part of the time. The Sales Executive will be responsible for identifying and pursuing new sales opportunities, managing customer relationships, developing sales strategies, and closing deals. The role will also include preparing sales reports, conducting market research, and collaborating with other departments to ensure customer satisfaction. Qualifications Proven experience in sales, business development, or related field Strong communication, negotiation, and interpersonal skills Ability to develop and execute effective sales strategies Proficiency with CRM software and MS Office Suite Ability to work independently and in a team environment Strong organizational and time management skills Bachelor's degree in Business, Marketing, or related field Experience in the glass or construction industry is a plus
Posted 4 weeks ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Role Overview We are seeking an experienced and highly skilled PostgreSQL Database Administrator (DBA) to manage, maintain, and optimize our PostgreSQL database systems. The ideal candidate will be responsible for ensuring database availability, security, performance, and scalability. Responsibilities You will work closely with application developers, system engineers, and DevOps teams to provide high-quality data solutions and troubleshoot complex issues in a mission-critical Responsibilities : Install, configure, and upgrade PostgreSQL databases in high-availability environments Design and implement database architecture, including replication, partitioning, and sharding Perform daily database administration tasks including backups, restores, monitoring, and tuning Optimize queries, indexes, and overall performance of PostgreSQL systems Ensure high availability and disaster recovery by configuring replication (Streaming, Logical) and backup solutions (pgBackRest, Barman, WAL archiving) Implement and maintain security policies, user access control, and encryption Monitor database health using tools such as pgAdmin, Nagios, Zabbix, or Troubleshoot database-related issues in development, test, and production environments Automate routine tasks using shell scripting, Python, or Ansible Work with DevOps/SRE teams to integrate PostgreSQL into CI/CD pipelines and cloud Technical Skills : PostgreSQL Expertise : Proven experience with PostgreSQL 11+ (latest version experience preferred) Strong knowledge of SQL, PL/pgSQL, database objects, and data types Experience with PostgreSQL replication : streaming, logical, and hot standby Deep understanding of VACUUM, ANALYZE, autovacuum configuration and tuning Knowledge of PostGIS, pgBouncer, and pg_stat_statements is a Tuning & Monitoring : Query optimization and slow query analysis using EXPLAIN and ANALYZE Experience with database performance monitoring tools (e.g., pg_stat_activity, pgBadger) Strong debugging and troubleshooting of locking, deadlocks, and resource contention & DevOps Integration : PostgreSQL experience on AWS RDS, Azure Database for PostgreSQL, or GCP Cloud SQL Familiarity with IaC tools like Terraform or CloudFormation is a plus Experience with CI/CD integration and containerization tools (Docker, Kubernetes) for DB & Compliance : Implement role-based access control, data masking, and audit logging Ensure compliance with standards like GDPR, ISO 27001, or SOC 2 where : Bachelors or Masters degree in Computer Science, Information Technology, or a related field Minimum 4+ years of experience in PostgreSQL database administration PostgreSQL certification (e.g., EDB Certified Associate/Professional) is a plus Experience in 24x7 production environments supporting high-volume Experience : Exposure to multi-tenant architectures Experience migrating from Oracle/MySQL to PostgreSQL Knowledge of NoSQL systems (MongoDB, Redis) is a plus Understanding of data warehousing and ETL processes (ref:hirist.tech)
Posted 1 month ago
6.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
JOB_POSTING-3-72171-1 Job Description Role Title : AVP, Enterprise Logging & Observability (L11) Company Overview Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles. Organizational Overview Splunk is Synchrony's enterprise logging solution. Splunk searches and indexes log files and helps derive insights from the data. The primary goal is, to ingests massive datasets from disparate sources and employs advanced analytics to automate operations and improve data analysis. It also offers predictive analytics and unified monitoring for applications, services and infrastructure. There are many applications that are forwarding data to the Splunk logging solution. Splunk team including Engineering, Development, Operations, Onboarding, Monitoring maintain Splunk and provide solutions to teams across Synchrony. Role Summary/Purpose The role AVP, Enterprise Logging & Observability is a key leadership role responsible for driving the strategic vision, roadmap, and development of the organization’s centralized logging and observability platform. This role supports multiple enterprise initiatives including applications, security monitoring, compliance reporting, operational insights, and platform health tracking. This role lead platform development using Agile methodology, manage stakeholder priorities, ensure logging standards across applications and infrastructure, and support security initiatives. This position bridges the gap between technology teams, applications, platforms, cloud, cybersecurity, infrastructure, DevOps, Governance audit, risk teams and business partners, owning and evolving the logging ecosystem to support real-time insights, compliance monitoring, and operational excellence. Key Responsibilities Splunk Development & Platform Management Lead and coordinate development activities, ingestion pipeline enhancements, onboarding frameworks, and alerting solutions. Collaborate with engineering, operations, and Splunk admins to ensure scalability, performance, and reliability of the platform. Establish governance controls for source naming, indexing strategies, retention, access controls, and audit readiness. Splunk ITSI Implementation & Management - Develop and configure ITSI services, entities, and correlation searches. Implement notable events aggregation policies and automate response actions. Fine-tune ITSI performance by optimizing data models, summary indexing, and saved searches. Help identify patterns and anomalies in logs and metrics. Develop ML models for anomaly detection, capacity planning, and predictive analytics. Utilize Splunk MLTK to build and train models for IT operations monitoring. Security & Compliance Enablement Partner with InfoSec, Risk, and Compliance to align logging practices with regulations (e.g., PCI-DSS, GDPR, RBI). Enable visibility for encryption events, access anomalies, secrets management, and audit trails. Support security control mapping and automation through observability. Stakeholder Engagement Act as a strategic advisor and point of contact for business units, application, infrastructure, security stakeholders and business teams leveraging Splunk. Conduct stakeholder workshops, backlog grooming, and sprint reviews to ensure alignment. Maintain clear and timely communications across all levels of the organization. Process & Governance Drive logging and observability governance standards, including naming conventions, access controls, and data retention policies. Lead initiatives for process improvement in log ingestion, normalization, and compliance readiness. Ensure alignment with enterprise architecture and data classification models. Lead improvements in logging onboarding lifecycle time, automation pipelines, and selfservice ingestion tools. Mentor junior team members and guide engineering teams on secure, standardized logging practices. Required Skills/Knowledge Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Splunk Subject Matter Expert (SME) Strong hands-on understanding of Splunk architecture, pipelines, dashboards, and alerting, data ingestion, search optimization, and enterprise-scale operations. Experience supporting security use cases, encryption visibility, secrets management, and compliance logging. Splunk Development & Platform Management, Security & Compliance Enablement, Stakeholder Engagement & Process & Governance Experience with Splunk Premium Apps - ITSI and Enterprise Security (ES) minimally Experience with Data Streaming Platforms & tools like Cribl, Splunk Edge Processor. Proven ability to work in Agile environments using tools such as JIRA or JIRA Align. Strong communication, leadership, and stakeholder management skills. Familiarity with security, risk, and compliance standards relevant to BFSI. Proven experience leading product development teams and managing cross-functional initiatives using Agile methods. Strong knowledge and hands-on experience with Splunk Enterprise/Splunk Cloud. Design and implement Splunk ITSI solutions for proactive monitoring and service health tracking. Develop KPIs, Services, Glass Tables, Entities, Deep Dives, and Notable Events to improve service reliability for users across the firm Develop scripts (python, JavaScript, etc.) as needed in support of data collection or integration Develop new applications leveraging Splunk’s analytic and Machine Learning tools to maximize performance, availability and security improving business insight and operations. Support senior engineers in analyzing system issues and performing root cause analysis (RCA). Desired Skills/Knowledge Deep knowledge of Splunk development, data ingestion, search optimization, alerting, dashboarding, and enterprise-scale operations. Exposure to SIEM integration, security orchestration, or SOAR platforms. Knowledge of cloud-native observability (e.g. AWS/GCP/Azure logging). Experience in BFSI or regulated industries with high-volume data handling. Familiarity with CI/CD pipelines, DevSecOps integration, and cloud-native logging. Working knowledge of scripting or automation (e.g., Python, Terraform, Ansible) for observability tooling. Splunk certifications (Power User, Admin, Architect, or equivalent) will be an advantage . Awareness of data classification, retention, and masking/anonymization strategies. Awareness of integration between Splunk and ITSM or incident management tools (e.g., ServiceNow, PagerDuty) Experience with Version Control tools – Git, Bitbucket Eligibility Criteria Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Demonstrated success in managing large-scale logging platforms in regulated environments. Excellent communication, leadership, and cross-functional collaboration skills. Experience with scripting languages such as Python, Bash, or PowerShell for automation and integration purposes. Prior experience in large-scale, security-driven logging or observability platform development. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication and interpersonal skills to interact effectively with team members and stakeholders. Knowledge of IT Service Management (ITSM) and monitoring tools. Knowledge of other data analytics tools or platforms is a plus. WORK TIMINGS : 01:00 PM to 10:00 PM IST This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L9+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L09+ Employees can apply. Level / Grade : 11 Job Family Group Information Technology
Posted 1 month ago
1.0 - 31.0 years
2 - 2 Lacs
Anandpuri, Patna
On-site
Company name -Chemist box Pvt.Ltd 🎨 Job Title Commercial Painter (Experience: 1–2 years) Salary ₹18,000–₹21,000 per month (depending on experience and skill level) Key Responsibilities Surface Preparation: Clean, sand, prime, and fill walls, shelves, fixtures—ensuring smooth and damage-free surfaces Paint Mixing & Color Matching: Accurately mix paint and match colors per store specifications Applying Paint: Use brushes, rollers, and spray guns to apply latex, enamel, or varnish coatings, achieving consistent coverage Masking & Coverage Protection: Safeguard areas not to be painted (e.g., fixtures, glass, signage) using tape and protective coverings Touch-ups & Finishing: Inspect surfaces, correct imperfections, and ensure a polished, professional appearance Equipment Maintenance: Clean and maintain painting tools and spray guns; perform routine upkeep Safety Compliance: Follow safety protocols and use PPE—masks, gloves, protective wear—while handling paints and solvents Collaboration: Coordinate with store managers, carpenters, and interior teams to ensure on-time completion. Qualifications & Skills Experience: 1–2 years as a painter (preferably in retail, interior, or spray-paint contexts) Education: High school diploma or equivalent; vocational training (ITI/Fitter or painting certification) is a plus Technical Skills: Proficient with spray guns, brushes, rollers Strong color-matching ability Familiarity with various paint types, finishes, and application methods Soft Skills: Keen attention to detail Good manual dexterity and hand–eye coordination Ability to stand for extended periods Safety Awareness: Knowledge of handling chemicals and equipment safely Physical Requirements & Work Environment Capable of standing, bending, and climbing ladders as needed Comfortable handling paints, solvents, and cleaning supplies Day shift in-store role, 📈 Career Growth Potential to advance to Senior Painter, Paint Shop Supervisor, or Maintenance Lead roles with demonstrated performance 💡 Additional Notes Jobs for painters with similar experience typically offer ₹15,000–₹20,000; this JD’s upper range of ₹21,000 is competitive and attainable पद नाम स्टोर पेंटर (अनुभव: १–२ वर्ष) 💰 वेतन ₹18,000–₹21,000 प्रति माह (अनुभव और कौशल के आधार पर तय) 🧰 मुख्य जिम्मेदारियाँ सतह तैयार करना: दीवारों, शेल्फ़, फिक्सचर आदि की सफाई, सैंडिंग, प्राइमिंग और रिपेयरिंग का कार्य रंग मिलाना: दुकान की आवश्यकताओं के अनुसार रंग मिलाना तथा समरूपता सुनिश्चित करना पेंट लगाना: ब्रश, रोलर या स्प्रे गन से लेटेक्स, एनामेल या वार्निश को समान रूप से लगाना नॉन-पेंटिंग एरियाज़ की सुरक्षा: शीर्षक, ग्लास, फिक्सचर आदि पर टेप या कवर लगाकर सुरक्षा करना टच‑अप और फिनिशिंग: अंतिम जाँच कर सतहों पर बचे खामियों को दूर करना उपकरण की सफाई: पेंटिंग टूल्स और स्प्रे गन्स की सही ढंग से सफाई और रखरखाव करना सुरक्षा पालन: पेंट्स और सॉल्वेंट के संग उपयोग किये जाने वाले मास्क, ग्लव्स और अन्य पीपीई का प्रयोग करना टीम सहयोग: स्टोर प्रबंधक, कैरपेंटर या इंटीरियर टीम के साथ तालमेल से कार्य पूरा करना 🛠️ योग्यता एवं कौशल अनुभव: १–२ वर्ष की पेंटिंग में कार्यकुशलता (रिटेल, इंटीरियर या स्प्रे पेंट का अनुभव वांछनीय) शिक्षा: हाई स्कूल डिप्लोमा अथवा समकक्ष; आई.टी.आई या तकनीकी प्रशिक्षण होने पर वरीयता तकनीकी दक्षता: स्प्रे गन, ब्रश और रोलर के साथ कार्य करने की क्षमता रंग मिलान में दक्षता विभिन्न प्रकार के पेंट और फिनिश का ज्ञान एवं अनुभव व्यक्तिगत कौशल: विवरणों पर ध्यान देना हाथ–आँख समन्वय में कुशल लंबे समय तक खड़े रहने की शारीरिक क्षमता सुरक्षा संबंधी जागरूकता: रसायनों एवं उपकरणों को सुरक्षित रूप से संभालने का ज्ञान 📋 शारीरिक अपेक्षाएँ और कार्य वातावरण सीढ़ियाँ चढ़ने, झुकने या लंबे समय तक खड़े रहने की क्षमता पेंट्स, सॉल्वेंट्स तथा सफाई सामग्री को संभालने में सहज आम तौर पर दिन के दौरान कार्य; आवश्यकता पड़ने पर रात्रि या सप्ताहांत में कार्य करना पड़ सकता है 📈 करियर वृद्धि अवसर अच्छे प्रदर्शन पर सीनियर पेंटर, पेंट शॉप सुपरवाइजर, या मेंटेनेंस लीड जैसे पदों पर उन्नति सम्भव 💡 विशेष सुझाव १–२ वर्ष के अनुभव वाले पेंटरों का वेतन ₹15,000–₹20,000 के बीच सामान्य रूप से होता है; आपका प्रस्ताव ₹18,000–₹21,000 पूरी तरह प्रतिस्पर्धी
Posted 1 month ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Global Data Insight & Analytics organization is looking for a top-notch Software Engineer who has also got Machine Learning knowledge & Experience to add to our team to drive the next generation of Cloud platform Fullstack Developers. In this role you will work in a small, cross-functional team. The position will collaborate directly and continuously with other engineers, business partners, product managers and designers from distributed locations, and will release early and often. The team you will be working on is focused on building Cloud platform to democratize Machine We strongly believe that data has the power to help create great products and experiences which delight our customers. We believe that actionable and persistent insights, based on high quality data platform, help business and engineering make more impactful decisions. Our ambitions reach well beyond existing solutions, and we are in search of innovative individuals to join this Agile team. This is an exciting, fast-paced role which requires outstanding technical and organization skills combined with critical thinking, problem-solving and agile management tools to support team success Responsibilities 5+ years of experience in data engineering or software engineering, with at least 2 years focused on cloud data platforms (GCP preferred). Technical Skills: Proficient in Java, SpringBoot & Angular/React with experience in designing and deploying cloud-based data pipelines and microservices using GCP tools like BigQuery, Dataflow, and Dataproc. Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context. Full-Stack Development: Knowledge of front-end and back-end technologies, enabling collaboration on data access and visualization layers (e.g.Angular, React, Node.js). Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery. Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments. CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks. Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues. Qualifications Design and Build Data Pipelines: Architect, develop, and maintain scalable data pipelines and microservices that support real-time and batch processing on GCP. Service-Oriented Architecture (SOA) and Microservices: Design and implement SOA and microservices-based architectures to ensure modular, flexible, and maintainable data solutions. Full-Stack Integration: Leverage your full-stack expertise to contribute to the seamless integration of front-end and back-end components, ensuring robust data access and UI-driven data exploration. Data Ingestion and Integration: Lead the ingestion and integration of data from various sources into the data platform, ensuring data is standardized and optimized for analytics. GCP Data Solutions: Utilize GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that meet business needs. Data Governance and Security: Implement and manage data governance, access controls, and security best practices while leveraging GCP’s native row- and column-level security features. Performance Optimization: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions. Collaboration and Best Practices: Work closely with data architects, software engineers, and cross-functional teams to define best practices, design patterns, and frameworks for cloud data engineering. Automation and Reliability: Automate data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency.
Posted 1 month ago
2.0 - 4.0 years
1 - 6 Lacs
Gurgaon
On-site
Location: Gurugram Salary: Up to ₹50,000 per month Working Days: 5 days a week Job Type: Full-time Job Summary: We are looking for a highly skilled Motion Graphic Designer with strong expertise in Adobe After Effects and Premiere Pro to join our creative team. In this role, you will be responsible for conceptualizing, designing, and producing high-quality motion content for branding, marketing campaigns, and social platforms. If you’re passionate about visual storytelling and have a flair for design and animation, we’d love to meet you. Key Responsibilities: Create visually compelling motion graphics for social media, ads, websites, presentations, and internal communications. Design custom animations using After Effects, including kinetic typography, icon animation, explainer elements, and transitions. Edit and assemble raw footage using Premiere Pro, applying color correction, sound design, and visual enhancements. Collaborate closely with designers, video editors, marketers, and copywriters to bring stories to life through motion. Develop animation assets from scratch or enhance static designs with animation and transitions. Stay updated on design trends, motion techniques, and new tools/plugins for more efficient and modern workflows. Manage multiple projects, meet tight deadlines, and maintain a high standard of quality and creativity. Required Skills & Qualifications: Bachelor’s degree in Motion Design, Animation, Graphic Design, or related field. 2–4 years of proven experience in motion design, preferably in an agency or digital content environment. Expert-level proficiency in Adobe After Effects and Adobe Premiere Pro. Strong understanding of animation principles, video editing, typography, and visual hierarchy. Experience in compositing, masking, motion tracking, rotoscoping, and applying visual effects. Familiarity with sound design, color grading, and working with audio in video projects. Ability to integrate After Effects with Premiere Pro for efficient dynamic workflows. Portfolio showcasing a strong body of work in both motion graphics and video editing. Preferred (Bonus) Skills: Experience with 3D tools like Cinema 4D or Blender. Basic scripting or expressions in After Effects. Job Types: Full-time, Permanent Pay: ₹13,205.71 - ₹52,155.35 per month Benefits: Leave encashment Paid sick time Paid time off Schedule: Day shift Fixed shift Monday to Friday Weekend availability Supplemental Pay: Overtime pay Ability to commute/relocate: Gurgaon, Haryana: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): Current CTC Expected CTC Location: Gurgaon, Haryana (Preferred) Work Location: In person
Posted 1 month ago
0 years
1 - 1 Lacs
Farīdābād
On-site
We are looking for a creative and detail-oriented Houdini Artist to join our 3D production team. The ideal candidate should be skilled in procedural modeling using Houdini , have a good understanding of AI image generation tools like Midjourney , and basic proficiency in Photoshop for post-processing and texture editing. This role is perfect for someone with a passion for combining traditional 3D workflows with modern AI-assisted creativity. Proficiency in Houdini (procedural modelling, dynamics, particles, etc.). Familiarity with Mid journey prompts or other AI image tools Basic Photoshop skills (layering, masking, clean up). Understanding of 3D pipelines and asset optimization. Strong visual sense and ability to translate creative briefs into output. Interest in blending AI + 3D workflows. Knowledge of Blender, Unreal, or other 3D software is a plus. Job Types: Full-time, Permanent, Fresher Pay: ₹10,000.00 - ₹15,000.00 per month Benefits: Paid sick time Schedule: Day shift Fixed shift Morning shift Supplemental Pay: Performance bonus Work Location: In person
Posted 1 month ago
0 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
Shadow design discussions the Senior Designer does with clients; prepare Minutes of Meetings and keep track of project milestones to ensure a timely and high-quality delivery Assist the Senior Designer in 3D designs using SpaceCraft (HomeLane Software) and Sketchup; recommend enhancements and be a sounding board for the Senior Designer Be available for Site Visits, Masking along with the Senior Designer; take on the responsibility of file management across HomeLane tech systems Assist the Senior Designer in creating commercial proposals using SpaceCraft and other quoting tools; validate quotes to ensure customers get a transparent and fair estimate. Coordinate with various stakeholders to ensure a great design outcome; build relationships with teams like sales, drawing QC, project management teams and planners Mandatory Qualifications: Design education background - B.Arch, B.Des, M.Des, Diploma in Design 0-1yr of experience in Interior Design / Architecture Good communication & presentation skills Basic knowledge of Modular furniture Practical knowledge of SketchUp A great attitude.
Posted 1 month ago
7.0 years
0 Lacs
Chennai
On-site
Ford Credit's Tech Team in India is actively seeking a highly experienced and strategic Full Stack Automation Engineer with a proven background automating tests for industrial core banking platforms. In this tech role, you will be responsible for establishing, leading, and managing the test automation strategy, standards, and practices specifically for our core banking and integrated financial product systems. You will drive the design, development, and scaling of robust, full-stack automation frameworks, providing comprehensive test coverage for user interfaces, APIs, microservices, and critical integration layers interacting with enterprise core banking systems (such as those provided by Fiserv, FIS, Finacle, or similar). Your expertise will be crucial in ensuring the highest levels of quality, performance, security, and financial data accuracy through efficient and effective automated testing solutions. This position requires you to be a subject matter expert in full-stack test automation, with a demonstrated ability to build and scale test automation frameworks in complex, regulated environments. You will lead by example, mentor teams, and drive the necessary cultural change to embed advanced automation practices across the organization, ensuring the delivery of reliable and compliant financial software. Required Skills: Must Have: 7+ years of progressive experience in Quality Engineering and Test Automation. 5+ years of direct, hands-on testing, QA, or automation experience with at least one of the following industrial core banking platforms: Fiserv, FIS, or Finacle. Strong understanding of core banking domain concepts, processes, and data models (account lifecycle, transaction types, payment processing, interest calculation, regulatory reporting, customer data) and how they function within enterprise systems. Strong Scripting and Programming knowledge in languages such as Java, Python, JavaScript, or Groovy, with proven ability to build robust, maintainable automation frameworks and scripts for complex financial applications. Must have hands-on Experience in Developing Automation Scripts for UI using frameworks/tools like Selenium WebDriver, Appium, Playwright, or Cypress. Experience with BDD frameworks like Cucumber is required. (Experience with tools like Tosca is also valuable but focus on code-based automation skills). Must have strong experience in API Automation using tools/frameworks like Postman, SoapUI, or Rest Assured, specifically for testing APIs, web services, and microservices that interface with or extend the core banking platform. Extensive experience with database testing and advanced SQL scripting for data validation, test data management, and verifying transaction outcomes within relational databases. Exposure to MySQL, SQL Server, and/or PostgreSQL is required. Experience in using build tools like Gradle or Maven and testing frameworks like TestNG. Must have Experience in GitHub for version control and collaborative development of automation code. Very strong experience in designing, implementing, and maintaining CI/CD pipelines (preferred experience with Tekton, Cloud Build, and/or Jenkins) to integrate automated tests and implement quality gates for changes impacting core banking systems. Good to have Public cloud experience, especially GCP, demonstrating the ability to leverage cloud services for test environment management, test execution, and scaling automation infrastructure securely. Must have working experience in Mobile cloud platforms like Headspin or Perfecto for automating testing of mobile banking applications. Must have strong experience with multi-channel and system integration testing, specifically validating data flow and interactions between the core banking system and other internal/external applications. Must have strong knowledge in data visualization and reporting using tools like Extent Report and QlikSense to effectively communicate test results, quality metrics, and automation coverage for banking applications. Experience in using Test management tools like Xray, TestRail, or ALM for managing test cases, execution cycles, and defect tracking within a structured QA process. Must have experience in Jira for issue tracking and project management. Must have experience in designing and automating End-to-End user journeys that simulate real-world banking scenarios across multiple channels and system touchpoints. Ability to work effectively in diversified global teams and projects, collaborating across different time zones and cultures. Advanced troubleshooting skills, with the ability to diagnose and resolve complex issues across the full stack, particularly those involving core banking interactions or data discrepancies. Excellent communication, collaboration, and interpersonal skills, with the ability to articulate technical concepts and quality concerns clearly to both technical and non-technical stakeholders. Understanding of data security and privacy principles (data masking, encryption) and familiarity with regulatory compliance requirements in banking ([Mention specific relevant regulations if known]) as they relate to testing and test data. Nice to Have: Experience with multiple of the listed core banking platforms (Fiserv, FIS, Finacle). Knowledge of performance testing concepts and tools ([e.g., JMeter, LoadRunner]) for high-volume transaction systems. Exposure to Unix and Linux environments for managing test execution or environments. Exposure to AI tools like GenAI for potential applications in test case generation, test data creation, or test analysis. Knowledge on Current Market Trends about the Automation tools and Frameworks, specifically in the FinTech or banking space. Experience with Infrastructure as Code (IaC), Virtualization, and Container Orchestration (Kubernetes - K8s) related to setting up test environments. Preferred Qualification: Bachelor’s Degree in Computer Science, Engineering or equivalent work experience Min of 5+ years of SDET Experience Min of 5+ years of Test Automation Engg. Full Stack Automation Engineer, Core Banking - Role & Responsibilities : Core Banking Automation Strategy & Standards: Establish, lead, and continuously refine the test automation strategy specifically for Ford Credit’s core banking applications and integrated financial products, ensuring rigorous quality standards aligned with business goals, regulatory requirements, and audit needs. Define and implement comprehensive test automation standards, best practices, and guidelines tailored for testing complex, high-transaction financial systems. Full Stack Automation Development: Design, develop, and maintain scalable, robust automated test suites covering the full application stack – including UI (Web and Desktop applications), APIs, and Microservices – with a critical focus on components that interact directly with or extend the core banking platform. Develop and expand advanced test automation frameworks, modernizing them to align with DevOps principles and cloud-native architectures. CI/CD Integration & Quality Gates: Enhance existing automation frameworks and develop new solutions to integrate seamlessly with CI/CD pipelines, ensuring continuous testing of core banking-related code changes. Design and implement automated quality gates and checkpoints within the CI/CD pipeline to prevent regressions and ensure the integrity of builds impacting core banking functionalities. Develop DevOps solutions for automating testing tasks, reporting, and automatically breaking builds upon critical test failures or quality degradation. Comprehensive Testing & Validation: Build and execute a comprehensive automated testing strategy covering unit, integration, regression, performance, and end-to-end testing, with a strong emphasis on validating core banking workflows, transaction processing, and financial data accuracy. Conduct meticulous software testing, verification, and validation of changes, especially focusing on preventing defects and incidents that could impact core banking operations or financial data integrity in production. Data Integrity & Test Data Management: Focus on automating tests that rigorously validate the accuracy, consistency, and integrity of financial data throughout its lifecycle within and across systems interacting with the core banking platform. Ensure the existence and availability of adequate, comprehensive, and appropriately obfuscated/anonymized test data that accurately reflects complex core banking scenarios and complies with data privacy standards and regulations. System Integration Testing: Develop and execute automated tests specifically for integration points between the core banking system and various upstream and downstream applications (e.g., payment gateways, general ledger systems, online/mobile banking platforms), validating data flow and system interactions. Compliance, Security, and Documentation: Create and maintain detailed testing evidence, test reports, and documentation for all automated tests, ensuring full compliance with internal policies, external regulations, and audit requirements specific to the financial industry. Incorporate security testing practices (e.g., API security testing) into automation where relevant, focusing on the secure handling of financial data. Identify and promote the adoption of best practices in code health, testability, observability, and maintainability within the automation code base and the applications being tested, contributing to the overall reliability and auditability of financial systems. Performance & Efficiency: Contribute to identifying and automating performance and load tests for critical core banking transactions and integration points to ensure scalability and responsiveness under peak financial loads. Continuously improve test strategies, test cases, and automation scripts to ensure optimal test coverage and efficient quality engineering practices for the core banking domain. Collaboration & Business Alignment: Collaborate closely with Product Owners, Business Analysts, Software Engineers, and Core Banking domain experts to understand complex financial requirements, define precise testing criteria, and prioritize automation efforts. Support development teams in troubleshooting and resolving technical issues, particularly those related to core banking integrations, data discrepancies, and test environment challenges. Leverage test automation insights to improve the reliability of core banking operations, contributing directly to positive business outcomes and streamlined financial processes.
Posted 1 month ago
1.0 - 3.0 years
2 - 3 Lacs
Karūr
On-site
Minimum 1 to 3 years experiences need in the painting area. Able to do painting uniformly in the jobs. Basic knowledge in type of paints and mixing parties. Prepare surfaces by cleaning, sanding,filling(putty work) and masking as requried. Mix and apply paint, primers and coating as per specification. Use brusher, roller and spray guns for paint application. Maintain tools and paniting equipments in good working condition. Follow safety procedures and use personal protective equipment(PPE). Job Types: Full-time, Fresher Pay: ₹20,000.00 - ₹30,000.00 per month Benefits: Food provided Leave encashment Provident Fund Schedule: Day shift Supplemental Pay: Overtime pay Yearly bonus Experience: Painting: 2 years (Required)
Posted 1 month ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role Summary: As a Senior Solution Engineer, you are a technical advisor and customer strategist, engaged across the customer lifecycle—from pre-sales discovery and solution design to post-sales enablement, implementation oversight, and value realization. You drive outcomes, build trust, and scale adoption through both strategic and hands-on execution. K ey Responsibilities ● Engage from Day 0: Lead discovery, identify gaps, and co-create value roadmaps. ● Deliver tailored demos, proofs of concept, and technical win strategies. ● Define SoWs and solution architectures that align to customer goals. ● Oversee deployment quality, readiness checks, and performance optimization. ● Act as an advisor during implementation and post-go-live phases. ● Conduct QBRs, MBRs, architecture reviews, and deliver value realization plans. ● Monitor license utilization, usage metrics, and time-to-value KPIs. ● Influence renewals and expansions through strong technical credibility. Required Skills & Experience: ● 6+ years in solution engineering or technical architecture roles. ● Mastery of modern data clouds (Snowflake, Databricks, Cloudera). ● Proficient in RBAC/ABAC policy management, masking, tagging, and access control. ● Experience with container orchestration (Kubernetes), IaC (Terraform), and SSO (Okta, Azure AD). ● Exposure to metadata platforms (Atlan, Collibra, Unity). ● Capability to perform health checks, RCA, and platform audits. ● Track record of leading customer success programs and influencing commercial growth. Preferred Qualifications: ● Cloud certifications (AWS, Azure, GCP), or data security (CISSP, CIPP). ● Experience in regulated industries (finance, healthcare, public sector). ● Strong API/scripting knowledge (Python, REST, CLI-based automation). ● Experience mentoring or enabling junior SEs or partner teams. What You'll Gain: ● Strategic ownership of customer success from qualification to expansion. ● A voice in shaping product evolution and delivery best practices. ● Career path to Principal, Field CTO, or leadership in Customer Engineering. ● Opportunity to work on the frontier of data security and governance at scale.
Posted 1 month ago
10.0 years
0 Lacs
Kolkata metropolitan area, West Bengal, India
Remote
· Location : Hybrid/Remote · Experience : 10+ Years · Type : Full-time Company Overview Techylla is a specialized IT consulting firm with offices in India and the US, focused on delivering high-impact data and analytics solutions for clients. We help organizations leverage modern data platforms to unlock actionable insights and drive transformation. Our teams work at the intersection of business strategy and technical depth, delivering scalable solutions that are robust, secure, and built for the future. At Techylla, we foster a culture of accountability, collaboration, and innovation. Role Overview We are seeking a Senior Snowflake Solutions Architect to lead the design, implementation, and governance of complex, enterprise-grade data platforms built on Snowflake. This role requires an expert-level understanding of cloud data architecture, data modelling, performance tuning, cost governance, and modern data engineering frameworks. As a strategic thought partner to both technical and business stakeholders, you will define architecture standards, oversee end-to-end migrations, implement robust change data capture (CDC) frameworks, and ensure secure, scalable, and governed data solutions. You will also mentor engineering teams and play a hands-on role in technical solutioning across our client environments. This position is ideal for a senior professional with a proven record of leading Snowflake architecture at scale, deep knowledge of modern data stacks (dbt, Airflow, CI/CD), and a strong focus on business alignment, data governance, and platform sustainability. Key Responsibilities Define and own the Snowflake architecture strategy , ensuring alignment with enterprise goals, data governance policies, and operational requirements Lead the migration of large-scale data assets from legacy and cloud data platforms into Snowflake Architect and implement advanced CDC frameworks for real-time or near-real-time data replication Design and optimize modular, scalable ELT pipelines using dbt , ensuring reusable and production-grade data transformations Govern and monitor Snowflake environments through performance tuning , multi-cluster strategies , credit optimization , and cost forecasting Oversee security and compliance in Snowflake through RBAC, encryption, data masking, OAuth 2.0, and integration with identity providers like Okta Establish standards for metadata management, data lineage , and enterprise documentation Drive adoption of best practices in data modeling (3NF, star, snowflake schemas), query optimization, and storage management Integrate Snowflake with orchestration (Airflow, dbt Cloud), observability (Datadog), and CI/CD platforms (Bitbucket, Azure DevOps) Provide technical leadership and mentorship to engineering teams across projects and clients Conduct architectural reviews, proof-of-concepts, and tool evaluations for ingestion, processing, CDC, and security frameworks Collaborate with business, analytics, and IT stakeholders to translate business goals into robust, secure data architectures Implement principles of data quality monitoring , including integration with tools like Informatica IDQ or enterprise MDM platforms Manage data sharing across environments, monitor query usage, and enforce governance around EOL data and reference datasets Required Skills & Qualifications Must-Have: 10+ years of experience in data engineering and architecture, with a strong focus on Snowflake over the last 4–5 years Demonstrated leadership in enterprise-scale Snowflake platform design and optimization Proven ability to plan and execute large data migrations and platform consolidations Expertise in CDC tools and patterns (Kafka, Debezium, Fivetran, Qlik Replicate) Advanced Snowflake administration skills: warehouse sizing, multicluster tuning, resource monitors, performance tracking Mastery of SQL , dbt, and modern data transformation techniques Experience integrating Snowflake into CI/CD , version-controlled workflows, and agile delivery models Strong grasp of data governance, metadata management , and platform observability Excellent communication, stakeholder engagement, and cross-functional leadership skills Nice-to-Have: Domain knowledge in SAP supply chain Familiarity with Tableau or Power BI for analytics enablement Exposure to Snowflake AI/ML capabilities Experience advising on technical debt , platform modernization, or data architecture strategies · Snowflake Certifications , such as: o SnowPro Core Certification o SnowPro Advanced Architect Certification o SnowPro Advanced Data Engineer Certification Why Join Techylla? At Techylla, you’ll operate at the forefront of cloud data architecture, influencing both strategy and execution across high-value client environments. You will have the opportunity to lead end-to-end initiatives, mentor top talent, and shape best practices for scalable, future-ready data platforms. This is a strategic role offering deep technical engagement, autonomy, and impact. #Techylla #SnowflakeArchitect #DataEngineering #SnowPro #CloudDataArchitecture #SnowflakeSolutions #DataPlatformStrategy #SeniorDataRoles #CDC #dbt #Airflow #Azure #EnterpriseArchitecture #Hiring
Posted 1 month ago
0 years
20 - 25 Lacs
Pune, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Snowflake,AWS,analytics,sales,sql,data,etl/elt optimization,python,data warehousing,azure,data modeling,data governance,cloud
Posted 1 month ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Ford Credit's Tech Team in India is actively seeking a highly experienced and strategic Full Stack Automation Engineer with a proven background automating tests for industrial core banking platforms. In this tech role, you will be responsible for establishing, leading, and managing the test automation strategy, standards, and practices specifically for our core banking and integrated financial product systems. You will drive the design, development, and scaling of robust, full-stack automation frameworks, providing comprehensive test coverage for user interfaces, APIs, microservices, and critical integration layers interacting with enterprise core banking systems (such as those provided by Fiserv, FIS, Finacle, or similar). Your expertise will be crucial in ensuring the highest levels of quality, performance, security, and financial data accuracy through efficient and effective automated testing solutions. This position requires you to be a subject matter expert in full-stack test automation, with a demonstrated ability to build and scale test automation frameworks in complex, regulated environments. You will lead by example, mentor teams, and drive the necessary cultural change to embed advanced automation practices across the organization, ensuring the delivery of reliable and compliant financial software. Responsibilities Full Stack Automation Engineer, Core Banking - Role & Responsibilities : Core Banking Automation Strategy & Standards: Establish, lead, and continuously refine the test automation strategy specifically for Ford Credit’s core banking applications and integrated financial products, ensuring rigorous quality standards aligned with business goals, regulatory requirements, and audit needs. Define and implement comprehensive test automation standards, best practices, and guidelines tailored for testing complex, high-transaction financial systems. Full Stack Automation Development: Design, develop, and maintain scalable, robust automated test suites covering the full application stack – including UI (Web and Desktop applications), APIs, and Microservices – with a critical focus on components that interact directly with or extend the core banking platform. Develop and expand advanced test automation frameworks, modernizing them to align with DevOps principles and cloud-native architectures. CI/CD Integration & Quality Gates: Enhance existing automation frameworks and develop new solutions to integrate seamlessly with CI/CD pipelines, ensuring continuous testing of core banking-related code changes. Design and implement automated quality gates and checkpoints within the CI/CD pipeline to prevent regressions and ensure the integrity of builds impacting core banking functionalities. Develop DevOps solutions for automating testing tasks, reporting, and automatically breaking builds upon critical test failures or quality degradation. Comprehensive Testing & Validation: Build and execute a comprehensive automated testing strategy covering unit, integration, regression, performance, and end-to-end testing, with a strong emphasis on validating core banking workflows, transaction processing, and financial data accuracy. Conduct meticulous software testing, verification, and validation of changes, especially focusing on preventing defects and incidents that could impact core banking operations or financial data integrity in production. Data Integrity & Test Data Management: Focus on automating tests that rigorously validate the accuracy, consistency, and integrity of financial data throughout its lifecycle within and across systems interacting with the core banking platform. Ensure the existence and availability of adequate, comprehensive, and appropriately obfuscated/anonymized test data that accurately reflects complex core banking scenarios and complies with data privacy standards and regulations. System Integration Testing: Develop and execute automated tests specifically for integration points between the core banking system and various upstream and downstream applications (e.g., payment gateways, general ledger systems, online/mobile banking platforms), validating data flow and system interactions. Compliance, Security, and Documentation: Create and maintain detailed testing evidence, test reports, and documentation for all automated tests, ensuring full compliance with internal policies, external regulations, and audit requirements specific to the financial industry. Incorporate security testing practices (e.g., API security testing) into automation where relevant, focusing on the secure handling of financial data. Identify and promote the adoption of best practices in code health, testability, observability, and maintainability within the automation code base and the applications being tested, contributing to the overall reliability and auditability of financial systems. Performance & Efficiency: Contribute to identifying and automating performance and load tests for critical core banking transactions and integration points to ensure scalability and responsiveness under peak financial loads. Continuously improve test strategies, test cases, and automation scripts to ensure optimal test coverage and efficient quality engineering practices for the core banking domain. Collaboration & Business Alignment: Collaborate closely with Product Owners, Business Analysts, Software Engineers, and Core Banking domain experts to understand complex financial requirements, define precise testing criteria, and prioritize automation efforts. Support development teams in troubleshooting and resolving technical issues, particularly those related to core banking integrations, data discrepancies, and test environment challenges. Leverage test automation insights to improve the reliability of core banking operations, contributing directly to positive business outcomes and streamlined financial processes. Qualifications Required Skills: Must Have: 7+ years of progressive experience in Quality Engineering and Test Automation. 5+ years of direct, hands-on testing, QA, or automation experience with at least one of the following industrial core banking platforms: Fiserv, FIS, or Finacle. Strong understanding of core banking domain concepts, processes, and data models (account lifecycle, transaction types, payment processing, interest calculation, regulatory reporting, customer data) and how they function within enterprise systems. Strong Scripting and Programming knowledge in languages such as Java, Python, JavaScript, or Groovy, with proven ability to build robust, maintainable automation frameworks and scripts for complex financial applications. Must have hands-on Experience in Developing Automation Scripts for UI using frameworks/tools like Selenium WebDriver, Appium, Playwright, or Cypress. Experience with BDD frameworks like Cucumber is required. (Experience with tools like Tosca is also valuable but focus on code-based automation skills). Must have strong experience in API Automation using tools/frameworks like Postman, SoapUI, or Rest Assured, specifically for testing APIs, web services, and microservices that interface with or extend the core banking platform. Extensive experience with database testing and advanced SQL scripting for data validation, test data management, and verifying transaction outcomes within relational databases. Exposure to MySQL, SQL Server, and/or PostgreSQL is required. Experience in using build tools like Gradle or Maven and testing frameworks like TestNG. Must have Experience in GitHub for version control and collaborative development of automation code. Very strong experience in designing, implementing, and maintaining CI/CD pipelines (preferred experience with Tekton, Cloud Build, and/or Jenkins) to integrate automated tests and implement quality gates for changes impacting core banking systems. Good to have Public cloud experience, especially GCP, demonstrating the ability to leverage cloud services for test environment management, test execution, and scaling automation infrastructure securely. Must have working experience in Mobile cloud platforms like Headspin or Perfecto for automating testing of mobile banking applications. Must have strong experience with multi-channel and system integration testing, specifically validating data flow and interactions between the core banking system and other internal/external applications. Must have strong knowledge in data visualization and reporting using tools like Extent Report and QlikSense to effectively communicate test results, quality metrics, and automation coverage for banking applications. Experience in using Test management tools like Xray, TestRail, or ALM for managing test cases, execution cycles, and defect tracking within a structured QA process. Must have experience in Jira for issue tracking and project management. Must have experience in designing and automating End-to-End user journeys that simulate real-world banking scenarios across multiple channels and system touchpoints. Ability to work effectively in diversified global teams and projects, collaborating across different time zones and cultures. Advanced troubleshooting skills, with the ability to diagnose and resolve complex issues across the full stack, particularly those involving core banking interactions or data discrepancies. Excellent communication, collaboration, and interpersonal skills, with the ability to articulate technical concepts and quality concerns clearly to both technical and non-technical stakeholders. Understanding of data security and privacy principles (data masking, encryption) and familiarity with regulatory compliance requirements in banking ([Mention specific relevant regulations if known]) as they relate to testing and test data. Nice to Have: Experience with multiple of the listed core banking platforms (Fiserv, FIS, Finacle). Knowledge of performance testing concepts and tools ([e.g., JMeter, LoadRunner]) for high-volume transaction systems. Exposure to Unix and Linux environments for managing test execution or environments. Exposure to AI tools like GenAI for potential applications in test case generation, test data creation, or test analysis. Knowledge on Current Market Trends about the Automation tools and Frameworks, specifically in the FinTech or banking space. Experience with Infrastructure as Code (IaC), Virtualization, and Container Orchestration (Kubernetes - K8s) related to setting up test environments. Preferred Qualification: Bachelor’s Degree in Computer Science, Engineering or equivalent work experience Min of 5+ years of SDET Experience Min of 5+ years of Test Automation Engg.
Posted 1 month ago
0 years
0 Lacs
Ranchi, Jharkhand, India
On-site
Company Description AGITUFF is a registered trademark of Abhishek Glass Industries Limited, which produces a wide range of glass products including Toughened Glass, Insulating Glass, Laminated Glass, Switchable Glass, Bend Glass, Decorative Glass, Floor Springs, Patch Fittings, Automatic Sensor Doors, Masking Tape, and Aluminium Composite Panels (ACP). Role Description The Sales Manager will be responsible for leading and managing the sales team, developing sales strategies, and ensuring the achievement of sales targets. Daily tasks will include identifying new business opportunities, building and maintaining client relationships, conducting market research, and preparing sales reports. This is a full-time, on-site role located in Durgapur. Qualifications Experience in sales, client relationship management, and business development Strong leadership, team management, and motivational skills Market research and strategic planning abilities Excellent communication, presentation, and negotiation skills Proficiency in using CRM software and Microsoft Office Suite Ability to work independently and as part of a team Bachelor's degree in Business Administration, Marketing, or related field
Posted 1 month ago
0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Shadow design discussions the Senior Designer does with clients; prepare Minutes of Meetings and keep track of project milestones to ensure a timely and high-quality delivery Assist the Senior Designer in 3D designs using SpaceCraft (HomeLane Software) and Sketchup; recommend enhancements and be a sounding board for the Senior Designer Be available for Site Visits, Masking along with the Senior Designer; take on the responsibility of file management across HomeLane tech systems Assist the Senior Designer in creating commercial proposals using SpaceCraft and other quoting tools; validate quotes to ensure customers get a transparent and fair estimate. Coordinate with various stakeholders to ensure a great design outcome; build relationships with teams like sales, drawing QC, project management teams and planners Mandatory Qualifications: Design education background - B.Arch, B.Des, M.Des, Diploma in Design 0-1yr of experience in Interior Design / Architecture Good communication & presentation skills Basic knowledge of Modular furniture Practical knowledge of SketchUp A great attitude.
Posted 1 month ago
1.0 - 5.0 years
3 Lacs
Chennai
On-site
Job Title: Powder Coating Operator Department: Production / Finishing Reports To: Production Supervisor Job Summary: Elite Elevators is seeking a skilled and detail-oriented Powder Coating Operator to join our manufacturing team who has a experience of 1-5 Years. The ideal candidate will be responsible for preparing and applying powder coatings to metal components used in our premium residential elevator systems, ensuring high-quality surface finishes and adherence to safety and quality standards. Key Responsibilities: Prepare surfaces for coating by cleaning, sanding, masking, or applying pre-treatment chemicals. Set up and operate manual or automated powder coating equipment. Apply powder coatings evenly to components, ensuring correct thickness and finish quality. Monitor oven temperatures and curing cycles to ensure optimal adhesion and finish. Inspect coated items for defects and perform touch-ups or rework as required. Maintain cleanliness of coating booths, equipment, and work areas. Follow safety protocols, including proper use of PPE and ventilation systems. Maintain records of daily production, materials used, and process parameters. Coordinate with the Quality Control team to ensure finished components meet company standards. Report equipment issues or maintenance needs promptly. Job Type: Full-time Pay: ₹25,000.00 per month Benefits: Health insurance Leave encashment Paid sick time Provident Fund Schedule: Rotational shift Supplemental Pay: Overtime pay Performance bonus Work Location: In person
Posted 1 month ago
3.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Data Masking Good to have skills : ASP.NET MVC, Angular Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, while also performing maintenance and enhancements to existing applications. You will be responsible for delivering high-quality code and contributing to the overall success of the projects you are involved in, ensuring that client requirements are met effectively and efficiently. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with cross-functional teams to gather requirements and provide technical insights. - Conduct code reviews to ensure adherence to best practices and coding standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Masking. - Good To Have Skills: Experience with ASP.NET MVC, Angular. - Strong understanding of software development life cycle methodologies. - Experience with database management and data security practices. - Familiarity with version control systems such as Git. Additional Information: - The candidate should have minimum 3 years of experience in Data Masking. - This position is based at our Mumbai office. - A 15 years full time education is required. 15 years full time education
Posted 1 month ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
This role is for a true database leader — someone who has designed, scaled, and maintained mission-critical MS SQL Server environments that handle terabytes of data and thousands of concurrent users. You'll own every aspect of the SQL Server stack — from core engine tuning and HA/DR setup to schema optimization and security hardening. This is a pure on-premises DBA role, built for those who understand the cost of downtime and the value of precision. You’ll work closely with engineering, DevOps, support, and business teams to ensure our data layer is fast, available, secure, and future-ready. Key Responsibilities Architecture & Design Design, implement, and support enterprise-scale SQL Server infrastructure (SQL Server 2014 through 2022) Architect and maintain Always On Availability Groups, clustering, log shipping, and replication across data centers Build fault-tolerant database clusters optimized for low-latency and high-throughput use cases Evaluate and recommend hardware, disk I/O strategies, and storage tiers for OLTP and OLAP workloads Performance & Optimization Analyze execution plans and wait stats to optimize server, query, and index performance Implement and refine indexing, partitioning, and compression strategies for very large databases (1TB+) Tune system-wide performance across tempdb, memory grants, parallelism, and disk subsystems Automate performance monitoring and anomaly detection Operations & Automation Own and manage all backup, restore, and recovery strategies using native tools and scripting Automate maintenance plans using SQL Agent, PowerShell, and in-house tooling Implement robust alerting and health checks across 24x7 production environments Lead patching, version upgrades, and schema deployments with zero downtime Security & Compliance Define and enforce least-privilege access, authentication protocols, and encryption policies Implement row-level security, auditing, data masking, and compliance controls (HIPAA, GDPR, ISO) Conduct periodic vulnerability scans and participate in audit reviews Incident Management & Troubleshooting Lead RCA efforts on slowdowns, deadlocks, blocking, I/O contention, and unplanned outages Coordinate escalation with vendors and internal teams for resolution and knowledge sharing Document incident runbooks and create preventive SOPs Collaboration & Mentorship Act as the primary DBA resource for application development, DevOps, and infrastructure teams Guide developers on query design, execution plans, indexing, and transaction control Mentor junior DBAs and create a knowledge-driven culture of continuous improvement Required Experience & Skills 10+ years in hands-on SQL Server DBA roles in high-volume, on-prem environments Expertise in HA/DR solutions: Always On AGs, failover clustering, log shipping, replication In-depth understanding of SQL Server internals: memory architecture, query engine, locking, wait types Proficiency in T-SQL, dynamic SQL, and procedural development Strong in performance troubleshooting using DMVs, Extended Events, Query Store, Profiler Advanced PowerShell scripting for task automation and orchestration Proven experience with backup tools (native, Redgate, Quest) and enterprise monitoring solutions Experience managing databases over 1TB with more than 1000 concurrent sessions Solid understanding of Windows Server, Active Directory, and storage subsystems Strong documentation, incident reporting, and change management discipline Nice to Have Experience with SSIS, SSRS (maintenance and troubleshooting) Familiarity with SAN/NAS storage tuning Basic understanding of DevOps pipelines for DB changes (Liquibase, Redgate, etc.) What You’ll Get Competitive pay with bonus structure linked to uptime, performance, and ownership A chance to lead high-availability database systems with real impact Work with senior engineering and infrastructure teams in a flat, collaborative culture Clear promotion paths and quarterly reviews Training budget and paid certifications to stay ahead in SQL Server and automation An environment that values accountability, craftsmanship, and clarity
Posted 1 month ago
5.0 years
10 Lacs
Hyderābād
On-site
To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts. Job Category Software Engineering Job Details About Salesforce We’re Salesforce, the Customer Company, inspiring the future of business with AI+ Data +CRM. Leading with our core values, we help companies across every industry blaze new trails and connect with customers in a whole new way. And, we empower you to be a Trailblazer, too — driving your performance and career growth, charting new paths, and improving the state of the world. If you believe in business as the greatest platform for change and in companies doing well and doing good – you’ve come to the right place. Salesforce is looking for a Senior software engineer to join the Trailhead team. Trailhead is an online learning platform created by Salesforce with a big, bold mission to democratize education and skill up anyone for the future of work. The Trailhead team has immediate opportunities for talented software engineers who want to make a significant and measurable positive impact to users, the company’s bottom line and the industry. Trailhead is where developers, admins, and business users get the skills they need for the jobs of the future. And thanks to gamification they have a little fun along the way. This is a rare opportunity to build something that positively impacts millions of users helping folks develop new skills and break into new careers. Feel free to explore our app, trailhead.salesforce.com , and maybe even snag a few badges (we'd recommend the Introduction to Agentforce module)! Bonus points if you download the Trailhead GO app from the App Store and earn the badge on mobile! The team focuses on understanding our Trailblazers’ career needs and optimising their learning journey. We build solutions across product and marketing based on the full point of view of the Trailblazer to cultivate more credentialed, employable individuals in the Salesforce ecosystem. We multiply our efforts across the Trailhead marketing, engineering, content, and credentialing teams to align our strategies and change the culture to use data to make decisions. In this role, you will be work on building data pipelines, optimizing, and delivering data for core Trailhead KPIs. You will also contribute to setting the vision for and delivering the future of Trailhead core analytical funnel metrics and user behavior tracking/experiments. You will work on high impact and high visibility projects that are used by Salesforce executives. You will be encouraged to leverage and implement the latest Salesforce products and technologies. In addition, you will often be challenged to solve for ad-hoc/unstructured problems in a highly fast-paced environment and to partner with key stakeholders across teams. Equality is a core value at Salesforce. We strive to create workplaces that reflect the communities we serve and where everyone feels empowered to bring their full, authentic selves to work. People of different backgrounds, experiences, abilities, and perspectives are warmly encouraged to apply. Responsibilities Build & maintain pipelines – Develop Airflow workflows to ingest data from S3, APIs, and Kafka into Snowflake, ensuring reliability and scalability. Define data contracts & governance – Align with source teams on schemas/SLAs and enforce data classification, masking, and privacy standards. Model for analytics – Create well-structured fact/dimension tables and business measures that power self-service dashboards. Safeguard data quality & lineage – Automate tests, monitoring, and lineage tracking to surface issues early and expedite root-cause analysis. Enable collaboration & learning – Partner with analysts and data scientists, document data definitions, and share best practices across the team. About You Collaborative team player who is kind, friendly, and cares about doing the right thing Desire to keep learning and growing, both technically and otherwise, and keeping informed of new data engineering methods and techniques Ability to ask good questions and learn quickly Openness and courage to give and receive feedback Respect towards people from diverse backgrounds and commitment to upholding diversity, equity, and inclusion at work Some Qualifications We Look For B.S/M.S. in Computer Sciences or equivalent field, and 5+ years of relevant experience within big data engineering Excellent understanding of data structures and distributed data processing patterns Experience with many of the following: Implementing and operating big data technologies like Redshift, Hadoop, Spark, Presto, Hive, etc. especially in the evolving areas of security, compliance (GDPR/CCPA/Data Privacy), and data retention Cloud computing and data processing, preferably AWS, security, cluster sizing, and performance tuning ETL design and implementing pipelines in languages like Java, Scala or scripting in Python Hands on experience with Airflow, CI/CD pipelines via Jenkins or similar tools, GitHub Well versed with Snowflake/Google BigQuery/Redshift. Version control systems (Github, Stash, etc..) and deployment tools Implementing and managing Python open-source data orchestration tools such as Airflow, Pandas, etc Experience working with Web analytics platforms, metrics, and data sets (Google Analytics preferred) Plusses Salesforce experience/ certification is a plus but not required Heroku app development experience is a plus but not required Data Cloud experience is a plus but not required Accommodations If you require assistance due to a disability applying for open positions please submit a request via this Accommodations Request Form . Posting Statement Salesforce is an equal opportunity employer and maintains a policy of non-discrimination with all employees and applicants for employment. What does that mean exactly? It means that at Salesforce, we believe in equality for all. And we believe we can lead the path to equality in part by creating a workplace that’s inclusive, and free from discrimination. Know your rights: workplace discrimination is illegal. Any employee or potential employee will be assessed on the basis of merit, competence and qualifications – without regard to race, religion, color, national origin, sex, sexual orientation, gender expression or identity, transgender status, age, disability, veteran or marital status, political viewpoint, or other classifications protected by law. This policy applies to current and prospective employees, no matter where they are in their Salesforce employment journey. It also applies to recruiting, hiring, job assignment, compensation, promotion, benefits, training, assessment of job performance, discipline, termination, and everything in between. Recruiting, hiring, and promotion decisions at Salesforce are fair and based on merit. The same goes for compensation, benefits, promotions, transfers, reduction in workforce, recall, training, and education.
Posted 1 month ago
3.0 years
20 - 23 Lacs
Chennai, Tamil Nadu, India
On-site
We are hiring a detail-oriented and technically strong ETL Test Engineer to validate, verify, and maintain the quality of complex ETL pipelines and Data Warehouse systems . The ideal candidate will have a solid understanding of SQL , data validation techniques , regression testing , and Azure-based data platforms including Databricks . Key Responsibilities Perform comprehensive testing of ETL pipelines, ensuring data accuracy and completeness across systems. Validate Data Warehouse (DWH) objects including fact and dimension tables. Design and execute test cases and test plans for data extraction, transformation, and loading processes. Conduct regression testing to validate enhancements and ensure no breakage of existing data flows. Work with SQL to write complex queries for data verification and backend testing. Test data processing workflows in Azure Data Factory and Databricks environments. Collaborate with developers, data engineers, and business analysts to understand requirements and raise defects proactively. Perform root cause analysis for data-related issues and suggest improvements. Create clear and concise test documentation, logs, and reports. Required Technical Skills Strong knowledge of ETL testing methodologies and tools Excellent skills in SQL (joins, aggregation, subqueries, performance tuning) Hands-on experience with Data Warehousing and data models (Star/Snowflake) Experience in test case creation, execution, defect logging, and closure Proficient in regression testing, data validation, data reconciliation Working knowledge of Azure Data Factory (ADF), Azure Synapse, and Databricks Experience with test management tools like JIRA, TestRail, or HP ALM Nice to Have Exposure to automation testing for data pipelines Scripting knowledge in Python or PySpark Understanding of CI/CD in data testing Experience with data masking, data governance, and privacy rules Qualifications Bachelor’s degree in Computer Science, Information Systems, or related field 3+ years of hands-on experience in ETL/Data Warehouse testing Excellent analytical and problem-solving skills Strong attention to detail and communication skills Skills: regression,azure,data reconciliation,test management tools,data validation,azure databricks,etl testing,data warehousing,dwh,etl pipeline,test case creation,azure data factory,test cases,etl tester,regression testing,sql,databricks
Posted 1 month ago
3.0 years
20 - 23 Lacs
Gurugram, Haryana, India
On-site
We are hiring a detail-oriented and technically strong ETL Test Engineer to validate, verify, and maintain the quality of complex ETL pipelines and Data Warehouse systems . The ideal candidate will have a solid understanding of SQL , data validation techniques , regression testing , and Azure-based data platforms including Databricks . Key Responsibilities Perform comprehensive testing of ETL pipelines, ensuring data accuracy and completeness across systems. Validate Data Warehouse (DWH) objects including fact and dimension tables. Design and execute test cases and test plans for data extraction, transformation, and loading processes. Conduct regression testing to validate enhancements and ensure no breakage of existing data flows. Work with SQL to write complex queries for data verification and backend testing. Test data processing workflows in Azure Data Factory and Databricks environments. Collaborate with developers, data engineers, and business analysts to understand requirements and raise defects proactively. Perform root cause analysis for data-related issues and suggest improvements. Create clear and concise test documentation, logs, and reports. Required Technical Skills Strong knowledge of ETL testing methodologies and tools Excellent skills in SQL (joins, aggregation, subqueries, performance tuning) Hands-on experience with Data Warehousing and data models (Star/Snowflake) Experience in test case creation, execution, defect logging, and closure Proficient in regression testing, data validation, data reconciliation Working knowledge of Azure Data Factory (ADF), Azure Synapse, and Databricks Experience with test management tools like JIRA, TestRail, or HP ALM Nice to Have Exposure to automation testing for data pipelines Scripting knowledge in Python or PySpark Understanding of CI/CD in data testing Experience with data masking, data governance, and privacy rules Qualifications Bachelor’s degree in Computer Science, Information Systems, or related field 3+ years of hands-on experience in ETL/Data Warehouse testing Excellent analytical and problem-solving skills Strong attention to detail and communication skills Skills: regression,azure,data reconciliation,test management tools,data validation,azure databricks,etl testing,data warehousing,dwh,etl pipeline,test case creation,azure data factory,test cases,etl tester,regression testing,sql,databricks
Posted 1 month ago
3.0 years
20 - 23 Lacs
Pune, Maharashtra, India
On-site
We are hiring a detail-oriented and technically strong ETL Test Engineer to validate, verify, and maintain the quality of complex ETL pipelines and Data Warehouse systems . The ideal candidate will have a solid understanding of SQL , data validation techniques , regression testing , and Azure-based data platforms including Databricks . Key Responsibilities Perform comprehensive testing of ETL pipelines, ensuring data accuracy and completeness across systems. Validate Data Warehouse (DWH) objects including fact and dimension tables. Design and execute test cases and test plans for data extraction, transformation, and loading processes. Conduct regression testing to validate enhancements and ensure no breakage of existing data flows. Work with SQL to write complex queries for data verification and backend testing. Test data processing workflows in Azure Data Factory and Databricks environments. Collaborate with developers, data engineers, and business analysts to understand requirements and raise defects proactively. Perform root cause analysis for data-related issues and suggest improvements. Create clear and concise test documentation, logs, and reports. Required Technical Skills Strong knowledge of ETL testing methodologies and tools Excellent skills in SQL (joins, aggregation, subqueries, performance tuning) Hands-on experience with Data Warehousing and data models (Star/Snowflake) Experience in test case creation, execution, defect logging, and closure Proficient in regression testing, data validation, data reconciliation Working knowledge of Azure Data Factory (ADF), Azure Synapse, and Databricks Experience with test management tools like JIRA, TestRail, or HP ALM Nice to Have Exposure to automation testing for data pipelines Scripting knowledge in Python or PySpark Understanding of CI/CD in data testing Experience with data masking, data governance, and privacy rules Qualifications Bachelor’s degree in Computer Science, Information Systems, or related field 3+ years of hands-on experience in ETL/Data Warehouse testing Excellent analytical and problem-solving skills Strong attention to detail and communication skills Skills: regression,azure,data reconciliation,test management tools,data validation,azure databricks,etl testing,data warehousing,dwh,etl pipeline,test case creation,azure data factory,test cases,etl tester,regression testing,sql,databricks
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough