Jobs
Interviews

2818 Snowflake Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

18 - 25 Lacs

Bengaluru

Hybrid

About the Role We are seeking a BI Architect to advise the BI Lead of a global CPG organization and architect an intelligent, scalable Business Intelligence ecosystem. This includes an enterprise-wide KPI dashboard suite augmented by a GenAI-driven natural language interface for insight discovery. The ideal candidate will be responsible for end-to-end architecture: from scalable data models and dashboards to a conversational interface powered by Retrieval-Augmented Generation (RAG) and/or Knowledge Graphs. The solution must synthesize internal BI data with external (web-scraped and competitor) data to deliver intelligent, context-rich insights. Key Responsibilities • Architect BI Stack : Design and oversee a scalable and performant BI platform that serves as a single source of truth for key business metrics across functions (Sales, Marketing, Supply Chain, Finance, etc.). • Advise BI Lead : Act as a technical thought partner to the BI Lead, aligning architecture decisions with long-term strategy and business priorities. • Design GenAI Layer : Architect a GenAI-powered natural language interface on top of BI dashboards to allow business users to query KPIs, trends, and anomalies conversationally. • RAG/Graph Approach : Select and implement appropriate architectures (e.g., RAG using vector stores, Knowledge Graphs) to support intelligent, context-aware insights. • External Data Integration : Build mechanisms to ingest and structure data from public sources (e.g., competitor websites, industry reports, social sentiment) to augment internal insights. • Security & Governance : Ensure all layers (BI + GenAI) adhere to enterprise data governance, security, and compliance standards. • Cross-functional Collaboration : Work closely with Data Engineering, Analytics, and Product teams to ensure seamless integration and operationalization. Qualifications • 69 years of experience in BI architecture and analytics platforms, with at least 2 years working on GenAI, RAG, or LLM-based solutions. • Strong expertise in BI tools (e.g., Power BI, Tableau, Looker) and data modeling. • Experience with GenAI frameworks (e.g., LangChain, LlamaIndex, Semantic Kernel) and vector databases (e.g., Pinecone, FAISS, Weaviate). • Knowledge of graph-based data models and tools (e.g., Neo4j, RDF, SPARQL) is a plus. • Proficiency in Python or relevant scripting language for pipeline orchestration and AI integration. • Familiarity with web scraping and structuring external/third-party datasets. • Prior experience in CPG domain or large-scale KPI dashboarding preferred.

Posted 4 days ago

Apply

8.0 - 13.0 years

15 - 22 Lacs

Chennai

Work from Office

Technical specifications 7+ years of experience in managing Data & Analytics service delivery, preferably within a Managed Services or consulting environment. Experience in managing support for modern data platforms across Azure, Databricks, Fabric, or Snowflake environments. Strong understanding of data engineering and analytics concepts, including ELT/ETL pipelines, data warehousing, and reporting layers. Experience in ticketing, issue triaging, SLAs, and capacity planning for BAU operations. Hands-on understanding of SQL and scripting languages (Python preferred) for debugging/troubleshooting. Proficient with cloud platforms like Azure and AWS; familiarity with DevOps practices is a plus. Familiarity with orchestration and data pipeline tools such as ADF, Synapse, dbt, Matillion, or Fabric. Understanding of monitoring tools, incident management practices, and alerting systems (e.g., Datadog, Azure Monitor, PagerDuty). Circulation Limited: Internal Version 1.0 June 2025 2 Strong stakeholder communication, documentation, and presentation skills. Experience working with global teams and collaborating across time zones. 3.1 Responsibilities Serve as the primary owner for all managed service engagements across all clients, ensuring SLAs and KPIs are met consistently. Continuously improve the operating model, including ticket workflows, escalation paths, and monitoring practices. Coordinate triaging and resolution of incidents and service requests raised by client stakeholders. Collaborate with client and internal cluster teams to manage operational roadmaps, recurring issues, and enhancement backlogs. Lead a >40 member team of Data Engineers and Consultants across offices, ensuring high- quality delivery and adherence to standards. Support transition from project mode to Managed Services including knowledge transfer, documentation, and platform walkthroughs. Ensure documentation is up to date for architecture, SOPs, and common issues. Contribute to service reviews, retrospectives, and continuous improvement planning. Report on service metrics, root cause analyses, and team utilization to internal and client stakeholders. Participate in resourcing and onboarding planning in collaboration with engagement managers, resourcing managers and internal cluster leads. Act as a coach and mentor to junior team members, promoting skill development and strong delivery culture. 3.2 Required Skillset: ETL or ELT: Azure Data Factory, Databricks, Synapse, dbt (any two Mandatory). Data Warehousing: Azure SQL Server/Redshift/Big Query/Databricks/Snowflake (Anyone Mandatory). Data Visualization: Looker, Power BI, Tableau (Basic understanding to support stakeholder queries). Cloud: Azure (Mandatory), AWS or GCP (Good to have). SQL and Scripting: Ability to read/debug SQL and Python scripts. Monitoring: Azure Monitor, Log Analytics, Datadog, or equivalent tools. Ticketing & Workflow Tools: Freshdesk, Jira, ServiceNow, or similar. DevOps: Containerization technologies (e.g., Docker, Kubernetes), Git, CI/CD pipelines (Exposure preferred). 3.3 Behavioural Competencies: At JMAN, we expect our team members to embody the following: Self-Driven & Proactive Own delivery and service outcomes, ensure proactive communication, and manage expectations confidently. Circulation Limited: Internal Version 1.0 June 2025 3 Adaptability & Resilience Thrive in a high-performance, entrepreneurial environment and navigate dynamic challenges effectively. Operational Excellence Be process-oriented and focused on SLA adherence, documentation, and delivery consistency. Agility & Problem Solving Adapt quickly to changing priorities, debug effectively, and escalate when needed with clarity. Commitment & Engagement Ensure timesheet compliance, attend meetings regularly, follow company policies, and actively participate in org-wide initiatives. Teamwork & Collaboration Share knowledge, support colleagues, and contribute to talent retention and team success. Professionalism & Continuous Improvement Maintain a professional demeanour and commit to ongoing learning and self-improvement. Mentoring & Knowledge Sharing Guide and support junior team members, fostering a culture of continuous learning and professional growth. Advocacy & Organizational Citizenship Represent JMAN positively, uphold company values, respect others, and honour commitments, including punctuality and timely delivery.

Posted 4 days ago

Apply

5.0 - 10.0 years

25 - 40 Lacs

Bengaluru

Work from Office

Job Title: Data Engineer Job Type: Full-time Department: Data Engineering / Data Science Reports To: Data Engineering Manager / Chief Data Officer About the Role: We are looking for a talented Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, building, and maintaining robust data pipelines and systems that process and store large volumes of data. You will collaborate closely with data scientists, analysts, and business stakeholders to deliver high-quality, actionable data solutions. This role requires a strong background in data engineering, database technologies, and cloud platforms, along with the ability to work in an Agile environment to drive data initiatives forward. Responsibilities: Design, build, and maintain scalable and efficient data pipelines that move, transform, and store large datasets. Develop and optimize ETL processes using tools such as Apache Spark , Apache Kafka , or AWS Glue . Work with SQL and NoSQL databases to ensure the availability, consistency, and reliability of data. Collaborate with data scientists and analysts to ensure data requirements and quality standards are met. Design and implement data models, schemas, and architectures for data lakes and data warehouses. Automate manual data processes to improve efficiency and data processing speed. Ensure data security, privacy, and compliance with industry standards and regulations. Continuously evaluate and integrate new tools and technologies to enhance data engineering processes. Troubleshoot and resolve data quality and performance issues. Participate in code reviews and contribute to a culture of best practices in data engineering. Requirements: 3-10 years of experience as a Data Engineer or in a similar role. Strong proficiency in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience with big data technologies such as Apache Hadoop , Spark , Hive , and Kafka . Hands-on experience with cloud platforms like AWS , Azure , or Google Cloud . Proficiency in Python , Java , or Scala for data processing and scripting. Familiarity with data warehousing concepts, tools, and technologies (e.g., Snowflake , Redshift , BigQuery ). Experience working with data modeling, data lakes, and data pipelines. Solid understanding of data governance, data privacy, and security best practices. Strong problem-solving and debugging skills. Ability to work in an Agile development environment. Excellent communication skills and the ability to work cross-functionally.

Posted 4 days ago

Apply

5.0 - 10.0 years

25 - 40 Lacs

Pune

Work from Office

Job Title: Data Engineer Job Type: Full-time Department: Data Engineering / Data Science Reports To: Data Engineering Manager / Chief Data Officer About the Role: We are looking for a talented Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, building, and maintaining robust data pipelines and systems that process and store large volumes of data. You will collaborate closely with data scientists, analysts, and business stakeholders to deliver high-quality, actionable data solutions. This role requires a strong background in data engineering, database technologies, and cloud platforms, along with the ability to work in an Agile environment to drive data initiatives forward. Responsibilities: Design, build, and maintain scalable and efficient data pipelines that move, transform, and store large datasets. Develop and optimize ETL processes using tools such as Apache Spark , Apache Kafka , or AWS Glue . Work with SQL and NoSQL databases to ensure the availability, consistency, and reliability of data. Collaborate with data scientists and analysts to ensure data requirements and quality standards are met. Design and implement data models, schemas, and architectures for data lakes and data warehouses. Automate manual data processes to improve efficiency and data processing speed. Ensure data security, privacy, and compliance with industry standards and regulations. Continuously evaluate and integrate new tools and technologies to enhance data engineering processes. Troubleshoot and resolve data quality and performance issues. Participate in code reviews and contribute to a culture of best practices in data engineering. Requirements: 3-10 years of experience as a Data Engineer or in a similar role. Strong proficiency in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience with big data technologies such as Apache Hadoop , Spark , Hive , and Kafka . Hands-on experience with cloud platforms like AWS , Azure , or Google Cloud . Proficiency in Python , Java , or Scala for data processing and scripting. Familiarity with data warehousing concepts, tools, and technologies (e.g., Snowflake , Redshift , BigQuery ). Experience working with data modeling, data lakes, and data pipelines. Solid understanding of data governance, data privacy, and security best practices. Strong problem-solving and debugging skills. Ability to work in an Agile development environment. Excellent communication skills and the ability to work cross-functionally.

Posted 4 days ago

Apply

5.0 - 10.0 years

25 - 40 Lacs

Noida

Work from Office

Job Title: Data Engineer Job Type: Full-time Department: Data Engineering / Data Science Reports To: Data Engineering Manager / Chief Data Officer About the Role: We are looking for a talented Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, building, and maintaining robust data pipelines and systems that process and store large volumes of data. You will collaborate closely with data scientists, analysts, and business stakeholders to deliver high-quality, actionable data solutions. This role requires a strong background in data engineering, database technologies, and cloud platforms, along with the ability to work in an Agile environment to drive data initiatives forward. Responsibilities: Design, build, and maintain scalable and efficient data pipelines that move, transform, and store large datasets. Develop and optimize ETL processes using tools such as Apache Spark , Apache Kafka , or AWS Glue . Work with SQL and NoSQL databases to ensure the availability, consistency, and reliability of data. Collaborate with data scientists and analysts to ensure data requirements and quality standards are met. Design and implement data models, schemas, and architectures for data lakes and data warehouses. Automate manual data processes to improve efficiency and data processing speed. Ensure data security, privacy, and compliance with industry standards and regulations. Continuously evaluate and integrate new tools and technologies to enhance data engineering processes. Troubleshoot and resolve data quality and performance issues. Participate in code reviews and contribute to a culture of best practices in data engineering. Requirements: 3-10 years of experience as a Data Engineer or in a similar role. Strong proficiency in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience with big data technologies such as Apache Hadoop , Spark , Hive , and Kafka . Hands-on experience with cloud platforms like AWS , Azure , or Google Cloud . Proficiency in Python , Java , or Scala for data processing and scripting. Familiarity with data warehousing concepts, tools, and technologies (e.g., Snowflake , Redshift , BigQuery ). Experience working with data modeling, data lakes, and data pipelines. Solid understanding of data governance, data privacy, and security best practices. Strong problem-solving and debugging skills. Ability to work in an Agile development environment. Excellent communication skills and the ability to work cross-functionally.

Posted 4 days ago

Apply

5.0 - 8.0 years

5 - 15 Lacs

Pune

Work from Office

Role & responsibilities Design and implement scalable ELT pipelines using DBT and Snowflake Develop and optimize complex SQL queries and transformations Work with data loading/integration tools like StreamSets Collaborate with stakeholders to gather business requirements and translate them into technical solutions Version control and CI/CD using Git Schedule and monitor workflows using Apache Airflow (preferred) Leverage Python for custom data manipulation, scripting, and automation Ensure data quality, integrity, and availability across various business use cases Preferred candidate profile Strong expertise in DBT (Data Build Tool) Hands-on experience with Snowflake and ELT processing Proficiency in SQL Good to Have Skills: Experience with StreamSets or other data ingestion tools Working knowledge of Airflow for orchestration Familiarity with Python for data engineering tasks Strong understanding of Git and version control practices Exposure to Agile methodologies

Posted 4 days ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Gurugram

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 5+ Years Location: Gurgaon Preferred: Immediate Joiners Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills.

Posted 4 days ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Job Title: Project Manager Work Location: 5th Floor, Sri Durga Towers, Above UCO Bank, Road No 10, Banjara Hills, Hyderabad 500034 Telangana, India Work timings: 11: 00AM- 8:00 PM IST Job Type: Full Time Roles and Responsibilities: Lead and coordinate cross-functional teams (data engineers, architects, BI, QA, security) for Snowflake project delivery Define project scope, timelines, resources, and budget tailored to Snowflake and cloud infrastructure Oversee Snowflake environment setup, data migration, and integration with source/target systems (e.g., ERP, CRM, BI tools) Ensure Snowflake best practices around cost control, data governance, security, and performance optimization Collaborate with architects on Snowflake deployment strategies (multi-cloud, data sharing, scalability) Manage risks, issues, and changes using industry-standard practices (RAID logs, impact assessments) Monitor data pipelines, SLA compliance, and usage costs; coordinate tuning and optimization as needed Facilitate agile/hybrid delivery, maintain backlogs, sprints, and reports via Jira or similar tools Drive stakeholder communication, training, and documentation to ensure successful adoption and operational readiness. Required Skills: Strong knowledge of Snowflake architecture, components, and deployment models (multi-cloud, data sharing, etc.) Proven experience in managing data migration and integration projects, especially involving cloud data platforms Proficiency in Agile/Scrum methodologies and project tracking tools (e.g., Jira, Azure DevOps) Ability to manage project risks, scope, timelines, and budgets effectively Familiarity with ETL/ELT tools (e.g., dbt, Informatica, Talend, Fivetran) and BI/reporting tools (e.g., Power BI, Tableau) Understanding of RBAC, data masking, encryption, and other Snowflake security features Strong communication and stakeholder management skills Experience in monitoring and controlling Snowflake compute costs and usage reports. Eligibility Criteria Minimum: Bachelors degree in Computer Science, Information Technology, Engineering, or a related field Preferred: Masters degree (MBA or MS in Information Systems/Data Analytics). 610 years of IT Project Management experience 2+ years of direct experience managing Snowflake projects or large-scale cloud data initiatives. Certifications : PMP® (Project Management Professional) Agile/Scrum Certification (e.g., CSM, PMI-ACP, or SAFe) Snowflake SnowPro Core Certification Cloud certifications (AWS/Azure/GCP)

Posted 4 days ago

Apply

8.0 - 13.0 years

15 - 25 Lacs

Hyderabad, Pune

Hybrid

Role & responsibilities Job Description - Snowflake Senior Developer Experience: 8+ years Location: India, Hybrid Employment Type: Full-time Job Summary We are seeking a skilled Snowflake Developer with 8+ years of experience in designing, developing, and optimizing Snowflake data solutions. The ideal candidate will have strong expertise in Snowflake SQL, ETL/ELT pipelines, and cloud data integration. This role involves building scalable data warehouses, implementing efficient data models, and ensuring high-performance data processing in Snowflake. Key Responsibilities 1. Snowflake Development & Optimization Design and develop Snowflake databases, schemas, tables, and views following best practices. Write complex SQL queries, stored procedures, and UDFs for data transformation. Optimize query performance using clustering, partitioning, and materialized views. Implement Snowflake features (Time Travel, Zero-Copy Cloning, Streams & Tasks). 2. Data Pipeline Development Build and maintain ETL/ELT pipelines using Snowflake, Snowpark, Python, or Spark. Integrate Snowflake with cloud storage (S3, Blob) and data ingestion tools (Snowpipe). Develop CDC (Change Data Capture) and real-time data processing solutions. 3. Data Modeling & Warehousing Design star schema, snowflake schema, and data vault models in Snowflake. Implement data sharing, secure views, and dynamic data masking. Ensure data quality, consistency, and governance across Snowflake environments. 4. Performance Tuning & Troubleshooting Monitor and optimize Snowflake warehouse performance (scaling, caching, resource usage). Troubleshoot data pipeline failures, latency issues, and query bottlenecks. Work with DevOps teams to automate deployments and CI/CD pipelines. 5. Collaboration & Documentation Work closely with data analysts, BI teams, and business stakeholders to deliver data solutions. Document data flows, architecture, and technical specifications. Mentor junior developers on Snowflake best practices. Required Skills & Qualifications 8+ years in database development, data warehousing, or ETL. 4+ years of hands-on Snowflake development experience. Strong SQL or Python skills for data processing. Experience with Snowflake utilities (SnowSQL, Snowsight, Snowpark). Knowledge of cloud platforms (AWS/Azure) and data integration tools (Coalesce, Airflow, DBT). Certifications: SnowPro Core Certification (preferred). Preferred Skills Familiarity with data governance and metadata management. Familiarity with DBT, Airflow, SSIS & IICS Knowledge of CI/CD pipelines (Azure DevOps). If Interested, Kindly share update cv on - Himanshu.mehra@thehrsolutions.in

Posted 4 days ago

Apply

6.0 - 11.0 years

30 - 35 Lacs

Pune

Remote

Role & responsibilities We are seeking a Production Support Lead with expertise in modern data platforms to oversee the reliability, performance, and user access control of our analytics and reporting environment. This individual will lead operational support across tools like Snowflake, dbt, Fivetran, Tableau, and Azure Entra, AWS, Terraform ensuring compliance and high data availability. The ideal candidate will not only resolve technical issues but also guide the team in scaling and automating platform operations.

Posted 4 days ago

Apply

4.0 - 9.0 years

8 - 18 Lacs

Chennai, Coimbatore, Vellore

Work from Office

We at Blackstraw.ai. are organizing a Walk-in Interview Drive for Data Engineers with minimum 3 years exp in Data Engineer Data Engineer Mini 3 Years Exp in Python, Spark, PySpark, Hadoop, Hive, Snowflake , AWS , Databricks We are looking for a Data Engineer to join our team. You will use various methods to transform raw data into useful data systems. You'll strive for efficiency by aligning data systems with business goals. To succeed in this position, you should have strong analytical skills and the ability to combine data from different sources. Data engineer skills also include familiarity with several programming languages and an understanding of machine learning methods. If you are detail-oriented, with excellent organizational skills and experience in this field, wed like to hear from you. Job Requirements Participate in the customer's system design meetings and collect the functional/technical requirements. Responsible to meet the customer expectations on real-time data integrity and implementing efficient solutions A clear understanding of Python, Spark, PySpark, Hive, Kafka, and RDBMS architecture. Experience in writing Spark/Python programs and SQL queries. Suggest and implement best practices in data integration. Guide the QA team in defining system integration tests as needed. Split the planned deliverables into tasks and assign them to the team. Good to have: Knowledge of CI/CD concepts, Apache Kafka Key traits: Should have excellent communication skills. Should be self-motivated and willing to work as part of a team. Should be able to collaborate and coordinate in a remote environment. Be a problem solver and be proactive to solve the challenges that come his way. Important Instructions: Do carry a hard copy of your resume, one passport photograph, along with a government identity proof for ease of access to our premises. *Please note: Do not carry any electronic devices apart from your mobile phone at office premises.* Please send us your resume to chennai.walkin@blackstraw.ai *Kindly fill up below form to submit you registration form: https://forms.gle/LtNYvGM8pbxMifXw6 Preference will be given for Immediate Joiners or who can join within 10-15 days.

Posted 4 days ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Gurugram

Work from Office

About the Opportunity Job TypeApplication 26 July 2025 About The Role Title Senior Test Analyst Department ISS DELIVERY - DEVELOPMENT - GURGAON Location GGN Level 3 Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our ISS Delivery team and feel like youre part of something bigger. About your team The Investment Solutions Services (ISS) delivery team provides team provides systems development, implementation and support services for FILs global Investment Management businesses across asset management lifecyle. We support Fund Managers, Research Analysts, Traders and Investment Services Operations in all of FILs international locations, including London, Hong Kong, and Tokyo About your role You will be joining this position as Senior Test Analyst in QA chapter, and therefore be responsible for executing testing activities for all applications under IM technology based out of India. Here are the expectations and probably how your day in a job will look like Understand business needs and analyse requirements and user stories to carry out different testing activities. Collaborate with developers and BAs to understand new features, bug fixes, and changes in the codebase. Create and execute functional as well as automated test cases on different test environments to validate the functionality Log defects in defect tracker and work with PMs and devs to prioritise and resolve them. Develop and maintain automation script , preferably using python stack. Deep understanding of databases both relational as well as non-relational. Document test cases , results and any other issues encountered during testing. Attend team meetings and stand ups to discuss progress, risks and any issues that affects project deliveries Stay updated with new tools, techniques and industry trends. About You Seasoned Software Test analyst with more than 5+ years of hands on experience Hands-on experience in automating web and backend automation using open source tools ( Playwright, pytest, Selenium, request, Rest Assured, numpy , pandas). Proficiency in writing and understanding complex db queries in various databases ( Oracle, Snowflake) Good understanding of cloud ( AWS , Azure) Preferable to have finance investment domain. Strong logical reasoning and problem solving skills. Preferred programming language Python and Java. Familiarity with CI/CD tools (e.g., Jenkins, GitLab CI) for automating deployment and testing workflows Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.

Posted 4 days ago

Apply

5.0 - 10.0 years

18 - 22 Lacs

Pune

Work from Office

About The Role : DB Global Technology is Deutsche Banks technology center in Central and Eastern Europe. Opened in January 2014, the Bucharest office is constantly expanding and is now hosting over 1600 employees. Designed as an agile working environment, custom made to encourage innovation, collaboration, and productivity in a modern high-tech atmosphere. Enterprise Architecture is one of the key pillars of Deutsche Banks IT Strategy and plays a key part in all aspects of defining, managing, governing, and realizing our technology strategy to our clients. The Group Architecture Data team works collaboratively with federated business and technology departments to ensure that Deutsche Bank has a clear target state data architecture whose delivery is managed & supported by appropriate data architecture principles, standards, frameworks, tools, and governance processes. Our ultimate goal is to accelerate the delivery of a bank-wide simplified target architecture, improve technology agility, increase speed-to-market, and reduce cost across our technology landscape The role will work closely with the data architects within Group Architecture and the federated Data Across all locations and functions with Deutsche Bank. The Data Architecture Project/Process Manager role is instrumental in driving the execution and continuous improvement of data architecture processes. This role ensures the effective implementation of data architecture standards, governance, and project delivery methodologies to enhance data quality, integration, and accessibility across the organization. Responsibilities: Lead the design and review of data architecture defined processes, identify opportunities for efficiencies and automation. Track architecture milestones and ensure they are timely and correctly mapped to Clarity and provide end-to-end oversight of the process Oversee the execution and implementation of data architecture processes, ensuring adherence to standards and best practices. Collaborate with domain architects and data stakeholders to ensure consistent application of data architecture frameworks. Conduct data quality assessments and implement improvements to ensure data integrity and completeness. Provide guidance and support to project and program managers on data architecture methodologies and delivery processes. Develop and deliver training materials to upskill teams on data architecture governance and implementation practices. Produce insightful metrics and reports to inform decision-making and support continuous process improvement. Prepare communication materials for senior management and regulatory bodies regarding data architecture initiatives. Skills Proven experience in managing technology projects and driving process improvements in large, complex organizations. Understanding of data architecture principles, standards, and implementation approaches. Project management skills with a track record of delivering technology initiatives on time and within scope. Proficient in data analysis, visualization, and reporting to support architecture governance and decision-making. Effective communicator with the ability to convey complex data architecture concepts to diverse stakeholders. Experience with tools such as JIRA, Confluence, MS Office, and data visualization tools (Looker or Tableau). Strong stakeholder management skills, with the ability to influence and collaborate across global teams. Well-being & Benefits Emotionally and mentally balanced we support you in dealing with life crises, maintaining stability through illness, and maintaining good mental health Empowering managers who value your ideas and decisions. Show your positive attitude, determination, and open-mindedness. A professional, passionate, and fun workplace with flexible Work from Home options. A modern office with fun and relaxing areas to boost creativity. Continuous learning culture with coaching and support from team experts. Physically thriving we support you managing your physical health by taking appropriate preventive measures and providing a workplace that helps you thrive Private healthcare and life insurance with premium benefits for you and discounts for your loved ones. Socially connected we strongly believe in collaboration, inclusion and feeling connected to open up new perspectives and strengthen our self-confidence and wellbeing. Kids@TheOffice - support for unexpected events requiring you to care for your kids during work hours. Enjoy retailer discounts, cultural and CSR activities, employee sport clubs, workshops, and more. Financially secure we support you to meet personal financial goals during your active career and for the future Competitive income, performance-based promotions, and a sense of purpose. 24 days holiday, loyalty days, and bank holidays (including weekdays for weekend bank holidays). We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 4 days ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Hyderabad

Work from Office

Fusion Plus Solutions Inc is looking for Snowflake Oracle Pl/Sql to join our dynamic team and embark on a rewarding career journey Collaborate with cross-functional teams to achieve strategic outcomes Apply subject expertise to support operations, planning, and decision-making Utilize tools, analytics, or platforms relevant to the job domain Ensure compliance with policies while improving efficiency and outcomes

Posted 4 days ago

Apply

5.0 - 7.0 years

18 - 20 Lacs

Pune

Work from Office

Critical Skills to Possess: 5+ years of experience in data engineering or ETL development. 5+ years of hands-on experience with Informatica. Experience in production support , handling tickets, and monitoring ETL systems. Strong SQL skills with experience in querying large datasets. Familiarity with data warehousing concepts and design (e.g., star schema, snowflake schema). Experience with relational databases such as Oracle, SQL Server, or PostgreSQL. Knowledge of cloud platforms such as AWS, Azure, or GCP is a plus. Preferred Qualifications: BS degree in Computer Science or Engineering or equivalent experience Roles and Responsibilities Roles and Responsibilities: Design, develop, and maintain robust ETL pipelines using Informatica . Work with data architects and business stakeholders to understand data requirements and translate them into technical solutions. Integrate data from various sources including relational databases, flat files, APIs, and cloud-based systems. Optimize and troubleshoot existing Informatica workflows for performance and reliability. Monitor ETL workflows and proactively address failures, performance issues, and data anomalies. Respond to and resolve support tickets related to data loads, ETL job failures, and data discrepancies. Provide support for production data pipelines and jobs Ensure data quality and consistency across different systems and pipelines. Implement data validation, error handling, and auditing mechanisms within ETL processes. Collaborate with data analysts, data scientists, and other engineers to ensure a consistent and accurate data platform. Maintain documentation of ETL processes, data flows, and technical designs. Monitor daily data loads and resolve any ETL failures or data quality issues.

Posted 4 days ago

Apply

7.0 - 11.0 years

4 - 7 Lacs

Bengaluru

Work from Office

About The Role Skill required: Delivery - Fraud Risk Management Designation: I&F Decision Sci Practitioner Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Visit us at www.accenture.com.Business and regulatory requirements, governance, operating model, process and system controls to identify, detect, measure, prevent and report internal and external fraud for an organization. The fraud strategy program managers will work for the Fraud Risk & Solutions manager to design and implement innovative and robust fraud controls across financial products and customer journeys from application to account closure, which strengthen fraud prevention measures and enhance digital experience. This role requires fraud domain expertise, data analytics skills, and the ability to track & manage engineering projects on a day-to-day basis. What are we looking for Conduct data analysis, running queries to process large-scale datasets involving millions of identities and transactions to develop data-driven fraud solutions Perform root-cause analysis and fraud investigations, leveraging fraud review tools and vendor systems such as SentiLink, iOvation, ThreatMetrix, LexisNexis, etc Work with SQL, Tableau, and Looker for data analysis and reporting Ensure compliance with KYC, CIP, AML, Reg E, and other fraud-related regulations and guidelines Product/program/project management:5+ years Fraud prevention, fraud management Fraud RCA:2+ years Data analytics:1+ year Roles and Responsibilities: Develop, implement, and enhance fraud prevention controls to identify and mitigate fraud risks, including but not limited to identity fraud, account takeover fraud, and transaction fraud Manage engineering development and fraud risk implementation projects, preparing PSR, PRD, and solution documentation. Working with the engineering team in daily standup, grooming sessions to clarify all the requirements details support accuracy and smooth engineering implementation Collaborate closely with fraud analytics and fraud operations teams to design effective fraud prevention methods that balance fraud controls and customer experience In this role you are required to do analysis and solving of moderately complex problems Typically creates new solutions, leveraging and, where needed, adapting existing methods and procedures The person requires understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor or team leads Generally. interacts with peers and/or management levels at a client and/or within Accenture The person should require minimal guidance when determining methods and procedures on new assignments Decisions often impact the team in which they reside and occasionally impact other teams Individual would manage medium-small sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Qualification Any Graduation

Posted 4 days ago

Apply

6.0 - 11.0 years

5 - 9 Lacs

Pune

Work from Office

This position involves performing feasibility and impact assessments, reviewing documentation to ensure conformity to methods, designs, and standards, and achieving economies of scale during the support phase As a senior Business Analyst you will also be responsible for stakeholder communication, conducting primary and secondary research based on solution and project needs, supporting new solution design and other organizational initiatives, and collaborating with all stakeholders in a multi-disciplinary team environment to build consensus on various data and analytics projects, Location: Nagpur/Pune/Chennai/Bangalore Key Result Areas and Activities: Stakeholder Collaboration: Collaborate with Retail business product owners and end users to understand data and analytics requirements, Requirement Analysis: Analyse and triage business requirements to determine scope and data sources, Documentation: Create business and functional requirement documents (FSD, BRD, etc ) detailing the metrics and KPIs needed by the business, Development Collaboration: Collaborate with development and engineering teams on the delivery of data products, Testing Coordination: Coordinate and manage testing cycles, including user acceptance testing, Essential Skills: Business Data Analyst with a D&A background Understanding of the Luxury Retail Domain Experience in the full lifecycle of Data and Analytics projects from discovery to deployment Experience working with multiple stakeholders for requirements gathering and business prioritization Ability to articulate and measure the business value of requirements Knowledge of AI/GenAI concepts Ability to convert business requirements into implementation ready Epics and user stories Ability to contribute to estimations and project plans across data engineering and analytics teams Understanding a data visualization tools like Power BI Ability to write SQLs for data analysis T ools : Power BI, Snowflake, SQL, JIRA, Microsoft office suite Desirable Skills: Understanding of AI solutions Create and manage Epics and User stories, including acceptance criteria in JIRA Hands on knowledge with data querying, data analysis, data mining, reporting and analytics will be a plus Qualifications: Bachelor's degree in computer science, engineering, or related field (Master's degree is a plus) 6 years of experience as Business Analyst Demonstrated continued learning through one or more technical certifications or related methods Qualities: Self-motivated and focused on delivering outcomes for a fast-growing team and firm Able to communicate persuasively through speaking, writing, and client presentations Able to consult, write, and present persuasively Able to work in a self-organized and cross-functional team Able to iterate based on new information, peer reviews, and feedback Able to work with teams and clients in different time zones Research focused mindset

Posted 4 days ago

Apply

1.0 - 4.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the evolving needs of users while adhering to best practices in software development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data integration tools and ETL processes.- Strong understanding of SQL and database management.- Familiarity with cloud computing concepts and services.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

2.0 - 4.0 years

2 - 5 Lacs

Pune

Work from Office

Project Role : Quality Engineer (Tester) Project Role Description : Enables full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Performs continuous testing for security, API, and regression suite. Creates automation strategy, automated scripts and supports data and environment configuration. Participates in code reviews, monitors, and reports defects to support continuous improvement activities for the end-to-end testing process. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Quality Engineer, you will enable full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Your typical day will involve performing continuous testing for security, API, and regression suites, creating automation strategies, and supporting data and environment configurations. You will also participate in code reviews and monitor defects to support continuous improvement activities for the end-to-end testing process, ensuring that the highest quality standards are met throughout the project lifecycle. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement automated testing scripts to enhance testing efficiency.- Collaborate with cross-functional teams to ensure seamless integration of testing processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data integration tools and ETL processes.- Strong understanding of database management and SQL querying.- Familiarity with continuous integration and continuous deployment (CI/CD) practices.- Experience in performance testing and monitoring tools. Additional Information:- The candidate should have minimum 2 years of experience in Snowflake Data Warehouse.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Experience:- Overall IT experience (No of years) - 7+- Data Modeling Experience - 3+- Data Vault Modeling Experience - 2+ Key Responsibilities:- Drive discussions with clients deal teams to understand business requirements, how Data Model fits in implementation and solutioning- Develop the solution blueprint and scoping, estimation in delivery project and solutioning- Drive Discovery activities and design workshops with client and lead strategic road mapping and operating model design discussions- Design and develop Data Vault 2.0-compliant models, including Hubs, Links, and Satellites.- Design and develop Raw Data Vault and Business Data Vault.- Translate business requirements into conceptual, logical, and physical data models.- Work with source system analysts to understand data structures and lineage.- Ensure conformance to data modeling standards and best practices.- Collaborate with ETL/ELT developers to implement data models in a modern data warehouse environment (e.g., Snowflake, Databricks, Redshift, BigQuery).- Document models, data definitions, and metadata. Technical Experience:Good to have Skills: - 7+ year overall IT experience, 3+ years in Data Modeling and 2+ years in Data Vault Modeling- Design and development of Raw Data Vault and Business Data Vault.- Strong understanding of Data Vault 2.0 methodology, including business keys, record tracking, and historical tracking.- Data modeling experience in Dimensional Modeling/3-NF modeling- Hands-on experience with any data modeling tools (e.g., ER/Studio, ERwin, or similar).- Solid understanding of ETL/ELT processes, data integration, and warehousing concepts.- Experience with any modern cloud data platforms (e.g., Snowflake, Databricks, Azure Synapse, AWS Redshift, or Google BigQuery).- Excellent SQL skills. Good to Have Skills: - Any one of these add-on skills - Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Hands-on experience in any Data Vault automation tool (e.g., VaultSpeed, WhereScape, biGENIUS-X, dbt, or similar).- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E or B.Tech must Qualification 15 years full time education

Posted 4 days ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, manage project timelines, and contribute to the overall success of application development initiatives. Roles & Responsibilities:1.Serve as a client-facing technical lead, working closely with stakeholders to gather requirements and translate them into actionable ETL solutions.2.Design and develop new stored procedures in MS SQL Server, with a strong focus on performance and maintainability.3.Build/enhance SSIS packages, implementing best practices for modularity, reusability, and error handling.4.Architect and design ETL workflows, including staging, cleansing, data masking, transformation, and loading strategies.5.Implement comprehensive error handling and logging mechanisms to support reliable, auditable data pipelines.6.Design and maintain ETL-related tables, including staging, audit/logging, and dimensional/historical tables.7.Work with Snowflake to build scalable cloud-based data integration and warehousing solutions.8.Reverse-engineer and optimize existing ETL processes and stored procedures for better performance and maintainability.9.Troubleshoot job failures, data discrepancies in Production Professional & Technical Skills: 1.7+ years of experience in Data Warehousing [MS SQL, Snowflake], MS SQL Server (T-SQL, stored procedures, indexing, performance tuning).2.Proven expertise in SSIS package development, including parameterization, data flow, and control flow design.3.Strong experience in ETL architecture, including logging, exception handling, and data validation.4.Proficient in data modeling for ETL, including staging, target, and history tables.5.Hands-on experience with Snowflake, including data loading, transformation scripting, and optimization.6.Ability to manage historical data using SCDs, auditing fields, and temporal modeling.7.Set up Git repositories, define version control standards, and manage code branching/release. DevOps, and CI/CD practices for data pipelines.8.Ability to work independently while managing multiple issues and deadlines.9.Excellent communication skills, both verbal and written, with demonstrated client interaction.Would be a Plus:10.DW migration from MS SQL to Snowflake.11.Experience with modern data integration tools such as Matillion.12.Knowledge of BI tools like Tableau. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: At least 3+ years of overall experienceExperience in building data solutions using Azure Databricks and PySparkExperience in building complex SQL queries , understand data warehousing concepts Experience with Azure DevOps CI/CD pipelines for automated deployment and release managementGood to have :Experience with Snowflake data warehousingExcellent problem-solving and analytical skillsAbility to work independently as well as collaboratively in a team environmentGood to have :Experience building pipelines, Data Flows using Azure Data FactoryStrong communication and interpersonal skills Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Pune

Work from Office

Required Skills and Competencies: - Experience: 3+ Years. Expertise in Python Language is MUST. SQL (should be able to write complex SQL Queries) is MUST Hands on experience in Apache Flink Streaming Or Spark Streaming MUST Hands On expertise in Apache Kafka experience is MUST Data Lake Development experience. Orchestration (Apache Airflow is preferred). Spark and Hive: Optimization of Spark/PySpark and Hive apps Trino/(AWS Athena) (Good to have) Snowflake (good to have). Data Quality (good to have). File Storage (S3 is good to have).

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Coimbatore

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the business environment. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking ways to improve processes and solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data integration tools.- Strong understanding of SQL and database management.- Familiarity with cloud computing concepts and services.- Experience in application testing and debugging methodologies. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Coimbatore office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality applications that meet user expectations and business goals. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Collaborate with cross-functional teams to gather requirements and provide technical insights. Professional & Technical Skills: At least 3+ years of overall experienceExperience in building data solutions using Azure Databricks and PySparkExperience in building complex SQL queries , understand data warehousing concepts Experience with Azure DevOps CI/CD pipelines for automated deployment and release managementGood to have :Experience with Snowflake data warehousingExcellent problem-solving and analytical skillsAbility to work independently as well as collaboratively in a team environmentGood to have :Experience building pipelines, Data Flows using Azure Data FactoryStrong communication and interpersonal skills Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies