Home
Jobs

683 Olap Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

3 - 7 Lacs

Nagar, Pune

Work from Office

Naukri logo

Title : REF64648E - Python developer + Chatbot with 4 - 6 years exp - Pune/Mum/ BNG/ GGN/CHN Assistant Manager - WTS 4 - 6 years of experience as a Python Developer with a strong understanding of Python programming concepts and best practice Bachelors Degree/B.Tech/B.E in Computer Science or a related discipline Design, develop, and maintain robust and scalable Python-based applications, tools, and frameworks that integrate machine learning models and algorithms Demonstrated expertise in developing machine learning solutions, including feature selection, model training, and evaluation Proficiency in data manipulation libraries (e.g., Pandas, NumPy) and machine learning frameworks (e.g., Scikit-learn, TensorFlow, PyTorch, Keras) Experience with web frameworks like Django or Flask Contribute to the architecture and design of data-driven solutions, ensuring they meet both functional and non-functional requirements Experience with databases such as MS-SQL Server, PostgreSQL or MySQL. Solid knowledge of OLTP and OLAP concepts Experience with CI/CD tooling (at least Git and Jenkins) Experience with the Agile/Scrum/Kanban way of working Self-motivated and hard-working Knowledge of performance testing frameworks including Mocha and Jest. Knowledge of RESTful APIs. Understanding of AWS and Azure Cloud services Experience with chatbot and NLU / NLP based application is required Qualifications Bachelors Degree/B.Tech/B.E in Computer Science or a related discipline Job Location

Posted -1 days ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Experience: 7+ Years Location: Noida-Sector 64 Key Responsibilities: Data Architecture Design: Design, develop, and maintain the enterprise data architecture, including data models, database schemas, and data flow diagrams. Develop a data strategy and roadmap that aligns with the business objectives and ensures the scalability of data systems. Architect both transactional (OLTP) and analytical (OLAP) databases, ensuring optimal performance and data consistency. Data Integration & Management: Oversee the integration of disparate data sources into a unified data platform, leveraging ETL/ELT processes and data integration tools. Design and implement data warehousing solutions, data lakes, and/or data marts that enable efficient storage and retrieval of large datasets. Ensure proper data governance, including the definition of data ownership, security, and privacy controls in accordance with compliance standards (GDPR, HIPAA, etc.). Collaboration with Stakeholders: Work closely with business stakeholders, including analysts, developers, and executives, to understand data requirements and ensure that the architecture supports analytics and reporting needs. Collaborate with DevOps and engineering teams to optimize database performance and support large-scale data processing pipelines. Technology Leadership: Guide the selection of data technologies, including databases (SQL/NoSQL), data processing frameworks (Hadoop, Spark), cloud platforms (Azure is a must), and analytics tools. Stay updated on emerging data management technologies, trends, and best practices, and assess their potential application within the organization. Data Quality & Security: Define data quality standards and implement processes to ensure the accuracy, completeness, and consistency of data across all systems. Establish protocols for data security, encryption, and backup/recovery to protect data assets and ensure business continuity. Mentorship & Leadership: Lead and mentor data engineers, data modelers, and other technical staff in best practices for data architecture and management. Provide strategic guidance on data-related projects and initiatives, ensuring that all efforts are aligned with the enterprise data strategy. Required Skills & Experience: Extensive Data Architecture Expertise: Over 7 years of experience in data architecture, data modeling, and database management. Proficiency in designing and implementing relational (SQL) and non-relational (NoSQL) database solutions. Strong experience with data integration tools (Azure Tools are a must + any other third party tools), ETL/ELT processes, and data pipelines. Advanced Knowledge of Data Platforms: Expertise in Azure cloud data platform is a must. Other platforms such as AWS (Redshift, S3), Azure (Data Lake, Synapse), and/or Google Cloud Platform (BigQuery, Dataproc) is a bonus. Experience with big data technologies (Hadoop, Spark) and distributed systems for large-scale data processing. Hands-on experience with data warehousing solutions and BI tools (e.g., Power BI, Tableau, Looker). Data Governance & Compliance: Strong understanding of data governance principles, data lineage, and data stewardship. Knowledge of industry standards and compliance requirements (e.g., GDPR, HIPAA, SOX) and the ability to architect solutions that meet these standards. Technical Leadership: Proven ability to lead data-driven projects, manage stakeholders, and drive data strategies across the enterprise. Strong programming skills in languages such as Python, SQL, R, or Scala. Certification: Azure Certified Solution Architect, Data Engineer, Data Scientist certifications are mandatory. Pre-Sales Responsibilities: Stakeholder Engagement: Work with product stakeholders to analyze functional and non-functional requirements, ensuring alignment with business objectives. Solution Development: Develop end-to-end solutions involving multiple products, ensuring security and performance benchmarks are established, achieved, and maintained. Proof of Concepts (POCs): Develop POCs to demonstrate the feasibility and benefits of proposed solutions. Client Communication: Communicate system requirements and solution architecture to clients and stakeholders, providing technical assistance and guidance throughout the pre-sales process. Technical Presentations: Prepare and deliver technical presentations to prospective clients, demonstrating how proposed solutions meet their needs and requirements. Additional Responsibilities: Stakeholder Collaboration: Engage with stakeholders to understand their requirements and translate them into effective technical solutions. Technology Leadership: Provide technical leadership and guidance to development teams, ensuring the use of best practices and innovative solutions. Integration Management: Oversee the integration of solutions with existing systems and third-party applications, ensuring seamless interoperability and data flow. Performance Optimization: Ensure solutions are optimized for performance, scalability, and security, addressing any technical challenges that arise. Quality Assurance: Establish and enforce quality assurance standards, conducting regular reviews and testing to ensure robustness and reliability. Documentation: Maintain comprehensive documentation of the architecture, design decisions, and technical specifications. Mentoring: Mentor fellow developers and team leads, fostering a collaborative and growth-oriented environment. Qualifications: Education: Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Experience: Minimum of 7 years of experience in data architecture, with a focus on developing scalable and high-performance solutions. Technical Expertise: Proficient in architectural frameworks, cloud computing, database management, and web technologies. Analytical Thinking: Strong problem-solving skills, with the ability to analyze complex requirements and design scalable solutions. Leadership Skills: Demonstrated ability to lead and mentor technical teams, with excellent project management skills. Communication: Excellent verbal and written communication skills, with the ability to convey technical concepts to both technical and non-technical stakeholders. Show more Show less

Posted 3 hours ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

It's fun to work in a company where people truly BELIEVE in what they are doing! We're committed to bringing passion and customer focus to the business. Job Description About Company : Fractal is a leading AI & analytics organization. We have a strong Fullstack Team with great leaders accelerating the growth. Our people enjoy a collaborative work environment, exceptional training, and career development as well as unlimited growth opportunities. We have a Glassdoor rating of 4/5 and achieve customer NPS of 9/10. If you like working with a curious, supportive, high-performing team, Fractal is the place for you. Responsibilities As a Fullstack Engineer, you would be part of the team consisting of Scrum Master, Cloud Engineers, AI/ML Engineers, and UI/UX Engineers to build end-to-end Data to Decision Systems. You would report to a Senior Fullstack Engineer and will be responsible for - Managing, developing & maintaining the backend and frontend for various Data to Decision projects for our Fortune 500 client Work closely with the data science & engineering team to integrate the algorithmic output from the backend REST APIs Work closely with business and product owners to create dynamic infographics with intuitive user controls Qualifications REQUIRED QUALIFICATIONS: 4+ years of demonstrable experience designing, building, and working as a Fullstack Engineer for enterprise web applications Ideally, this would include the following: Expert-level proficiency with Angular Expert-level proficiency with ReactJS or VueJS Expert-level proficiency with Node.js MongoDB) and data warehousing concepts (OLAP, OLTP) Understanding of REST concepts and building/interacting with REST APIs Deep understanding of a few UI concepts: Cross-browser compatibility and implementing responsive web design familiarity with code versioning tools such as Github Preferred Qualifications Familiarity with Microsoft Azure Cloud Services (particularly Azure Web App, Storage and VM), or familiarity with AWS (EC2 containers) or GCP Services. Familiarity with Github Actions or any other CI/CD tool (e.g., Jenkins) Job Location : BLR/MUM/GUR/PUNE/CHENNAI If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest! Show more Show less

Posted 3 hours ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat

On-site

Indeed logo

Job Title: Dot Net Developer Location: Gujarat Experience Required: Minimum 5 years post-qualification Employment Type: Full-Time Department: IT / Software Development Key Responsibilities: Design, develop, and maintain scalable and secure Client-Server and distributed web applications using Microsoft .NET technologies. Collaborate with cross-functional teams (analysts, testers, developers) to implement project requirements. Ensure adherence to architectural and coding standards and apply best practices in .NET stack development. Integrate applications with third-party libraries and RESTful APIs for seamless data sharing. Develop and manage robust SQL queries, stored procedures, views, and functions using MS SQL Server. Implement SQL Server features such as replication techniques, Always ON, and database replication. Develop and manage ETL workflows, SSIS packages, and SSRS reports. (Preferred) Develop OLAP solutions for advanced data analytics. Participate in debugging and troubleshooting complex issues to deliver stable software solutions. Support IT application deployment and ensure smooth post-implementation functioning. Take ownership of assigned tasks and respond to changing project needs and timelines. Quickly adapt and learn new tools, frameworks, and technologies as required. Technical Skills Required: .NET Framework (4.0/3.5/2.0), C#, ASP.NET, MVC Bootstrap, jQuery, HTML/CSS Multi-layered architecture design Experience with RESTful APIs and third-party integrations MS SQL Server – Advanced SQL, Replication, SSIS, SSRS Exposure to ETL and OLAP (added advantage) Soft Skills: Excellent problem-solving and debugging abilities Strong team collaboration and communication skills Ability to work under pressure and meet deadlines Proactive learner with a willingness to adopt new technologies Job Types: Full-time, Permanent Pay: ₹60,000.00 - ₹90,000.00 per month Benefits: Flexible schedule Provident Fund Location Type: In-person Schedule: Fixed shift Experience: .NET: 5 years (Required) Location: Ahmedabad, Gujarat (Required) Work Location: In person Speak with the employer +91 7888499500

Posted 7 hours ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title : Senior Database Administrator (DBA) Location : Gachibowli, Hyderabad : 12+ Years Interview Mode : Face-to-Face (F2F) is Type : Full-Time / Permanent Work Mode : On-site Job Summary We are looking for a highly experienced Senior Database Administrator (DBA) with over 12 years of hands-on experience to manage mission-critical database systems. The ideal candidate will play a key role in ensuring optimal performance, security, availability, and reliability of our data infrastructure. Key Responsibilities Design, implement, and manage complex database architectures. Administer, monitor, and optimize large-scale databases (OLTP and OLAP systems). Ensure high availability, failover, replication, and disaster recovery strategies are in place and effective. Conduct performance tuning, query optimization, and capacity planning. Automate database maintenance tasks using scripting languages (Shell, Python, PowerShell, etc.). Lead major database upgrades, migrations, and patching with zero or minimal downtime. Collaborate with DevOps, application, and infrastructure teams to support integration and delivery pipelines. Implement strong database security practices, user roles, and access controls. Maintain up-to-date documentation of database environments and configurations. Mentor junior DBAs and participate in database architecture reviews. Required Skills & Experience 12+ years of solid experience as a Database Expertise in Oracle, MS SQL Server, MySQL, or PostgreSQL databases. Experience in managing cloud-based database platforms (AWS RDS, Azure SQL, or GCP Cloud SQL). Strong knowledge of backup and recovery, replication, and clustering technologies. Proficient in database monitoring tools (e.g., OEM, SolarWinds, Nagios, etc.). Hands-on experience with automation scripts for routine tasks and monitoring. Solid understanding of data privacy, compliance, and audit requirements. Excellent problem-solving and troubleshooting abilities. Strong verbal and written communication skills. Preferred Qualifications Professional certifications (e.g., Oracle OCP, Microsoft Certified DBA, AWS Database Specialty). Experience with NoSQL databases like MongoDB or Cassandra. Exposure to Agile/DevOps environments and CI/CD database automation. Additional Information Location : Gachibowli, Hyderabad (Candidates must be willing to work from office). Interview Mode : Face-to-face interviews are mandatoryno virtual interviews will be conducted. Joining Timeline : Immediate joiners preferred or within 15-30 days (ref:hirist.tech) Show more Show less

Posted 13 hours ago

Apply

0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

What Youll Do Architect and scale modern data infrastructure: ingestion, transformation, warehousing, and access Define and drive enterprise data strategygovernance, quality, security, and lifecycle management Design scalable data platforms that support both operational insights and ML/AI applications Translate complex business requirements into robust, modular data systems Lead cross-functional teams of engineers, analysts, and developers on large-scale data initiatives Evaluate and implement best-in-class tools for orchestration, warehousing, and metadata management Establish technical standards and best practices for data engineering at scale Spearhead integration efforts to unify data across legacy and modern platforms What You Bring Experience in data engineering, architecture, or backend systems Strong grasp of system design, distributed data platforms, and scalable infrastructure Deep hands-on experience with cloud platforms (AWS, Azure, or GCP) and tools like Redshift, BigQuery, Snowflake, S3, Lambda Expertise in data modeling (OLTP/OLAP), ETL pipelines, and data warehousing Experience with big data ecosystems: Kafka, Spark, Hive, Presto Solid understanding of data governance, security, and compliance frameworks Proven track record of technical leadership and mentoring Strong collaboration and communication skills to align tech with business Bachelors or Masters in Computer Science, Data Engineering, or a related field Nice To Have (Your Edge) Experience with real-time data streaming and event-driven architectures Exposure to MLOps and model deployment pipelines Familiarity with data DevOps and Infra as Code (Terraform, CloudFormation, CI/CD pipelines) (ref:hirist.tech) Show more Show less

Posted 13 hours ago

Apply

7.0 years

0 Lacs

Greater Kolkata Area

Remote

Linkedin logo

Omni's team is passionate about Commerce and Digital Transformation. We've been successfully delivering Commerce solutions for clients across North America, Europe, Asia, and Australia. The team has experience executing and delivering projects in B2B and B2C solutions. Job Description This is a remote position. We are seeking a Senior Data Engineer to architect and build robust, scalable, and efficient data systems that power AI and Analytics solutions. You will design end-to-end data pipelines, optimize data storage, and ensure seamless data availability for machine learning and business analytics use cases. This role demands deep engineering excellence balancing performance, reliability, security, and cost to support real-world AI applications. Key Responsibilities Architect, design, and implement high-throughput ETL/ELT pipelines for batch and real-time data processing. Build cloud-native data platforms : data lakes, data warehouses, feature stores. Work with structured, semi-structured, and unstructured data at petabyte scale. Optimize data pipelines for latency, throughput, cost-efficiency, and fault tolerance. Implement data governance, lineage, quality checks, and metadata management. Collaborate closely with Data Scientists and ML Engineers to prepare data pipelines for model training and inference. Implement streaming data architectures using Kafka, Spark Streaming, or AWS Kinesis. Automate infrastructure deployment using Terraform, CloudFormation, or Kubernetes operators. Requirements 7+ years in Data Engineering, Big Data, or Cloud Data Platform roles. Strong proficiency in Python and SQL. Deep expertise in distributed data systems (Spark, Hive, Presto, Dask). Cloud-native engineering experience (AWS, GCP, Azure) : BigQuery, Redshift, EMR, Databricks, etc. Experience designing event-driven architectures and streaming systems (Kafka, Pub/Sub, Flink). Strong background in data modeling (star schema, OLAP cubes, graph databases). Proven experience with data security, encryption, compliance standards (e.g., GDPR, HIPAA). Preferred Skills Experience in MLOps enablement : creating feature stores, versioned datasets. Familiarity with real-time analytics platforms (Clickhouse, Apache Pinot). Exposure to data observability tools like Monte Carlo, Databand, or similar. Passionate about building high-scale, resilient, and secure data systems. Excited to support AI/ML innovation with state-of-the-art data infrastructure. Obsessed with automation, scalability, and best engineering practices. (ref:hirist.tech) Show more Show less

Posted 13 hours ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Gurugram

Work from Office

Naukri logo

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 1 day ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum of 2+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions.

Posted 1 day ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum of 2+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions.

Posted 1 day ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 1 day ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 1 day ago

Apply

5.0 years

10 - 15 Lacs

Gurgaon

On-site

Job Role: ETL SSIS+SSAS Job Location: Gurugram Interview: 1 Internal technical Round|| PM Round ||Client Interview (Candidate should be ready for F2F) Experience Range- 5-8 Year Job Description: · Microsoft SQL · Microsoft SSIS, SSAS · Data warehouse/Data Migration · Experience in Analytics / OLAP Cube Development (Microsoft SSAS and MDX). · Analyze, design, build, query, troubleshoot and maintain cubes. · Knowledge/expertise in SSAS Tabular with DAX language, Proficient in MDX and DAX · Strong conceptual knowledge of ETL fundamentals · Exposure to the following will be beneficial. Job Type: Full-time Pay: ₹1,000,000.00 - ₹1,500,000.00 per year Schedule: Day shift Work Location: In person

Posted 1 day ago

Apply

0.6 - 1.6 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Linkedin logo

Visa is a world leader in payments and technology, with over 259 billion payments transactions flowing safely between consumers, merchants, financial institutions, and government entities in more than 200 countries and territories each year. Our mission is to connect the world through the most innovative, convenient, reliable, and secure payments network, enabling individuals, businesses, and economies to thrive while driven by a common purpose – to uplift everyone, everywhere by being the best way to pay and be paid. Make an impact with a purpose-driven industry leader. Join us today and experience Life at Visa. Job Description Work collaboratively with Data Analyst, Data Scientists Software Engineers and cross-functional partners to design and deploy data pipelines to deliver analytical solution. Responsible for building data pipelines, data model, data marts, data warehouse including OLAP cube in multidimensional data model with proficiency / conceptual understanding of PySpark and SQL scripting. Responsible for the design, development, testing, implementation and support functional semantic data marts using various modeling techniques from underlying data stores/data warehouse and facilitate Business Intelligence Data Solutions Experience in building reports, dashboards, scorecards & visualization using Tableau/ Power BI and other data analysis techniques to collect, explore, and extract insights from structured and unstructured data. Responsible for AI/ML model Utilizing machine learning, statistical methods, data mining, forecasting and predictive modeling techniques. Following Dev Ops Model, Agile implementation, CICD method of deployment & JIRA creation / management for projects. Define and build technical/data documentation and experience with code version control systems (for e.g., git). Assist owner with periodic evaluation of next generation & modernization of platform. Exhibit Leadership Principles such as Accountability & Ownership of High Standards: Given the criticality & sensitivity of data . Customer Focus : Going Above & Beyond in finding innovative solution and product to best serve the business needs and there-by Visa. This is a hybrid position. Expectation of days in office will be confirmed by your hiring manager. Qualifications Basic Qualifications • Bachelors degree or •0.6-1.6 years of work experience with a Bachelor’s Degree or Master's Degree in computer / information science with relevant work experience in IT industry •Enthusiastic, energetic and self-learning candidates with loads of curiosity and flexibility. •Proven hands-on capability in the development of data pipelines and data engineering. •Experience in creating data-driven business solutions and solving data problems using technologies such as Hadoop, Hive, and Spark. •Ability to program in one or more scripting languages such as Python and one or more programming languages such as Java or Scala. •Familiarity with AI-centric libraries like TensorFlow, PyTorch, and Keras. •Familiarity with machine learning algorithms and statistical models is beneficial. •Critical ability to interpret complex data and provide actionable insights. This encompasses statistical analysis, predictive modeling, and data visualization. •Extended experience in Agile Release Management practices, governance, and planning. •Strong leadership skills with demonstrated ability to lead global, cross-functional teams. Additional Information Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

At Aramya, we’re redefining fashion for India’s underserved Gen X/Y women, offering size-inclusive, comfortable, and stylish ethnic wear at affordable prices. Launched in 2024, we’ve already achieved ₹40 Cr in revenue in our first year, driven by a unique blend of data-driven design, in-house manufacturing, and a proprietary supply chain. Today, with an ARR of ₹100 Cr, we’re scaling rapidly with ambitious growth plans for the future. Our vision is bold to build the most loved fashion and lifestyle brands across the world while empowering individuals to express themselves effortlessly. Backed by marquee investors like Accel and Z47, we’re on a mission to make high-quality ethnic wear accessible to every woman. We’ve built a community of loyal customers who love our weekly design launches, impeccable quality, and value-for-money offerings. With a fast-moving team driven by creativity, technology, and customer obsession, Aramya is more than a fashion brand—it’s a movement to celebrate every woman’s unique journey. We’re looking for a passionate Data Engineer with a strong foundation. The ideal candidate should have a solid understanding of D2C or e-commerce platforms and be able to work across the stack to build high-performing, user-centric digital experiences. Key Responsibilities Design, build, and maintain scalable ETL/ELT pipelines using tools like Apache Airflow, Databricks , and Spark . Own and manage data lakes/warehouses on AWS Redshift (or Snowflake/BigQuery). Optimize SQL queries and data models for analytics, performance, and reliability. Develop and maintain backend APIs using Python (FastAPI/Django/Flask) or Node.js . Integrate external data sources (APIs, SFTP, third-party connectors) and ensure data quality & validation. Implement monitoring, logging, and alerting for data pipeline health. Collaborate with stakeholders to gather requirements and define data contracts. Maintain infrastructure-as-code (Terraform/CDK) for data workflows and services. Must-Have Skills Strong in SQL and data modeling (OLTP and OLAP). Solid programming experience in Python , preferably for both ETL and backend. Hands-on experience with Databricks , Redshift , or Spark . Experience with building and managing ETL pipelines using tools like Airflow , dbt , or similar. Deep understanding of REST APIs , microservices architecture, and backend design patterns. Familiarity with Docker , Git, CI/CD pipelines. Good grasp of cloud platforms (preferably AWS ) and services like S3, Lambda, ECS/Fargate, CloudWatch. Nice-to-Have Skills Exposure to streaming platforms like Kafka, Kinesis, or Flink. Experience with Snowflake , BigQuery , or Delta Lake . Proficient in data governance , security best practices, and PII handling. Familiarity with GraphQL , gRPC , or event-driven systems. Knowledge of data observability tools (Monte Carlo, Great Expectations, Datafold). Experience working in a D2C/eCommerce or analytics-heavy product environment. Show more Show less

Posted 1 day ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Total 6+ years of experience 3 years of experience with leading automation tools and White-box testing (Java APIs) i.e. JUnit 2 years of Software development in Java 2EE. Experience in other automation tools i.e. Selenium, Mercury tools or self created test-harness tool 4 Year College Degree in Computer Science or related field i.e. BE or MCA Good understanding of XML, XSL/XSLT, RDBMS and Unix platforms. Experience in Multi-dimensional (OLAP technology),Data Warehouse and Financial software would be desirable. Motivated individual in learning leading-edge technology and testing complex software Career Level - IC3 Responsibilities Total 6+ years of experience 3 years of experience with leading automation tools and White-box testing (Java APIs) i.e. JUnit 2 years of Software development in Java 2EE. Experience in other automation tools i.e. Selenium, Mercury tools or self created test-harness tool 4 Year College Degree in Computer Science or related field i.e. BE or MCA Good understanding of XML, XSL/XSLT, RDBMS and Unix platforms. Experience in Multi-dimensional (OLAP technology),Data Warehouse and Financial software would be desirable. Motivated individual in learning leading-edge technology and testing complex software About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 2 days ago

Apply

7.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description: We are seeking an experienced Engineer with strong expertise in PostgreSQL, PL/SQL programming, and cloud-based data migration. The ideal candidate will have hands-on experience in migrating and tuning databases, particularly from Oracle to PostgreSQL on GCP (AlloyDB / Cloud SQL), and be skilled in modern data architecture and cloud services. Locations - Indore/Bengaluru/Noida Key Responsibilities Design, build, test, and maintain scalable data architectures on GCP. Lead Oracle to PostgreSQL data migration initiatives (preferably AlloyDB / Cloud SQL). Optimize PostgreSQL performance (e.g., tuning autovacuum, stored procedures). Translate Oracle PL/SQL code to PostgreSQL equivalents. Integrate hybrid data storage using GCP services (BigQuery, Firestore, MemoryStore, Spanner). Implement database job scheduling, disaster recovery, and logging. Work with GCP Dataflow, MongoDB, and data migration services. Mentor and lead database engineering teams. Required Technical Skills Advanced PostgreSQL & PL/SQL programming (queries, procedures, functions). Strong experience with database migration (Oracle ➝ PostgreSQL on GCP). Proficient in Cloud SQL, AlloyDB, and performance tuning. Hands-on experience with BigQuery, Firestore, Spanner, MemoryStore, MongoDB, Cloud Dataflow. Understanding of OLTP and OLAP systems. Desirable Qualifications GCP Database Engineer Certification Exposure to Enterprise Architecture, Project Delivery, and Performance Benchmarking Strong analytical, problem-solving, and leadership skills. Years Of Experience- 7 to 10 Years Education/Qualification- BE / B.Tech / MCA / M.Tech / M.Com Interested candidates can directly share their resume at anubhav.pathania@impetus.com Show more Show less

Posted 2 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Experience- 5 To 9 Year Location- Pune Job Type- Contract For Client Responsibilities Job Description- Development of high-quality database solutions Develop, implement, and optimize stored procedures and functions using SQL Review and interpret ongoing business report requirements Analyze existing SQL queries for performance improvements Gather user requirements and identify new features Provide data management support to users Ensure all database programs meet company and performance requirements Build appropriate and useful reporting deliverables Suggest new queries Provide timely scheduled management reporting Investigate exceptions about asset movements Mentor junior team members as needed Work with data architects to ensure that solutions are aligned with company-wide technology directions Required Skills Technical Bachelor’s degree in IT, Computer science, or related field 5+ years of experience as a SQL Developer or similar role Strong proficiency with SQL and its variations among popular databases (Snowflake) Strong skills in performance tuning of complex SQL queries, procedure and indexing strategies Experience in designing, OLAP databases using data warehouse patterns and schema’s including facts, dimensions, sorting keys, indexes, constraints etc. Query design and performance tuning of complex queries for very large data sets Knowledge of best practices when dealing with relational databases Capable of troubleshooting common database issues Translating functional and technical requirements into detailed design Data Analysis experience, for example – mapping the source to target rules and fields Click here to apply Apply here Job Category: SQL Developer Job Type: Contract Job Location: Pune Apply for this position Full Name * Email * Phone * Cover Letter * Upload CV/Resume *Allowed Type(s): .pdf, .doc, .docx By using this form you agree with the storage and handling of your data by this website. * Show more Show less

Posted 2 days ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Greetings from Tata Consultancy Services Join the Walk-in Drive on 21st June 2025 and Pave your path to value with TCS AI Cloud Team We are Hiring for Below Skills Exp : 4 yrs to 12 yrs Azure Data Engineer Required:Implementation, and operations of OLTP, OLAP, DW technologies such as Azure SQL, Azure SQL DW Show more Show less

Posted 2 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Total 6+ years of experience 3 years of experience with leading automation tools and White-box testing (Java APIs) i.e. JUnit 2 years of Software development in Java 2EE. Experience in other automation tools i.e. Selenium, Mercury tools or self created test-harness tool 4 Year College Degree in Computer Science or related field i.e. BE or MCA Good understanding of XML, XSL/XSLT, RDBMS and Unix platforms. Experience in Multi-dimensional (OLAP technology),Data Warehouse and Financial software would be desirable. Motivated individual in learning leading-edge technology and testing complex software Career Level - IC3 Responsibilities Total 6+ years of experience 3 years of experience with leading automation tools and White-box testing (Java APIs) i.e. JUnit 2 years of Software development in Java 2EE. Experience in other automation tools i.e. Selenium, Mercury tools or self created test-harness tool 4 Year College Degree in Computer Science or related field i.e. BE or MCA Good understanding of XML, XSL/XSLT, RDBMS and Unix platforms. Experience in Multi-dimensional (OLAP technology),Data Warehouse and Financial software would be desirable. Motivated individual in learning leading-edge technology and testing complex software About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 2 days ago

Apply

3.0 years

0 Lacs

India

On-site

Linkedin logo

Who Are We Third Chair(YC X25) is building AI Agents for in-house legal teams. The team comprises of 2 second-time founders with past exits. Yoav previously cofounded social media analytics startup Trendpop(YC W21) that scaled to 1M in ARR in 16 months building a platform that processed millions of social posts per day. Shourya previously cofounded a consumer finance startup Fello(YC W22) that scaled to over 2 million users and managed over 600k monthly active users and over $250,000 in monthly investments. Third Chair is building vertical AI-native workflows for legal teams that help them complete end-to-end workflows that previously required 100s of hours. This is accomplished by building SOTA AI agents that browse the web, download and collect evidence, draft letters and more. We've grown 88% last month and went from 0 to 100k ARR in 5 months . More here. What Makes You a Good Fit 3+ years of hands-on experience developing production level Node.js/Typescript backends. Strong experience with structured DBMSs like PostgreSQL and OLAP databases like Redshift. Strong understanding of AWS services such as ECS, RDS, S3, CloudWatch, Elasticache. Experience working with telemetry, CI/CD and IaaC pipelines. Comfortable with US timezones. What Makes You a Great Fit Past experience with OpenAI APIs for completions, function calling and building context aware assistants. Past experience with Go routines. Building multi-agent systems using frameworks like CrewAI or such. Strong sense of cost optimization strategies, system design, and building efficient API stacks. Benefits Work from anywhere - We're a distributed team across multiple timezones with a focus on outputs instead of location or working hours. Generous PTO policy. Competitive pay bracket. Equity at a fast growing YC backed co in a disruptive market. Show more Show less

Posted 3 days ago

Apply

12.0 years

1 - 6 Lacs

Hyderābād

On-site

The Windows Data Team is responsible for developing and operating one of the world’s largest data eco-systems: PiB data is being processed, stored and accessed every day. In addition to Azure, Fabric, and Microsoft offerings, the team also utilizes modern open-source technologies such as Spark, Starrocks, and ClickHouse. Thousands of developers in Windows, Bing, Ads, Edge, MSN, etc. are working on top of the data products that the team builds. We’re looking for passionate engineers to join us for the mission of powering Microsoft businesses through data substrate and infusing our data capabilities into the industry. We are looking for a Principal Software Engineering Manager who can lead a team to design, develop, and maintain data pipelines and applications using Spark, SQL, map-reduce, and other technologies on our big data platforms. You will work with a team of data scientists, analysts, and engineers to deliver high-quality data solutions that support our business goals and customer needs. You will also collaborate with other teams across the organization to ensure data quality, security, and compliance. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Lead a team of software developers to develop and optimize data pipelines and applications using Spark, Cosmos, Azure, SQL, and other frameworks. Implement data ingestion, transformation, and processing logic using various data sources and formats. Perform data quality checks, testing, and debugging to ensure data accuracy and reliability. Document and maintain data pipeline specifications, code, and best practices. Research and evaluate new data technologies and tools to improve data performance and scalability. Work with world-class engineer/scientist team on Big Data, Analytics and OLAP/OLTP. Embrace both Microsoft technology and cutting-edge open-source technology. Qualifications Required Qualifications: Bachelor's Degree in Computer Science or related technical field AND 12+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR Master's Degree in Computer Science or related technical field AND 10+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience. 4+ years people management experience. Demonstrate working knowledge of cloud and distributed computing platforms such as Azure or AWS. Strong knowledge and experience with Map Reduce, Spark, Kafka, Synapse, Fabric, or other data processing frameworks. Fluent in English, both written and spoken. Preferred Qualifications: Experience with CosmosDB or other NoSQL databases is a plus. Experience in data engineering, data analysis, or data related fields. Experience with data science and ML tools such as Scikit-learn, R, Azure AI, Pyspark, or similar. Experience with data modeling, data warehousing, and ETL techniques. Experience in designing, developing, and shipping services with secure continuous integration and continuous delivery practices (CI/CD). Relational and/or non-relational (NoSQL) databases. C/C++ and lower-level languages are a plus. #W+Djobs #WindowsIndia #WDXIndia Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 3 days ago

Apply

3.0 years

1 - 4 Lacs

Bengaluru

On-site

Job Description: Position Overview: We are seeking a skilled FLEXCUBE Reports Developer with expertise in Qlik sense to join our team. The ideal candidate will be responsible for designing, developing, and maintaining reports and dashboards that provide valuable insights from FLEXCUBE core banking data. Key Responsibilities: Report Development: Design and create interactive reports and dashboards using Qlik Sense to visualize FLEXCUBE data for business users. FLEXCUBE 14.7 Backend Tables: FLEXCUBE data model knowlege is must Data Modelling: Develop data models and relationships within Qlik Sense to ensure accurate representation of FLEXCUBE data. Customization: Customize reports to meet specific business requirements and ensure they align with industry best practices. Performance Optimization: Optimize report performance for efficient data retrieval and rendering. Data Integration: Integrate data from various sources into Qlik Sense reports, including FLEXCUBE and other data repositories. Data Security: Implement data security and access controls within Qlik Sense to protect sensitive information. User Training: Provide training and support to end-users to enable them to effectively utilize Qlik Sense reports. Documentation: Maintain documentation for reports, data models, and best practices. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. 3 to 7 Years of proven experience in developing reports and dashboards using Qlik Sense. Familiarity with FLEXCUBE core banking systems. Familiarity with OLAP Cubes, Data Marts, Datawarehouse Proficiency in data modelling and data visualization concepts. Strong SQL skills for data extraction and transformation. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Banking or financial industry experience is beneficial. Qlik Sense certifications are a plus. Additional Information: This role offers an opportunity to work with cutting-edge reporting and analytics tools in the banking sector. The candidate should be prepared to work closely with business stakeholders and contribute to data-driven decision-making. Candidates with a strong background in FLEXCUBE reports development and Qlik Sense are encouraged to apply. We are committed to providing a collaborative and growth-oriented work environment. Career Level - IC2 Diversity and Inclusion: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, interview process, and in potential roles. to perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before.

Posted 3 days ago

Apply

0 years

5 - 8 Lacs

Bengaluru

On-site

Position Summary BI Developer will possess advanced knowledge of SQL Server, SSIS, SSRS, T-SQL and some data modelling experience. This role will provide outstanding analytical and problem solving expertise with a good grasp of technical side of Business Intelligence PRINCIPLE JOB RESPONSIBILITIES: Logical & physical Design of the Database, Creation of tables, views, procedures, packages, triggers and other database objects. Generating scripts of database objects. Writing and debug complex queries. Performance tuning and SQL tuning. Product development knowledge, having familiarity with issues related to database design, schema changes. Good experience in Reporting and Dashboarding, especially using SSRS Experience in Design and development of Stored Procedures and Views necessary to support SSRS reports Experience in creating complex SSIS packages with translation handling and logging is a plus Experience in OLAP, especially using SSAS, is a plus Education B Tech / BE/ M Tech/ MCA/ BSc/ MSc from reputed University Skills BC - Dependability and Reliability BC - Initiative BC - Time Management DC - US Healthcare domain Knowledge FC - Client Focus FC - Oral Communication FC - Written Communication PC - Jiva Product Knowledge Competencies BC - Collaboration & Interpersonal Skills BC - Time Management FC - Analytical Skills FC - Communication Skills FC - Quality TC - Relational Database - SQL Server TC - SQL Server Analysis Service (SSAS) - Tabular TC - SQL Server Integration Service (SSIS) TC - SQL Server Reporting Service (SSRS)

Posted 3 days ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Summary- Effectively facilitate meetings and brainstorming sessions with business as well as the technical team. Require strategic design and mapping of business requirements & solutions to system/technical requirements. Examine functional product requirements and breakdown the requirements into details technical stories and tasks Construct Use case diagrams and workflow charts, to help clarify and elaborate upon technical requirements. Identify and engage all key stakeholders, contributors, business, and technical resources required for product updates and ensure contributors are motivated to complete tasks within the parameters of the requirements Work with the entire team and customers to resolve any conflicts or confusion related to requirements or desired functionality Responsibilities - Effectively collaborate with Technical and Non-Technical team members and customers. Oversees and take ownership for the successful completion of the assigned project. Lead the ERP level project development efforts with minimal direction from the director or manager. Effectively facilitate meetings, brainstorming sessions to build consensus within customers representatives and in the technical team (development + QA) Create detailed documentation covering use cases and business requirements. Scrum planning Reporting to management and obtaining approval before taking any key project decision. Provide guidance to technical teams regarding functional requirements. Ensure & validate that delivered functionality meets customer’s expectation. Coordinate UAT efforts. Demo the released features/application to customers. Key Skills: (must have) 3+ years’ work experience in end to end systems development process. Excellent verbal and written communication skills. Must have good listening skills. Knowledge and experience in using Postman tool for APIs Extensive knowledge of relevant technology concepts (e.g. client-server, relational databases, cloud-based and web-based architectures) Basic competence in at least one programming language (e.g. C#, node.js, JavaScript, or PHP etc.) Basic understanding of the implementation of ERP or CRM systems. Ability to quickly assimilate and apply business models and technologies. Team player with strong interpersonal skills and the ability to lead the team when required. Proactive risk analysis in the project and providing steps to customer/internal dev team to mitigate the risk. Must have an extensive working knowledge of Business Intelligence concepts (e.g. reporting, querying software, OLAP, spreadsheets, dashboards, and data mining) Knowledge and experience with service/API patterns including protocols and formats such as SOAP, REST, XML, and SWAGGER. Strong communication skills, including prioritizing, problem-solving and interpersonal relationship building Extensive Experience in technical business analysis. Advanced knowledge of programming languages like SQL and system integration solutions Proven time management cum organization skills (must be able to prioritize workload and meet deadlines). Must have excellent ideas presentation skills through PPT or Word. Addon Skills Experience in BFSI (Banking, Finance & Insurance) domain. Knowledge of Information Security/Identity management Understanding of Testing methodology and processes. Show more Show less

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies