Jobs
Interviews

532 Data Manipulation Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

6 - 10 Lacs

Bengaluru

Work from Office

2-5 years of hands-on software engineering experience Excellent hands on experience in Java (17+, 21 preferred) Experience in developing Spring Boot and REST services. Experience in unit test frameworks. Ability to provide solutions based on the business requirements. Ability to collaborate with cross-functional teams. Ability to work with global teams and a flexible work schedule. Must have excellent problem-solving skills and be customer-centric. Excellent communication skills. : Education: Bachelor s/Master s degree in computer science or equivalent. Mandatory Skills: 2-5 years of hands-on software engineering experience Excellent hands on experience in Java (17+, 21 preferred) Experience in developing Spring Boot and REST services. Experience in unit test frameworks. Ability to provide solutions based on the business requirements. Ability to collaborate with cross-functional teams. Ability to work with global teams and a flexible work schedule. Must have excellent problem-solving skills and be customer-centric. Excellent communication skills. Preferred Skills: Experience with Microservices, CI/CD, Event Oriented Architectures and Distributed Systems Experience with cloud environments (e.g., Google Cloud Platform, Azure, Amazon Web Services, etc.) Familiarity with web technologies (e,g,, JavaScript, HTML, CSS), data manipulation (e.g., SQL), and version control systems (e.g., GitHub/GitLab) Familiarity with DevOps practices/principles, Agile/Scrum methodologies, CI/CD pipelines and the product development lifecycle Familiarity with modern web APIs and full stack frameworks. Experience with Kubernetes, Kafka, Postgresql Experience developing eCommerce systems - especially B2B eCommerce - is a plus. 2-5 years of hands-on software engineering experience Excellent hands on experience in Java (17+, 21 preferred) Experience in developing Spring Boot and REST services. Experience in unit test frameworks. Ability to provide solutions based on the business requirements. Ability to collaborate with cross-functional teams. Ability to work with global teams and a flexible work schedule. Must have excellent problem-solving skills and be customer-centric. Excellent communication skills.

Posted 1 month ago

Apply

17.0 - 18.0 years

8 Lacs

Mumbai, Navi Mumbai

Work from Office

Position Requirement: Administration and data entry of supplier invoices Reconciliation of month end statements Multi currencies invoice handling Matching of purchase order and invoice and liaising with purchase team for completion Reconciliation of purchase order receipt and following up on old goods receipt but not invoiced Payment processing in bank Bank Reconciliation 8. Multi-currency account maintenance 9. Liaising with various departments, both locally and internationally Staff claim processing and credit card reconciliation Following internal procedures and escalating appropriately for non-compliance Administrative tasks like creating letters, documents review and assisting with HR related issues Must have experience 1. Dealing with international locations and people from overseas 2. Multi-currency invoice and payments 3. Bank Reconciliation 4. Detail oriented and adherence to set procedures 5. Good command on English language. 6. Experience in international entities with overseas head offices essential 7. Good excel knowledge like pivot tables, LookUps, data manipulation. 8. Experience in handling raw data from multiple sources to create meaningful report 9. Bachelor in accounting or similar level of education 10. Experience with Tier 1 ERP like Oracle, JDE, SAP, IFS essential 11. Working knowledge of GST

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 7 Lacs

Bengaluru

Work from Office

As a Product Data Taxonomy Specialist in our eCommerce organization at MilliporeSigma, you will play a critical role in ensuring data governance, compliance, and effective taxonomy management across our extensive portfolio of over 350,000 products. You will be accountable for evolving, expanding, and enriching our product data taxonomy structure to support a best-in-class shopping experience for our customers. Who Are You: Lead the development of product category taxonomies to effectively organize and categorize our expansive and diverse catalog of reagents, chemicals, and instrumentation. Establish and maintain effective taxonomy structures that enhance product categorization and facilitate data retrieval. Develop and maintain standard operating procedures (SOPs) for implementing taxonomy changes. Collaborate with our Product Teams on establishing requirements and enhancements to Taxonomy Management Service/Application Conduct competitive research on product taxonomy structures, both internally and externally. Analyze website metrics and product discovery behaviors to drive data-informed decisions. Collaborate with stakeholders to promote the availability of high-quality, trusted data as a strategic asset in support of operational effectiveness and decision-making. Monitor and evaluate product data quality on an ongoing basis, identifying areas for improvement and driving corrective actions. Mentor internal users on data governance principles, taxonomy management, and compliance requirements. Educate technical and non-technical audiences on information architecture concepts. Basic/Minimum Qualifications: 2+ years of experience in Scientific Taxonomy/Ontology and its applications to eCommerce Taxonomy management, Search Engine management, or Information Architecture Bachelors degree in Chemistry, Biology, Materials Science, Data Science, or a related field. Experience working with PIM systems Familiarity with digital marketing, web content development, and the implications of product data structure on marketing strategies Experience creating and communicating technical requirements to engineering teams Strong understanding of data management best practices, including taxonomy management, metadata management, data quality management, and compliance frameworks. Understand the flow of e-Commerce product catalog information from search to checkout Preferred Qualifications: Masters Degree in Chemistry, Biology, Materials Science, Data Science, or a related field. Experience with scientific products and laboratory workflows Familiarity with database management concepts, and experience with advanced data manipulation techniques Demonstrable ability to devise, communicate, and implement information architecture strategy Deep familiarity with taxonomies, ontologies, and classification systems and their management Familiarity with machine learning and natural language concepts and applications Adept at diving deep into search including algorithms and technical details Intellectually curious, technically savvy, detail oriented, entrepreneurial

Posted 1 month ago

Apply

8.0 - 13.0 years

13 - 18 Lacs

Bengaluru

Work from Office

We are seeking a Senior Snowflake Developer/Architect will be responsible for designing, developing, and maintaining scalable data solutions that effectively meet the needs of our organization. The role will serve as a primary point of accountability for the technical implementation of the data flows, repositories and data-centric solutions in your area, translating requirements into efficient implementations. The data repositories, data flows and data-centric solutions you create will support a wide range of reporting, analytics, decision support and (Generative) AI solutions. Your Role: Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2.0 methodologies. Write optimized SQL queries for data extraction, transformation, and loading. Utilize Python for advanced data processing, automation tasks, and system integration. Be an advisor with your In-depth knowledge of Snowflake architecture, features, and best practices. Develop and maintain complex data pipelines and ETL processes in Snowflake. Collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions. Automate DBT Jobs & build CI/CD pipelines using Azure DevOps for seamless deployment of data solutions. Ensure data quality, integrity, and compliance throughout the data lifecycle. Troubleshoot, optimize, and enhance existing data processes and queries for performance improvements. Document data models, processes, and workflows clearly for future reference and knowledge sharing. Build Data tests, Unit tests and mock data frameworks. Who You Are: Masters degree in computer science, Information Technology, or a related field. At least 3+ years of proven experience as a Snowflake Developer and minimum 8+ years of total experience with data modelling (OLAP & OLTP). Extensive hands-on experience in writing complex SQL queries and advanced Python, demonstrating proficiency in data manipulation and analysis for large data volumes. Strong understanding of data warehousing concepts, methodologies, and technologies with in-depth experience in data modelling techniques (OLTP, OLAP, Data Vault 2.0) Experience building data pipelines using DBT (Data Build Tool) for data transformation. Familiarity with advanced techniques for performance tuning methodologies in Snowflake including query optimization. Strong knowledge with CI/CD pipelines, preferably in Azure DevOps. Excellent problem-solving, analytical, and critical thinking skills. Strong communication, collaboration, and interpersonal skills. Knowledge of additional data technologies (e.g., AWS, Azure, GCP) is a plus. Knowledge of Infrastructure as Code (IAC) tools such as Terraform or cloud formation is a plus. Experience in leading projects or mentoring junior developers is advantageous.

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Project description We are seeking a highly skilled Data Modelling Expert with deep experience in the Avaloq Core Banking platform to join our technology team. The ideal candidate will be responsible for designing, maintaining, and optimizing complex data models that support our banking products, client data, and regulatory reporting needs. This role requires a blend of domain expertise in banking and wealth management, strong data architecture capabilities, and hands-on experience working with the Avaloq platform. Responsibilities Design, implement, and maintain conceptual, logical, and physical data models within the Avaloq Core Banking system. Collaborate with business analysts, product owners, and Avaloq parameterisation teams to translate business requirements into robust data models. Ensure alignment of data models with Avaloq's object model and industry best practices. Perform data profiling, quality checks, and lineage tracing to support regulatory and compliance requirements (e.g., Basel III, MiFID II, ESG). Support integration of Avaloq data with downstream systems (e.g., CRM, data warehouses, reporting platforms). Provide expert input on data governance, metadata management, and model documentation. Contribute to change requests, upgrades, and data migration projects involving Avaloq. Collaborate with cross-functional teams to drive data consistency, reusability, and scalability. Review and validate existing data models, identify gaps or optimisation opportunities. Ensure data models meet performance, security, and privacy requirements. Skills Must have Proven experience (5+ years) in data modelling or data architecture, preferably within financial services. 3+ years of hands-on experience with Avaloq Core Banking Platform, especially its data structures and object model. Strong understanding of relational databases and data modelling tools (e.g., ER/Studio, ERwin, or similar). Proficient in SQL and data manipulation in Avaloq environments. Knowledge of banking products, client lifecycle data, and regulatory data requirements. Familiarity with data governance, data quality, and master data management concepts. Experience working in Agile or hybrid project delivery environments. Nice to have Exposure to Avaloq Scripting or parameterisation is a strong plus. Experience integrating Avaloq with data lakes, BI/reporting tools, or regulatory platforms. Understanding of data privacy regulations (GDPR, FINMA, etc.). Certification in Avaloq or relevant financial data management domains is advantageous. Other Languages English: C1 Advanced Location - Pune,Bangalore,Hyderabad,Chennai,Noida

Posted 1 month ago

Apply

4.0 - 6.0 years

12 - 18 Lacs

Noida, Greater Noida

Work from Office

Role & responsibilities Utilize Python (specifically Pandas) to clean, transform, and analyze data, automate repetitive tasks, and create custom reports and visualizations. Analyze and interpret complex datasets, deriving actionable insights to support business decisions. Write and optimize advanced SQL queries for data extraction, manipulation, and analysis from various sources, including relational databases and cloud-based data storage. Collaborate with cross-functional teams to understand data needs and deliver data-driven solutions. Create and maintain dashboards and reports that visualize key metrics and performance indicators. Identify trends, patterns, and anomalies in data to support business intelligence efforts and provide strategic recommendations. Ensure data integrity and accuracy by developing and implementing data validation techniques. Support data migration, transformation, and ETL processes within cloud environments. Requirements 3 - 5 years of experience as a Data analyst or equivalent role. Good experience in Python, with hands-on experience using Pandas for data analysis and manipulation. Expertise in analytical SQL, including writing complex queries for data extraction, aggregation, and transformation. Knowledge of cloud platforms, particularly AWS (Amazon Web Services). Strong analytical thinking, problem-solving, and troubleshooting abilities. Familiarity with data visualization tools (e.g., Tableau, Power BI, Quicksight , Superset etc.) is a plus. Excellent communication skills, with the ability to explain complex data insights in a clear and actionable manner. Detail-oriented with a focus on data quality and accuracy. Preferred Qualifications: Experience working in a cloud-based data analytics environment. Familiarity with additional cloud services and tools (e.g. Snowflake , Athena). Experience working in an Agile environment or with data-oriented teams.

Posted 1 month ago

Apply

4.0 - 6.0 years

11 - 15 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Responsibilities Essential Job Functions: Design, develop, test, debug, and maintain mainframe applications using COBOL programming language. Write and maintain JCL (Job Control Language) for batch processing and job scheduling. Utilize VSAM (Virtual Storage Access Method) for efficient data access and management. Interact with databases using DB2, including SQL query optimization and performance tuning. Utilize tools such as File-AID for data manipulation, browsing, and editing. Perform debugging and testing of mainframe applications using XPEDITOR. Write and execute SQL queries using SPUFI for data retrieval and manipulation. Collaborate with cross-functional teams to analyze requirements, design solutions, and implement changes. Provide technical support and assistance to resolve issues and troubleshoot production problems. Stay updated with emerging technologies and industry trends in mainframe development. Skills Must have 4-6 years of strong Mainframe development experience. Experience in developing design, develop, test, debug, and maintain mainframe applications using COBOL programming language 3.Utilize VSAM (Virtual Storage Access Method) for efficient data access and management. 4.Interact with databases using DB2, including SQL query optimization and performance tuning. 5.Utilize tools such as File-AID for data manipulation, browsing, and editing. 6.Write and execute SQL queries using SPUFI for data retrieval and manipulation. 7.Collaborate with cross-functional teams to analyze requirements, design solutions, and implement changes. Nice to have Insurance domain experience. Agile/Scrum experience. Other Languages English: C1 Advanced Seniority Regular Location: Pune,Bengaluru,Hyderabad,Chennai,Noida

Posted 1 month ago

Apply

3.0 - 5.0 years

15 - 30 Lacs

Bengaluru

Work from Office

Position summary: We are seeking a Senior Software Development Engineer – Data Engineering with 3-5 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions. Key Responsibilities: Work with cloud-based data solutions (Azure, AWS, GCP). Implement data modeling and warehousing solutions. Developing and maintaining data pipelines for efficient data extraction, transformation, and loading (ETL) processes. Designing and optimizing data storage solutions, including data warehouses and data lakes. Ensuring data quality and integrity through data validation, cleansing, and error handling. Collaborating with data analysts, data architects, and software engineers to understand data requirements and deliver relevant data sets (e.g., for business intelligence). Implementing data security measures and access controls to protect sensitive information. Monitor and troubleshoot issues in data pipelines, notebooks, and SQL queries to ensure seamless data processing. Develop and maintain Power BI dashboards and reports. Work with DAX and Power Query to manipulate and transform data. Basic Qualifications Bachelor’s or master’s degree in computer science or data science 3-5 years of experience in data engineering, big data processing, and cloud-based data platforms. Proficient in SQL, Python, or Scala for data manipulation and processing. Proficient in developing data pipelines using Azure Synapse, Azure Data Factory, Microsoft Fabric. Experience with Apache Spark, Databricks and Snowflake is highly beneficial for handling big data and cloud-based analytics solutions. Preferred Qualifications Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub). Experience in BI and analytics tools (Tableau, Power BI, Looker). Familiarity with data observability tools (Monte Carlo, Great Expectations). Contributions to open-source data engineering projects.

Posted 1 month ago

Apply

5.0 - 9.0 years

25 - 35 Lacs

Hyderabad

Hybrid

Role Summary: The Senior Data Scientist is a technical leadership role with direct accountability for the delivery of multiple projects within FMCG including the management of the analytics quality on these projects. The Senior Data Scientist collaborates closely with other Data Scientists and consultants on projects (across the FMCG client portfolio) to transform client questions and issues into analytical solutions. Key responsibilities: Accountable for the analytics quality and delivery of key FMCG projects. Transforming client questions and issues into analytical solutions, through partnering with Consultants to scope solutions and ensure projects are successfully delivering value to clients. Task management including overseeing and delegating technical tasks amongst data scientists. Team management: leading a small team of technical analysts (~1-3 direct reports). Key activities: * Accountable for the analytics quality and delivery of key FMCG projects: Delivery and build of analytical solutions, transforming transaction- or customer-level data sets into client ready strategic insights e.g. data exploration, cleansing, manipulation, extracting insights, and clear presentation via spreadsheets, PowerPoint presentations and self-service analytical visualization tools. Task management including overseeing and delegating technical tasks amongst analysts within project team, including reviewing code, analysis and outputs. Design and be accountable for solution design, analytical methodology / technique and quality assurance for analytical projects. Providing estimates of work effort, timeframes, costings and resourcing for analytical tasks. * Partnering with consultants to ensure projects are successfully delivered and providing value to clients: Convert data into meaningful, actionable recommendations for Quantium's clients. Transforming client questions and issues into analytical solutions, through project scoping and solution design. * Team management: leading a small team of Data Scientists: Directly manage all direct reports, managing on-boarding, driving their performance, supporting their development & progression and managing the formal performance review process. Provide technical guidance for direct reports and other analysts within the project team as needed / on a day-to-day basis. Support team initiatives, such as interviewing candidates, facilitating team training or knowledge sharing, or organizing events. Be a role model of Quantium's DNA and support a culture of collaboration, innovation, high-performance and fun. Experience and education required: 1. Approximately 5+ years experience in a highly technical analytics environment, carrying out data analytics or data science work. Strong coding experience in SQL. Experience in working with large datasets to solving commercial business problems (customer-level and/or sales transaction-level datasets preferred). Advanced knowledge of technical analytics discipline, including data preparation, feature engineering and foundational analytics concepts, model development and model training. 2. Well-developed commercial acumen to understand business needs and be able suggest the commercial impacts of different analytics solutions or approaches. Experience transforming client questions and issues into analytical solutions, through project scoping and solution design. Experience in FMCG/CPG or Retail industry (preferred). Experience working in consulting environment (preferred). 3. Experience in supervising the work of others as a senior team member, project manager or team leader. 4. Experience with successfully managing either internal and external stakeholders, delivering against projects, tasks and activities in a dynamic deadline driven environments. 5. Experience and interest in people management, mentoring and development. 6.Tertiary qualifications in engineering, mathematics, actuarial studies, statistics, physics, or a related discipline. Key business capabilities required: 1. Resource Planning - Coordination of resources effectively to achieve committed client outcomes at a high level of quality. 2. Client Project Planning - Effective capture, planning and management of the series of inter-related activities involved in delivering the agreed client scope of work, including managing dependencies, making decisions on variances to the original plan and aligning stakeholders. 3. Client Project Delivery- Effective management that integrates people, systems, structures and practices across Quantium and a client organization to ensure high quality project delivery. 4. Business and Product Requirements - Ability to effectively translate business and client needs or problems into an a sustainable technical or product solution. 5. Method Evaluation - Assess the appropriateness of the solution to the original problem and the application of the method. 6. Method Selection - Choose the appropriate type of analytics or modelling to best solve the problem, given all inputs and prescribed outputs. 7. Method Design - Design the most appropriate structure for the method that best solves the problem, given inputs and prescribed outputs. 8. Insight Generation: Convert information into meaningful, actionable recommendations for Quantium's clients and partners. Key People and leadership capabilities required: 1. Self-aware - you leverage diversity across people, tasks, client interactions and projects, taking responsibility for self and other. 2. Agile and innovative - you possess strong lateral thinking skills and actively develop these in others. 3. Achieve and perform - you have exceptional execution skills and are achievement focused. 4. Brand advocate - you anticipate and consider brand and cultural impact in decision making. 5. Purposeful and aligned - you can set clear, tangible objectives which deliver against our strategy. 6. Achievement oriented - you effectively facilitate challenging performance related conversations in a timely and appropriate manner. 7. Coach - you demonstrate the ability to coach and develop others while facilitating learning, growth and engagement.

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

Gurugram

Work from Office

Role/Designation: Data Scientist Location: Gurgaon Level/ Band: Band C (Manager/ Sr. Manager) Brief about the role/ team/ candidate requirement: We are looking for a motivated and highly skilled Data Scientist to join our Flight Safety team. This role will be pivotal in using advanced data analytics and machine learning techniques to enhance flight safety decisions and processes. The candidate will work with time-series, multi-variate data to detect anomalies, identify patterns, and build predictive models that support safety management. The ideal candidate should have a strong foundation in data science, machine learning, and deep learning techniques, along with expertise in applying these skills in the aviation safety domain. The candidate will play a key role in developing algorithms, performing exploratory data analysis, and utilizing artificial intelligence tools to forecast potential safety risks, providing valuable insights for strategic safety decisions. Qualification & Experience: Educational Requirements: Engineering graduate with a specialization in Data Science or a related field. Completion of a Data Scientist course or equivalent certification. Experience: A minimum of 6 years of hands-on experience working directly with data, particularly in the context of data science and machine learning. Skill set: Data analysis Anomaly detection Machine learning Big data analytics Programming skills Job responsibilities: Data analysis expertise: Ability to conduct exploratory data analysis to uncover patterns, trends, and insights from complex datasets. Anomaly detection: Expertise in anomaly detection techniques for time-series, multi-variate data, with a focus on identifying outliers and irregular behavior. Machine learning and deep learning: Strong understanding of data science methodologies, machine learning algorithms, and deep learning architectures such as Artificial Neural Networks (ANN), Recurrent Neural Networks (RNN), and Convolutional Neural Networks (CNN). Big data analytics: Proficient in utilizing AI and machine learning techniques for big data analytics, with the goal of predicting and mitigating potential safety risks. Programming skills: Expertise in Python and MATLAB for data manipulation, modeling, and algorithm development. Algorithm development: Strong ability to formulate custom algorithms tailored to specific requirements, ensuring accurate results in the context of flight safety and operational excellence. Data collection: Proficiency in gathering and preprocessing data from various sources to ensure high-quality datasets for analysis and modeling.

Posted 1 month ago

Apply

10.0 - 15.0 years

22 - 30 Lacs

Mumbai

Work from Office

At Amazon Ads, we sit at the intersection of Advertising, Media and eCommerce. With millions of customers visiting us every day to find, discover, and buy products, we believe that advertising, when done well, can enhance the value of the customer experience and generate a positive ROI for our advertising partners. We strive to make advertising relevant so that customers welcome it across Amazon s ecosystem of mobile and desktop websites, proprietary devices, and the Amazon Advertising Platform. If you re interested in innovative advertising solutions with a relentless focus on the customer, you ve come to the right place! This is a leadership role reporting into the head of advertising sales. You should be comfortable with building structure in a high growth and ambiguous environment, build and implement a strategy that helps us maximize our revenue, and ability to work with multiple stakeholders / teams. As a Monetization Strategy Leader, you will: Develop and own the monetization strategy and initiatives for shaping the advertising sales GTM approach. This will require you to analyze audience data, engagement metrics, and revenue performance to identify opportunities for growth and optimization. Evaluate new ideas/strategies for ad-based monetization. Define, build the strategy, get buy-in from senior leadership, define the approach and and then execute the plan to bring it to viability. Leverage market research to set strategic business goals and work with India Advertising leader, LCS, GCS and indirect revenue teams to establish CPMs and pricing models for packages You will partner closely with the content, marketing and brand solutions team to develop and present compelling opportunities to brands to participate in larger / tentpole shows Establish KPIs and reporting mechanisms to track success of monetization initiatives Build and own client success metrics and present industry insights to key brands/advertisers Experience managing teams Experience using data and metrics to drive improvements Experience with Excel or Tableau (data manipulation, macros, charts and pivot tables) Experience driving direction and alignment with cross-functional teams 10+ years in media sales, monetization strategy or owning ad revenue targets Overall 15+ years of non-internship experience Bachelors or Masters from a reputed university Experience in inter-team working skills to develop partnership with partner, tech team and operating at leadership levels MBA

Posted 1 month ago

Apply

2.0 - 5.0 years

5 - 6 Lacs

Bengaluru

Work from Office

Hiring for Power bi Developer - Bangalore As a Power BI developer, your primary role will be to deliver business intelligence services, lead BI software development, and present Power BI reports You will transform raw data into cohesive, valuable reports capturing meaningful business insights Here are some of your potential responsibilities:Designing and developing Power BI reports and dashboards to meet the business stakeholdersneedsGathering and understanding business requirements for data visualization and analysisCollaborating with data engineers and analysts to acquire, clean, and transform data for reporting purposes Creating complex DAX calculations and measures to support data analysisEnsuring data security and compliance with best practices Troubleshooting and resolving issues in Power BI reportsProviding training and support to end users on using Power BIKeeping up-to-date with the latest Power BI features and trends Required Power BI developer requirements, qualifications & skills:Proficiency in Power BI development, including report and dashboard creationStrong understanding of data modeling and data visualization conceptsExperience with SQL for data manipulation and extractionKnowledge of Data Analysis Expressions (DAX) for creating calculationsFamiliarity with data warehouse conceptsExcellent attention to detail and problem-solving skillsExcellent communication and collaboration skillsAbility to work independently and as a part of a teamAdaptability to changing business requirementsA bachelor?s degree in computer science, data analytics, or relevant fieldsPower BI certifications are a plus

Posted 1 month ago

Apply

4.0 - 6.0 years

3 - 7 Lacs

Nagar, Pune

Work from Office

Title : REF64648E - Python developer + Chatbot with 4 - 6 years exp - Pune/Mum/ BNG/ GGN/CHN Assistant Manager - WTS 4 - 6 years of experience as a Python Developer with a strong understanding of Python programming concepts and best practice Bachelors Degree/B.Tech/B.E in Computer Science or a related discipline Design, develop, and maintain robust and scalable Python-based applications, tools, and frameworks that integrate machine learning models and algorithms Demonstrated expertise in developing machine learning solutions, including feature selection, model training, and evaluation Proficiency in data manipulation libraries (e.g., Pandas, NumPy) and machine learning frameworks (e.g., Scikit-learn, TensorFlow, PyTorch, Keras) Experience with web frameworks like Django or Flask Contribute to the architecture and design of data-driven solutions, ensuring they meet both functional and non-functional requirements Experience with databases such as MS-SQL Server, PostgreSQL or MySQL. Solid knowledge of OLTP and OLAP concepts Experience with CI/CD tooling (at least Git and Jenkins) Experience with the Agile/Scrum/Kanban way of working Self-motivated and hard-working Knowledge of performance testing frameworks including Mocha and Jest. Knowledge of RESTful APIs. Understanding of AWS and Azure Cloud services Experience with chatbot and NLU / NLP based application is required Qualifications Bachelors Degree/B.Tech/B.E in Computer Science or a related discipline Job Location

Posted 1 month ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Python Developer Location: Bengaluru, Karnataka, India Experience Level: 35 Years Employment Type: Full-Time Role Overview We are seeking a skilled Python Developer with a strong background in data manipulation and analysis using NumPy and Pandas, coupled with proficiency in SQL. The ideal candidate will have experience in building and optimizing data pipelines, ensuring efficient data processing and integration. Key Responsibilities Develop and maintain robust data pipelines and ETL processes using Python, NumPy, and Pandas. Write efficient SQL queries for data extraction, transformation, and loading. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Implement data validation and quality checks to ensure data integrity. Optimize existing codebases for performance and scalability. Document processes and maintain clear records of data workflows. Required Qualifications Bachelors degree in Computer Science, Engineering, or a related field. 25 years of professional experience in Python development. Proficiency in NumPy and Pandas for data manipulation and analysis. Strong command of SQL and experience with relational databases like MySQL, PostgreSQL, or SQL Server. Familiarity with version control systems, particularly Git. Experience with data visualization tools and libraries is a plus. Preferred Skills Experience with data visualization libraries such as Matplotlib or Seaborn. Familiarity with cloud platforms like AWS, Azure, or GCP. Knowledge of big data tools and frameworks like Spark or Hadoop. Understanding of machine learning concepts and libraries. Why Join Enterprise Minds Enterprise Minds is a forward-thinking technology consulting firm dedicated to delivering next-generation solutions. By joining our team, you'll work on impactful projects, collaborate with industry experts, and contribute to innovative solutions that drive business transformation.

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Pune

Work from Office

The Apex Group was established in Bermuda in 2003 and is now one of the worlds largest fund administration and middle office solutions providers. Our business is unique in its ability to reach globally, service locally and provide cross-jurisdictional services. With our clients at the heart of everything we do, our hard-working team has successfully delivered on an unprecedented growth and transformation journey, and we are now represented by over circa 13,000 employees across 112 offices worldwide.Your career with us should reflect your energy and passion. Thats why, at Apex Group, we will do more than simply empower you. We will work to supercharge your unique skills and experience. Take the lead and well give you the support you need to be at the top of your game. And we offer you the freedom to be a positive disrupter and turn big ideas into bold, industry-changing realities. For our business, for clients, and for you Job Summary: We are seeking a highly skilled and motivated Operational Reporting Specialist to join our dynamic Operations team. The ideal candidate will have extensive experience with Microsoft tools, particularly PowerBI, SharePoint, MS Excel, and MS Lists. They will be capable of not only using these tools but also understanding and interpreting business requirements. This role requires a proactive individual who can think independently and contribute to the continuous improvement of our reporting processes. Key Responsibilities: Develop, maintain, and enhance operational reports and dashboards using PowerBI, SharePoint, MS Excel, and MS Lists. Collaborate with various departments to gather and understand business requirements and translate them into effective reporting solutions. Analyze data to identify trends, patterns, and insights that can drive business decisions. Ensure data accuracy and integrity in all reports and dashboards. Manipulate and transform data to create meaningful and actionable insights. Provide training and support to team members on the use of PowerBI, SharePoint, MS Excel, and MS Lists. Continuously seek opportunities to improve reporting processes and tools. Respond to ad-hoc reporting requests and provide timely and accurate information. Skills Required: Proven experience with Microsoft PowerBI, SharePoint, MS Excel, and MS Lists. Strong analytical and problem-solving skills. Proficiency in data manipulation and transformation. Data-oriented mindset with a keen eye for detail. Ability to understand and interpret business requirements. Excellent communication and interpersonal skills. Ability to work independently and as part of a team. Detail-oriented with a focus on data accuracy and quality. Qualifications and Experience: Bachelors degree in Business, Information Technology, or a related field. 5 - 10 years of relevant experience in an operational reporting role. DisclaimerUnsolicited CVs sent to Apex (Talent Acquisition Team or Hiring Managers) by recruitment agencies will not be accepted for this position. Apex operates a direct sourcing model and where agency assistance is required, the Talent Acquisition team will engage directly with our exclusive recruitment partners.

Posted 1 month ago

Apply

8.0 - 13.0 years

37 - 45 Lacs

Hyderabad

Hybrid

Role Summary Join Quantiums India leadership team in shaping the future of data-drive business operations. The Data Systems Lead (India) is pivotal in executing Quantium's internal data and systems transformation strategy, working as part of a globally distributed team to ensure our corporate data ecosystem supports business growth and operational excellence. This role will lead and develop the India-based analytics operations team, working closely with the Australia-based leadership team to deliver high-quality data integration, system maintenance, and analytical solutions across Quantium's internal systems ecosystem. The successful candidate will combine strong technical analytics engineering capabilities with stakeholder management skills, enabling them to work autonomously with business stakeholders when requirements need clarification whilst maintaining close collaboration with the Australian leadership team. This role is critical to Quantium's India growth strategy, building local capability whilst maintaining global standards and connectivity. The position is designed to support both professional excellence and personal fulfilment, with flexible working arrangements and a commitment to inclusive leadership development. Key Responsibilities Team Leadership & Talent Development Guide and develop a team of 3-5 analysts focused on internal data systems and operations Recruit and nuture local India talent, including analysts similar to current team members Foster a culture of continuous improvement and operational excellence within the India operations team Partner closely with Australian squad leaders on resource planning and capability development Data Systems & Analytics Engineering Execute and enhance data integration strategies across Quantium's internal systems ecosystem (Kantata, HubSpot, Anaplan, HRMS, Finance systems) Orchestrate ETL processes, data quality monitoring, and system maintenance operations Ensure data consistency and integrity across the corporate data warehouse Drive automation of manual processes and implement operational efficiencies Stakeholder Engagement Collaborate with Australian leadership team to understand and translate business requirements Engage directly with internal business stakeholders across verticals when requirements need clarification Participate actively in regular forums with Finance, People & Culture, and vertical teams to gather feedback and requirements Support change management initiatives related to systems transformation Operational Excellence Coordinate day-to-day operations of internal data systems and reporting processes Monitor and resolve data quality issues, implementing systematic approaches to problem-solving Support month-end financial processes, management reporting, and compliance requirements Maintain adherence to Quantium's data governance and security standards Key activities Strategic Execution Execute business plans for foundational internal data assets including corporate data management, system integrations, and operational reporting Support the development of the internal data and systems roadmap based on business requirements Coordinate with global teams on system enhancements and new implementations Team Management Recruit and develop local analytics talent with strong ETL and data integration capabilities Create development plans and objectives aligned with operational excellence and business support objectives Provide coaching and mentorship to develop team members' technical and stakeholder management capabilities Build inclusive teams that leverage diverse perspectives and experiences Manage resource allocation across multiple internal projects and business-as-usual operations Technical Delivery Oversee technical designs for data integration solutions and system enhancements Ensure quality standards are maintained across all data processes and deliverables Champion best practices for analytics engineering and data operations Support troubleshooting and resolution of complex data and system issues Qualifications Essential Requirements Analytics Engineering Foundation : Strong experience in ETL development, data integration, and analytics engineering practices Enterprise Systems Knowledge : Previous experience working with ERP systems, enterprise applications, or corporate data ecosystems Team Leadership and Development : Proven experience managing and developing analytics teams (3+ people) Data Integration Expertise : Strong understanding of data structures, data transformation, and system integration patterns Stakeholder Partnership : Demonstrated ability to work with business stakeholders and translate requirements into technical solutions Cross-cultural Communication : Comfort working across time zones and cultural contexts Highly Desirable Experience with enterprise applications such as Kantata (or similar PSA tools), HubSpot, Anaplan, or HRMS systems Background in finance or HR data management and reporting Experience in professional services or consulting environments Previous experience working in globally distributed teams Understanding of data governance and compliance requirements Experience mentoring and developing junior team members Skills Required Data Engineering : Proficiency in SQL, data transformation tools, and ETL processes Analytics Tools : Experience with data analysis and reporting tools System Integration : Understanding of API integrations and data warehouse concepts Problem Solving : Strong systematic approach to troubleshooting data and system issues Data Quality : Experience implementing data quality monitoring and improvement processes

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Pune, Chennai, Bengaluru

Hybrid

Job Opportunity with Hexaware Technologies ! i am looking for Tibco Spotfire consultant with 4+ years and Immediate joiner only Interested Resources, please share your Details to manojkumark2@hexaware.com Total IT Exp: Exp in Spotfire: CCTC & ECTC: Date of Join in Hexaware: Location: Key Responsibilities: • Design, develop, and implement interactive dashboards and reports using TIBCO Spotfire Strong knowledge of TIBCO Spotfire and data visualization techniques. Experience with SQL and data manipulation. • Familiarity with ETL processes and data integration. • Excellent problem-solving and analytical skills &Strong communication and collaboration skills. •

Posted 1 month ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Chennai

Work from Office

Senior Software Engineer Are you an experienced developer with a can do attitude and enthusiasm that inspires others Do you enjoy being part of team that develops high-quality code About the Business At Cirium , our goal is to keep the world connected. We are the industry leader in aviation analytics, helping our customers understand the past, present, and predict what will happen tomorrow. Our mission is to transform the aviation industry by enabling airlines, airports, travel companies, tech giants, aircraft manufacturers, financial institutions, and many more to accelerate their own digital transformation. You can learn more about Cirium at the link below. About the Role: We are looking for a Software Engineer to join our team. In this role, you will engage in complex research, design, and software development assignments within various software areas or product lines. You will also contribute directly to project plans, schedules, and methodologies in the development of cross-functional software products. RESPONSIBILITIES: Collaborate with team members to finalize requirements Write and review detailed specifications for the development of complex system components. Complete bug fixes Translate product requirements into software designs Implement development processes, coding best practices, and code reviews Work in various development environments (e.g., Agile, Waterfall) Resolve technical issues as necessary Mentor junior software engineers, sharing knowledge and supporting their growth Stay informed about new technology developments Design and work with complex data models Requirements : Several years of software engineering experience or equivalent Knowledge of software development methodologies (e.g., Agile, Waterfall) Proficiency with data manipulation languages and optimization techniques Understanding of normalized/dimensional data modeling principles Knowledge of multiple data storage subsystems Experience with development languages and tools (e.g., Java, Docker, Kubernetes) Strong research skills Knowledge of industry best practices in development Ability to use and develop applicable tool sets Ability to collaborate effectively with team members to finalize requirements Ability to complete bug fixes Good oral and written communication skills Learn more about the LexisNexis Risk team and how we work here We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our Applicant Request Support Form or please contact 1-855-833-5120. Criminals may pose as recruiters asking for money or personal information. We never request money or banking details from job applicants. Learn more about spotting and avoiding scams here . Please read our Candidate Privacy Policy . We are an equal opportunity employer: qualified applicants are considered for and treated during employment without regard to race, color, creed, religion, sex, national origin, citizenship status, disability status, protected veteran status, age, marital status, sexual orientation, gender identity, genetic information, or any other characteristic protected by law. USA Job Seekers: EEO Know Your Rights .

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise AWS Data Vault 2.0 development mechanism for agile data ingestion, storage and scaling Databricks for complex queries on transformation, aggregation, business logic implementation aggregation, business logic implementation AWS Redshift and Redshift spectrum, for complex queries on transformation, aggregation, business logic implementation DWH Concept on star schema, Materialize view concept. Strong SQL and data manipulation/transformation skills Preferred technical and professional experience Robust and Scalable Cloud Infrastructure End-to-End Data Engineering Pipeline Versatile Programming Capabilities

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Gurugram

Work from Office

Hello Visionary! We know that the only way a business thrive is if our people are growing. That’s why we always put our people first. Our global, diverse team would be happy to support you and challenge you to grow in new ways. Who knows where our shared journey will take you We are looking for Senior Tableau Developer We are seeking a highly skilled Senior Tableau Developer with a minimum of 5 years of relevant experience to join our team. The ideal candidate will be responsible for designing, developing, and maintaining Tableau dashboards and reports to support business decision-making processes. You’ll make a difference by Develop and maintain Tableau dashboards and reports. Collaborate with business stakeholders to gather requirements and translate them into effective visualizations. Optimize and enhance existing Tableau solutions for better performance and usability. Provide training and support to end-users on Tableau functionalities. Ensure data accuracy and integrity in all reports and dashboards. Drive report requirements and specifications, provide feasibility analysis, and effort estimation. Manage Tableau report access, deployment, and development with best practices. Provide daily/weekly project status updates to Project Managers. Create or update necessary documentation as per project requirements. Collaborate with the team and all stakeholders. You’ll win us over by Expert in developing innovative and complex dashboards in Tableau. Strong understanding of data visualization principles. Proficiency in SQL and data manipulation. Excellent analytical and problem-solving skills. Working knowledge of databases like Snowflake, and Oracle. Extensive experience in writing complex database queries using SQL. Demonstrated experience in creating and presenting data, dashboards, and analysis to the management team with the ability to explain complex analytical concepts. Good to Have Tableau certification such as Desktop Specialist/Professional. Working experience in Agile methodologies. Experience with other BI tools like Power BI. Strong skills in Microsoft Excel and PowerPoint. AWS know – how Experience in the Finance domain. Data security and handling expertise. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We are dedicated to equality and welcome applications that reflect the diversity of the communities we serve. All employment decisions at Siemens are based on qualifications, merit, and business need. Bring your curiosity and imagination, and help us shape tomorrow Find out more about Siemens careers at www.siemens.com/careers

Posted 1 month ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Chennai

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data integration and ETL tools.- Strong understanding of data modeling and database design principles.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Chennai

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data integration and ETL tools.- Strong understanding of data modeling and database design principles.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data pipeline orchestration tools such as Apache Airflow or similar.- Strong understanding of ETL processes and data warehousing concepts.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data pipeline orchestration tools such as Apache Airflow or similar.- Strong understanding of ETL processes and data warehousing concepts.- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data is accessible, reliable, and actionable for stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture and data models.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and ETL processes.- Experience with data quality frameworks and data governance practices.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies