Jobs
Interviews

1286 Relational Databases Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an Engineer at Pramana, Inc., you will be responsible for evaluating analysis, problem definition, requirements, solution development, and proposing solutions to determine operational feasibility. You will showcase your solutions through the development of documentation, flowcharts, layouts, diagrams, charts, code comments, and clear code. Designing system specifications, standards, and programming to prepare and install solutions will be a key part of your role. Conducting systems analysis to improve operations and recommending changes in policies and procedures will be crucial. You will be responsible for obtaining and licensing software from vendors, recommending purchases, and testing and approving products. Staying updated with the latest development tools and programming techniques will be essential to enhance your job knowledge. You will participate in educational opportunities, read professional publications, and ensure the confidentiality of operations. Collecting, analyzing, and summarizing development and service issues to provide information and accomplishing engineering and organization missions will also be part of your responsibilities. Your expertise in software development using Python and C++, designing interactive applications, knowledge of relational databases and SQL, experience with test-driven development, proficiency in software engineering tools, and ability to document requirements and specifications will be essential for success in this role. To qualify for this position, you should hold a Bachelors/Masters degree in Computer Science Engineering or an equivalent field. Join us at Pramana, Inc. to contribute to our vision of bringing fully autonomous systems to pathology labs and enabling AI-enabled workflows for serving patients. Visit https://pramana.ai/ to learn more about us.,

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

delhi

On-site

You are invited to join our team at ThoughtSol Infotech Pvt.Ltd as a skilled and experienced Power BI Developer. If you have a passion for data visualization, analytics, and business intelligence, along with a strong background in developing and implementing Power BI solutions, we are looking for you. Your responsibilities will include developing and maintaining Power BI dashboards, reports, and data visualizations to meet business requirements. You will design and implement data models, ETL processes, and data integration solutions using Power BI and related technologies. Collaborating with business stakeholders to gather requirements, understand data needs, and deliver actionable insights will be a key part of your role. It will also be essential to optimize performance and usability of Power BI solutions through data modeling, query optimization, and UI/UX enhancements. Implementing data governance and security best practices to ensure data accuracy, integrity, and confidentiality is another crucial aspect of the position. Additionally, you will provide training and support to end users on Power BI usage, best practices, and troubleshooting. Staying updated on the latest Power BI features, trends, and best practices will be necessary to recommend improvements to existing solutions. To qualify for this role, you should have a minimum of 2 years of experience and hold a Bachelor's degree in Computer Science, Information Systems, or a related field. Proficiency in Power BI Desktop, Power Query, DAX, and Power BI Service is required. A strong understanding of data warehousing concepts, data modeling techniques, and ETL processes is essential. Experience with SQL, T-SQL, and relational databases (e.g., SQL Server, MySQL, PostgreSQL) is also necessary. Familiarity with Azure services (e.g., Azure SQL Database, Azure Data Lake, Azure Analysis Services) is considered a plus. Excellent analytical, problem-solving, and communication skills are a must, along with the ability to work independently and collaboratively in a fast-paced environment.,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Bengaluru

Work from Office

Project description We are seeking a highly skilled Data Modelling Expert with deep experience in the Avaloq Core Banking platform to join our technology team. The ideal candidate will be responsible for designing, maintaining, and optimizing complex data models that support our banking products, client data, and regulatory reporting needs. This role requires a blend of domain expertise in banking and wealth management, strong data architecture capabilities, and hands-on experience working with the Avaloq platform. Responsibilities Design, implement, and maintain conceptual, logical, and physical data models within the Avaloq Core Banking system. Collaborate with business analysts, product owners, and Avaloq parameterisation teams to translate business requirements into robust data models. Ensure alignment of data models with Avaloq's object model and industry best practices. Perform data profiling, quality checks, and lineage tracing to support regulatory and compliance requirements (e.g., Basel III, MiFID II, ESG). Support integration of Avaloq data with downstream systems (e.g., CRM, data warehouses, reporting platforms). Provide expert input on data governance, metadata management, and model documentation. Contribute to change requests, upgrades, and data migration projects involving Avaloq. Collaborate with cross-functional teams to drive data consistency, reusability, and scalability. Review and validate existing data models, identify gaps or optimisation opportunities. Ensure data models meet performance, security, and privacy requirements. Skills Must have Proven experience (5+ years) in data modelling or data architecture, preferably within financial services. 3+ years of hands-on experience with Avaloq Core Banking Platform, especially its data structures and object model. Strong understanding of relational databases and data modelling tools (e.g., ER/Studio, ERwin, or similar). Proficient in SQL and data manipulation in Avaloq environments. Knowledge of banking products, client lifecycle data, and regulatory data requirements. Familiarity with data governance, data quality, and master data management concepts. Experience working in Agile or hybrid project delivery environments. Nice to have Exposure to Avaloq Scripting or parameterisation is a strong plus. Experience integrating Avaloq with data lakes, BI/reporting tools, or regulatory platforms. Understanding of data privacy regulations (GDPR, FINMA, etc.). Certification in Avaloq or relevant financial data management domains is advantageous.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

3 - 7 Lacs

Mumbai

Work from Office

Objectives Of This Role Design and implement efficient, scalable backend services using Python. Work closely with healthcare domain experts to create innovative and accurate diagnostics solutions. Build APIs, services, and scripts to support data processing pipelines and front-end applications. Automate recurring tasks and ensure robust integration with cloud services. Maintain high standards of software quality and performance using clean coding principles and testing practices. Collaborate within the team to upskill and unblock each other for faster and better outcomes. Primary Skills Python Development Proficient in Python 3 and its ecosystem Frameworks Flask/Django/FastAPI RESTful API development Understanding of OOPs and SOLID design principles Asynchronous programming (asyncio, aiohttp) Experience with task queues (Celery, RQ) Database & Storage Relational Databases: PostgreSQL / MySQL NoSQL : MongoDB / Redis / Cassandra ORM Tools : SQLAlchemy / Django ORM Bonus Skills Data handling libraries: Pandas / NumPy Experience with scripting: Bash / PowerShell Functional programming concepts Familiarity with front-end integration (REST API usage, JSON handling) Other Skills Innovation and thought leadership Interest in learning new tools, languages, workflows Strong communication and collaboration skills Basic understanding of UI/UX principles Skills Python, Algorithms, Flask, Django and MongoDB

Posted 3 weeks ago

Apply

2.0 - 7.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Go Programming Language Good to have skills : Java Full Stack DevelopmentMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, while also performing maintenance and enhancements to existing applications. You will be responsible for delivering high-quality code and contributing to the overall success of the projects you are involved in, ensuring that client requirements are met effectively and efficiently. Roles & Responsibilities:Develop, test, and maintain scalable back-end services, APIs, and microservices; Full unit test coverage is mandatory.Design and implement robust, secure, and reliable systems to handle complex workflows.Collaborate with US based cross-functional teams to gather requirements and create solutions tailored to Enterprise needs.Optimize existing back-end systems for performance, scalability, and maintainability.Implement and enforce best practices for code quality, testing, deployment, and documentation.Troubleshoot and resolve back-end system issues, ensuring high availability. Professional & Technical Skills: 5+ years of professional software engineering experience in a product-oriented, live production environment.2+ years developing Golang software.2+ years experience building on microservices architecture with AWS Strong background in building scalable, reliable, and low-latency systems.Experience with back-end frameworks including GraphQL and Rest.Proficiency in working with relational databases like PostgreSQL.Strong understanding of modern web protocols, security concerns, and system integrations. Additional Information:- The candidate should have minimum 3 years of experience in Go Programming Language.- This position is based at our Hyderabad office.- B.Tech required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

You should have a minimum of 8 years of experience in developing web applications using React Native. It is essential to possess experience in leading a team and expertise in Micro frontend. A solid understanding of ES6, Typescript, and mobile architecture is required for this role. You should have hands-on experience in writing unit test cases using Jest and conducting end-to-end testing. Proficiency in RN state management libraries is also necessary. In addition, you must have a strong knowledge of iOS and Android platforms, as well as experience in working with git repositories. The ability to troubleshoot effectively and analyze technical issues, along with skills in web performance optimization, is highly valued. A good understanding of REST APIs and relational or NoSQL databases is crucial for this position. Familiarity with build tools and unit testing is also expected.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Engineer, you will be responsible for designing, building, and maintaining scalable ETL pipelines using Java and SQL-based frameworks. Your role involves extracting data from various structured and unstructured sources, transforming it into formats suitable for analytics and reporting, and collaborating with data scientists, analysts, and business stakeholders to gather data requirements and optimize data delivery. Additionally, you will develop and maintain data models, databases, and data integration solutions, while monitoring data pipelines and troubleshooting data issues to ensure data quality and integrity. Your expertise in Java for backend/ETL development and proficiency in SQL for data manipulation, querying, and performance tuning will be crucial in this role. You should have hands-on experience with ETL tools such as Apache NiFi, Talend, Informatica, or custom-built ETL pipelines, along with familiarity with relational databases like PostgreSQL, MySQL, Oracle, and data warehousing concepts. Experience with version control systems like Git is also required. Furthermore, you will be responsible for optimizing data flow and pipeline architecture for performance and scalability, documenting data flow diagrams, ETL processes, and technical specifications, and ensuring adherence to security, governance, and compliance standards related to data. To qualify for this position, you should hold a Bachelor's degree in computer science, Information Systems, Engineering, or a related field, along with at least 5 years of professional experience as a Data Engineer or in a similar role. Your strong technical skills and practical experience in data engineering will be essential in successfully fulfilling the responsibilities of this role.,

Posted 3 weeks ago

Apply

6.0 - 12.0 years

0 Lacs

karnataka

On-site

You will be leading a cross-functional team as a Team Lead Software Development, responsible for developing scalable trading tools, APIs, and platforms. Your role will involve collaborating closely with product managers and clients to deliver both in-house solutions and customized implementations for our client base in the trading domain. This position is a unique blend of product ownership, client interaction, team management, and hands-on technical delivery, making it ideal for individuals who excel in a fast-paced finance-tech environment. As a Team Lead, your key responsibilities will include leading and mentoring a team of junior and mid-level developers, prioritizing development efforts between product roadmap features and custom client requests, overseeing the design, development, and delivery of web, mobile, or backend applications, collaborating with various teams to plan and prioritize development activities, architecting scalable solutions, reviewing code, ensuring high standards in performance, security, and reliability, managing technical delivery across multiple projects, engaging with stakeholders for requirements gathering, sprint planning, and roadmap alignment, monitoring timelines, conducting risk assessments, staying current with emerging technologies, and proposing innovations to enhance productivity or user experience. To be successful in this role, you should possess a Bachelor's degree in Computer Science, Software Engineering, or a related field from a prestigious institution like IITs/NITs, a minimum of 6+ years of professional software development experience, strong knowledge of programming languages and DSA concepts, along with 12+ years of experience in a team lead or technical leadership role. Additionally, exposure to the product lifecycle, hands-on experience with automated testing frameworks, familiarity with relational and NoSQL databases, expertise in REST APIs, microservices, version control, proficiency in Agile/Scrum methodologies, CI/CD pipelines, and DevOps collaboration are essential qualifications. Knowledge of financial markets, trading concepts, algorithms, quantitative analysis, statistics, risk management, order execution systems, FIX Protocol, market data feeds, competitive coding platforms, and strong mathematical and statistical skills would be advantageous. In this role, your problem-solving and analytical skills, ability to work under pressure in a fast-paced environment, as well as your excellent communication, teamwork, and leadership skills will be key to effectively leading your team and driving successful outcomes.,

Posted 3 weeks ago

Apply

10.0 - 15.0 years

20 - 35 Lacs

Noida, Bengaluru

Work from Office

Description: We are looking for a Python Developer with working knowledge of ETL workflow. Experience in data extraction using APIs and writing queries in PostgreSQL is mandatory. Requirements: Need a Python that has good EXperience in Python programming and problem solving Should be good in Data Structure and implementation. Shoudl be good in Data base i.e. relation Database and SQL. Should be proficient in requirements and implementation Should have a degreee in Computer science Should have good communication, prioritization, organization skills Should be keen on learning and upskilling Job Responsibilities: Need a Python that has good Experiencein Python programming and problem solving Should be good in Data Structure and implementation. Shoudl be good in Data base i.e. relation Database and SQL. Should be proficient in requirements and implementation Should have a degreee in Computer science Should have good communication, prioritization, organization skills Should be keen on learning and upskilling What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 3 weeks ago

Apply

5.0 - 10.0 years

19 - 25 Lacs

Hyderabad, Gurugram, Bengaluru

Work from Office

As a full spectrum integrator, we assist hundreds of companies to realize the value, efficiency, and productivity of the cloud. We take customers on their journey to enable, operate, and innovate using cloud technologies from migration strategy to operational excellence and immersive transformation. If you like a challenge, youll love it here, because were solving complex business problems every day, building and promoting great technology solutions that impact our customers success. The best part is, were committed to you and your growth, both professionally and personally. You will be part of a team designing, automating, and deploying services on behalf of our customers to the cloud in a way that allows these services to automatically heal themselves if things go south. We have deep experience applying cloud architecture techniques in virtually every industry. Every week is different and the problems you will be challenged to solve are constantly evolving. We build solutions using infrastructure-as-code so our customers can refine and reuse these processes again and again - all without having to come back to us for additional deployments. Key Responsibilities Create well-designed, documented, and tested software features that meet customer requirements. Identify and address product bugs, deficiencies, and performance bottlenecks. Participate in an agile delivery team, helping to ensure the technical quality of the features delivered across the team, including documentation, testing strategies, and code. Help determine technical feasibility and solutions for business requirements. Remain up-to-date on emerging technologies and architecture and propose ways to use them in current and upcoming projects. Leverage technical knowledge to cut scope while maintaining or achieving the overall goals of the product. Leverage technical knowledge to improve the quality and efficiency of product applications and tools. Willingness to travel to client locations and deliver professional services Qualifications Experience developing software in GCP, AWS, or Azure 5+ yrs experience developing applications in Java 3+ years required with at least one other programming language such as , Scala, Python, Go, C#, Typescript, Ruby. Experience with relational databases, including designing complex schemas and queries. Experience developing within distributed systems or a microservice based architecture. Strong verbal and written communication skills for documenting workflows, tools, or complex areas of a codebase. Ability to thrive in a fast-paced environment and multi-task efficiently. Strong analytical and troubleshooting skills. 3+ years of experience as a technical specialist in Customer-facing roles Experience with Agile development methodologies Experience with Continuous Integration and Continuous Delivery (CI/CD) Preferred Qualifications Experience with GCP Building applications using Container and serverless technologies Cloud Certifications Good exposure to Agile software development and DevOps practices such as Infrastructure as Code (IaC), Continuous Integration and automated deployment Exposure to Continuous Integration (CI) tools (e.g. Jenkins) Strong practical application development experience on Linux and Windows-based systems Experience working directly with customers, partners or third-party developers Location- Remote,Bangalore,Gurgaon,Hyderabad

Posted 3 weeks ago

Apply

5.0 - 10.0 years

12 - 16 Lacs

Kolkata

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your role Having 5+ years of experience in creating data strategy frameworks/ roadmaps Having relevant experience in data exploration & profiling, involve in data literacy activities for all stakeholders. 5+ years in Analytics and data maturity evaluation based on current AS-is vs to-be framework. 5+ years Relevant experience in creating functional requirements document, Enterprise to-be data architecture. Relevant experience in identifying and prioritizing use case by for business; important KPI identification opex/capex for CXO's 2+ years working knowledge in Data StrategyData Governance/ MDM etc 4+ year experience in Data Analytics operating model with vision on prescriptive, descriptive, predictive , cognitive analytics Identify, design, and recommend internal process improvementsautomating manual processes, optimizing data delivery, re- designing infrastructure for greater scalability, etc. Identify data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to create frameworks for digital twins/ digital threads. Relevant experience in co-ordinating with cross functional team ; aka SPOC for global master data Your Profile 8+ years of experience in a Data Strategy role, who has attained a Graduate degree in Computer Science, Informatics, Information Systems, or another quantitative field. They should also have experience using the following software/tools: Experience with understanding big data toolsHadoop, Spark, Kafka, etc. Experience with understanding relational SQL and NoSQL databases, including Postgres and Cassandra/Mongo dB. Experience with understanding data pipeline and workflow management toolsLuigi, Airflow, etc. Good to have cloud skillsets (Azure/ AWS/ GCP), 5+ years of Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.Postgres/ SQL/ Mongo 2+ years working knowledge in Data StrategyData Governance/ MDM etc. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. A successful history of manipulating, processing, and extracting value from large, disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Responsibilities / Tasks Full Stack Developer The candidate should be experienced in both Frontend and Backend development using TypeScript, with a strong focus on building scalable web applications in a modern monorepo environment Required Skills Minimum 2-5 years of professional experience in Full stack development Proficiency in TypeScript Solid experience in Backend Development Experience with NestJS o Alternatively: ExpressJS with a good understanding of dependency injection Strong understanding of relational Databases Proficiency in Angular Experience with RxJS o If not yet proficient in RxJS, the candidate must commit to learning it before starting, as it is a core part of the frontend codebase Nice to Have (or Willingness to Learn) Experience with TypeORM Familiarity with Joi for input validation Experience using @nestjs/swagger for API documentation Even if there is no prior experience with object-relational mapping, input validation, or Swagger, these can be learned quickly with a solid foundation in backend development Your Profile / Qualifications Required Skills Minimum 2-5 years of professional experience in Full stack development Proficiency in TypeScript Solid experience in Backend Development Experience with NestJS o Alternatively: ExpressJS with a good understanding of dependency injection Strong understanding of relational Databases Proficiency in Angular Experience with RxJS o If not yet proficient in RxJS, the candidate must commit to learning it before starting, as it is a core part of the frontend codebase Did we spark your interest Then please click apply above to access our guided application process.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

7 - 15 Lacs

Bengaluru, Karnataka, India

On-site

Pradeepit Consulting Services is actively seeking a Java AWS Full Stack Developer to join our dynamic team and embark on a rewarding career journey. This pivotal role requires a seasoned developer with expertise in Java and Spring Boot , coupled with significant experience in building and managing highly scalable cloud-based applications on the AWS platform . You'll be instrumental in designing, developing, and maintaining both backend and frontend components, ensuring security, performance, and seamless integration. Key Responsibilities Application Design & Development : Design, develop, and maintain highly scalable cloud-based applications using Java and Spring Boot on the AWS platform . Collaboration & Requirements : Collaborate with cross-functional teams to gather requirements, analyze business needs, and translate them into robust technical solutions. Cloud Best Practices : Implement best practices for cloud application development, including security, scalability, and performance optimization . API & Microservices Management : Develop and manage APIs, microservices, and serverless components on AWS . AWS Service Configuration : Set up and configure AWS services , such as EC2, Lambda, S3, RDS, DynamoDB, and API Gateway , to support application functionalities. Security Implementation : Ensure application and data security by implementing appropriate AWS security measures and encryption techniques . Performance Monitoring & Troubleshooting : Monitor application performance, troubleshoot issues, and implement necessary improvements to maintain high availability and reliability. DevOps & CI/CD : Collaborate with DevOps teams to set up CI/CD pipelines for automated deployment and continuous integration of the applications. Continuous Learning : Stay updated with the latest AWS services, tools, and best practices , and recommend their adoption to improve development processes. Documentation : Write clear and comprehensive technical documentation for applications, APIs, and AWS infrastructure. UI Development : Hands-on with UI development , with a preference for ReactJS . Requirements (Mandatory) Proven experience as a Java Developer , with hands-on expertise in Spring Boot . Proficiency in developing RESTful APIs and microservices architectures . Experience with relational databases (e.g., MySQL, PostgreSQL). Familiarity with containerization technologies such as Docker. Understanding of DevOps principles and experience with CI/CD pipelines (e.g., Jenkins, GitLab CI/CD). Knowledge of application security practices and implementing secure AWS solutions . Hands-on experience with UI development , preferably ReactJS . Strong problem-solving skills and the ability to work independently or in a team environment. Understanding of performance monitoring and debugging tools for cloud applications. Familiarity with Bosch Quality Process . Preferred Skills Experience with NoSQL databases (e.g., DynamoDB, MongoDB). AWS certifications demonstrating expertise in AWS services. Experience with serverless computing using AWS Lambda . Knowledge of infrastructure as code tools like AWS CloudFormation or Terraform. Qualifications Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Java Developer with a strong focus on AWS and Full Stack development. Typically seeking candidates with 7-10 years of experience.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are an experienced Alteryx developer responsible for designing and developing new applications and enhancing existing models using Alteryx Design Studio. You will be involved in the entire Software Development Life Cycle (SDLC), requiring excellent communication skills and direct collaboration with the business. It is crucial that you are self-sufficient and adept at building internal networks within the business and technology teams. Your responsibilities include owning changes from inception to deployment, implementing new functionality, identifying process gaps for improvement, and focusing on scalability and stability. You must be results-oriented, self-motivated, and able to multitask across different teams and applications. Additionally, effective communication with remotely dispersed teams is essential for this role. Your technical expertise should include workflow enhancement, designing macros, integrating Alteryx with various tools, maintaining user roles in the Alteryx gallery, using version control systems like git, and working with multiple data sources compatible with Alteryx. You should possess advanced development and troubleshooting skills, document training and support, understand SDLC methodologies, have strong communication skills, be proficient in SQL database query tools, and comprehend data warehouse architecture. In addition to the technical requirements, you will need to have experience working in an Agile environment, managing ETL/ELT data load processes, knowledge of Cloud Infrastructure, and integration with data sources and relational databases. Being self-motivated, working independently, and collaborating as a team player are essential. Your analytical, problem-solving skills, ability to handle multiple stakeholders and queries, prioritize tasks, and meet prompt deadlines are crucial. A strong client service focus and willingness to respond promptly to queries and deliverables are expected. Preferred Skills: - DataAnalytics - Alteryx,

Posted 3 weeks ago

Apply

4.0 - 7.0 years

4 - 7 Lacs

Hyderabad, Telangana, India

On-site

Responsibilities: Backend Development: Develop and maintain robust backend services using Java, Spring Boot, and Hibernate/JPA . Frontend Development: Build dynamic and responsive user interfaces with Angular, TypeScript, HTML5, and CSS3 . API & Microservices: Design and implement RESTful APIs and contribute to Microservices architecture . Database Interaction: Work with relational databases such as MySQL, PostgreSQL, or Oracle , ensuring efficient data management. Version Control: Utilize Version Control Systems (Git) for collaborative code management. Agile Collaboration: Participate effectively in development cycles following Agile methodologies . Required Skills: Strong proficiency in Java, Spring Boot, and Hibernate/JPA . Proficiency in Angular, TypeScript, HTML5, and CSS3 . Experience with RESTful APIs and Microservices architecture . Good understanding of Relational Databases like MySQL, PostgreSQL, or Oracle. Familiarity with Version Control Systems (Git) . Knowledge of Agile methodologies .

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

At EG, we are dedicated to developing software solutions that enable our customers to focus on their profession while we handle the intricacies of technology. Our industry-specific software is crafted by professionals who understand the sector intricately, supported by the stability, innovation, and security provided by EG. We are on a mission to drive industries forward by addressing significant challenges such as resource optimization, efficiency enhancement, and sustainability promotion. With a thriving global workforce exceeding 3000 employees, including a 700+ strong team located in Mangaluru, India, we foster a people-first culture that encourages innovation, collaboration, and continuous learning. We invite individuals to join us in the journey of creating software that serves people rather than making people work for it. EG Healthcare, a division of EG, is dedicated to building intelligent solutions that enhance healthcare services across the Nordics and Europe. Our goal is to simplify complexities, empower care providers, and improve patient outcomes through technological innovation. Our core values revolve around collaboration, curiosity, and purpose-driven progress. As a Senior Software Developer at EG Healthcare, you will leverage your passion for software engineering, backed by over 8 years of experience, to develop impactful solutions using cutting-edge technologies. Your role will involve designing and implementing robust, scalable software solutions utilizing Java and associated technologies. You will be responsible for creating and maintaining RESTful APIs for seamless system integration, collaborating with diverse teams to deliver high-impact features, ensuring code quality through best practices and testing, and contributing to architectural decisions and technical enhancements. Key Responsibilities: - Design and develop robust, scalable software solutions using Java and related technologies. - Build and maintain RESTful APIs for seamless integration across systems. - Collaborate with cross-functional teams to deliver high-impact features. - Ensure code quality through best practices, automated testing, and peer reviews. - Utilize Docker, Kubernetes, and CI/CD pipelines for modern DevOps workflows. - Troubleshoot issues efficiently and provide timely solutions. - Contribute to architectural decisions and technical improvements. Must-Have Skills: - Proficiency in Java (8+ years of professional experience). - Experience with Spring Boot for backend service development. - Strong understanding of REST API design and implementation. - Familiarity with Docker and Kubernetes. - Exposure to Event Sourcing / CQRS patterns. - Hands-on experience with front-end technology, preferably React.js. - Proficient in relational databases, Git, and testing practices. Good-to-Have Skills: - Knowledge of ElasticSearch for advanced search and indexing. - Experience with Axon Framework for distributed systems in Java. - Familiarity with tools like Grafana for observability. - Exposure to ArgoCD for GitOps-based deployments. - Full-stack mindset or experience collaborating with front-end teams. Who You Are: - Analytical and structured thinker. - Reliable, self-driven, and team-oriented. - Strong communication skills. - Eager to contribute to a meaningful mission in healthcare. What We Offer: - Competitive salary and benefits. - Opportunities for professional growth. - Collaborative, innovative work culture. Join us at EG Healthcare and become a part of a team that is dedicated to building smarter healthcare solutions for the future.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Services ETL Developer at Zeta Global, you will play a crucial role in managing and supporting the Delivery Operations Team through the implementation and maintenance of ETL and automation procedures. Your responsibilities will include processing data conversions on multiple platforms, performing tasks such as address standardization, merge purge, database updates, client mailings, and postal presort. You will be expected to automate scripts for transferring and manipulating data feeds both internally and externally. The ability to multitask and manage multiple jobs simultaneously to ensure timely client deliverability is essential. Collaborating with technical staff to maintain and support an ETL environment and working closely with database/crm, modelers, analysts, and application programmers to deliver results for clients will also be part of your role. To excel in this position, you must possess experience in database marketing and be proficient in transforming and manipulating data. Your expertise with Oracle and SQL will be crucial for automating scripts to process and manipulate marketing data. Familiarity with tools such as DMexpress, Talend, Snowflake, Sap DQM suite, Excel, and SQL Server is required for data exports, imports, running SQL server Agent Jobs, and SSIS packages. Experience with editors like Notepad++, Ultraedit, or similar tools, as well as knowledge of SFTP and PGP for ensuring data security and client data protection, will be beneficial. Working with large-scale customer databases in a relational database environment and demonstrating the ability to handle multiple tasks concurrently are key skills for this role. Effective communication and teamwork skills are essential for collaborating with colleagues to ensure tasks are completed promptly. The ideal candidate for this position should hold a Bachelor's degree or equivalent with at least 5 years of experience in Database Marketing. Strong oral and written communication skills are required to effectively fulfill the responsibilities of this role. If you are looking to be part of a dynamic and innovative company that leverages data-powered marketing technology to drive business growth, Zeta Global offers a rewarding opportunity to contribute to the success of leading brands through personalized marketing experiences.,

Posted 3 weeks ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

Bengaluru, Karnataka, India

On-site

Responsibilities: GCP Solution Architecture & Implementation: Implement and architect data solutions on Google Cloud Platform (GCP) , leveraging its various components. End-to-End Data Pipeline Development: Design and create end-to-end data pipelines using technologies like Apache Beam, Google Dataflow, or Apache Spark . Data Ingestion & Transformation: Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, providing best practices for pipeline operations. Data Technologies Proficiency: Work with Python, Hadoop, Spark, SQL, BigQuery, BigTable, Cloud Storage, Datastore, Spanner, Cloud SQL, and Machine Learning services. Database Expertise: Demonstrate expertise in at least two of these technologies: Relational Databases, Analytical Databases, or NoSQL databases. SQL Development & Data Mining: Possess expert knowledge in SQL development and experience in data mining (SQL, ETL, data warehouse, etc.) using complex datasets in a business environment. Data Integration & Preparation: Build data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc.). Data Quality & Regulatory Compliance: Identify downstream implications of data loads/migration, considering aspects like data quality and regulatory compliance. Scalable Data Solutions: Develop scalable data solutions that simplify user access to massive data, capable of adapting to a rapidly changing business environment. Programming: Proficient in programming languages such as Java and Python . Required Skills: GCP Data Engineering Expertise: Strong experience with GCP Data Engineering , including BigQuery, SQL, Cloud Composer/Python, Cloud Functions, Dataproc + PySpark, Python injection, Dataflow + Pub/Sub . Expert knowledge of Google Cloud Platform ; other cloud platforms are a plus. Expert knowledge in SQL development . Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc.). Proficiency with Apache Beam/Google Dataflow/Apache Spark in creating end-to-end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, BigQuery, BigTable, Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning . Proficiency in programming in Java, Python , etc. Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases . Strong analytical and problem-solving skills. Capability to work in a rapidly changing business environment. Certifications (Major Advantage): Certified in Google Professional Data Engineer/Solution Architect.

Posted 3 weeks ago

Apply

1.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a motivated and detail-oriented BI Developer with 1 to 10 years of experience in Jaspersoft for report development and conversion. Your strong background in PL/SQL, data handling, and business intelligence reporting will be beneficial in this role. Your key responsibilities will include designing, developing, and maintaining reports and dashboards using Jaspersoft / JasperReports. You will also be involved in migrating and converting existing reports from legacy systems into Jaspersoft. Writing and optimizing complex PL/SQL queries for data retrieval and reporting purposes is a crucial part of your role. To excel in this position, you should have 1 to 3 years of experience in BI development, specifically with Jaspersoft / JasperReports. Proficiency in PL/SQL, the ability to write complex queries and stored procedures, and experience in report conversion and migration projects are essential. A solid understanding of relational databases such as Oracle, MySQL, etc., good analytical and problem-solving skills, and the ability to work independently and as part of a team in a fast-paced environment are also required. Excellent communication skills will be an asset in this role. Immediate or short notice joiners are preferred, and experience working in Agile environments will be a plus. About the company: DataTerrain is a leading, full-scale solutions and services company in the Data Analytics and Data Visualization domain. With over a decade of experience, DataTerrain has served more than 270 customers in the United States and has developed multiple Data Analytics automation tools in-house. The company values reliability, strong work culture, and long-term relationships with its customers. DataTerrain encourages ingenuity and innovation in implementing real-world solutions and is an equal opportunity employer that welcomes new ideas and talent. The company trusts in the unlimited potential of its team to bring together customers" visions and engineers" passion.,

Posted 3 weeks ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Chennai, Tamil Nadu, India

On-site

Responsibilities: GCP Solution Architecture & Implementation: Implement and architect data solutions on Google Cloud Platform (GCP) , leveraging its various components. End-to-End Data Pipeline Development: Design and create end-to-end data pipelines using technologies like Apache Beam, Google Dataflow, or Apache Spark . Data Ingestion & Transformation: Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, providing best practices for pipeline operations. Data Technologies Proficiency: Work with Python, Hadoop, Spark, SQL, BigQuery, BigTable, Cloud Storage, Datastore, Spanner, Cloud SQL, and Machine Learning services. Database Expertise: Demonstrate expertise in at least two of these technologies: Relational Databases, Analytical Databases, or NoSQL databases. SQL Development & Data Mining: Possess expert knowledge in SQL development and experience in data mining (SQL, ETL, data warehouse, etc.) using complex datasets in a business environment. Data Integration & Preparation: Build data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc.). Data Quality & Regulatory Compliance: Identify downstream implications of data loads/migration, considering aspects like data quality and regulatory compliance. Scalable Data Solutions: Develop scalable data solutions that simplify user access to massive data, capable of adapting to a rapidly changing business environment. Programming: Proficient in programming languages such as Java and Python . Required Skills: GCP Data Engineering Expertise: Strong experience with GCP Data Engineering , including BigQuery, SQL, Cloud Composer/Python, Cloud Functions, Dataproc + PySpark, Python injection, Dataflow + Pub/Sub . Expert knowledge of Google Cloud Platform ; other cloud platforms are a plus. Expert knowledge in SQL development . Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc.). Proficiency with Apache Beam/Google Dataflow/Apache Spark in creating end-to-end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, BigQuery, BigTable, Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning . Proficiency in programming in Java, Python , etc. Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases . Strong analytical and problem-solving skills. Capability to work in a rapidly changing business environment. Certifications (Major Advantage): Certified in Google Professional Data Engineer/Solution Architect.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Hyderabad, Telangana, India

On-site

Responsibilities: GCP Solution Architecture & Implementation: Implement and architect data solutions on Google Cloud Platform (GCP) , leveraging its various components. End-to-End Data Pipeline Development: Design and create end-to-end data pipelines using technologies like Apache Beam, Google Dataflow, or Apache Spark . Data Ingestion & Transformation: Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, providing best practices for pipeline operations. Data Technologies Proficiency: Work with Python, Hadoop, Spark, SQL, BigQuery, BigTable, Cloud Storage, Datastore, Spanner, Cloud SQL, and Machine Learning services. Database Expertise: Demonstrate expertise in at least two of these technologies: Relational Databases, Analytical Databases, or NoSQL databases. SQL Development & Data Mining: Possess expert knowledge in SQL development and experience in data mining (SQL, ETL, data warehouse, etc.) using complex datasets in a business environment. Data Integration & Preparation: Build data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc.). Data Quality & Regulatory Compliance: Identify downstream implications of data loads/migration, considering aspects like data quality and regulatory compliance. Scalable Data Solutions: Develop scalable data solutions that simplify user access to massive data, capable of adapting to a rapidly changing business environment. Programming: Proficient in programming languages such as Java and Python . Required Skills: GCP Data Engineering Expertise: Strong experience with GCP Data Engineering , including BigQuery, SQL, Cloud Composer/Python, Cloud Functions, Dataproc + PySpark, Python injection, Dataflow + Pub/Sub . Expert knowledge of Google Cloud Platform ; other cloud platforms are a plus. Expert knowledge in SQL development . Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc.). Proficiency with Apache Beam/Google Dataflow/Apache Spark in creating end-to-end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, BigQuery, BigTable, Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning . Proficiency in programming in Java, Python , etc. Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases . Strong analytical and problem-solving skills. Capability to work in a rapidly changing business environment. Certifications (Major Advantage): Certified in Google Professional Data Engineer/Solution Architect.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

8 - 14 Lacs

Bengaluru, Karnataka, India

On-site

Data Integration: Design, develop, and implement data integration solutions using Talend and other ETL tools . Ensure seamless integration of data from various sources, including databases , APIs , and flat files . ETL Development: Create and maintain ETL workflows and processes to extract, transform, and load data into target systems. Optimize ETL processes for performance , scalability , and reliability . Data Pipeline Management: Design and manage data pipelines to support data ingestion , processing , and storage . Monitor and troubleshoot pipeline performance and resolve issues promptly. Java Development: Develop and maintain custom Java components and scripts to support data integration and transformation needs. Ensure code quality , reusability , and adherence to best practices . Collaboration: Work closely with business analysts , data architects , and stakeholders to understand data integration requirements . Collaborate with team members to deliver high-quality solutions on time. Documentation and Reporting: Create and maintain comprehensive documentation for data integration processes and workflows. Provide regular updates and reports on project progress and data integration activities. Required Qualifications: Minimum 5 years of experience in data integration and ETL processes . Hands-on experience with Talend data integration tools . Proficiency in Java programming for data-related tasks . Strong understanding of ETL concepts and best practices . Experience in designing and managing data pipelines . Solid knowledge of relational databases , SQL , and data modeling . Familiarity with data governance and data quality principles . Excellent problem-solving skills and attention to detail . Strong communication and collaboration abilities .

Posted 3 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You are an experienced Qlik Sense & Qlik Cloud Developer who will be responsible for designing, developing, and implementing business intelligence solutions using Qlik Sense and Qlik Cloud. Your expertise in data visualization, dashboard development, and cloud-based analytics will be crucial in supporting data-driven decision-making. Your key responsibilities will include developing, maintaining, and enhancing Qlik Sense dashboards and Qlik Cloud applications to meet business analytics needs. You will design and implement data models, ETL processes, and data integration solutions from various sources. Optimizing Qlik applications for performance, scalability, and efficiency will also be a significant part of your role. Collaboration with business stakeholders to gather requirements and deliver insightful analytics solutions is essential. Ensuring data accuracy, integrity, and security across Qlik Sense and Qlik Cloud environments is a critical aspect of your job. Troubleshooting and resolving issues related to data connectivity, scripting, and performance tuning will also be part of your responsibilities. Staying updated with the latest Qlik technologies, best practices, and industry trends is required. Providing technical guidance and training to business users on Qlik Sense & Qlik Cloud functionalities is expected. Collaborating with IT and Data Engineering teams to ensure seamless integration with enterprise data systems is also part of your role. To qualify for this position, you should have 5 to 10 years of hands-on experience in Qlik Sense and Qlik Cloud development. Strong expertise in Qlik scripting, expressions, and set analysis is necessary. Experience with data modeling, ETL processes, and data transformation is required. Knowledge of SQL, relational databases, and data warehousing concepts is essential. Experience integrating Qlik Sense/Qlik Cloud with different data sources like SAP, REST APIs, Cloud Storage, etc., is preferred. A strong understanding of Qlik Management Console (QMC) and security configurations is important. Proficiency in performance optimization, data governance, and dashboard usability is expected. Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud is a plus. You should be able to work independently and collaboratively in a fast-paced environment. Excellent communication and problem-solving skills are necessary for this role. This is a full-time position with the option to work from either Coimbatore or remotely. Interested candidates can send their resumes to fazilahamed.r@forartech.in or contact +91-7305020181. We are excited to meet you and explore the potential of having you as a valuable member of our team. Benefits include commuter assistance, flexible schedule, health insurance, leave encashment, provident fund, and the opportunity to work from home. The work schedule is during the day shift from Monday to Friday, and there is a performance bonus offered. If you are interested in applying for this position, please provide the following information: - Number of years of experience in Qlik Sense - Current CTC - Minimum expected CTC - Notice period or availability to join - Present location Work Location: Coimbatore / Remote (Work from Home),

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

haryana

On-site

As a highly skilled and experienced Senior .NET Developer, you will be joining our team to work closely with Red Hat and customer teams. Your role will be pivotal in designing, developing, and mentoring others in the adoption of modern Cloud Native Development practices. If you are passionate about pairing, fostering technical growth, and building robust microservices-based solutions with .NET and Podman, we are eager to hear from you. Key Responsibilities Lead the design, development, and implementation of high-quality, scalable, and secure microservices using C# and the .NET (Core) ecosystem. Drive the adoption and implementation of Continuous Delivery (CD) pipelines, ensuring efficient and reliable software releases for microservices. Demonstrate highly skilled Test-Driven Development (TDD) practices by writing comprehensive unit, integration, and end-to-end tests to ensure code quality and maintainability within a microservices architecture. Design, develop, and deploy .NET microservices within containers, leveraging inner loop practices. Utilize Podman/Docker Compose (or similar multi-container tooling compatible with Podman) for local development environments and multi-service microservices application setups. Implement robust API Testing strategies, including automated tests for RESTful APIs across microservices. Integrate and utilize Observability tools and practices (e.g., logging, metrics, tracing) to monitor application health, performance, and troubleshoot issues effectively in a containerized microservices environment. Collaborate closely with product owners, architects, and other developers to translate business requirements into technical solutions, specifically focusing on microservices design. Play a key mentoring role by actively participating in pairing sessions, providing technical guidance, and fostering the development of junior and mid-level engineers in microservices development. Contribute to code reviews with an eye for quality, maintainability, and knowledge transfer within a microservices context. Actively participate in architectural discussions and contribute to technical decision-making, particularly concerning microservices design patterns, containerization strategies with Podman, and overall system architecture. Stay up-to-date with emerging technologies and industry best practices in .NET, microservices, and containerization, advocating for their adoption where appropriate. Troubleshoot and debug complex issues across various environments, including Podman containers and distributed microservices. Required Skills and Experience You should have 10+ years of professional experience in software development with a strong focus on the Microsoft .NET (Core) ecosystem, ideally .NET 6+ or .NET 8+. Expertise in C# and building modern applications with .NET Core is required. Demonstrable experience in designing, developing, and deploying Microservices Architecture is necessary. Experience with Continuous Delivery (CD) principles and tools such as Azure DevOps, GitLab CI/CD, Jenkins is essential. A proven track record of applying Test-Driven Development (TDD) methodologies is expected. Practical experience with Podman, including building and running .NET applications in Podman containers, and understanding its daemonless/rootless architecture benefits is required. Proficiency in using Podman Compose (or similar approaches) for managing multi-container .NET applications locally is a must. Extensive experience with API Testing frameworks and strategies such as Postman, Newman, SpecFlow, Playwright, XUnit/NUnit for integration tests is necessary. Deep understanding and practical experience with Observability principles and tools like Application Insights, Prometheus, Grafana, OpenTelemetry, ELK Stack, Splunk is required. Solid understanding of RESTful API design and development is necessary. Experience with relational databases (e.g., SQL Server, PostgreSQL) and ORMs (e.g., Entity Framework Core) is expected. Excellent mentorship and communication skills, with a passion for knowledge sharing and team development are essential. Excellent problem-solving, analytical, and communication skills are necessary. You should have the ability to work independently and as part of a collaborative team.,

Posted 3 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

telangana

On-site

As a Data Modeler at TRST01, a premier Sustainable Tech company in Hyderabad, you will play a crucial role in designing, implementing, and optimizing data models to support carbon accounting, lifecycle analysis, and sustainability reporting. Your expertise in Carbon Footprint and Reduction Data (CFRD) will be instrumental in tracking carbon emissions, energy consumption, and sustainability metrics. Collaborating with cross-functional teams, including data scientists, software engineers, and sustainability experts, you will ensure the integrity, accuracy, and scalability of environmental data models. Your responsibilities will include developing robust data models, establishing reliable and scalable data pipelines for climate impact analysis, and optimizing storage and retrieval of climate-related data through entity-relationship diagrams and schema designs. You will work closely with data engineers and scientists to integrate climate and sustainability data into existing platforms, implement data validation and quality control measures, and support the development of AI/ML models for predictive analysis of carbon reduction strategies. To excel in this role, you should hold a Bachelor's or Masters degree in Data Science, Computer Science, Environmental Science, Sustainability, or a related field, along with 1-3+ years of experience in data modeling, database design, and schema optimization. Expertise in CFRD, strong understanding of relational and non-relational databases, hands-on experience with big data tools, and proficiency in data modeling tools are essential qualifications. Additionally, familiarity with carbon accounting standards, analytical skills, and excellent communication abilities are required. Preferred qualifications include experience in cloud-based data solutions for sustainability analytics, exposure to machine learning models for climate risk assessment, and familiarity with GIS-based modeling for environmental impact analysis. Keeping abreast of the latest advancements in climate tech, data modeling, and carbon accounting methodologies will be crucial for success in this role.,

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies