Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals.Develop, test, and support end-to-end batch and near real-time data flows/pipelines.Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 2 months ago
4.0 - 8.0 years
27 - 42 Lacs
Hyderabad
Work from Office
Job Summary We are looking for an experienced Infra Dev Specialist with 4 to 8 years of experience to join our team. The ideal candidate will have expertise in KSQL Kafka Schema Registry Kafka Connect and Kafka. This role involves working in a hybrid model with day shifts and does not require travel. The candidate will play a crucial role in developing and maintaining our infrastructure to ensure seamless data flow and integration. Responsibilities Develop and maintain infrastructure solutions using KSQL Kafka Schema Registry Kafka Connect and Kafka. Oversee the implementation of data streaming and integration solutions to ensure high availability and performance. Provide technical support and troubleshooting for Kafka-related issues to minimize downtime and ensure data integrity. Collaborate with cross-functional teams to design and implement scalable and reliable data pipelines. Monitor and optimize the performance of Kafka clusters to meet the demands of the business. Ensure compliance with security and data governance policies while managing Kafka infrastructure. Implement best practices for data streaming and integration to enhance system efficiency. Conduct regular reviews and updates of the infrastructure to align with evolving business needs. Provide training and support to team members on Kafka-related technologies and best practices. Develop and maintain documentation for infrastructure processes and configurations. Participate in code reviews and contribute to the continuous improvement of the development process. Stay updated with the latest trends and advancements in Kafka and related technologies. Contribute to the overall success of the team by delivering high-quality infrastructure solutions. Qualifications Possess strong experience in KSQL Kafka Schema Registry Kafka Connect and Kafka. Demonstrate a solid understanding of data streaming and integration concepts. Have a proven track record of troubleshooting and resolving Kafka-related issues. Show expertise in designing and implementing scalable data pipelines. Exhibit knowledge of security and data governance practices in managing Kafka infrastructure. Display proficiency in monitoring and optimizing Kafka cluster performance. Have experience in providing technical support and training to team members. Be skilled in developing and maintaining infrastructure documentation. Stay informed about the latest trends in Kafka and related technologies. Possess excellent communication and collaboration skills. Have a proactive approach to problem-solving and continuous improvement. Demonstrate the ability to work effectively in a hybrid work model. Show commitment to delivering high-quality infrastructure solutions. Certifications Required Certified Apache Kafka Developer
Posted 2 months ago
6.0 - 11.0 years
20 - 25 Lacs
Bengaluru
Work from Office
At Branch, we re transforming how brands and users interact across digital platforms. Our mobile marketing and deep linking solutions are trusted to deliver seamless experiences that increase ROI, decrease wasted spend, and eliminate siloed attribution. Our Branch team consists of smart, humble, and collaborative people who value ownership over all. Everything we do is centered around creating a great product, team, and company that lives and breathes our motto: Build Together, Grow Together, Win Together. As a Senior Data Engineer, you will design, build, and manage components of our highly available real-time and batch data pipelines handling petabytes of data. Our pipeline platform is designed to ingest and process billions of events per day, and make the resulting aggregations and insights available within minutes in our analytical data stores. Data Analytics is at the core of our business, and we re constantly innovating to make our systems more performant, timely, cost-effective, and capable while maintaining high reliability. You will build on top of our core data infrastructure and pipelines using technologies and tools tailored for massive data sets including Flink, Spark, Kafka, Iceberg and Druid while working in the AWS cloud environment. If you are interested in building systems that can consume and explore billions of data points a day, work with petabytes of data, and want to push what is possible with data, this is the place for you! As a Senior Data Engineer, you ll get to: Architect, build, and own real-time and batch data aggregation systems to deliver quality analytical reports for our internal and external customers. Collaborate with Data Scientists, Backend Engineers, Data Infrastructure Operations, and Products Managers to deliver new features and capabilities for customers. Develop clean, safe, testable, and cost-efficient solutions. Make well-informed decisions with deep knowledge of both the internal and external impacts on teams and projects. Foresee shortcomings ahead of time and be able to drive to resolution. You ll be a good fit if you have: Bachelors in CS or equivalent. 6+ years of Software Engineering or Data Engineering experience with a recent focus on big data. Strong development skills in Java or Scala and Python. Solid background in the fundamentals of computer science, distributed systems, large scale data processing as well as database schema design and data warehousing. Practical experience managing AWS or Google Cloud environments. Experience in containerized deployment or Kubernetes is a big plus! Good understanding of a broad spectrum of NoSQL, traditional RDBMS, and analytical/columnar data stores including Postgres, Druid, Vertica, Redshift, Hadoop, Hive, Cassandra, Aerospike, and Redis. Ability to build systems that balance scalability, availability, and latency. Strong ability to advocate for the continual deployment and automation tools, monitoring, and self-healing systems that can help improve the lives of our engineers. Great communication skills and you are a team player who has a proven track record of building strong relationships with management, co-workers, and customers. A desire to learn and grow, push yourself and your team, share lessons with others and provide constructive and continuous feedback, and be receptive to feedback from. This role will be based at our Bengaluru, KA office and follows a Hybrid schedule that will be aligned with our Return to Office guidelines. The salary range provided represents base compensation and does not include potential equity, which is available for qualifying positions. At Branch, we are committed to the well-being of our team by offering a comprehensive benefits package. From health and wellness programs to paid time off and retirement planning options, we provide a range of benefits for qualified employees. For detailed information on the benefits specific to your position, please consult with your recruiter. Branch is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. If you think youd be a good fit for this role, wed love for you to apply! At Branch, we strive to create an inclusive culture that encourages people from all walks of life to bring their unique, diverse perspectives to work. We aim every day to build an environment that empowers us all to do the best work of our careers, and we cant wait to show you what we have to offer! A little bit about us: Branch is the leading provider of engagement and performance mobile SaaS solutions for growth-focused teams, trusted to maximize the value of their evolving digital strategies. The Branch platform provides a seamless experience across paid and organic, on all channels and platforms, online and offline, to eliminate friction and drive valuable action at the moments of highest intent. With Branch, businesses gain accurate mobile measurement and insights into user interactions, enabling them to drive conversions, engagement, and more intelligent marketing spend. Branch is an award-winning employer headquartered in Mountain View, CA. World-class brands like Instacart, Western Union, NBCUniversal, Zocdoc and Sephora acquire users, retain customers and drive more conversions with Branch. Candidate Privacy Information: For more information on the data that Branch will collect through your application, and how we use, share, delete, and retain that information as part of our recruitment and employment efforts, please see our HR Privacy Policy .
Posted 2 months ago
3.0 - 7.0 years
14 - 18 Lacs
Hyderabad
Work from Office
Design, develop, and maintain scalable and robust data solutions in the cloud using Apache Spark and Databricks. Gather and analyse data requirements from business stakeholders and identify opportunities for data-driven insights. Build and optimize data pipelines for data ingestion, processing, and integration using Spark and Databricks. Ensure data quality, integrity, and security throughout all stages of the data lifecycle. Collaborate with cross-functional teams to design and implement data models, schemas, and storage solutions. Optimize data processing and analytics performance by tuning Spark jobs and leveraging Databricks features. Provide technical guidance and expertise to junior data engineers and developers. Stay up to date with emerging trends and technologies in cloud computing, big data, and data engineering. Contribute to the continuous improvement of data engineering processes, tools, and best practices. Bachelor s or master s degree in computer science, engineering, or a related field. 10+ years of experience as a Data Engineer, Software Engineer, or similar role, with a focus on building cloud-based data solutions. Strong
Posted 2 months ago
8.0 - 12.0 years
37 - 45 Lacs
Bengaluru
Work from Office
Design and develop cloud-native enterprise software products and services, focusing on distributed, scalable, fault-tolerant, and multi-tenant cloud services for healthcare applications using FHIR and LLM-based models for natural language processing (NLP). Build FHIR-compliant integrations with Oracle Health s healthcare data systems, ensuring adherence to FHIR standards in data exchange, patient record management, and clinical workflows, while integrating LLMs to provide contextual insights and automated processing of clinical data. Develop healthcare cloud applications following microservices and twelve-factor application principles, optimizing for data integration with FHIR-based services and LLMs for advanced text-based analysis and healthcare recommendations. Focus on API-first design using OpenAPI, Swagger, and test-driven development, ensuring integration with FHIR-based healthcare data through RESTful APIs, and leveraging LLMs for real-time patient data analytics. Code in Java, leveraging RESTful APIs, microservices, Docker, and Kubernetes, with integration into healthcare data systems at Oracle Health that use FHIR, and incorporating LLMs for scalable NLP tasks, such as summarizing patient information or automating documentation. Work with prominent healthcare APIs and other systems like AWS, Microsoft Azure, GCP, and Salesforce to facilitate healthcare data exchange, with programming languages like Java, Go, Python, JavaScript, and NodeJS, integrating AI-driven solutions and LLMs for intelligent data extraction. Implement message interchange formats such as JSON/JSON Schema, XML, Avro, and FHIR resources, while integrating LLM-based AI models to enhance decision-making, predictive analytics, and patient engagement. Collaborate with JavaScript frameworks like OJET, ReactJS, and AngularJS to create dynamic, healthcare-driven user interfaces, integrating with FHIR-based patient data systems and enhancing user experience with LLM-driven insights and contextual support. Provide technical evaluations, optimize applications, and implement best practices in developing cloud-native healthcare applications with FHIR and LLM capabilities for Oracle Health. Create reusable solutions that can accelerate the creation of healthcare applications using FHIR and LLMs, streamlining the integration with EHR and clinical systems, and delivering smarter healthcare insights. Design and Development: Create and implement scalable, multi-tenant cloud services on Oracle Cloud Infrastructure (OCI) tailored for Oracle Health. Integration: Develop and integrate healthcare systems using FHIR standards to ensure smooth data exchange in clinical workflows. Cloud-Native Applications: Build cloud-native applications and ensure compliance with FHIR standards for healthcare integrations. Decision-Making and Analytics: Leverage large language models (LLMs) and AI-driven solutions to enhance decision-making processes, predictive analytics, and automation in healthcare. API Development: Design and develop RESTful APIs to facilitate seamless integration and interaction with various healthcare data formats. Microservices Architecture: Implement and maintain a robust microservices architecture to support scalable and efficient cloud services. Collaborative Efforts: Work closely with cross-functional teams to create reusable, FHIR-based solutions focusing on healthcare interoperability and AI integration. Operational Excellence: Monitor, operate, and maintain cloud services to ensure optimal performance, reliability, and scalability. Compliance and Standards: Ensure all cloud services and integrations comply with industry standards and regulations, particularly in healthcare.
Posted 2 months ago
3.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
We are seeking skilled Data and Analytics Developers to support and extend our existing analytics capabilities. The current work is maintained by a team of five in-house developers, and we are looking to augment or outsource this function to ensure scalability, consistency, and efficiency. The role involves data pipeline development, dashboard/report creation, data modeling, and integration with business systems. Key Responsibilities: Design, develop, and maintain data pipelines, ETL processes, and data transformation workflows. Build and enhance interactive dashboards and reports using tools like Power BI, Tableau, or equivalent. Work closely with business teams to gather reporting requirements and translate them into actionable analytics solutions. Ensure data quality, governance, and integrity across multiple systems. Collaborate with in-house developers and analysts to align on data architecture and standards. Optimize data queries and storage solutions for performance and scalability. Provide technical documentation, version control, and regular updates on project progress. Required Skills Experience: 3+ years of experience in data engineering, analytics development, or BI. Proficiency in SQL and at least one ETL tool (e.g., Azure Data Factory, Talend, SSIS). Hands-on experience with BI platforms such as Power BI, Tableau, or Qlik. Familiarity with data warehousing concepts (e.g., star/snowflake schema). Experience with cloud data platforms such as Azure, AWS, or GCP. Strong understanding of data governance, security, and privacy practices. Version control using Git or other source control tools. Preferred Qualifications: Experience with Azure Synapse, Databricks, or Snowflake. Familiarity with CI/CD pipelines for data deployment. Exposure to Python or R for data manipulation or statistical analysis. Knowledge of business domains such as finance, operations, or sales analytics.
Posted 2 months ago
4.0 - 9.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Exp: 4+ Years Contract Duration : 6 Months Location : Hyderabad NP: Immediate Only Strong expertise in Java (Java 13+) and Spring Boot framework. Extensive experience with AWS cloud services and deploying applications in a cloud environment. Proven hands-on experience with Apache Kafka (producer/consumer, topics, partitions). Deep knowledge of PostgreSQL including schema design, indexing, and query optimization. Solid experience writing JUnit test cases and developing unit/integration test suites. Familiarity with code coverage tools (e.g., JaCoCo, SonarQube) and implementing best practices for test coverage. Excellent verbal and written communication skills, with the ability to clearly explain complex technical concepts to non-technical stakeholders. Demonstrated leadership skills with experience managing, mentoring, and motivating technical teams. Proven experience in stakeholder management, including gathering requirements, setting expectations, and delivering technical solutions aligned with business goals. Familiarity with microservices architecture and RESTful API design. Experience with containerization (Docker) and orchestration platforms like Kubernetes (EKS). Strong understanding of CI/CD pipelines and DevOps practices. Solid problem-solving skills with ability to handle complex technical challenges. Familiarity with monitoring tools (Prometheus, Grafana) and log management. Experience with version control systems (Git) and Agile/Scrum methodologies BPM knowledge on modeler, groovy scripts, event listeners and interceptors Camel framework KAFKA APIs API security with JWT AWS Glue React JS Key Skills : Java (Java 13+) and Spring Boot , AWS , KAFKA. React JS , PostgreSQL BPM knowledge on modeler, groovy scripts, event listeners and interceptors
Posted 2 months ago
10.0 - 15.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Exp: 10+ Years Contract Duration : 6 Months Location : Hyderabad NP: Immediate Only Key Skills : Java (Java 17+) and Spring Boot , AWS PostgreSQL Strong expertise in Java (Java 17+) and Spring Boot framework. Extensive experience with AWS cloud services and deploying applications in a cloud environment. Proven hands-on experience with Apache Kafka (producer/consumer, topics, partitions). Deep knowledge of PostgreSQL including schema design, indexing, and query optimization. Solid experience writing JUnit test cases and developing unit/integration test suites. Familiarity with code coverage tools (e.g., JaCoCo, SonarQube) and implementing best practices for test coverage. Excellent verbal and written communication skills, with the ability to clearly explain complex technical concepts to non-technical stakeholders. Demonstrated leadership skills with experience managing, mentoring, and motivating technical teams. Proven experience in stakeholder management, including gathering requirements, setting expectations, and delivering technical solutions aligned with business goals. Familiarity with microservices architecture and RESTful API design. Experience with containerization (Docker) and orchestration platforms like Kubernetes (EKS). Strong understanding of CI/CD pipelines and DevOps practices. Solid problem-solving skills with ability to handle complex technical challenges. Familiarity with monitoring tools (Prometheus, Grafana) and log management. Experience with version control systems (Git) and Agile/Scrum methodologies
Posted 2 months ago
2.0 - 5.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Job Title: SEO- AI Executive Experience Required: 2 to 5 Years Location: Bengaluru About Us Boss Wallah is a platform empowering small business owners and aspiring entrepreneurs with the skills, support, and expert guidance needed to start and grow their businesses. We offer courses from successful entrepreneurs across 100+ business areas, access to 2,000+ experts, and content in six languages (Telugu, Tamil, Kannada, Hindi, Malayalam, and English). Role Summary Boss Wallah Technologies seeks an AI Specialist with SEO expertise to develop advanced search strategies. The role involves using AI tools to optimize content, improve visibility, enhance engagement, and streamline SEO workflows while staying current with emerging AI search trends. Key Responsibilities Optimize content for traditional SEO and emerging AI search engines, voice search, and featured snippets. Use AI tools (e.g., ChatGPT, Jasper, Surfer SEO) to generate SEO-friendly content, meta descriptions, FAQs, and schema markup. Collaborate with cross-functional teams to align SEO efforts with evolving AI search technologies and digital marketing goals. Stay updated on the latest SEO best practices, Google algorithm changes, and advancements in AI-powered search technologies. Write and optimize content tailored for AI consumption, including conversational search and AI knowledge graphs. Apply structured data and schema markup to improve content discoverability and search clarity. Skills Experience 2 to 5 years of SEO experience with a solid understanding of on-page, off-page, and technical SEO fundamentals. Hands-on experience using AI SEO tools like ChatGPT, Jasper, Surfer SEO, Frase, Scalenut, or MarketMuse for content creation and optimization. Experience with keyword research, competitor analysis, and SEO audit tools. Familiarity with Google Search Console, Google Analytics, and SEO plugins (e.g., RankMath, Yoast). Basic understanding of content management systems (WordPress Wix) Understanding of SEO for voice search, featured snippets, and AI-based search platforms. Willingness to learn and adapt to new AI tools and SEO trends. Preferred Skills Experience creating AI-friendly content Knowledge of schema/structured data Basic scripting for SEO automation
Posted 2 months ago
6.0 - 7.0 years
8 - 9 Lacs
Bengaluru
Work from Office
ECMS Req # 526872 Relevant and Total years of experience 6-7+ years relevant exp Informatica + Strong SQL 7-8 yrs. in, Tibco, Control M and Data Modeling can be 4-5 years each Detailed job description - Skill Set: 5+ yrs experience on ETL(INFORMATICA), TIBCO Experience in PL/SQL and writing SQL statements Strong analytical, problem-solving, verbal, listening and interpersonal skills Excellent written and verbal communication skills. Ability to work independently and with a team in a diverse, fast-paced, and collaborative environment. Proven success working in an on/off-shore development teams.- Good knowledge of industry tools Knowledge of Data Warehousing concepts, Kimball methodology and Dimensional modeling concepts like star/snow flake schema. Knowledge on any BI reporting tool is added advantage Mandatory Skills Informatica, SQL, TIBCO Vendor Billing range in local currency (per day) INR10500/Day Work Location Bangalore, Hyderabad (Preferred) Notice period 15 days, not more than that. WFO/WFH/Hybrid WFO Hybrid BGCHECK before or After onboarding Post onboarding
Posted 2 months ago
6.0 - 8.0 years
8 - 10 Lacs
Bengaluru
Work from Office
Design, develop, and maintain scalable data pipelines and Snowflake data warehouse models. Implement data ingestion processes using Snowflake and ETL/ELT tools (e.g., dbt, Informatica, Talend, etc.). Optimize Snowflake SQL queries and manage performance tuning and data modeling. Develop and maintain stored procedures, UDFs, and other Snowflake scripting components. Work with cross-functional teams to understand business requirements and translate them into technical solutions. Collaborate with BI developers to provide clean, transformed, and well-modeled data for analytics and reporting. Maintain data governance, security, and compliance within the Snowflake environment. Monitor data pipelines for reliability and troubleshoot issues as needed. Support integration of Snowflake with various data sources and analytics tools like Tableau, Power BI, Looker, etc. Required Skills and Qualifications: 6 to 8 years of experience in data engineering, data warehousing, or data analytics roles. Minimum 3+ years of hands-on experience with Snowflake (data modeling, performance tuning, schema design, etc.). Strong proficiency in SQL, with expertise in writing complex queries and stored procedures. Solid experience with ETL/ELT tools and frameworks. Familiarity with cloud platforms (AWS, Azure, or GCP) and data lake architectures. Experience in integrating Snowflake with third-party BI tools and APIs. Strong understanding of data warehousing concepts, data lakes, and dimensional modeling. Working knowledge of version control (Git), CI/CD practices, and Agile methodologies. Excellent problem-solving skills and ability to work in a collaborative environment. Preferred Qualifications: Snowflake Certification (SnowPro Core or Advanced). Experience with dbt (Data Build Tool) or similar modern data transformation frameworks. Exposure to Python or other scripting languages for data manipulation. Knowledge of data privacy and compliance standards (e.g., GDPR, HIPAA)
Posted 2 months ago
6.0 - 10.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Tekion s Automotive Partner Cloud (APC) powers all partner-facing integrations through a robust API Platform. We are looking for a Lead Product Manager to drive the evolution of this platform and own its roadmap, adoption, and ecosystem impact. If you re passionate about API-first ecosystems, integration scalability, and building great developer and partner experiences we d love to talk to you. Key Responsibilities As the Lead PM for the APC Platform , you will: Own the vision, strategy, and roadmap for APC s integration platform including standard products like APIs, Webhooks, Data Feeds and more. Define and track key product success metrics (adoption, performance, time-to-integrate, etc.) to evolve the platform. Collaborate with Tekion s architecture panel to define and evolve best practices, governance guidelines, and platform capabilities for APIs and integration products. Work closely with all internal Application Teams to support consistent, scalable, and secure API development across the organization. Monitor adoption, reliability, and usage trends for all integration products proactively identify and address friction areas or high-volume issues. Champion a platform mindset , delivering a consistent experience across partners, internal product teams, and support teams. Ensure transparency and visibility into the status of APIs, Webhooks, and other integration features for internal client-facing teams and partners. Continuously improve internal processes , including product development lifecycle, documentation, and certification workflows. Work with Sales, Solution Engineering, and Customer Success to help design integration solutions for new clients using our standard integration products. Partner with Marketing and Enablement teams to market our APIs and Webhooks, and ensure Sales and Support teams are well-trained. Contribute to developer experience and partner onboarding by driving enhancements across developer documentation, API usability, and the partner integration lifecycle. Skills and Experience 6-10 years of total product management experience , with at least 3-4 years focused on API Platforms, Developer Tooling, or SaaS Platforms . Strong domain experience with OpenAPI, webhooks, event-driven architectures , and multi-tenant B2B SaaS platforms . Solid technical acumen ability to collaborate closely with architects and engineers, with a deep understanding of REST APIs, versioning, rate limits, and schema governance. Proven experience working cross-functionally across product, engineering, architecture, customer success, sales, marketing, and partner teams . Skilled at identifying and resolving integration adoption challenges , driving API usability improvements , and simplifying onboarding at scale. Exceptional communication skills , with experience leading internal Confluence pages, product documentation, and developer-facing content. Based in Bangalore , with flexibility to collaborate and coordinate with global teams across the US, Canada, and the UK. Preferred Skills Partner with Security and Legal teams to ensure API policies are compliant with regional regulations (e.g., SOC2, GDPR). Identify opportunities to productize commonly repeated integration patterns or workflows into reusable templates or SDKs. Influence platform evangelism and developer outreach through demos, tech talks, and external documentation standards (like OpenAPI, AsyncAPI). Perks and Benefits Competitive compensation Generous stock options Medical Insurance coverage Work with some ofthe brightestmindsfrom Silicon Valley s most dominant and successful Companies
Posted 2 months ago
15.0 - 20.0 years
5 - 9 Lacs
Gurugram
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP HCM Payroll Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop solutions that align with business needs and requirements. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Expected to provide solutions to problems that apply across multiple teams- Lead the design and development of SAP HCM Payroll applications- Provide technical expertise and guidance to the team- Ensure the applications meet business process requirements Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM Payroll- Strong understanding of SAP HCM Payroll processes- Experience in configuring SAP Payroll schemas and rules- Knowledge of SAP Payroll reporting and integration- Hands-on experience in SAP Payroll implementation- Good To Have Skills: Experience with SAP SuccessFactors Additional Information:- The candidate should have a minimum of 12 years of experience in SAP HCM Payroll- This position is based at our Gurugram office- A 15 years full-time education is required Qualification 15 years full time education
Posted 2 months ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP HCM Payroll Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and implement SAP HCM Payroll solutions.- Collaborate with cross-functional teams to analyze and address business requirements.- Provide technical expertise and support in application development.- Conduct testing and debugging to ensure application functionality.- Stay updated on industry trends and best practices for continuous improvement. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM Payroll.- Strong understanding of SAP HCM modules.- Experience in SAP Payroll configuration and customization.- Knowledge of ABAP programming language.- Hands-on experience in SAP Payroll schema and rules configuration. Additional Information:- The candidate should have a minimum of 3 years of experience in SAP HCM Payroll.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 2 months ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP HANA DB Administration Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your day will involve collaborating with teams to create innovative solutions and contribute to key decisions in application development. Roles & ResponsibilitiesThe DBA is responsible for designing, implementing and maintaining the database system. The DBA also must establish policies and procedures pertaining to the management, security, maintenance and use of the database management system. Furthermore, the DBA ensures that databases are protected and secured, enacts measures to maintain the database integrity in terms of data accuracy and makes sure unauthorized users can't access the data. Our role involves managing components, memory, and data persistence of SAP HANA, while integrating with SAP and non-SAP applications. Diagnosing and resolving startup and shutdown issues effectively, ensuring minimal downtime, is a key aspect of our work. Hands-on experience with backup, restore, and recovery activities is crucial for supporting data integrity. Currently proficient in performing regular and disaster recovery activities to maintain system resilience is part of our duties. Generating DB health check reports to optimize performance and functionality is a part of our expertise. Skilled at performing schema refreshes, cloning, and data provisioning tasks for various purposes, including development, testing, and reporting, is within our capabilities. Providing support for development, testing, and reporting activities ensures smooth processes. Demonstrating proficiency in managing HANA high availability (HA) and disaster recovery (DR) solutions showcases our expertise. Experts in system replication, failover mechanisms, and data synchronization for maintaining continuous operations is our forte. Possessing a strong understanding of Linux operating systems, including command-line navigation, user management, and system monitoring, is an integral part of our knowledge. Hana Performance and health check,status of all services, Quality checks, mini check run, EWA report analysis Identify, analyze and optimize expensive SQL-statements to improve application performance Design table partitioning strategy/architecture Hana SDA Smart data access-HANA Data Provisioning and Integration with non-SAP system Virtual tables creation and refresh Hana Space overview(data volumes,log volumes,backup volumes,trace management). Additional Information:- The candidate should have a minimum of 5 years of experience in SAP HANA DB Administration- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 2 months ago
3.0 - 8.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP HCM Time Management Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating solutions that align with business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement SAP HCM Time Management solutions.- Collaborate with cross-functional teams to gather requirements.- Troubleshoot and resolve application issues efficiently.- Stay updated on industry trends and best practices.- Provide training and support to end-users. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM Time Management.- Strong understanding of SAP HCM modules and integration.- Experience in ABAP programming for SAP HCM.- Knowledge of SAP HR processes and configurations.- Hands-on experience in SAP Time Evaluation and Schema customization. Additional Information:- The candidate should have a minimum of 3 years of experience in SAP HCM Time Management.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
5.0 - 10.0 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Dynamics CRM Technical Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Ensure timely project delivery- Provide technical guidance and support to the team Professional & Technical Skills: - Lead the design and development of D365, Power Platform solutions, ensuring best practices and scalability.- Sale and Marketing experience is mandatory.- Architect solutions integrating Power Apps, Power Automate, Power BI, Dataverse, and external systems.- Define best practices for Power Platform governance, security, and compliance.- Oversee Azure-based integrations using API Management, Logic Apps, Azure Functions, Service Bus, and Graph API.- Ensure alignment with enterprise architecture, DevOps strategies, and Microsoft best practices.Development & Integration- Guide teams in building canvas and model-driven apps with custom components (PCF), Power Fx, and JavaScript.- Implement API-based integrations with Dynamics 365, Azure, Microsoft 365, and third-party services.- Optimize Dataverse schema, security roles, business rules, and plugins for performance and maintainability.- Enable Power Automate workflows for process automation, including cloud flows, desktop flows, and RPA.DevOps, ALM & Performance Optimization- Establish CI/CD pipelines for Power Platform ALM using DevOps, and Power Platform Build Tools.- Drive automated testing strategies for Power Platform solutions.- Monitor and optimize app performance, capacity planning, and data management.Security & Compliance- Ensure Azure AD-based authentication, role-based access control (RBAC), and data security policies.- Implement Power Platform Center of Excellence (CoE) best practices.- Define strategies for data governance, compliance, and tenant-wide policies.Team Leadership & Stakeholder Collaboration- Lead and mentor a team of Power Platform developers, testers, and analysts.- Work closely with business stakeholders, architects, and cross-functional teams to translate requirements into technical solutions.- Conduct technical reviews, training sessions, and knowledge-sharing initiatives. Additional Information:- The candidate should have a minimum of 5 years of experience in Microsoft Dynamics CRM Technical- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education
Posted 2 months ago
2.0 - 3.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Job TitleJava Developer About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Work location Pune JD as below - 5+ Years Only 30 days : We are looking for a highly skilled Java Developer with expertise in Spring Boot, Confluent Kafka, and distributed systems . The ideal candidate should have strong experience in designing, developing, and optimizing event-driven applications using Confluent Kafka while leveraging Spring Boot/Spring Cloud for microservices-based architectures. Key Responsibilities: Develop, deploy, and maintain scalable and high-performance applications using Java (Core Java, Collections, Multithreading, Executor Services, CompletableFuture, etc.) Work extensively with Confluent Kafka , including producer-consumer frameworks, offset management, and optimization of consumer instances based on message volume. Ensure efficient message serialization and deserialization using JSON, Avro, and Protobuf with Kafka Schema Registry . Design and implement event-driven architectures with real-time processing capabilities. Optimize Kafka consumers for high-throughput and low-latency scenarios. Collaborate with cross-functional teams to ensure seamless integration and deployment of services. Troubleshoot and resolve performance bottlenecks and scalability issues in distributed environments. Familiarity with containerization (Docker, Kubernetes) and cloud platforms is a plus. Experience with monitoring and logging tool- Splunk is a plus.
Posted 2 months ago
6.0 - 8.0 years
7 - 12 Lacs
Bengaluru
Work from Office
6 - 8 years of experience in database Administration with a focus on cloud and migration projects. Strong hands-on experience with MongoDB, Azure Cosmos DB for NoSQL, and Azure Database for PostgreSQL. Proven experience in Oracle to PostgreSQL migrations, including schema and stored procedure conversions. Hand of experience on NoSQL Database migrations Experience with Azure Cosmos DB, MongoDB, PostgreSQL on Azure, Experience with AWS to Azure database migration projects. Lead and execute end-to-end NoSQL and relational database migrations across cloud platforms. Design and implement solutions involving Azure Cosmos DB for NoSQL, MongoDB, and Azure Database for PostgreSQL. Perform Oracle to PostgreSQL database migrations including schema conversion, data migration, and performance tuning. Lead the analysis, planning, and execution of complex database migration projects. Use tools and scripting to automate data transformation, migration, and validation processes. Troubleshoot issues related to performance, availability, and data consistency. Support and lead cloud migration efforts, especially transitioning database workloads from AWS to Azure. Solid understanding of cloud-native database services and infrastructure on Azure. Document architecture, configurations, and migration strategies. Collaborate with architects, developers, and cloud engineers to define and implement best practices for cloud-native database architectures. Communicate effectively with stakeholders and cross-functional teams. Preferred certification (if Any): MongoDB Certified DBA Microsoft Certified: Azure Database Administrator Associate Microsoft Certified: Azure Administrator Associate Good to have skills: Experience with CI/CD pipelines for database deployments. Exposure to containerized database deployments using Kubernetes or Docker
Posted 2 months ago
7.0 - 11.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Job Description We are seeking a highly skilled and motivated Senior AWS Snowflake Engineer to join our growing data engineering team. In this role, you will be responsible for building scalable and secure data pipelines and Snowflake-based architectures that power data analytics across the organization. You ll collaborate with business and technical stakeholders to design robust solutions in an AWS environment and play a key role in driving our data strategy forward. Responsibilities Design, develop, and maintain efficient and scalable Snowflake data warehouse solutions on AWS. Build robust ETL/ELT pipelines using SQL, Python, and AWS services (e.g., Glue, Lambda, S3). Collaborate with data analysts, engineers, and business teams to gather requirements and design data models aligned with business needs. Optimize Snowflake performance through best practices in clustering, partitioning, caching, and query tuning. Ensure data quality, accuracy, and completeness across data pipelines and warehouse processes. Maintain documentation and enforce best practices for data architecture, governance, and security. Continuously evaluate tools, technologies, and processes to improve system reliability, scalability, and performance. Ensure compliance with relevant data privacy and security regulations (e.g., GDPR, CCPA). Qualifications Bachelor s degree in Computer Science, Information Technology, or a related field. Minimum 5 years of experience in data engineering, with at least 3 years of hands-on experience w
Posted 2 months ago
7.0 - 11.0 years
6 - 10 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled and motivated Senior AWS Snowflake Engineer to join our growing data engineering team. In this role, you will be responsible for building scalable and secure data pipelines and Snowflake-based architectures that power data analytics across the organization. You ll collaborate with business and technical stakeholders to design robust solutions in an AWS environment and play a key role in driving our data strategy forward. Responsibilities Design, develop, and maintain efficient and scalable Snowflake data warehouse solutions on AWS. Build robust ETL/ELT pipelines using SQL, Python, and AWS services (e.g., Glue, Lambda, S3). Collaborate with data analysts, engineers, and business teams to gather requirements and design data models aligned with business needs. Optimize Snowflake performance through best practices in clustering, partitioning, caching, and query tuning. Ensure data quality, accuracy, and completeness across data pipelines and warehouse processes. Maintain documentation and enforce best practices for data architecture, governance, and security. Continuously evaluate tools, technologies, and processes to improve system reliability, scalability, and performance. Ensure compliance with relevant data privacy and security regulations (e.g., GDPR, CCPA). Bachelor s degree in Computer Science, Information Technology, or a related field. Minimum 5 years of experience in data engineering, with at least 3 years of hands-on experience with Snowflake.
Posted 2 months ago
5.0 - 10.0 years
11 - 12 Lacs
Pune
Work from Office
We are looking forward to hire QAD Professionals in the following areas : : Experience required: 7-10 years. Below is the JD:- We are looking to hire an experienced Administrator who can independently manage QAD landscape for its place of work, located in Pune, India. The candidate will effectively be working on the QAD support & projects working closely with other Admins and a team of QAD Dev team based out of both onsite & offshore. The person will be responsible for ensuring QAD environment, progress databases & systems operate efficiently and securely by managing the core QAD DBA activities with Progress DBA, QAD NetUI, QXTEND & Technical issue. In addition, the person is expected to have good understanding of Database AI & BI files working and has to be pro-active in monitoring QAD environment & system to avoid issues from application ensuring high availability. Required Skills: 5 to 7 years of QAD (Enterprise Edition (EE)), QAD Standard Edition, Lower versions of QAD and Progress products technical experience which includes being well versed with RDBMS concepts, comfortable performing Progress DBA tasks and YAB Administration. Min 5 yrs of hands-on experience in end-end management of QAD Landscape. QAD Financial patch & add-on tool installation. QAD i19 add-on package installation & setup. QAD YAB & add-on tools upgrade. Good Knowledge of Unix, Linux Manage physical aspects of the database such as DBMS setup, backups, extent addition, AfterImage file management, Schema addition, Dump load. Working with the QAD Development team in Schema management, suggesting schema modification if any. Ensure progress database systems operate efficiently and securely. Proactive production monitoring & manual monitoring, if any. Non-Production Monitoring Perform database refreshes and other DBA tasks Create performance, DB, Table & index analysis reports. Working on assigned Incidents & tasks and resolving them as per defined Service Level Agreements (SLAs). Additional skills: Building & implementing processes in the QAD landscape. Experience leading a team would be a plus. Experience with Qxtend, QAD internationalization is a plus A strong let s-do-it attitude Strong analytical and conceptual skills. Likes working independently, on multiple assignments simultaneously Pro-active attitude and proposing improvements where applicable . Excellent English verbal and communication skills Educational Qualification Graduate or Post Graduate Benefits Opportunity to work on QAD 2017 EE Exposure beyond just QAD systems, like TMS. This role is best suited for someone with zest and an intrapreneurial spirit. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 2 months ago
4.0 - 6.0 years
14 - 15 Lacs
Bengaluru
Work from Office
Role : Oracle Database Developer Senior Location : Offshore/India Who are we looking for? We are looking for 5-7 years of experience in database development, with a strong knowledge of database management systems such as SQL Server and Oracle . Technical Skills: 5-7 years of database development experience Strong SQL experience in creating database objects like Tables, Stored Procedures, DDL/DML Triggers, Views, Indexes, Cursors, Functions & User defined data types. Good experience with Snowflake cloud data platform including Snowflake utilities like SnowSQL, SnowPipe, administering Snowflake, data loading within cloud (AWS ) Strong understanding of Data Warehousing (Including Star Schema, Snowflake Schema) and Extraction Transformation Loading (ETL) Experience in MongoDB, PostgreSQl and Amazon DB Experience with cloud development/technologies would be a plus Experience in Talend and Cognos. Familiarity with modern frameworks (e.g. .NET, Java, Python) Familiarity with programming languages such as Python or Java is a plus. Experience with cloud development/technologies and architecture would be a plus. Strong written and oral communication skills Excellent problem-solving and quantitative skills Demonstrated ability to work as part of a team. Investment Management experience in the past Process Skills: Ability to evaluate, analyze, design and implement solutions based on technical requirements. Develop and peer review of LLD (Initiate/ participate in peer reviews) Strong design and technical skills, ability to translate business needs into technical solutions and able to analyze the impact. Behavioral Skills : Resolve technical issues of projects and explore alternate designs. Participates as a team member and fosters teamwork by inter-group coordination within the modules of the project. Effectively collaborates and communicates with the stakeholders and ensure client satisfaction. Train and coach members of project groups to ensure effective knowledge management activity. Qualification: Somebody who has at least 5-7 years of work experience. Education qualification: Any degree from a reputed college
Posted 2 months ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Back Key Responsibilities Collaborate with stakeholders to understand their data needs and develop a MongoDB strategy that aligns with the organization's overall data goals This includes determining the best use cases for MongoDB, considering scalability, performance, and security requirements Define and implement data governance policies and procedures specifically for MongoDB data, ensuring data quality, consistency, and compliance with relevant regulations This may involve setting standards for data modelling, schema design, and access control Design and document the MongoDB architecture, including data models, data flows, sharding strategies, replication setups, and indexing plans This ensures a robust and scalable MongoDB infrastructure that meets the organization's needs Evaluate and recommend appropriate MongoDB tools and technologies, including drivers, libraries, monitoring tools, and cloud-based services, to optimize MongoDB performance and management Create and maintain data models for MongoDB, leveraging its document-oriented nature to represent data effectively This involves understanding data relationships, designing efficient schemas, and optimizing data structures for specific use cases Deploy, configure, and manage MongoDB instances, including sharding, replication, and backup/recovery mechanisms This ensures high availability, scalability, and data resilience Analyze MongoDB performance, identify bottlenecks, and implement solutions to improve query performance, data access speed, and overall efficiency This may involve optimizing queries, creating appropriate indexes, and adjusting sharding strategies Implement security measures for MongoDB, including access control, authentication, data encryption, and auditing, to protect sensitive data and comply with relevant regulations This ensures data integrity and protects against unauthorized access Integrate MongoDB with cloud platforms like AWS, Azure, or GCP, leveraging cloud-based services for storage, compute, and management This enables scalability, cost optimization, and enhanced manageability Monitor MongoDB instances, track performance metrics, and address any issues or anomalies This involves using monitoring tools, analyzing logs, and proactively identifying and resolving potential problems Design and manage data pipelines that integrate MongoDB with other data sources and systems This ensures data flows smoothly and efficiently between MongoDB and other applications Plan and manage MongoDB capacity, ensuring sufficient resources are available to meet current and future data growth This involves forecasting data storage requirements and scaling MongoDB instances accordingly Collaborate effectively with stakeholders, including developers, data analysts, operations teams, and business users, to communicate MongoDB architecture plans, address concerns, and ensure alignment Stay updated with the latest MongoDB features, best practices, and industry trends to continuously improve the organization's MongoDB architecture and processes Participate in Agile Scrum planning, estimation, and sprint execution Required Qualifications To Be Successful In This Role Deep understanding of MongoDB architecture (document-oriented data model, sharding, replication, indexing, etc ) Proficient in creating effective schema structures, understanding data relationships, and optimizing queries for specific use cases Strong skills in analyzing performance bottlenecks, identifying areas for improvement, and implementing appropriate solutions Strong technical foundation in data warehousing, data modelling, data integration, and data governance principles Familiarity with security best practices, access control mechanisms, data encryption, and backup/recovery strategies Additional Information Job Type Full Time Work ProfileHybrid (Work from Office/ Remote) Years of Experience8-12 Years LocationBangalore What We Offer Competitive salaries and comprehensive health benefits Flexible work hours and remote work options Professional development and training opportunities A supportive and inclusive work environment
Posted 2 months ago
3.0 - 4.0 years
4 - 8 Lacs
Noida
Work from Office
Build and maintain web applications using MongoDB, Express.js, React.js, and Node.js. Develop responsive and dynamic user interfaces with React.js. Design and implement RESTful APIs using Node.js and Express.js. Handle database operations using MongoDB, Including schema design and query optimization. Designing and building applications and systems based on requirements and wireframes. Writing well-designed, efficient, and testable code. Understanding of MVC and design patterns. Understanding of data structures. Actively communicating with clients to understand functional requirements. Work closely with designers and other developers to implement functional and visually appealing applications. Identify and fix bugs in the application and write unit and integration tests to ensure application stability. Use Git for version control and participate in code reviews to maintain code quality. Optimize applications for maximum speed and scalability. Adhere to industry best practices and contribute to internal coding standards. Participate in agile ceremonies like stand-ups, sprint planning, and retrospectives. Requirements At least 3-4 years of experience in MERN Stack Development. Proficiency in JavaScript and ES6 + features. Strong experience with React.js and understanding of state management libraries like Redux. Experience with Node.js, Express.js, and MongoDB. Familiarity with front-end libraries like Material-UI or Bootstrap. Familiarity with front-build tools like Webpack and Babel. Knowledge of RESTful APIs and web services. Familiarity with development processes using cloud platforms like AWS or Azure Hands-on experience preferred. Experience with Docker and Containerization. Benefits Office Hours: 5 days a week with first and third Saturday working. Office timing: 10 am to 7:00 pm Small and friendly Team Culture with high exposure to learning in different domains Increment: As per market standards Provident fund Paid leave encashment Medical Insurance
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France