Roles & Responsibilities- 1.Experience into Webmethod Support role 2.Experience into L1,L2,L3 Support 3.Experience into Webmethod Integrations 4.Experience in P1,P2,P3 role 5.Experience in Webmethod Pub-Sub model
Full stack web developer is required for MNC Client who have strong experience in Node js react js, redux and aws
Roles & Responsibilities- 8-10 years of hands-on functional consulting experience in SAP B1 Finance modules Experience in configuration and mapping business functionalities to AP,AR,GL, Assets and Taxation Understanding of Finance,Taxation, Assets
Roles & Responsibilities- 4-6 years of hands-on functional consulting experience in SAP B1, with a deep focus on the Manufacturing module Experience in an SAP B1 to SAP ECC Experience in Make to stock and Make to order scenario & Data Migration
Roles & Responsibilities- 4-6 years of hands-on functional consulting experience in SAP B1 with a Sales (SD) and Inventory Management (MM) modules Experience in full lifecycle SAP implementation project Experience in generating / extracting Sales
Roles & Responsibilities- 4-6 years of hands-on technical experience with SAP Business One (SAP B1) Proven experience in SAP B1 to SAP ECC migration project Proficiency in ABAP development Experience in customised functionality in SAP B1
Roles & Responsibilities- Experience in Salesforce AgentForce AI implementation experience Experience in Sales and Service cloud Expertise in SFDC with APEX code, Visual Force, Lightning UI, Triggers, LWC Experience in Agile Methodologies
Data Platform Design & Implementation Architect and deploy scalable, secure, and high-performing Snowflake environments in line with data segregation policies. Automate infrastructure provisioning, testing, and deployment for seamless operations. 2. Data Integration & Pipeline Development Develop, optimize, and maintain data pipelines (ETL/ELT) to ensure efficient data ingestion, transformation, and migration. Implement best practices for data consistency, quality, and performance across cloud and on-premises systems. 3. Data Transformation & Modeling Design and implement data models that enable efficient reporting and analytics. Develop data transformation processes using Snowflake, DBT, and Python to enhance usability and accessibility. 4. Networking, Security & Compliance Configure and manage secure network connectivity for data ingestion. Ensure compliance with GDPR, CISO policies, and industry security standards. 5. Data Quality & Governance Ensure the Data Segregation Policy is firmly followed for the data sets enabled. Implement data validation, anomaly detection, and quality assurance frameworks. Collaborate with the Data Governance team to maintain compliance and integrate quality checks into data pipelines.
Job Title: Senior Python Backend Developer AWS Serverless & Event-Driven Architecture Job Description: We are seeking an experienced Python Backend Developer with expertise in asynchronous programming and AWS serverless architecture to design and develop scalable, event-driven microservices. Key Responsibilities: Develop APIs using FastAPI, Flask, or Django (async views) Design and implement event-driven microservices using AWS Lambda, API Gateway, DynamoDB (GSI/LSI), EventBridge, Step Functions, SNS, and SQS Apply API standards with Pydantic, OAuth2/JWT, and rate limiting Build resilient, idempotent services with observability using AWS X-Ray, CloudWatch, DLQs, and retries Optimize DynamoDB schemas, TTLs, and streams Requirements: 4+ years of backend development experience with Python Strong expertise in AWS Serverless stack
• Design, develop, & maintain applications on the IBM i (AS/400) platform. • Write using RPG IV/ILE, CL, and DDS. • Analyze business requirements and translate into technical specifications. • Perform unit testing, debugging, and performance tuning
Are you passionate about connecting talent with opportunity? Manage end-to-end IT recruitment lifecycle Source, screen, and shortlist candidates for technical roles Coordinate with stakeholder Maintain a strong pipeline
Role & responsibilities - Lead the architecture, design, and implementation of robust, scalable, and high-performing ML and AI platforms. Deep expertise in designing and deploying ML/AI platforms, specifically using AWS SageMaker and MLOps.. Hands-on implementation of parallel computing and distributed training methodologies to enhance the efficiency and scalability of machine learning models. Collaborate closely with data scientists and engineers to deploy complex ML and deep learning models into mission-critical production systems. Ensure best practices in CI/CD, containerization, orchestration, and infrastructure-as-code are consistently applied across platforms. Foster a culture of innovation, continuous improvement, and self-service analytics across the team and organization. Knowledge and Skills: Deep expertise in designing and deploying ML/AI platforms, specifically using AWS SageMaker and MLOps. Strong proficiency in Python and its ecosystem (e.g., TensorFlow, PyTorch, scikit-learn). Extensive hands-on experience with parallel computing frameworks and distributed processing. Proficient in SQL, ETL, data warehousing, and data modeling techniques. Thorough understanding of statistical analysis, predictive modeling, and data mining methodologies. Proven capability of deploying machine learning models into production at scale. Familiarity with CI/CD pipelines, containerization (Docker), and version control (Git).
Work Location: Bangalore Whitefield Mode of Hire: Contract to Hire (C2H) Notice Period: Immediate to 10 Days Experience: 8+ Years (6+ Years in Database/Infrastructure Management) Role Overview: We are looking for an experienced Redis & Kafka Administrator (L3 Support) to join our dynamic infrastructure team. The ideal candidate will have strong expertise in managing, optimizing, and securing large-scale Redis and Kafka clusters while providing advanced support and automation for mission-critical systems. Key Roles & Responsibilities Redis Administration Manage and maintain Redis clusters (Standalone, Sentinel, and Redis Enterprise setups). Perform upgrades, patching, and capacity planning. Configure persistence (RDB, AOF) and ensure high availability. Monitor and optimize Redis performance through configuration tuning. Manage backup, recovery, and disaster recovery processes. Troubleshoot and resolve production-related issues. Kafka Administration Deploy, configure, and maintain Kafka clusters (including ZooKeeper). Monitor performance, optimize parameters, and ensure message reliability. Manage topics, partitions, replication, and retention policies. Configure security (SSL, SASL, ACLs) for secure data streaming. Support Schema Registry, Kafka Connect, Kafka Streams, and MirrorMaker. Troubleshoot consumer/producer lag and broker-related issues. L3 Support & Automation Act as an escalation point for complex incidents and perform root cause analysis. Automate routine DBA tasks using scripting (Python, Bash, etc.). Develop and manage Infrastructure as Code (Terraform, Ansible). Collaborate with SRE and DevOps teams to integrate with CI/CD pipelines. Participate in on-call rotations and incident response activities. Must-Have Skills: 8+ years of overall experience, including 6+ years in database/infrastructure management. Minimum 3 years of hands-on experience with Redis . Minimum 3 years of hands-on experience with Kafka . Strong Linux system administration skills. Proficiency in scripting (Bash, Python, or equivalent). Experience with monitoring tools like Prometheus, Grafana, ELK , etc. Working knowledge of Docker and Kubernetes . Good understanding of networking, firewalls, DNS, and load balancing . Good to Have: Experience with Redis Enterprise and Confluent Kafka . Certifications in Redis, Kafka, or Linux . Familiarity with AWS, GCP, or Azure . Experience in PCI/HIPAA-compliant environments . Qualification: Bachelors degree in Computer Science, Information Technology , or a related field. Preferred Certifications: MongoDB Certified DBA Associate or Professional . Other relevant certifications in database or infrastructure management are a plus.
•Design, develop, using Java (Spring Boot Microservices) & Node.js. •Build & integrate RESTful APIs & microservices to support front-end & third-party integrations. Develop responsive & dynamic front-end applications using Angular / React ..
Design and implement end-to-end Power BI solutions using Power BI Interactive and Paginated Reporting. Build efficient and scalable semantic models using DAX and Power BI best practices. Collaborate with business and technical stakeholders to gather requirements. Optimize data performance and visualization across reports and dashboards. Guide and mentor BI developers and analysts. Required Skills: 8 to 13 years of experience in BI, with a strong focus on Power BI and Microsoft Fabric Mandatory: Deep hands-on experience with semantic modeling and Distribution of reports. Proficient in DAX, Power Query (M), and data modeling techniques. Proficient in Power BI Apps and Subscriptions. Experience in Dynamic RLS, Field parameters and Calculation Groups Experience in Dataflow Gen1 and Gen2 and Deployment pipelines Experience in Power Platform(Power automate and Power Apps) Experience in lakehouse,Medallion Architecture,Warehouse and One Lake Experience with data warehousing, ETL processes, and cloud platforms
Hiring Senior Android Developer (Kotlin) with 6+ yrs exp. Strong in Kotlin, MVVM, Jetpack, Coroutines, Dagger/Hilt, and Android TV/OTT apps. Must ensure scalability, performance & mentoring. Exp in Jetpack Compose & TIF is a plus.
FIND ON MAP
 
                         
                    