Role & responsibilities Eager to learn problem Solving good communication skills ability to resolve the customer query
We are seeking an experienced Data Engineer with 4 to 5 years of experience, and at least 3+ years of expertise in designing and implementing data solutions using the Databricks platform. The ideal candidate will have a solid background in distributed computing, big data technologies, and cloud platforms. Key Responsibilities : Design, implement, and maintain scalable data solutions using Databricks . Work with programming languages such as Python, SQL, or Scala . Collaborate with cross-functional teams to deliver data-driven solutions. Optimize and manage data pipelines using Apache Spark and cloud services (AWS, Azure, or GCP). Leverage DevOps practices to automate deployments and monitor data pipelines. Key Skills : Proficiency in Python , SQL , or Scala . Hands-on experience with Databricks and Apache Spark . Strong knowledge of cloud platforms like AWS, Azure, or GCP. Good understanding of distributed computing and big data technologies. Familiarity with containerization (Docker, Kubernetes) and DevOps practices. Good to Have : Experience with containerization technologies (Docker, Kubernetes). Knowledge of automated deployment and monitoring for data pipelines. Qualifications : Bachelor s or Master s degree in Computer Science , Engineering , or a related field. Work Location : Bangalore, Hyderabad (Preferred) How to Apply : Please send your updated resume to [email] with the subject line Data Engineer Databricks, SQL, Python. Job Category: Developer
TechMantra Global is looking for SAP CPI Consultant (with BTP Knowledge) to join our dynamic team and embark on a rewarding career journey Undertake short-term or long-term projects to address a variety of issues and needs Meet with management or appropriate staff to understand their requirements Use interviews, surveys etc. to collect necessary data Conduct situational and data analysis to identify and understand a problem or issue Present and explain findings to appropriate executives Provide advice or suggestions for improvement according to objectives Formulate plans to implement recommendations and overcome objections Arrange for or provide training to people affected by change Evaluate the situation periodically and make adjustments when needed Replenish knowledge of industry, products and field
We are seeking a skilled Boomi Associate with extensive experience in Dell Boomi integrations. The role involves handling the full software lifecycle from planning, requirement gathering, system design, and configuration to testing, integration, and deployment. You will work in an Agile Scrum environment, ensuring timely project delivery while handling complex situations under pressure. Key Responsibilities : Implement Boomi integrations with various applications. Manage end-to-end software lifecycle including system design , configuration , and testing . Collaborate with cross-functional teams and ensure timely project delivery. Provide user training and troubleshoot ongoing projects. Mandatory Skills : Dell Boomi Dell Boomi Associate Developer and Professional Developer certifications Good to Have : Experience with client interaction . Strong analytical, communication, and problem-solving skills. Team management and mentoring experience. Job Category: Information Technology
We are looking for an experienced SAP FIORI and ODATA Developer with expertise in SAP Tcodes , OData Services , and UI5 apps . The role involves supporting custom-built UI5 applications and developing new custom apps for our clients. Key Responsibilities : Support and maintain existing UI5 custom applications . Design and develop new custom apps for clients. Work with SAP NetWeaver Gateway Services , OData/JSON/RESTful data models. Hands-on experience with CDS Views , HANA Views , and SAP ABAP development . Knowledge of SAP Gateway and core ABAP methodologies. Mandatory Skills : SAP FIORI , OData Services , SAP Tcodes Experience with SAP NetWeaver Gateway , CDS Views , HANA Views SAP ABAP development programming Desired Skills : Same as mandatory skills. Domain : Manufacturing Job Category: Information Technology
We are looking for an experienced Power BI Developer with 3+ years of expertise in business intelligence or data analysis . This role requires expert-level proficiency in Power BI Desktop and Power BI Service . You will be responsible for creating data-driven solutions, developing dashboards, and providing insights for business decision-making. Key Responsibilities : Design and implement Power BI dashboards and reports . Analyze and visualize data for business insights. Collaborate with stakeholders to understand requirements and deliver solutions. Manage end-to-end data analysis and reporting tasks. Mandatory Skills : 3+ years of experience in business intelligence or data analysis . Expert-level proficiency in Power BI Desktop and Power BI Service . Good to Have : Knowledge of cloud-based data platforms (Azure, AWS, GCP). Job Category: Information Technology
We are seeking an experienced Azure Data Engineer with strong expertise in designing, developing, and implementing data pipelines using Azure Data Factory (ADF) . You will be responsible for extracting, transforming, and loading data into Azure Data Lake Storage (ADLS) from various sources. Key Responsibilities : Design, develop, and manage ETL pipelines using Azure Data Factory . Extract, transform, and load data into Azure Data Lake Storage (ADLS) . Collaborate with data teams and stakeholders to ensure optimal data architecture. Mandatory Skills : Proficiency in Azure Data Factory for data pipeline development. Experience with Azure Data Lake Storage (ADLS) . Good to Have : Experience with Azure Databricks , Python , and PySpark . Work Mode : Hybrid Start Date : Immediate Job Category: Information Technology
We are looking for a Technical Lead with 8+ years of total experience and 5+ years of relevant experience in Snowflake, SQL , and Python . The candidate will be responsible for leading a team and contributing to design and development activities. Hands-on expertise in Snowflake , SQL , and Tableau is essential. Key Responsibilities : Lead a team of developers to deliver high-quality solutions. Design and develop solutions using Snowflake, SQL , and Python . Provide problem-solving and technical expertise. Collaborate with cross-functional teams to ensure seamless project execution. Mandatory Skills : Snowflake SQL Python Work Model : Hybrid (Work from office 2-3 days/week) Joining Time : Within 15-20 days Job Category: Information Technology
We are seeking an experienced Technology Lead with expertise in Quarkus , Java Spring , Kong APIM , and DevOps . This role involves leading integration projects, developing microservices, and working in an agile, CI/CD environment. You will also manage and groom a geographically distributed team of developers. Key Responsibilities : Lead the development of microservices and integrations using Quarkus , Java Spring , and Kong APIM . Collaborate in a DevOps environment, ensuring smooth build and deployment processes. Provide technical leadership and guidance to a team of developers. Ensure adherence to CI/CD and Agile methodologies . Manage integration projects related to Identity and Access Management . Key Skills : Proficient in Quarkus , Java Spring , and Kong APIM . Experience in API Management . Strong knowledge of DevOps tools and practices. Experience leading development teams and working with distributed teams. Qualifications : 7+ years of relevant experience in Integration and related technologies. Strong communication and leadership skills. Familiarity with Identity and Access Management domain. Job Category: Information Technology
Strong hands-on experience in Java technologies as the primary skill with Java 17 or above. Strong hands on experience with Angular15+ and Typescript, exposure to ReactJS Mandatory skills* Unix systems, Shell scripting, Python etc. GitHub, Gradle/Maven, Jenkins and Ansible Desired skills* Agile methodologies such as Kanban/Scru
Must be an individual contributor and should manage tasks with minimum guidance Perform System Testing, System Integration Testing, Regression Testing manually and/or via automation Responsible for creating detailed, comprehensive, and well-structured test plans and test cases Responsible for automating basic regression scenarios. Track quality assurance metrics, like defect densities and Defect leakage. Identify, isolate, and track bugs throughout testing Liaise with internal teams (e.g. developers and product managers) to identify system requirements. Provide summary and detailed Test Execution report to Test Lead/Test Manager. Knowledge, Skills & Experience Job Experience Bachelors degree in computer science, IT or similar technical field 7-9 Years of functional testing experience in Oracle Retail RMS|RPM|ReSA|WMS Strong knowledge of software QA methodologies, and working experience in latest Test tools (Azure TFS / JIRA /ALM) Good experience in End-to-End testing of integrated applications Strong knowledge of Oracle SQL, PL/SQL & Scripting and Proficient in Excel Strong knowledge on Unix scripting Hands on Test automation experience and proficiency in any one scripting language Experience in writing clear, concise, and comprehensive test plans and test cases Experience working in an Agile/Scrum development process Must have excellent verbal and written communication skills with problem-solving skills Required Leadership Skills Demonstrated ability to work well with others Responds effectively to multiple demands. Assumes responsibility for the accuracy of work processes and flow of multiple tasks. Good communication skills – written and verbal
otal Yrs. of Experience* 8+ Relevant Yrs. of experience* 8 Detailed JD *(Roles and Responsibilities) Responsibilities: Analyze and designs the system requirements Create RAML Specifications as API contracts for development Understand and implement Dataweave scripts. Communicate with the Project Client and Onshore counterpart Deploy APIs to Cloudhub, Runtime Fabric, On-prem workers, etc. Practice API-Led Connectivity Implement Basic Policies in API Manager Excellent Problem solving skills Quality documentation practice for the API specifications Perform extensive Unit integration testing Create/Manage Basic CI/CD Pipelines (Maven Plugin, Jenkins, Bamboo, etc.) Mandatory skills* Review, interpret and respond to business requirements and specifications to ensure alignment between business expectations, Integration roadmap and core systems Within the agreed Program scope, deliver and contribute to a suite of integration architecture artefacts & deliverables to define how the Future Integration Platform (FIP) will integrate with other core systems. Work closely with the offshore integration development team to guide the detailed design, development and implementation of the integrations according to the Enterprise Design and within agreed budget and timeframes. Lead the development of repeatable integration technology/information solution patterns for application of appropriate technology in alignment to business requirements and objectives. Help standardize automation, logging and monitoring systems that will enable teams to be more agile Implement architectural designs based on best practices of Mulesoft Extensive experience in designing, developing and delivering Mulesoft artifacts Experience creating architecture artifacts associated with the C4E model Experience of leading an offshore integration development team Tertiary qualification in IT Strong technical and problem solving skills, able to deliver under pressure Strong stakeholder engagement and communication skills Strong technical leadership and the ability to influence and communicate with different audiences. Able to lead or assist in the creation and development of integration platform roadmaps, including performing current state analysis, defining the future state and creating the roadmap to the future state architecture. Takes responsibility for the Global integration architecture direction of the solution. Strong experience and interest in solving problems in the system integration domain. Strong hold on Service Oriented Architecture (SOA), Event Driven Architecture (EDA) and microservices. Strong engineering design skills. Knowing how to break down complex problems and follow industry design methods to describe solutions Strong foundation on the integration technology and integration patterns Solid hands on experience in Mulesoft is a must Experience in Web Methods, Kafka, SAP Integration Suit type technologies is a plus Keen interest in open-source technologies and contributing to the C4E Solution delivery with Agile, Waterfall and CICD practices - like Gitlab and Jenkins. Familiar with techniques related to automating platforms, deployments, system alerts and monitoring dashboards. CERTIFICATION REQUIREMENT: Preferably Agile Certified
10+ years of experience in leading the design & implementation for AI/ML initiatives. Should be well versed with GenAI ecosystem. Experience in deploying (MLOps) AI systems in production at scale, ideally in Retail or Ecommerce domains Sound & proven experience in Python is required. Experience with large scale consulting and program execution engagements in AI and data. Proven experience of leading & working with AI engineering teams and delivering production-grade AI/ML systems at scale. Deep expertise in machine learning, deep learning, and NLP (LLMs, embeddings, anomaly detection). Must have implemented at least 1 Recommendation engine (End-To-End), should be aware of algorithms like Time Series Forecasting (Autoregressive and Exponential Smoothing models), Gradient Boost, Reinforcement Learning. Knowledge of Collaborative & Content Filtering too, will be required. Strong hands-on experience with Python, TensorFlow/PyTorch, vector databases, MLOps, and cloud platforms (AWS/GCP/Azure). Strong expertise in machine learning algorithms, deep learning, NLP, computer vision, and Generative AI technologies. Hands-on experience with AI/ML frameworks and libraries such as TensorFlow, PyTorch, Keras, Hugging Face Transformers, LangChain, MLflow, and related tools. Solid understanding of data engineering concepts, ETL pipelines, and working knowledge of distributed computing frameworks (Spark, Hadoop). Experience with cloud platforms (AWS, Azure, GCP) and container orchestration (Kubernetes, Docker).
SAP BTP Integration Suite(CPI) experience in implementation and production support 8-10 years of hands-on experience in SAP Integration. Should play the role of an SME/Technical Expert and help is resolution of complex business problems Should demonstrate team lead skills and should have successfully executed such roles in the past Should have good Experience with SAP S/4HANA Integration(OData, Webservices, IDoc, Proxy technologies) Should be aware of the SAP RISE construct Should possess knowledge of retails business processes (P2P & O2C) As a lead he should have good communication skills and should be able to collaborate effectively with Business, Security & Infra, IT support, and 3rd Party vendors to drive topics Must be an expert at developing iFlows and configuring standard adapters with a good understanding of all key configurations. (OData, IDoc, SFTP, Proxy, SOAP, HTTP, Process Direct, REST, JDBC) Should have expertise in mapping data structures using Graphical editor, Groovy Scripts and Java Scripts, and very good knowledge on JSON, XML, XSD, WSDL, RAML and XSLT Should be adept in analyzing integration issues and driving teams towards successful resolution
Skills Required: Familiarity with data processing engines such as Apache Spark, Flink, or other big data tools. Design, develop, and implement robust data lake architectures on cloud platforms (AWS/Azure). Implement streaming and batch data pipelines using Apache Hudi, Apache Hive, and cloud-native services like AWS Glue, Azure Data Lake, etc. Architect and optimize ingestion, compaction, partitioning, and indexing strategies in Apache Hudi. Develop scalable data transformation and ETL frameworks using Python, Spark, and Flink. Work closely with DataOps/DevOps to build CI/CD pipelines and monitoring tools for data lake platforms. Ensure data governance, schema evolution handling, lineage tracking, and compliance. Sound knowledge of Hive, Parquet/ORC formats, and DeltaLake vs Hudi vs Iceberg Strong understanding of schema evolution, data versioning, and ACID guarantees in data lakes Collaborate with analytics and BI teams to deliver clean, reliable, and timely datasets. Troubleshoot performance bottlenecks in big data processing workloads and pipelines. Experience with data governance tools and practices, including data cataloging, data lineage, and metadata management. Strong understanding of data integration and movement between different storage systems (databases, data lakes, data warehouses). Strong understanding of API integration for data ingestion, including RESTful services and streaming data. Experience in data migration strategies, tools, and frameworks for moving data from legacy systems (on-premises) to cloud-based solutions. Proficiency with data warehousing solutions (e.g., Google BigQuery, Snowflake). Expertise in data modeling tools and techniques (e.g., SAP Datasphere, EA Sparx). Strong knowledge of SQL and NoSQL databases (e.g., MongoDB, Cassandra). Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud). Nice To Have Experience with Apache Iceberg, Delta Lake Familiarity with Kinesis, Kafka, or any streaming platform Exposure to dbt, Airflow, or Dagster Experience in data cataloging, data governance tools, and column-level lineage tracking
We are looking for a very strong D365 Customer Services candidate who is having experience in connectors and CIF framework.
Exp: 2-3 years Skills : Proficiency in languages Python, C#,net, Prompt engineering, knowledge of the Azure AI Foundry projects, Azure OpenAI, Azure AI Search
otal Yrs. of Experience 12+ yrs Relevant Yrs. of experience 8+ yrs Detailed JD (Roles and Responsibilities) Take responsibility for solutions (from design to deployment) Ensure that architectural choices are based on best practices and live up to the agility required to continue being best in class. Inspire colleagues by setting an example of professionalism and proper work ethics. Represent the project/area in internal forums. Contribute to the evolution of technology and software development practice stack. Test our thinking based on your experience with new technologies. • Mature the tribes usage of micro frontends. Primary Skills: 12+ years’ experience in developing Web applications using Modern Web technologies and at least 8+ years of React experience. Proficiency working with JavaScript, CSS/HTML Good knowledge of unit/integration testing and debugging React applications Experience with both consuming and designing RESTful APIs Enthusiasm about delivering well-designed, optimized, and scalable solutions as well as the ability to write efficient, secure, well-documented, and clean code Solid communication and interpersonal skills and an advanced level of English. Level 4— Experience from large projects and carried out assignments/projects with high quality. Takes primary responsibility for management of a larger group with ability to lead and develop. Requirements Required Proficiency: Advanced Mandatory skills React JS Typescript • JavaScript • React JS Hooks and Redux • HTML/CSS ReactJS profiles which has Typescript experience which is mandatory
Looking for candidate who has experience in C#, .net and winforms. Candidate should have good communication skills. Immediate joiners are preferred.