Job Title : SAP Solution Architect Location: Atlanta, GA(Onsite) Duration: 12+ Months Job Responsibilities : Participates in developing functional requirements, testing, training, and implementing Applications. Conducts business process analyses, needs assessments and preliminary cost/benefits analysis in an effort to align information technology with business objectives. Facilitates the implementation and support of SAP modules to enhance the clients business functionality and overall performance, while maintaining a high degree of customer satisfaction. Implement and configure SAP Transportation Technical Skills: Transportation Network Freight Unit Building Rule (FUBR) Freight unit Order Tendering and Carrier Determination. CIF-Integration Models Definition, Configuration and Error Remediation Transportation Charge Management Configuration. Should be knowledgeable in SAP TM Master Data Management Product, Business Partner, Dangerous goods, Transportation network (locations, routes, and zones) Resources (Vehicles, trailers, Handing units). Order Management integration with SAP TM Planning - Selection Profiles and Planning Profiles, conditions Optimizer Planning, Schedules, Freight Execution Carrier selection and Tendering. SAP TM business process: Inbound and outbound Shipments (Domestic and International) Transportation Load Builder (FTL, LTL Package Builder (Mix Product packing and Layer building) Shipper scenario with all modes of transport Road, ocean, Rail and Ocean. Ocean Air Freight Order Transpiration Planning (Freight Units, Freight proposal and Freight Orders) Transportation Execution (Carrier Selection, Tendering, Delivery and Shipment proposals) Freight Settlement (Charge Management, Charge Calculations, Freight Settlement) and cost distribution. Experience in a full project lifecycle highly desired Knowledge of RFCs, BAPIs, BADIs, BDCs, User Exits, and enhancement points. Knowledge in TM Queue processing. Strong knowledge of outbound/inbound ABAP interfaces using IDOCs, BAPIs, RFCs to/from Middleware, and flat files. Strong verbal and written communication skills with the ability to communicate at different levels within the organization. SAP Global Trade Services (GTS) integration and configuration experience highly desired SAP ERP Central Component (ECC) cross-module experience highly desired (MM, PM) Agile Methodology - Project Experience Qualifications : BS in Computer Science, Engineering or equivalent preferred 5+ years of experience with TM 9.3 or above. Overall, 5+ yrs. of SAP Functional experience in SAP logistics . U can reach me @ [HIDDEN TEXT] Show more Show less
Job Openings: AI & Data Professionals / Data Architects / Power BI Developers / ServiceNow / SAP / Cloud / Salesforce Company : Asoft Consulting Work Location : Bangalore & Hyderabad Interview Location : Coimbatore Experience : 5–15 years (Required) About We are hiring skilled professionals in AI, Data, Data Architects, Power BI Developers, ServiceNow, SAP, Cloud and Salesforce to join our growing team. If you have strong expertise in AI, data engineering, architecture, Power BI development, ServiceNow, SAP, Cloud, Salesforce, this is your chance to work on exciting projects with top clients. Education Bachelor’s/Master’s in Computer Science, Data Science, Engineering, or related fields. Walk-in Interview Details 📍 Venue: Asoft Consulting, 144, Sengupta St, Ram Nagar, Coimbatore – 641009 📅 Date: 10th October 2025 ⏰ Time: 10:00 AM – 5:00 PM 👉 Register Now to confirm your participation! https://forms.gle/LxgSUrFEt39739FP6
Job Title: Azure Data Engineer Lead Experience Required: 8-12 Years Location: [Bangalore] Employment Type: [Full-time] Job Summary Below are the job requirements: Experience in ADF and on support projects. Must have skills to resolve ADF pipeline issues within defined SLAs Hands on experience with ADF pipeline monitoring, debugging, and minor enhancements Minimum 3-4 yrs of relevant experience in Team Handling Strong knowledge of SQL server database and Azure Sql database Must have worked on ETL projects Must have knowledge on Azure Cloud architecture Must know Azure monitor, log analytics and other monitoring tools for pipeline health checks DP-203 certification not mandatory but give priority Knowledge on Power BI is Good to Have.
Job Title: Data Analyst Experience Required: 4-7 Years Location: [Bangalore] Employment Type: [Full-time] Required Skills & Experience 4–8 years of experience as a Data Analyst or Business Analyst in manufacturing or industrial projects. Strong hands-on experience with Power BI (data modeling, DAX, visualization best practices). Proficiency in SQL (joins, CTEs, stored procedures, performance tuning). Demonstrated ability to perform root cause and variance analysis using real production or supply chain data. Familiarity with manufacturing processes such as OEE, downtime analysis, production planning, and inventory management. Strong communication and presentation skills to explain insights to non-technical stakeholders. Exposure to Microsoft Fabric analytics ecosystem. Key Responsibilities Analyze manufacturing performance data to identify trends, inefficiencies, and improvement opportunities. Conduct root cause analysis on production, quality, and supply chain issues using structured problem-solving methods. Design and build insightful dashboards and reports in Power BI. Write and optimize SQL queries to extract, transform, and validate data from multiple systems (SAP, MES, Salesforce & Quality databases). Collaborate with cross-functional teams (Production, Quality, Finance, SCM) to translate business requirements into analytical solutions. Develop and maintain KPIs for operational excellence, cost optimization, and yield improvement.
Job Summary Company: Asoft Consulting Work Type: [Full Time] Work Location: [Bangalore] Interview Location: Hyderabad Venue: 5th Floor, Naspur, V4 Info Private Limited, 3-6-438/5, Above Hyundai Showroom, Himayatnagar, Hyderabad, Telangana - 500029, India. Skills Required: Proficiency in Databricks Workflows, Delta Live Tables (DLT), PySpark, and SQL for designing scalable ETL pipelines within the Medallion Architecture (Bronze–Silver–Gold). Strong knowledge of Unity Catalog and Delta Lake. Hands-on experience with Databricks Genie, Vector Search, and MLflow for developing, fine-tuning, and deploying AI/LLM-based solutions. Expertise in Databricks SQL Endpoints, LangChain, and RAG (Retrieval-Augmented Generation) frameworks for conversational analytics and chatbot development. Familiarity with DevOps practices (CI/CD, GitHub Actions, Terraform) and DataOps workflows within the Databricks ecosystem.