Tango s system implementation and support follows the process of requirements, design, configure/ build, test and remediate, deploy, and support. The Services and Support teams in the Bangalore office work directly with Tango s US based teams and with clients across the globe. The role requires candidates to work across the following areas: System design and configuration Report design/ development Testing Data conversion Integration design Support Requirements: Ability to communicate effectively with international team members and clients Ability to effectively and quickly understand business processes and understand how technology enables those processes Good understanding of SDLC or software implementation lifecycle (applications like SAP, Oracle, or other ERP or web applications) Prior system configuration experience based on design specifications is a plus. The role will focus on configuration of a proprietary application based on Oracle s ADF (Application Development Framework). No prior Oracle or ADF experience necessary. Investigates and reproduces issues, provides fixes and workarounds and verifies changes and ensure continued support of the software solution Ability to investigate challenging tickets from Client and close the tickets Monitor and own all support related metrics including ticket backlog, average time to close, workload distribution, first response time, resolution time and ticket escalations Strong understanding of databases like Oracle, SQL Server etc. Strong Excel skills Strong SQL skills Apply appropriate development methodologies (e.g.: agile, waterfall) and best practices (e.g.: mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion Collaborate with other team members (involved in the requirements gathering, testing, roll-out and operations phases) to ensure seamless transitions. Summarize and present results to team members and clients globally Understand technical requirements to ensure accurate understanding and implementation This role has the potential to grow into gaining extensive knowledge within multiple client areas with the ability to provide consulting and expertise on business processes Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects Willingness to travel internationally as needed to work with client Willingness to work from 3 PM IST to 12 AM IST with appropriate breaks for meals Qualifications Bachelor's/Master's degree with specialization in Computer Science, MIS, IT or other computer related disciplines 3-6 years of relevant consulting-industry experience working on medium-large scale technology solution delivery engagements Experience with working on Jira, Zendesk and similar tools. Extensive experience working with system implementation, reporting, back-end database management (e.g.: Oracle, Teradata) and/or ETL interfacing (e.g.: Informatica, SSIS) technologies Additional Information Need for immediate joining.
Job Description Configure and implement Single Sign-On (SSO) solutions using Microsoft Entra ID (Azure AD). Assess current environment and onboard applications into Entra ID for secure access. Perform requirements analysis, capture integration needs, and document configurations. Configure SAML-based authentication and manage Entra Enterprise Applications. Assign users/groups and validate access across SP-initiated and IdP-initiated login flows. Troubleshoot issues related to authentication, certificates, and attribute mapping. Deliver clear handover documentation for internal IT/admin teams. Qualifications Proven hands-on experience configuring SSO in Microsoft Entra ID / Azure AD. Strong understanding of SAML 2.0, IdP/SP flows, certificates, and claims mapping. Experience onboarding SaaS applications into Entra/Azure AD. Good troubleshooting and problem-solving skills. Ability to deliver results independently within tight timelines. Excellent documentation and communication skills. Additional Information Need for immediate joining. Work from home. Duration: 2 3 weeks (short-term contract)
Job Description Assist in configuring and maintaining integration workflows using middleware tools to support data exchange between Tangos API architecture and third-party systems. Support ETL processes, including data extraction, transformation, and loading, following Tangos proprietary data migration methodologies incorporating data analysis of input, internal, and output message data. Participate in testing and validating integrations, ensuring data quality, asynchronous processing, and polling-based synchronization meet client requirements; plan and lead integration tests with external systems. Collaborate on low-touch implementations by leveraging standard API endpoints and flat file transfers (e.g., SFTP-based), deploying standard integrations, and providing daily operational support. Provide Level 3 Production Support for integration-related issues, including root-cause analysis and remediation within defined SLAs; serve as the primary coordinator for internal engineering resources. Contribute to documentation updates for integration playbooks, Swagger files, user guides, test procedures, performance specifications, and product manuals to enhance self-service capabilities. Assist in estimating Level of Effort (LOE) for custom integrations during SOW/CO development and client engagements; prepare data templates based on Source to Target Mapping (STM) documents. Perform data transformations, including merging, ordering, aggregation, and resolving data cleansing/quality issues using various data processing tools; add/remove columns in ETL scripts (e.g., Kettle) using STM/Data Templates. Run lookups queries, update lookups, execute data quality scripts, format and validate data quality reports, and run scripts on data in staging databases while updating data load checklists. Conduct internal smoke testing of loaded data in the Tango application and prepare/add/remove columns in sample Business Data validation trackers using STM. Integrate information from multiple data sources, solving common transformation problems, and effectively communicate with managers and project teams regarding data sets and reporting needs. Engage in agile iterations for refining transformation routines and business rules, prioritizing critical path data elements while understanding business-domain context to clean and transform data for analysis. Qualifications Bachelors degree in Computer Science, Information Technology, Data Science, Mathematics, Statistics, or a related quantitative field (or equivalent experience). 0-2 years of professional experience in software development, integrations, or data wrangling (internships or academic projects count); at least 2+ years of SQL proficiency. Proficiency in JavaScript, with a solid understanding of scripting for data manipulation and automation; 2+ years of experience with Kettle (Pentaho) or similar ETL/data processing tools. Hands-on experience in any ETL/Reporting tool and manual data analysis using Excel or other spreadsheet tools. Basic knowledge of RESTful APIs, asynchronous processing, and data formats (e.g., JSON, CSV); understanding of both waterfall and agile project management methodologies. Proficient in Microsoft Office software applications, such as Word, Excel, and PowerPoint. Excellent analytical and problem-solving skills with the ability to perform root-cause analysis; strong attention to detail and prioritization skills to handle multiple tasks in a fast-paced environment. Strong written and verbal communication skills; aptitude to learn new technologies. Preferred Qualifications: Experience with Node.js for building scalable backend services and handling data pipelines; 3 years of scripting experience for data analysis in languages like SQL, Python, or R. Familiarity with Node-RED or similar low-code/no-code flow-based tools for middleware and integration workflows. Demonstrated experience in data wrangling, including extracting and transforming different types of data files (e.g., CSV, XLSX) into analyzable structures using reproducible coding and scripting techniques. Exposure to ETL tools, data migration processes, or middleware architectures (e.g., Node-RED in a production environment). Understanding of security protocols such as JWT authentication, PGP encryption, or SAML 2.0 for SSO. Prior experience in SaaS environments, particularly IWMS or enterprise software integrations. Additional Information Need for immediate joining.
Attain ETL - Data Expert | SmartRecruiters Company Description Founded in 2018 in Bangalore, the center of Indias high-tech industry area, Attain has grown to serve a global client base of startups, early stage companies, and SMEs. Attain is US based software company TANGOs India partner. TANGOs software solution helps customers find, build and manage retail and office locations in a single suite/ software platform. Attain supports Tango with services in the areas of development, implementation, QA, and client support. Job Description Prepare Data templates based on Source to Target Mapping (STM) document Add/remove columns in Kettle scripts using STM/Data Templates Run Lookups query and update lookups Run Data Quality scripts Format and validate Data Quality report Run scripts on loaded staging data and update data load checklist Execute internal smoke testing of loaded data in TANGO application Prepare/add/remove columns in Sample Business Data validation tracker using STM Assist to create training materials and help documentation Perform data transformations, including merging, ordering, aggregation, etc., using various data processing tools. Be able to understand the business-domain context for the data well enough to clean and transform it into a form useful for subsequent analysis. Integrate information from multiple data sources, solving common transformation problems, and resolve data cleansing and quality issues Effectively communicate with managers and project teams across different disciplines regarding data sets and reporting needs Qualifications Must Have: SQL basics Understanding of waterfall methodology Willing to learn new technologies Proficient in Microsoft Office software applications, such as Word, Excel, and PowerPoint Strong attention to detail Strong prioritization skills; ability to perform multiple tasks and meet deadlines in a fast-paced dynamic environment Strong written and verbal communication skills Desirable Skills (nice to have): Hands-on experience in any ETL/Reporting tool Excellent analytical and problem-solving skills. Ability to perform root cause analysis Manual data analysis using Excel or other spreadsheet tools Education/ Experience: Bachelor s degree in Data Science, Computer Science, Mathematics, statistics or similar quantitative field plus at least 3 years of relevant experience working with data. Demonstrated experience in Data Wrangling including extracting and transforming different types of data files (csv, xlsx) into analyzable data using reproducible coding and scripting techniques Minimum 3 years of data scripting experience in a common language (SQL, Python, R, etc.) Experience with data formatting and storage Skills/ Training: Data-orientated personality. Analytical, planning, problem solving, and decision-making skills Good scripting practices Experience with SQL. Ability to learn and implement new techniques and skills, and seek guidance when needed. Excellent written and oral communication skills including effectively explaining concepts to managers and other project team members. Ability to work effectively in a multi-disciplinary setting. Ability to effectively manage multiple concurrent tasks and seek direction on competing priorities. Experience with data wrangling and harmonization across a variety of data file types and formats.
Attain Data and Integration Engineer | SmartRecruiters Company Description Founded in 2018 in Bangalore, the center of Indias high-tech industry area, Attain has grown to serve a global client base of startups, early stage companies, and SMEs. Attain is US based software company TANGOs India partner. TANGOs software solution helps customers find, build and manage retail and office locations in a single suite/ software platform. Attain supports Tango with services in the areas of development, implementation, QA, and client support. Job Description Assist in configuring and maintaining integration workflows using middleware tools to support data exchange between Tangos API architecture and third-party systems. Support ETL processes, including data extraction, transformation, and loading, following Tangos proprietary data migration methodologies incorporating data analysis of input, internal, and output message data. Participate in testing and validating integrations, ensuring data quality, asynchronous processing, and polling-based synchronization meet client requirements; plan and lead integration tests with external systems. Collaborate on low-touch implementations by leveraging standard API endpoints and flat file transfers (e.g., SFTP-based), deploying standard integrations, and providing daily operational support. Provide Level 3 Production Support for integration-related issues, including root-cause analysis and remediation within defined SLAs; serve as the primary coordinator for internal engineering resources. Contribute to documentation updates for integration playbooks, Swagger files, user guides, test procedures, performance specifications, and product manuals to enhance self-service capabilities. Assist in estimating Level of Effort (LOE) for custom integrations during SOW/CO development and client engagements; prepare data templates based on Source to Target Mapping (STM) documents. Perform data transformations, including merging, ordering, aggregation, and resolving data cleansing/quality issues using various data processing tools; add/remove columns in ETL scripts (e.g., Kettle) using STM/Data Templates. Run lookups queries, update lookups, execute data quality scripts, format and validate data quality reports, and run scripts on data in staging databases while updating data load checklists. Conduct internal smoke testing of loaded data in the Tango application and prepare/add/remove columns in sample Business Data validation trackers using STM. Integrate information from multiple data sources, solving common transformation problems, and effectively communicate with managers and project teams regarding data sets and reporting needs. Engage in agile iterations for refining transformation routines and business rules, prioritizing critical path data elements while understanding business-domain context to clean and transform data for analysis. Qualifications Bachelors degree in Computer Science, Information Technology, Data Science, Mathematics, Statistics, or a related quantitative field (or equivalent experience). 3-6 years of professional experience in software development, integrations, or data wrangling, at least 3+ years of SQL proficiency. Proficiency in JavaScript, with a solid understanding of scripting for data manipulation and automation; 2+ years of experience with Kettle (Pentaho) or similar ETL/data processing tools. Hands-on experience in any ETL/Reporting tool and manual data analysis using Excel or other spreadsheet tools. Basic knowledge of RESTful APIs, asynchronous processing, and data formats (e.g., JSON, CSV); understanding of both waterfall and agile project management methodologies. Proficient in Microsoft Office software applications, such as Word, Excel, and PowerPoint. Excellent analytical and problem-solving skills with the ability to perform root-cause analysis; strong attention to detail and prioritization skills to handle multiple tasks in a fast-paced environment. Strong written and verbal communication skills; aptitude to learn new technologies. Preferred Qualifications: Experience with Node.js for building scalable backend services and handling data pipelines; 3 years of scripting experience for data analysis in languages like SQL, Python, or R. Familiarity with Node-RED or similar low-code/no-code flow-based tools for middleware and integration workflows. Demonstrated experience in data wrangling, including extracting and transforming different types of data files (e.g., CSV, XLSX) into analyzable structures using reproducible coding and scripting techniques. Exposure to ETL tools, data migration processes, or middleware architectures (e.g., Node-RED in a production environment). Understanding of security protocols such as JWT authentication, PGP encryption, or SAML 2.0 for SSO. Prior experience in SaaS environments, particularly IWMS or enterprise software integrations.
FIND ON MAP