Attain Data and Integration Engineer - I

0 - 2 years

0 Lacs

Posted:21 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Internship

Job Description

Job Description
  • Assist in configuring and maintaining integration workflows using middleware tools to support data exchange between Tangos API architecture and third-party systems.
  • Support ETL processes, including data extraction, transformation, and loading, following Tangos proprietary data migration methodologies incorporating data analysis of input, internal, and output message data.
  • Participate in testing and validating integrations, ensuring data quality, asynchronous processing, and polling-based synchronization meet client requirements; plan and lead integration tests with external systems.
  • Collaborate on low-touch implementations by leveraging standard API endpoints and flat file transfers (e.g., SFTP-based), deploying standard integrations, and providing daily operational support.
  • Provide Level 3 Production Support for integration-related issues, including root-cause analysis and remediation within defined SLAs; serve as the primary coordinator for internal engineering resources.
  • Contribute to documentation updates for integration playbooks, Swagger files, user guides, test procedures, performance specifications, and product manuals to enhance self-service capabilities.
  • Assist in estimating Level of Effort (LOE) for custom integrations during SOW/CO development and client engagements; prepare data templates based on Source to Target Mapping (STM) documents.
  • Perform data transformations, including merging, ordering, aggregation, and resolving data cleansing/quality issues using various data processing tools; add/remove columns in ETL scripts (e.g., Kettle) using STM/Data Templates.
  • Run lookups queries, update lookups, execute data quality scripts, format and validate data quality reports, and run scripts on data in staging databases while updating data load checklists.
  • Conduct internal smoke testing of loaded data in the Tango application and prepare/add/remove columns in sample Business Data validation trackers using STM.
  • Integrate information from multiple data sources, solving common transformation problems, and effectively communicate with managers and project teams regarding data sets and reporting needs.
  • Engage in agile iterations for refining transformation routines and business rules, prioritizing critical path data elements while understanding business-domain context to clean and transform data for analysis.

Qualifications
  • Bachelors degree in Computer Science, Information Technology, Data Science, Mathematics, Statistics, or a related quantitative field (or equivalent experience).
  • 0-2 years of professional experience in software development, integrations, or data wrangling (internships or academic projects count); at least 2+ years of SQL proficiency.
  • Proficiency in JavaScript, with a solid understanding of scripting for data manipulation and automation; 2+ years of experience with Kettle (Pentaho) or similar ETL/data processing tools.
  • Hands-on experience in any ETL/Reporting tool and manual data analysis using Excel or other spreadsheet tools.
  • Basic knowledge of RESTful APIs, asynchronous processing, and data formats (e.g., JSON, CSV); understanding of both waterfall and agile project management methodologies.
  • Proficient in Microsoft Office software applications, such as Word, Excel, and PowerPoint.
  • Excellent analytical and problem-solving skills with the ability to perform root-cause analysis; strong attention to detail and prioritization skills to handle multiple tasks in a fast-paced environment.
  • Strong written and verbal communication skills; aptitude to learn new technologies.

Preferred Qualifications:

  • Experience with Node.js for building scalable backend services and handling data pipelines; 3 years of scripting experience for data analysis in languages like SQL, Python, or R.
  • Familiarity with Node-RED or similar low-code/no-code flow-based tools for middleware and integration workflows.
  • Demonstrated experience in data wrangling, including extracting and transforming different types of data files (e.g., CSV, XLSX) into analyzable structures using reproducible coding and scripting techniques.
  • Exposure to ETL tools, data migration processes, or middleware architectures (e.g., Node-RED in a production environment).
  • Understanding of security protocols such as JWT authentication, PGP encryption, or SAML 2.0 for SSO.
  • Prior experience in SaaS environments, particularly IWMS or enterprise software integrations.

Additional Information
  • Need for immediate joining.

Mock Interview

Practice Video Interview with JobPe AI

Start JavaScript Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Javascript Skills

Practice Javascript coding challenges to boost your skills

Start Practicing Javascript Now
Attain

Consulting / Technology

Philadelphia

RecommendedJobs for You