Posted:2 days ago|
Platform:
Remote
Contractual
Job Title: Automation Engineer (with Strong Data Skills) Location: Remote Job Type: Contract Department: Engineering / Data Engineering / IT Primary Skills: ETL, Data Warehousing, SQL Joins, Selenium, Java/C#, TestNG Timings: 2:00 pm - 11:00 pm IST We are seeking a skilled Automation Engineer with strong data capabilities to join our innovative team. This role is ideal for someone who thrives at the intersection of software automation and data engineering, with a passion for streamlining workflows, building scalable automation systems, and working with data pipelines and analytics tools. Responsibilities: Design, develop, and maintain automation scripts, tools, and frameworks to improve operational efficiency across systems and teams. Collaborate with data engineers and analysts to automate data extraction, transformation, and loading (ETL) processes. Build scalable and reusable automation solutions for repetitive data processes, data quality checks, and reporting. Integrate APIs, databases, and third-party tools to support end-to-end automation workflows. Monitor and maintain data pipelines, ensuring data integrity, reliability, and performance. Implement testing and validation mechanisms for automated systems and data workflows. Collaborate with cross-functional teams (engineering, data, QA, DevOps) to identify automation opportunities and ensure seamless implementation. Document automation procedures and maintain version control of scripts using Git or similar tools. Requirements: Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related field. 7+ years of experience in automation engineering, scripting, and/or data pipeline development. Proficiency in at least one programming language such as Python, JavaScript, or Shell scripting. Strong knowledge of SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server). Hands-on experience with automation tools and frameworks (e.g., Airflow, Jenkins, Selenium, or custom scripts). Familiarity with data pipeline technologies (e.g., Apache Airflow, DBT, Kafka, Fivetran, etc.). Experience working with APIs, JSON, and RESTful services. Good understanding of data warehousing concepts and cloud platforms (e.g., AWS, GCP, or Azure). Good to have: Experience with cloud-native automation and orchestration tools (e.g., AWS Lambda, Step Functions). Exposure to CI/CD pipelines and infrastructure-as-code (Terraform, Ansible). Understanding of data governance, security, and compliance best practices. Strong problem-solving skills, attention to detail, and ability to work independently. Show more Show less
xStride
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Mock Interview
Salary: Not disclosed
Salary: Not disclosed