Home
Jobs
Companies
Resume
14 Job openings at ResourceTree Global Services Pvt Ltd
Fraud Strategy - 3 to 9 years ( Immediate Joiners )

Chennai, Tamil Nadu, India

3 - 7 years

Not disclosed

On-site

Full Time

Job Role: Fraud Analytics Years of Experience: 3 to 7 years Roles and Responsibilities: · Utilise data analytics tools and methodologies to conduct in-depth assessments and generate Fraud rules and reports on fraud trends (including Merchant Fraud, first-party and third-party fraud). · Investigate suspicious activities and transactions, determine root causes and develop strategies and statistical models to prevent future occurrences. · Develop Fraud rules on workflow systems · Develop fraud reports for ongoing monitoring of fraud events. · Generate reports and presentations outlining fraud risks, incidents, and mitigation strategies for management review. · Collaborate with cross-functional teams, including risk management, operations, and compliance, to enhance fraud prevention measures. · Monitor industry trends, regulatory changes, and best practices to continually enhance fraud prevention strategies. Skills Required: · Bachelor's degree in engineering, technology, computer science or related field. · 3+ years of proven data analytics experience in fraud prevention, risk management, or a related field · candidate will possess Fraud Mitigation, first party fraud, Transactional Fraud, merchant fraud, digital fraud detection. · Good experience with Python and SQL. · Familiarity with fraud detection software, risk assessment methodologies, and regulatory compliance. · Excellent communication and presentation skills with the ability to convey complex information clearly and concisely. · Detail-oriented with a proactive mindset toward problem-solving and risk mitigation. · Ability to work collaboratively in a cross-functional team environment Show more Show less

Metrology Engineer - 5 years

Chennai, Tamil Nadu, India

5 years

Not disclosed

On-site

Full Time

Job Role: Senior Application Engineer – Metrology Solutions Your Mission: As a Senior Application Engineer – Metrology Solutions, you won't just support customers — you'll be their go-to expert, problem-solver, and innovation partner. You'll bridge the gap between cutting-edge technology and real-world applications, helping customers elevate their processes through intelligent measurement solutions. What You’ll Do: 🔧 Be the Expert:Provide hands-on technical support and insight to customers, helping them solve complex metrology challenges.Understand customer needs deeply and propose tailored, high-impact solutions.🚀 Drive Innovation:Recommend advanced instruments and strategies that enhance customer productivity and accuracy.Collaborate with internal teams to craft customized solutions that push boundaries.🎓 Educate & Inspire:Conduct engaging training sessions and live demos on metrology tools and software.Empower customers to use our technologies with confidence and precision.🤝 Build Strong Partnerships:Cultivate meaningful relationships with customers and key decision-makers.Become a trusted advisor who helps drive their success — and ours.📘 Stay Ahead of the Curve:Stay current with the latest technologies in metrology and contribute insights to our product development roadmap. What We’re Looking For: ✅ Experience: 5+ years in application engineering or technical support, preferably in metrology or precision engineering. ✅ Expertise: In-depth understanding of metrology tools, principles, and practices. ✅ Communication: Clear, confident, and engaging — both with customers and teammates. ✅ Drive: A proactive mindset and a love for solving complex problems. ✅ Collaboration: Team spirit with the ability to work independently when needed.

Project Manager - Industry 4.0 - IoT -7 to 8 years

Chennai, Tamil Nadu, India

0 years

Not disclosed

On-site

Full Time

Each role will have a core competency in IIoT, ITIL, Data Analysis with the common denominator in Project Management. A - Core area: ITIL • Bring up infrastructure and accelerate deployment • Identify gaps in infrastructure - hand hold customers to resolve issues • Ownership of process - prevent delays in deployment • Industrial Nodes, Edge, Network, IoT server - gateways • Collaborate with our vendor - partners in IIoT product development, deployment and testing B - Core area: Data Analysis • Collect process data and documents from customers • Construct data stream for the manufacturing process under scope • Analysis of data for presence, quality • Build test data • Ownership of data models and validation of all charts, dashboards in the platform C - Core area: DevOps • Automate build pipelines • Carry updates from customer versions to base product • Release updates from base product to customer environment • License management • Handshake with Test Automation • Version Control Project Management (Common Denominator) Skills & Deliverables - Critical success factor is integration of functions across a small team - Primary failure modes are in customer communication, Kanban breaks - Hands on project management - Conceptual and communication clarity - Ownership and outcome orientation - Role span: Product configuration, deployment, integration - Customisations are minimal - Zoho Sprints based Agile Project Management - Requirements gathering product management are adjunct functions - Voice of Customer - Meeting Minutes are written up as Storyboards by tech lead - Acceptance criteria, test scenarios will be written by test engineers - Engineering team is growing at around 8 engineers - Tasks are largely based on API configuration -

Senior Data Engineer ( Informatica IICS, AWS ) 7 years

Chennai, Tamil Nadu, India

5 years

Not disclosed

On-site

Full Time

Job description Job Name: Senior Data Engineer - IICS Years of Experience: 5 Job Description: We are looking for a skilled and motivated Senior Data Engineer to join our data integration and analytics team. The ideal candidate will have hands-on experience with Informatica IICS, AWS Redshift, Python scripting, and Unix/Linux systems. You will be responsible for building and maintaining scalable ETL pipelines to support business intelligence and analytics needs. A strong passion for continuous learning, problem-solving, and enabling data-driven decision-making is highly valued. Primary Skills: Informatica IICS,AWS Secondary Skills: Python,Unix/Linux Role Description: We are looking for a Senior Data Engineer to lead the design, development, and management of scalable data platforms and pipelines. This role demands a strong technical foundation in data architecture, big data technologies, and database systems (both SQL and NoSQL), along with the ability to work across functional teams to deliver robust, secure, and high-performing data solutions. Role Responsibility: Design, develop, and maintain end-to end data pipelines and infrastructure. Translate business and functional requirements into scalable, well documented technical solutions. Build and manage data flows across structured and unstructured data sources, including streaming and batch integrations. Ensure data integrity and quality through automated validations, unit testing, and robust documentation. Optimize data processing performance and manage large datasets efficiently Collaborate closely with stakeholders and project teams to align data solutions with business objectives. Implement and maintain security and privacy protocols to ensure safe data handling. Lead development environment setup and configuration of tools and services. Mentor junior data engineers and contribute to continuous improvement and automation initiatives. Coordinate with QA and UAT teams during testing and release phases. Role Requirement: Strong proficiency in SQL (including procedures, performance tuning, and analytical functions). Solid understanding of data warehousing concepts, including dimensional modeling and SCDs. Hands-on experience with scripting languages (Shell / PowerShell). Familiarity with Cloud and Big data technologies. Experience working with relational, non-relational databases, and data streaming systems. Proficiency in data profiling, validation, and testing practices. Excellent problem-solving, communication (written and verbal), and documentation skills. Exposure to Agile methodologies and CI/CD practices. Self-motivated, adaptable, and capable of working in a fast-paced environment. Additional Requirements Overall 5 years and 3+ years of hands-on experience with Informatica IICS (Cloud Data Integration, Application Integration). Strong proficiency in AWS Redshift and writing complex SQL queries. Solid programming experience in Python for scripting, data wrangling, and automation. Experience with version control tools like Git and CI/CD workflows. Knowledge of data modeling and data warehousing concepts. Prior experience with data lakes and big data technologies is a plus. Show more Show less

Project Manager - Industry 4.0 - IoT -7 to 8 years

Chennai, Tamil Nadu, India

0 years

Not disclosed

On-site

Full Time

Each role will have a core competency in IIoT, ITIL, Data Analysis with the common denominator in Project Management. A - Core area: ITIL • Bring up infrastructure and accelerate deployment • Identify gaps in infrastructure - hand hold customers to resolve issues • Ownership of process - prevent delays in deployment • Industrial Nodes, Edge, Network, IoT server - gateways • Collaborate with our vendor - partners in IIoT product development, deployment and testing B - Core area: Data Analysis • Collect process data and documents from customers • Construct data stream for the manufacturing process under scope • Analysis of data for presence, quality • Build test data • Ownership of data models and validation of all charts, dashboards in the platform C - Core area: DevOps • Automate build pipelines • Carry updates from customer versions to base product • Release updates from base product to customer environment • License management • Handshake with Test Automation • Version Control Project Management (Common Denominator) Skills & Deliverables - Critical success factor is integration of functions across a small team - Primary failure modes are in customer communication, Kanban breaks - Hands on project management - Conceptual and communication clarity - Ownership and outcome orientation - Role span: Product configuration, deployment, integration - Customisations are minimal - Zoho Sprints based Agile Project Management - Requirements gathering product management are adjunct functions - Voice of Customer - Meeting Minutes are written up as Storyboards by tech lead - Acceptance criteria, test scenarios will be written by test engineers - Engineering team is growing at around 8 engineers - Tasks are largely based on API configuration - Show more Show less

Azure Data Engineer - 5 years

Chennai, Tamil Nadu, India

0 years

Not disclosed

On-site

Full Time

Job Name: Senior Data Engineer Azure Years of Experience: 5 Job Description: We are looking for a skilled and experienced Senior Azure Developer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating, and helping others, this job is for you! Primary Skills: ADF,Databricks Secondary Skills: DBT, Python, Databricks, Airflow, Fivetran, Glue, Snowflake Role Description: Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge /involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases Role Responsibility: Translate functional specifications and change requests into technical specifications Translate business requirement document, functional specification, and technical specification to related coding Develop efficient code with unit testing and code documentation Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving Setting up the development environment and configuration of the development tools Communicate with all the project stakeholders on the project status Manage, monitor, and ensure the security and privacy of data to satisfy business needs Contribute to the automation of modules, wherever required To be proficient in written, verbal and presentation communication (English) Co-ordinating with the UAT team Role Requirement: Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, nonrelational databases, data streams, and file stores Knowledgeable in performance tuning and optimization Experience in Data Profiling and Data validation Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project Knowledge in any BI tools will be an added advantage Sound aptitude, outstanding logical reasoning, and analytical skills Willingness to learn and take initiatives Ability to adapt to fast-paced Agile environment Additional Requirement: Demonstrated expertise as a Data Engineer, specializing in Azure cloud services. Highly skilled in Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. Create and execute efficient, scalable, and dependable data pipelines utilizing Azure Data Factory. Utilize Azure Databricks for data transformation and processing. Effectively oversee and enhance data storage solutions, emphasizing Azure Data Lake and other Azure storage services. Construct and uphold workflows for data orchestration and scheduling using Azure Data Factory or equivalent tools. Proficient in programming languages like Python, SQL, and conversant with pertinent scripting languages Show more Show less

Azure Senior Data Engineer - 5 years

Chennai, Tamil Nadu, India

0 years

Not disclosed

On-site

Full Time

Job Name: Senior Data Engineer Azure Years of Experience: 5 Job Description: We are looking for a skilled and experienced Senior Azure Developer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating, and helping others, this job is for you! Primary Skills: ADF,Databricks Secondary Skills: DBT, Python, Databricks, Airflow, Fivetran, Glue, Snowflake Role Description: Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge /involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases Role Responsibility: Translate functional specifications and change requests into technical specifications Translate business requirement document, functional specification, and technical specification to related coding Develop efficient code with unit testing and code documentation Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving Setting up the development environment and configuration of the development tools Communicate with all the project stakeholders on the project status Manage, monitor, and ensure the security and privacy of data to satisfy business needs Contribute to the automation of modules, wherever required To be proficient in written, verbal and presentation communication (English) Co-ordinating with the UAT team Role Requirement: Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, nonrelational databases, data streams, and file stores Knowledgeable in performance tuning and optimization Experience in Data Profiling and Data validation Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project Knowledge in any BI tools will be an added advantage Sound aptitude, outstanding logical reasoning, and analytical skills Willingness to learn and take initiatives Ability to adapt to fast-paced Agile environment Additional Requirement: Demonstrated expertise as a Data Engineer, specializing in Azure cloud services. Highly skilled in Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. Create and execute efficient, scalable, and dependable data pipelines utilizing Azure Data Factory. Utilize Azure Databricks for data transformation and processing. Effectively oversee and enhance data storage solutions, emphasizing Azure Data Lake and other Azure storage services. Construct and uphold workflows for data orchestration and scheduling using Azure Data Factory or equivalent tools. Proficient in programming languages like Python, SQL, and conversant with pertinent scripting languages Show more Show less

Senior Data Engineer - DBT and Snowflake - 6 years ( Immediate Joiner )

Chennai, Tamil Nadu, India

0 years

Not disclosed

On-site

Full Time

Job Name: Senior Data Engineer DBT & Snowflake Years of Experience: 5 Job Description: We are looking for a skilled and experienced DBT-Snowflake Developer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating, and helping others, this job is for you! Primary Skills: DBT,Snowflake Secondary Skills: ADF,Databricks,Python,Airflow,Fivetran,Glue Role Description: Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases. Role Responsibility: Translate functional specifications and change requests into technical specifications Translate business requirement document, functional specification, and technical specification to related coding Develop efficient code with unit testing and code documentation Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving Setting up the development environment and configuration of the development tools Communicate with all the project stakeholders on the project status Manage, monitor, and ensure the security and privacy of data to satisfy business needs Contribute to the automation of modules, wherever required To be proficient in written, verbal and presentation communication (English) Co-ordinating with the UAT team Role Requirement: Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, nonrelational databases, data streams, and file stores Knowledgeable in performance tuning and optimization Experience in Data Profiling and Data validation Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project Knowledge in any BI tools will be an added advantage Sound aptitude, outstanding logical reasoning, and analytical skills Willingness to learn and take initiatives Ability to adapt to fast-paced Agile environment Additional Requirement: • Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Snowflake, ensure the effective transformation and load data from diverse sources into data warehouse or data lake. • Implement and manage data models in DBT, guarantee accurate data transformation and alignment with business needs. • Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting. • Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance. • Establish best DBT processes to improve performance, scalability, and reliability. • Expertise in SQL and a strong understanding of Data Warehouse concepts and Modern Data Architectures. • Familiarity with cloud-based platforms (e.g., AWS, Azure, GCP). • Migrate legacy transformation code into modular DBT data models Show more Show less

Senior Data Engineer AZURE - 6 years ( Immediate Joiner )

Chennai, Tamil Nadu, India

0 years

Not disclosed

On-site

Full Time

Job description Job Name: Senior Data Engineer Azure Years of Experience: 5 Job Description: We are looking for a skilled and experienced Senior Azure Developer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating, and helping others, this job is for you! Primary Skills: ADF, Databricks Secondary Skills: DBT, Python, Databricks, Airflow, Fivetran, Glue, Snowflake Role Description: Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge /involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases Role Responsibility: Translate functional specifications and change requests into technical specifications Translate business requirement document, functional specification, and technical specification to related coding Develop efficient code with unit testing and code documentation Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving Setting up the development environment and configuration of the development tools Communicate with all the project stakeholders on the project status Manage, monitor, and ensure the security and privacy of data to satisfy business needs Contribute to the automation of modules, wherever required To be proficient in written, verbal and presentation communication (English) Co-ordinating with the UAT team Role Requirement: Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, nonrelational databases, data streams, and file stores Knowledgeable in performance tuning and optimization Experience in Data Profiling and Data validation Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project Knowledge in any BI tools will be an added advantage Sound aptitude, outstanding logical reasoning, and analytical skills Willingness to learn and take initiatives Ability to adapt to fast-paced Agile environment Additional Requirement: Demonstrated expertise as a Data Engineer, specializing in Azure cloud services. Highly skilled in Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. Create and execute efficient, scalable, and dependable data pipelines utilizing Azure Data Factory. Utilize Azure Databricks for data transformation and processing. Effectively oversee and enhance data storage solutions, emphasizing Azure Data Lake and other Azure storage services. Construct and uphold workflows for data orchestration and scheduling using Azure Data Factory or equivalent tools. Proficient in programming languages like Python, SQL, and conversant with pertinent scripting languages Show more Show less

Interesting Job Opportunity: Fraud Strategy/Analytics Engineer

Kolkata, West Bengal, India

3 years

Not disclosed

On-site

Full Time

Job Description Roles and Responsibilities : Utilize data analytics tools and methodologies to conduct in-depth assessments and generate Fraud rules and reports on fraud trends (including Merchant Fraud, first-party and third-party fraud). Investigate suspicious activities and transactions, determine root causes and develop strategies and statistical models to prevent future occurrences. Develop Fraud rules on workflow systems Develop fraud reports for ongoing monitoring of fraud events. Generate reports and presentations outlining fraud risks, incidents, and mitigation strategies for management review. Collaborate with cross-functional teams, including risk management, operations, and compliance, to enhance fraud prevention measures. Monitor industry trends, regulatory changes, and best practices to continually enhance fraud prevention strategies. Bachelor's degree in engineering, technology, computer science or related field. 3+ years of proven data analytics experience in fraud prevention, risk management, or a related field candidate will possess Fraud Mitigation, first party fraud, Transactional Fraud, merchant fraud, digital fraud detection. Good experience with Python and SQL. Familiarity with fraud detection software, risk assessment methodologies, and regulatory compliance. Excellent communication and presentation skills with the ability to convey complex information clearly and concisely. Detail-oriented with a proactive mindset toward problem-solving and risk mitigation. Ability to work collaboratively in a cross-functional team environment. (ref:hirist.tech) Show more Show less

Design Thinking - UI/UX - 15 years

Chennai, Tamil Nadu, India

12 years

Not disclosed

On-site

Full Time

Design Thinker Overview: The Design Thinker is a pivotal role responsible for driving user-centered design initiatives within the organization. This role involves conducting comprehensive research, defining problem statements, facilitating ideation sessions, and developing prototypes to create intuitive and impactful user experiences. The ideal candidate will have extensive experience in design thinking, exceptional facilitation skills, and the ability to collaborate effectively with cross-functional teams. Roles & Responsibilities Empathy and Research: • Conduct qualitative and quantitative research to understand user behaviors, motivations, and challenges. • Engage with stakeholders, customers, and end-users to gather insights and identify pain points. Problem Definition: • Synthesize research findings to define clear, actionable problem statements. • Use tools such as user personas, journey maps, and empathy maps to frame design challenges Ideation: • Facilitate brainstorming sessions and workshops to generate innovative ideas. • Encourage collaboration and creativity across teams to explore diverse perspectives. Prototyping: • Develop low- and high-fidelity prototypes to test concepts and solutions. • Iterate designs based on feedback from stakeholders and end-users. Testing and Validation: • Conduct usability testing and gather user feedback to refine solutions. • Analyze test results to ensure designs meet user needs and business objectives. Implementation and Delivery: • Collaborate with product, engineering, and marketing teams to ensure seamless implementation of solutions. • Monitor the impact of design solutions post-launch and identify opportunities for continuous improvement. Advocacy: • Champion design thinking methodologies and foster a user-centered culture within the organization. • Educate and mentor teams on applying design thinking principles in projects. Required skills • Champion design thinking methodologies and foster a user-centered culture within the organization. • Educate and mentor teams on applying design thinking principles in projects. • Bachelor’s or Master’s degree in Design, Human -Computer Interaction (HCI), Psychology, or a related field. • Proven experience (12+ years) in design thinking, UX/UI design, or a similar role. • Strong knowledge of design thinking tools and techniques (e.g., journey mapping, prototyping, personas). • Exceptional facilitation skills for workshops and brainstorming sessions. • Proficiency in design and prototyping tools such as Figma, Sketch, or Adobe Creative Suite. • Excellent communication and storytelling skills to articulate design decisions. • Ability to work collaboratively in cross-functional teams and manage multiple projects simultaneously. Desired Skills • Experience with agile methodologies and working in iterative design environments. • Knowledge of service design, business strategy, or systems thinking. Show more Show less

Interesting Job Opportunity: Fraud Strategy Analyst

Gurugram, Haryana, India

5 years

Not disclosed

On-site

Full Time

Roles And Responsibilities Utilize data analytics tools and methodologies to conduct in-depth assessments and generate Fraud rules and reports on fraud trends (including Merchant Fraud, first party and third-party fraud). Investigate suspicious activities and transactions, determine root causes and develop strategies and statistical models to prevent future occurrences. Develop Fraud rules on workflow systems Develop fraud reports for ongoing monitoring of fraud events. Generate reports and presentations outlining fraud risks, incidents, and mitigation strategies for management review. Collaborate with cross-functional teams, including risk management, operations, and compliance, to enhance fraud prevention measures. Monitor industry trends, regulatory changes, and best practices to continually enhance fraud prevention strategies. Skills Required Bachelor's degree in engineering, technology, computer science or related field. 5+ years of proven data analytics experience in fraud prevention, risk management, or a related field candidate will possess Fraud Mitigation, first party fraud, Transactional Fraud, merchant fraud, digital fraud detection. Good experience with Python and SQL. Familiarity with fraud detection software, risk assessment methodologies, and regulatory compliance. Excellent communication and presentation skills with the ability to convey complex information clearly and concisely. Detail-oriented with a proactive mindset toward problem-solving and risk mitigation. Ability to work collaboratively in a cross-functional team environment. (ref:hirist.tech) Show more Show less

Interesting Job Opportunity: Senior Data Engineer - Informatica ICS

Chennai, Tamil Nadu, India

5 years

Not disclosed

On-site

Full Time

Job Description We are looking for a skilled and motivated Senior Data Engineer to join data integration and analytics team. The ideal candidate will have hands-on experience with Informatica IICS, AWS Redshift, Python scripting, and Unix/Linux systems. You will be responsible for building and maintaining scalable ETL pipelines to support business intelligence and analytics needs. A strong passion for continuous learning, problem-solving, and enabling data-driven decision-making is highly valued. Primary Skills : Informatica Skills : Description : We are looking for a Senior Data Engineer to lead the design, development, and management of scalable data platforms and pipelines. This role demands a strong technical foundation in data architecture, big data technologies, and database systems (both SQL and NoSQL), along with the ability to work across functional teams to deliver robust, secure, and high-performing data solutions. Role Responsibility Design, develop, and maintain end-toend data pipelines and infrastructure. Translate business and functional requirements into scalable, welldocumented technical solutions. Build and manage data flows across structured and unstructured data sources, including streaming and batch integrations. Ensure data integrity and quality through automated validations, unit testing, and robust documentation. Optimize data processing performance and manage large datasets efficiently Collaborate closely with stakeholders and project teams to align data solutions with business objectives. Implement and maintain security and privacy protocols to ensure safe data handling. Lead development environment setup and configuration of tools and services. Mentor junior data engineers and contribute to continuous improvement and automation initiatives. Coordinate with QA and UAT teams during testing and release phases Role Requirement Strong proficiency in SQL (including procedures, performance tuning, and analytical functions). Solid understanding of data warehousing concepts, including dimensional modeling and SCDs. Hands-on experience with scripting languages (Shell / PowerShell). Familiarity with Cloud and Big data technologies. Experience working with relational, non-relational databases, and data streaming systems. Proficiency in data profiling, validation, and testing practices. Excellent problem-solving, communication (written and verbal), and documentation skills. Exposure to Agile methodologies and CI/CD practices. Self-motivated, adaptable, and capable of working in a fast-paced Requirement : Overall 5 years and 3+ years of hands-on experience with Informatica IICS (Cloud Data Integration, Application Integration). Strong proficiency in AWS Redshift and writing complex SQL queries. Solid programming experience in Python for scripting, data wrangling, and automation. Experience with version control tools like Git and CI/CD workflows. Knowledge of data modeling and data warehousing concepts. Prior experience with data lakes and big data technologies is a plus (ref:hirist.tech) Show more Show less

Principal Data scientist

Chennai, Tamil Nadu, India

10 - 13 years

Not disclosed

On-site

Full Time

Title : Principal Data scientist Location: Chennai or Bangalore Experience: 10-13 years Job Summary We are seeking a highly skilled and techno-functional Optimization Specialist with 10–13 years of experience in developing enterprise-grade optimization solutions and platforms. The ideal candidate will possess deep expertise in mathematical optimization, strong hands-on Python programming skills, and the ability to bridge the gap between technical and business teams. You will lead the design and deployment of scalable optimization engines to solve complex business problems across supply chain, manufacturing, pricing, logistics, and workforce planning. Key Responsibilities Design & Development : Architect and implement optimization models (LP, MILP, CP, metaheuristics) using solvers like Gurobi, CPLEX, or open-source equivalents. Platform Building : Lead the design and development of optimization-as-a-service platforms with modular, reusable architecture. Techno-Functional Role : Translate business requirements into formal optimization problems and provide functional consulting support across domains. End-to-End Ownership : Manage the full lifecycle from problem formulation, model design, data pipeline integration, to production deployment. Python Expertise : Build robust, production-grade code with modular design using Python, Pandas, NumPy, Pyomo/Pulp, and APIs (FastAPI/Flask). Collaboration : Work with business stakeholders, data scientists, and software engineers to ensure solutions are accurate, scalable, and aligned with objectives. Performance Tuning : Continuously improve model runtime and performance; conduct sensitivity analysis and scenario modeling. Innovation : Stay abreast of the latest in optimization techniques, frameworks, and tools; proactively suggest enhancements. Required Skills & Qualifications Bachelor’s or Master’s in Operations Research, Industrial Engineering, Computer Science, or related fields. 10–12 years of experience in solving real-world optimization problems. Deep understanding of mathematical programming (LP/MILP/CP), heuristics/metaheuristics, and stochastic modeling. Proficiency in Python and experience with relevant libraries (Pyomo, Pulp, OR-Tools, SciPy). Strong experience building end-to-end platforms or optimization engines deployed in production. Functional understanding of at least one domain: supply chain, logistics, manufacturing, pricing, scheduling, or workforce planning. Excellent communication skills – able to interact with technical and business teams effectively. Experience integrating optimization models into enterprise systems (APIs, cloud deployment, etc.). Preferred Qualifications Exposure to cloud platforms (AWS, GCP, Azure) and MLOps pipelines. Familiarity with data visualization (Dash, Plotly, Streamlit) to present optimization results. Certification or training in operations research or mathematical optimization tools. Show more Show less

My Connections ResourceTree Global Services Pvt Ltd

Download Chrome Extension (See your connection in the ResourceTree Global Services Pvt Ltd )

chrome image
Download Now
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview