Identify, qualify, and generate new business opportunities for AI/ML products and services in the US region. Understand customer needs and position suitable solutions from our AI/ML offerings. Build and nurture a strong sales pipeline through outbound efforts cold emails, LinkedIn outreach, networking, and referrals. Manage the complete sales cycle from lead generation to deal closure. Collaborate with technical teams to create tailored proposals and demos. Attend virtual events, webinars, and conferences to represent the company and generate leads. Maintain CRM with accurate lead and sales tracking. Provide regular market feedback and contribute to sales strategy development. Required Skills: 5+ years of experience in international B2B sales, preferably in AI/ML, data science, SaaS, or IT services. Strong understanding of AI/ML concepts and ability to explain tech-driven solutions to non-technical clients. Proven ability to generate leads, build a pipeline, and close deals independently. Excellent communication, negotiation, and interpersonal skills. Familiarity with CRM tools (e. g. , HubSpot, Salesforce). Must be comfortable working in US time zones. Bachelor s degree in Business, Marketing, Engineering, or related field; MBA is a plus. Preferred Qualifications: Prior experience selling to US-based clients. Understanding of data platforms, MLOps, or AI product development lifecycle. Exposure to industry verticals such as Healthcare, Finance, Retail, or Manufacturing.
Role & responsibilities SUMMARY: Data Engineer will be responsible for ETL and documentation in building data warehouse and analytics capabilities. Additionally, maintain existing systems/processes and develop new features, along with reviewing, presenting and implementing performance improvements. Duties and Responsibilities: Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Monitoring active ETL jobs in production. Build out data lineage artifacts to ensure all current and future systems are properly documented Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. Required Skills This job has no supervisory responsibilities. Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years experience in business analytics, data science, software development, data modeling or data engineering work 5+ years experience with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory) Experience working in the healthcare industry with PHI/PII Creative, lateral, and critical thinker Excellent communicator Well-developed interpersonal skills Good at prioritizing tasks and time management Ability to describe, create and implement new solutions Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau)
Role: Power BI Analyst Location : On-site Employment Type : Full-time Role Summary: We are seeking an experienced Senior BI Analyst to join our data analytics team, with a strong focus on migrating legacy Qlik dashboards to Power BI . This role requires deep expertise in Power BI , SQL , and preferably experience in the healthcare domain . Familiarity with Snowflake as a data warehouse platform is a strong plus. Key Responsibilities: Lead the migration of dashboards and reports from QlikView/Qlik Sense to Power BI , ensuring consistency in data logic, design, and user experience. Design, build, and optimize scalable, interactive Power BI dashboards to support key business decisions. Write complex SQL queries for data extraction, transformation, and validation. Collaborate with business users, analysts, and data engineers to gather requirements and deliver analytics solutions. Leverage data modeling and DAX to build robust and reusable datasets in Power BI. Perform data validation and QA to ensure accuracy during and post-migration. Work closely with Snowflake-based datasets or assist in transitioning data sources to Snowflake where applicable. Translate healthcare data metrics into actionable insights and visualizations. Required Skills: 4+ years of experience in Business Intelligence or Data Analytics roles Strong expertise in Power BI – including DAX, Power Query, custom visuals, row-level security Hands-on experience with QlikView or Qlik Sense , especially in migration scenarios Advanced proficiency in SQL – complex joins, performance tuning, and stored procedures Exposure to Snowflake or similar cloud data platforms (e.g., Redshift, BigQuery) Experience working with healthcare datasets (claims, clinical, EMR/EHR data, etc.) is a strong advantage Strong analytical and problem-solving mindset Effective communication and stakeholder management skills Show more Show less
🚨 We’re Hiring: IT Administrator (3+ Years Exp.) 🚨 📍 Location: Coimbatore 📅 Experience: 3+ Years 🎓 Qualification: Diploma / B.Sc in IT or related field 🕒 Join Us: Immediate Requirement Are you a proactive and dynamic IT professional looking to make an impact? We’re on the lookout for an IT Administrator to support and manage our tech infrastructure. 💻 What You’ll Do: Maintain and troubleshoot IT systems, networks, and servers Ensure system security, data backup, and smooth daily operations Provide user support, manage IT assets, and handle software/hardware setups Collaborate with teams to keep everything running efficiently 🔧 What We’re Looking For: 3+ years of hands-on IT admin experience Strong knowledge of system & network management Problem-solving mindset and good communication skills Familiarity with cloud tools & IT security practices is a plus Show more Show less
No. of Positions: 1 Position: DevOps Architect Location: Coimbatore ( Onsite) Total Years of Experience: 7+ years Key Responsibilities: Design, implement, and optimize scalable and reliable DevOps processes for continuous integration, continuous deployment (CI/CD), and infrastructure as code (IaC). Lead the architecture and implementation of cloud-based infrastructure solutions, leveraging AWS, Azure, or GCP, depending on project requirements. Collaborate with software development teams to ensure smooth integration of development, testing, and production environments. Implement and manage tools for automation, monitoring, and alerting across development and production environments (e.g., Jenkins, GitLab CI, Ansible, Terraform, Docker, Kubernetes). Oversee the management of version control, release pipelines, and deployment processes for a variety of applications. Design and implement infrastructure monitoring solutions, ensuring high availability and performance of systems. Foster a culture of continuous improvement and work closely with development and operations teams to enhance automation, testing, and release pipelines. Ensure security best practices are followed in the development and deployment pipeline (e.g., secret management, vulnerability scanning). Lead efforts to address performance bottlenecks, scaling challenges, and infrastructure optimization. Mentor and guide junior engineers in the DevOps space. Required Skills: Bachelor’s degree in computer science, Information Technology, or related field, or equivalent work experience. 7+ years of experience in DevOps, cloud infrastructure, and automation tools. Strong experience with cloud platforms (AWS, Azure, GCP) and their services (EC2, Lambda, S3, etc.). Expertise in containerization technologies (Docker, Kubernetes) and orchestration tools. Extensive experience with automation tools (Jenkins, Ansible, Chef, Puppet, Terraform). Familiarity with infrastructure as code (IaC) principles and practices. Proficient with scripting languages (Bash, Python, Go, etc.). Strong knowledge of version control systems (Git, SVN). Experience with monitoring and logging tools (Prometheus, Grafana, ELK stack, New Relic). Excellent troubleshooting skills, with the ability to quickly identify and resolve complex issues. Strong communication and leadership skills, with a proven ability to collaborate across multiple teams. Solid understanding of Agile and Scrum methodologies. Preferred. Qualifications: Certifications in DevOps tools, cloud technologies, or Kubernetes. Experience with serverless architecture. Familiarity with security best practices in a DevOps environment. Experience with database management and backup strategies. Don’t see a role that fits? We are growing rapidly and always on the lookout for passionate and smart engineers! If you are passionate about your career, reach out to us at [email protected] .
No. of Positions: 1 Position: Lead Data Engineer Location: Hybrid or Remote Total Years of Experience: 5+ years Key Responsibilities: Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Monitoring active ETL jobs in production. Build out data lineage artifacts to ensure all current and future systems are properly documented. Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes. Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies. Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations. Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. Hands-on experience in developing and implementing large-scale data warehouses. Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. Required Skills: This job has no supervisory responsibilities. Bachelor’s Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years’ experience in business analytics, data science, software development, data modeling or data engineering work. 5+ years’ experience with a strong proficiency with SQL query/development skills. Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks. Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory). Experience working in the healthcare industry with PHI/PII. Creative, lateral, and critical thinker. Excellent communicator. Well-developed interpersonal skills. Good at prioritizing tasks and time management. Ability to describe, create and implement new solutions. Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef). Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau). Don’t see a role that fits? We are growing rapidly and always on the lookout for passionate and smart engineers! If you are passionate about your career, reach out to us at careers@hashagile.com.
No. of Positions: 1 Position: Data Integration Technical Lead Location: Hybrid or Remote Total Years of Experience: 8+ years Experience: 8+ years of experience in data integration, cloud technologies, and API-based integrations. At least 3 years in a technical leadership role overseeing integration projects. Proven experience in integrating cloud-based systems, on-premise systems, databases, and legacy platforms. Informatica Cloud (IICS) or Mulesoft certifications preferable. Technical Expertise: Expertise in designing and implementing integration workflows using IICS, Mulesoft, or other integration platforms. Proficient in integrating cloud and on-premise systems, databases, and legacy platforms using API integrations, REST/SOAP, and middleware tools. Strong knowledge of Salesforce CRM, Microsoft Dynamics CRM, and other enterprise systems for integration. Experience in creating scalable, secure, and high-performance data integration solutions. Deep understanding of data modelling, transformation, and normalization techniques for integrations. Strong experience in troubleshooting and resolving integration issues. Key Responsibilities: Work with architects and client stakeholders to design data integration solutions that align with business needs and industry best practices. Lead the design and implementation of data integration pipelines, frameworks, and cloud integrations. Lead and mentor a team of data integration professionals, conducting code reviews and ensuring high-quality deliverables. Design and implement integrations with external systems using APIs, middleware, and cloud services. Develop data transformation workflows and custom scripts to integrate data between systems. Stay updated on new integration technologies and recommend improvements as necessary. Excellent verbal and written communication skills to engage with both technical and non-technical stakeholders. Proven ability to explain complex technical concepts clearly and concisely. Don’t see a role that fits? We are growing rapidly and always on the lookout for passionate and smart engineers! If you are passionate about your career, reach out to us at careers@hashagile.com.
Job Title: IT & System Administrator Location: Coimbatore (On site) Employment Type: Full-time Job Summary: We are seeking a highly skilled and proactive IT & System Administrator to manage and maintain our organization's IT infrastructure. The ideal candidate will be responsible for ensuring the stable operation, integrity, and efficient performance of information systems, networks, and servers. This includes installing, configuring, administering, and troubleshooting hardware, software, and network components. Key Responsibilities: 1-2 years hands on experience in Networking, Subnetting (IPV4, IPV6), VLAN, Crimping, Routers, Switches and Access points. Strong network troubleshooting skills. 1 year experience in Linux servers. Knowledge in ITIL process and ticketing tools like JIRA, ServiceDesk. CCNA Certification (Optional). Redhat / Linux Certifications (Optional). Required Qualifications: Bachelor’s degree in computer science, Information Technology, or related field. Diploma holders also preferred (They have realtime work experience). 1+ years of proven experience in system and network administration. Strong knowledge of Windows Server, Linux, VMware/Hyper-V, and Office 365 administration. Experience with firewalls (e.g., Fortinet, Cisco), networking protocols, and VPNs. Familiarity with backup systems, disaster recovery planning, and cybersecurity principles. Additional Qualifications: Relevant certifications (e.g., MCSA, CompTIA Network+, CCNA, ITIL). Experience with cloud services (e.g., AWS, Azure). Scripting skills (e.g., PowerShell, Bash) for automation tasks. Soft Skills: Excellent problem-solving and troubleshooting abilities. Strong communication and interpersonal skills. Ability to work independently and manage multiple tasks effectively. Show more Show less
Lead the migration of dashboards and reports from QlikView/Qlik Sense to Power BI , ensuring consistency in data logic, design, and user experience. Design, build, and optimize scalable, interactive Power BI dashboards to support key business decisions. Write complex SQL queries for data extraction, transformation, and validation. Collaborate with business users, analysts, and data engineers to gather requirements and deliver analytics solutions. Leverage data modeling and DAX to build robust and reusable datasets in Power BI. Perform data validation and QA to ensure accuracy during and post-migration. Work closely with Snowflake-based datasets or assist in transitioning data sources to Snowflake where applicable. Translate healthcare data metrics into actionable insights and visualizations. Required Skills: 4+ years of experience in Business Intelligence or Data Analytics roles. Strong expertise in Power BI including DAX, Power Query, custom visuals, row-level security. Hands-on experience with QlikView or Qlik Sense , especially in migration scenarios. Advanced proficiency in SQL complex joins, performance tuning, and stored procedures. Exposure to Snowflake or similar cloud data platforms (e. g. , Redshift, BigQuery). Experience working with healthcare datasets (claims, clinical, EMR/EHR data, etc ) is a strong advantage. Strong analytical and problem-solving mindset. Effective communication and stakeholder management skills.
Design, implement, and optimize scalable and reliable DevOps processes for continuous integration, continuous deployment (CI/CD), and infrastructure as code (IaC). Lead the architecture and implementation of cloud-based infrastructure solutions, leveraging AWS, Azure, or GCP, depending on project requirements. Collaborate with software development teams to ensure smooth integration of development, testing, and production environments. Implement and manage tools for automation, monitoring, and alerting across development and production environments (e. g. , Jenkins, GitLab CI, Ansible, Terraform, Docker, Kubernetes). Oversee the management of version control, release pipelines, and deployment processes for a variety of applications. Design and implement infrastructure monitoring solutions, ensuring high availability and performance of systems. Foster a culture of continuous improvement and work closely with development and operations teams to enhance automation, testing, and release pipelines. Ensure security best practices are followe'd in the development and deployment pipeline (e. g. , secret management, vulnerability scanning). Lead efforts to address performance bottlenecks, scaling challenges, and infrastructure optimization. Mentor and guide junior engineers in the DevOps space. Required Skills: bachelors degree in computer science, Information Technology, or related field, or equivalent work experience. 7+ years of experience in DevOps, cloud infrastructure, and automation tools. Strong experience with cloud platforms (AWS, Azure, GCP) and their services (EC2, Lambda, S3, etc ). Expertise in containerization technologies (Docker, Kubernetes) and orchestration tools. Extensive experience with automation tools (Jenkins, Ansible, Chef, Puppet, Terraform). Familiarity with infrastructure as code (IaC) principles and practices. Proficient with scripting languages (Bash, Python, Go, etc ). Strong knowledge of version control systems (Git, SVN). Experience with monitoring and logging tools (Prometheus, Grafana, ELK stack, New Relic). Excellent troubleshooting skills, with the ability to quickly identify and resolve complex issues. Strong communication and leadership skills, with a proven ability to collaborate across multiple teams. Solid understanding of Agile and Scrum methodologies. Preferred. Qualifications: Certifications in DevOps tools, cloud technologies, or Kubernetes. Experience with serverless architecture. Familiarity with security best practices in a DevOps environment. Experience with database management and backup strategies.
Role: Power BI Analyst Experience : 4+ Years Location : On-site Employment Type : Full-time Role Summary: We are seeking an experienced Senior BI Analyst to join our data analytics team, with a strong focus on migrating legacy Qlik dashboards to Power BI . This role requires deep expertise in Power BI , SQL , and preferably experience in the healthcare domain . Familiarity with Snowflake as a data warehouse platform is a strong plus. Key Responsibilities: Lead the migration of dashboards and reports from QlikView/Qlik Sense to Power BI , ensuring consistency in data logic, design, and user experience. Design, build, and optimize scalable, interactive Power BI dashboards to support key business decisions. Write complex SQL queries for data extraction, transformation, and validation. Collaborate with business users, analysts, and data engineers to gather requirements and deliver analytics solutions. Leverage data modeling and DAX to build robust and reusable datasets in Power BI. Perform data validation and QA to ensure accuracy during and post-migration. Work closely with Snowflake-based datasets or assist in transitioning data sources to Snowflake where applicable. Translate healthcare data metrics into actionable insights and visualizations. Required Skills: 4+ years of experience in Business Intelligence or Data Analytics roles Strong expertise in Power BI – including DAX, Power Query, custom visuals, row-level security Hands-on experience with QlikView or Qlik Sense , especially in migration scenarios Advanced proficiency in SQL – complex joins, performance tuning, and stored procedures Exposure to Snowflake or similar cloud data platforms (e.g., Redshift, BigQuery) Experience working with healthcare datasets (claims, clinical, EMR/EHR data, etc.) is a strong advantage Strong analytical and problem-solving mindset Effective communication and stakeholder management skills Show more Show less
Job Title: Data Engineer Location: Coimbatore Job Type: Full-Time Job Description We are seeking a skilled Data Engineer with proficient knowledge in Spark and SQL to join our dynamic team. The ideal candidate will be responsible for designing, implementing, and optimizing data pipelines on our Data platform. You will work closely with data architects, and other stakeholders to ensure data accessibility, reliability, and performance. Key Responsibilities Data Pipeline Development : Design, develop, and maintain scalable data pipelines using Azure Synapse, Databricks & Apache Spark (PySpark). Data Integration : Integrate data from various sources, ensuring data quality and consistency. Performance Optimization : Optimize data processing workflows for performance and cost-efficiency. Collaboration : Work with data architects, analysts, and product owners to understand data requirements and deliver solutions. Monitoring and Troubleshooting : Monitor data pipelines and troubleshoot issues to ensure data integrity. Documentation : Document data workflows, processes, and best practices. Skills Technical Skills : Proficiency in Azure Synapse/Databricks and Apache Spark. Strong PySpark and SQL skills for data manipulation and querying. Familiarity with Delta Live Tables and Databricks workflows. Experience with ETL tools and processes. Knowledge of cloud platforms (AWS, Azure, GCP). Soft Skills : Excellent problem-solving abilities. Strong communication and collaboration skills. Ability to work in a fast-paced environment and manage multiple priorities.
Job Title: Python REST API Developer Location: Coimbatore Job Type: Full-Time Your Responsibilities Develop and maintain Python-based REST APIs with a strong focus on OpenAPI (Swagger) specifications and clean, testable code. Collaborate with internal teams to align on data structures, endpoints, versioning strategies, and deployment timelines. Work with tools such as Postman and Swagger UI to validate and document API endpoints. Monitor and enhance the performance, reliability, and security of deployed APIs. Support consumers of the APIs by maintaining clear documentation and assisting with technical queries. Contribute to continuous improvement efforts in our development practices, code quality, and system observability (e.g., logging, error handling). Use GitHub, Azure DevOps, or similar tools for version control and CI/CD workflows. Your Profile Strong experience (3+ years) in backend development using Python (e.g., FastAPI, Flask). Solid understanding of REST API design, versioning, authentication, and documentation (especially OpenAPI/Swagger). Proficient in using tools like Postman, VS Code, GitHub, and working with SQL-based databases. Familiar with Azure Functions or cloud-based deployment patterns (experience with Azure is a plus but not mandatory). Comfortable troubleshooting technical issues, analyzing logs, and collaborating with support or development teams to identify root causes. Experience or interest in distributed data processing with Spark or real-time data pipelines using Kafka is a plus, but not required. Team player with a collaborative mindset and a proactive approach to sharing knowledge and solving problems. Fluent in English, written and spoken.
No. of Positions: 2 Position: Python REST API Developer Location: On-site (Coimbatore) Total Years of Experience: 3+ years Responsibilities: Develop and maintain Python-based REST APIs with a strong focus on OpenAPI (Swagger) specifications and clean, testable code. Collaborate with internal teams to align on data structures, endpoints, versioning strategies, and deployment timelines. Work with tools such as Postman and Swagger UI to validate and document API endpoints. Monitor and enhance the performance, reliability, and security of deployed APIs. Support consumers of the APIs by maintaining clear documentation and assisting with technical queries. Contribute to continuous improvement efforts in our development practices, code quality, and system observability (e.g., logging, error handling). Use GitHub, Azure DevOps, or similar tools for version control and CI/CD workflows. Experience: Strong experience (3+ years) in backend development using Python (e.g., FastAPI, Flask). Solid understanding of REST API design, versioning, authentication, and documentation (especially OpenAPI/Swagger). Proficient in using tools like Postman, VS Code, GitHub, and working with SQL-based. Familiar with Azure Functions or cloud-based deployment patterns (experience with Azure is a plus but not mandatory). Comfortable troubleshooting technical issues, analyzing logs, and collaborating with support or development teams to identify root causes. Experience or interest in distributed data processing with Spark or real-time data pipelines using Kafka is a plus. Team player with a collaborative mindset and a proactive approach to sharing knowledge and solving problems. Fluent in English, written and spoken. Don’t see a role that fits? We are growing rapidly and always on the lookout for passionate and smart engineers! If you are passionate about your career, reach out to us at careers@hashagile.com.
No. of Positions: 2 Position: Data Engineer Location: On-site (Coimbatore) Total Years of Experience: 3+ years Key Responsibilities: Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Azure Synapse, Databricks & Apache Spark (PySpark). Data Integration: Integrate data from various sources, ensuring data quality and consistency. Performance Optimization: Optimize data processing workflows for performance and cost-efficiency. Collaboration: Work with data architects, analysts, and product owners to understand data requirements and deliver solutions. Monitoring and Troubleshooting: Monitor data pipelines and troubleshoot issues to ensure data Documentation: Document data workflows, processes, and best practices. Technical Skills: Proficiency in Azure Synapse/Databricks and Apache Spark. Strong PySpark and SQL skills for data manipulation and querying. Familiarity with Delta Live Tables and Databricks workflows. Experience with ETL tools and processes. Knowledge of cloud platforms (AWS, Azure, GCP). Soft Skills: Excellent problem-solving abilities. Strong communication and collaboration skills. Ability to work in a fast-paced environment and manage multiple priorities. Don’t see a role that fits? We are growing rapidly and always on the lookout for passionate and smart engineers! If you are passionate about your career, reach out to us at careers@hashagile.com.
Develop and maintain Python-based REST APIs with a strong focus on OpenAPI (Swagger) specifications and clean, testable code. Collaborate with internal teams to align on data structures, endpoints, versioning strategies, and deployment timelines. Work with tools such as Postman and Swagger UI to validate and document API endpoints. Monitor and enhance the performance, reliability, and security of deployed APIs. Support consumers of the APIs by maintaining clear documentation and assisting with technical queries. Contribute to continuous improvement efforts in our development practices, code quality, and system observability (e. g. , logging, error handling). Use GitHub, Azure DevOps, or similar tools for version control and CI/CD workflows. Experience: Strong experience (3+ years) in backend development using Python (e. g. , FastAPI, Flask). Solid understanding of REST API design, versioning, authentication, and documentation (especially OpenAPI/Swagger). Proficient in using tools like Postman, VS Code, GitHub, and working with SQL-based. Familiar with Azure Functions or cloud-based deployment patterns (experience with Azure is a plus but not mandatory). Comfortable troubleshooting technical issues, analyzing logs, and collaborating with support or development teams to identify root causes. Experience or interest in distributed data processing with Spark or real-time data pipelines using Kafka is a plus. Team player with a collaborative mindset and a proactive approach to sharing knowledge and solving problems.
As a Senior ETL Developer with 5+ years of experience based in Coimbatore, your primary responsibilities will revolve around designing and developing robust ETL processes to extract, transform, and load data from various sources into the data warehouse. Your role will encompass integrating data from multiple sources to ensure data consistency and quality, optimizing ETL performance for efficiency and scalability, and analyzing complex datasets to identify data quality issues and inconsistencies. Collaboration with data analysts, data scientists, and business stakeholders will be crucial as you work closely with them to understand data requirements and deliver effective data solutions. You will also be responsible for maintaining data integrity throughout the ETL process, troubleshooting and resolving any issues that may arise, and documenting detailed procedures and processes for future reference. Monitoring ETL jobs for successful execution, conducting regular audits to ensure data accuracy, and implementing industry best practices for ETL development and data integration will be part of your routine tasks. Ensuring compliance with data governance policies and data privacy regulations will also be essential in your role. Your technical proficiency in ETL tools and technologies, programming skills, strong understanding of data warehousing concepts, exceptional analytical and problem-solving abilities, excellent communication skills, attention to detail, project management skills, and continuous learning mindset will all be key factors in your success as a Senior ETL Developer. Additionally, you will be expected to provide ongoing support and maintenance for ETL processes, mentor junior developers, and stay updated with the latest trends in ETL technologies and data integration.,
Design, develop, and maintain interactive dashboards and reports using Tableau. Collaborate with stakeholders to understand data visualization needs and translate them into effective visual solutions. Analyse, interpret, and model large data sets using SQL or Python to uncover insights and support business decisions. Work with cross-functional teams including data engineering, analytics, and business units. Ensure visualizations follow best practices in terms of usability, performance, and storytelling. Identify trends, outliers, and actionable opportunities through visual analysis. Provide thought leadership and subject matter expertise in data visualization and business intelligence. Required Qualification: Minimum 5+ years of experience in data visualization using any BI tool. At least 3+ years of hands-on experience in Tableau (dashboard creation, calculations, filters, parameters, actions, etc. ). Proficient in SQL or Python for data extraction, manipulation, and analysis. Strong analytical thinking, storytelling, and data presentation skills. Excellent communication and collaboration abilities. Preferred Qualifications: Experience in the pharmaceutical or life sciences domain is a strong plus. Familiarity with data warehousing and cloud platforms is advantageous (e. g. , Snowflake, AWS, Azure).
Design, implement, and manage data governance policies, standards, and procedures aligned with enterprise goals. Drive initiatives in areas such as Data Control, Data Privacy, Data Ethics, and Data Strategy. Collaborate with cross-functional and geographically distributed teams to establish governance frameworks. Contribute to the strategic direction of the Data Management Center of Excellence (CoE). Provide guidance on industry-standard frameworks such as EDM, DAMA, and DCAM. Ensure compliance with regulatory and privacy requirements across data assets. Act as a liaison between business and technical teams to communicate governance goals and initiatives. Support data quality improvement programs and monitor adherence to governance policies. Required Qualification: Minimum of 10 years of experience in data management, with significant exposure to data governance. Hands-on experience in at least one of the following areas: Data Control, Data Privacy, Data Ethics, or Data Strategy, with working knowledge of the others. Solid understanding of EDM, DAMA, and DCAM frameworks. Proven experience working in multi-team, global collaboration environments. Prior experience in strategic data initiatives or data governance CoE roles. Excellent communication and stakeholder management skills. Preferred Qualifications: Pharmaceutical industry experience or familiarity with regulated environments. Experience working with non-US clients and multicultural teams. Consulting background, ideally in data governance or enterprise data strategy.
You will be responsible for developing and maintaining Python-based REST APIs with a strong emphasis on adhering to OpenAPI (Swagger) specifications and writing clean, testable code. It will be crucial to collaborate effectively with internal teams to ensure alignment on data structures, endpoints, versioning strategies, and deployment timelines. You will utilize tools such as Postman and Swagger UI for validating and documenting API endpoints. Monitoring and improving the performance, reliability, and security of deployed APIs will be a key part of your role. Additionally, you will provide support to API consumers by maintaining clear documentation and assisting with technical queries. Your contributions will extend to continuous improvement initiatives in development practices, code quality, and system observability, including logging and error handling. Version control and CI/CD workflows will be managed using tools like GitHub, Azure DevOps, or similar platforms. The ideal candidate should possess a minimum of 3 years of experience in backend development using Python, with familiarity working with frameworks like FastAPI and Flask. A solid understanding of REST API design, versioning, authentication, and documentation, particularly OpenAPI/Swagger, is required. Proficiency in tools such as Postman, VS Code, GitHub, and SQL databases is essential. Knowledge of Azure Functions or cloud-based deployment patterns is advantageous, and experience with Azure is preferred but not mandatory. Troubleshooting technical issues, analyzing logs, and collaborating with support or development teams to identify root causes will be part of your day-to-day responsibilities. Experience or interest in distributed data processing with Spark or real-time data pipelines using Kafka is a plus. You should be a team player with a collaborative mindset, proactive in sharing knowledge, and adept at problem-solving. Proficiency in English, both written and spoken, is necessary for effective communication within the team. If you do not find a suitable role among the current openings but are a passionate and skilled engineer, we encourage you to reach out to us at careers@hashagile.com. Our company is growing rapidly, and we are always looking for enthusiastic individuals to join our team.,
FIND ON MAP