We are looking for someone that can join our team immediately. Your Mission Join our Global BI Development Team as we shape the future of our BI landscape in the Azure Cloud. As a key member of the team, you’ll focus on building and maintaining robust, scalable, and well-documented REST APIs that power our Data-as-a-Service platform. You’ll work closely with stakeholders to ensure reliable data delivery, clean design, and seamless integration across systems. Your Responsibilities Develop and maintain Python-based REST APIs with a strong focus on OpenAPI (Swagger) specifications and clean, testable code. Collaborate with internal teams to align on data structures, endpoints, versioning strategies, and deployment timelines. Work with tools such as Postman and Swagger UI to validate and document API endpoints. Monitor and enhance the performance, reliability, and security of deployed APIs. Support consumers of the APIs by maintaining clear documentation and assisting with technical queries. Contribute to continuous improvement efforts in our development practices, code quality, and system observability (e.g., logging, error handling). Use GitHub, Azure DevOps, or similar tools for version control and CI/CD workflows. Your Profile Strong experience (3+ years) in backend development using Python (e.g., FastAPI, Flask). Solid understanding of REST API design, versioning, authentication, and documentation (especially OpenAPI/Swagger). Proficient in using tools like Postman, VS Code, GitHub, and working with SQL-based databases. Familiar with Azure Functions or cloud-based deployment patterns (experience with Azure is a plus). Comfortable troubleshooting technical issues, analyzing logs, and collaborating with support or development teams to identify root causes. Experience or interest in distributed data processing with Spark or real-time data pipelines using Kafka is a plus, but not required. Team player with a collaborative mindset and a proactive approach to sharing knowledge and solving problems. Fluent in English, written and spoken. Show more Show less
Your Mission As a key member of the team, you’ll focus on building and maintaining robust, scalable, and well-documented REST APIs that power our Data-as-a-Service platform (PUMA API Portal). You’ll work closely with stakeholders to ensure reliable data delivery, clean design, and seamless integration across systems. Your Responsibilities Develop and maintain Python-based REST APIs with a strong focus on OpenAPI (Swagger) specifications and clean, testable code. Collaborate with internal teams to align on data structures, endpoints, versioning strategies, and deployment timelines. Work with tools such as Postman and Swagger UI to validate and document API endpoints. Monitor and enhance the performance, reliability, and security of deployed APIs. Support consumers of the APIs by maintaining clear documentation and assisting with technical queries. Contribute to continuous improvement efforts in our development practices, code quality, and system observability (e.g., logging, error handling). Use GitHub, Azure DevOps, or similar tools for version control and CI/CD workflows. Your Profile Strong experience (3+ years) in backend development using Python (e.g., FastAPI, Flask). Solid understanding of REST API design, versioning, authentication, and documentation (especially OpenAPI/Swagger). Proficient in using tools like Postman, VS Code, GitHub, and working with SQL-based databases. Familiar with Azure Functions or cloud-based deployment patterns (experience with Azure is a plus but not mandatory). Comfortable troubleshooting technical issues, analyzing logs, and collaborating with support or development teams to identify root causes. Experience or interest in distributed data processing with Spark or real-time data pipelines using Kafka is a plus, but not required. Team player with a collaborative mindset and a proactive approach to sharing knowledge and solving problems. Fluent in English, written and spoken. Show more Show less
Company Description Texplorers Inc. specializes in driving digital transformation for global enterprises through cutting-edge cloud technologies and AI expertise. The company enables businesses to scale efficiently, optimize operations, and foster continuous innovation. Texplorers equips clients with digital capabilities and forward-thinking strategies necessary for sustained success. Role Description This is a full-time on-site role for a Senior Human Resources Manager located in Hyderabad. The Senior HR Manager will be responsible for overseeing all aspects of HR functions, including recruitment, training, employee relations, performance management, and compliance with employment laws. This role involves collaborating with leadership to develop and implement HR strategies to support business objectives and promote a positive workplace culture. Qualifications Experience in recruitment, training, employee relations, and performance management Knowledge of employment laws and compliance Strong communication and interpersonal skills Ability to develop and implement HR strategies Excellent organizational and leadership abilities Experience in a global corporate environment is a plus Bachelor's degree in Human Resources Management or related field
Your Mission As a key member of the team, you’ll focus on building and maintaining robust, scalable, and well-documented REST APIs that power our Data-as-a-Service platform (PUMA API Portal). You’ll work closely with stakeholders to ensure reliable data delivery, clean design, and seamless integration across systems. Your Responsibilities Develop and maintain Python-based REST APIs with a strong focus on OpenAPI (Swagger) specifications and clean, testable code. Collaborate with internal teams to align on data structures, endpoints, versioning strategies, and deployment timelines. Work with tools such as Postman and Swagger UI to validate and document API endpoints. Monitor and enhance the performance, reliability, and security of deployed APIs. Support consumers of the APIs by maintaining clear documentation and assisting with technical queries. Contribute to continuous improvement efforts in our development practices, code quality, and system observability (e.g., logging, error handling). Use GitHub, Azure DevOps, or similar tools for version control and CI/CD workflows. Your Profile Strong experience (3+ years) in backend development using Python (e.g., FastAPI, Flask). Solid understanding of REST API design, versioning, authentication, and documentation (especially OpenAPI/Swagger). Proficient in using tools like Postman, VS Code, GitHub, and working with SQL-based databases. Familiar with Azure Functions or cloud-based deployment patterns (experience with Azure is a plus but not mandatory). Comfortable troubleshooting technical issues, analyzing logs, and collaborating with support or development teams to identify root causes. Experience or interest in distributed data processing with Spark or real-time data pipelines using Kafka is a plus, but not required. Team player with a collaborative mindset and a proactive approach to sharing knowledge and solving problems. Fluent in English, written and spoken.
Job Title: Python REST API Developer Your Mission Join Global BI Development Team as we shape the future of our BI landscape in the Azure Cloud. As a key member of the team, you’ll focus on building and maintaining robust, scalable, and well-documented REST APIs that power our Data-as-a-Service platform (API Portal). You’ll work closely with stakeholders to ensure reliable data delivery, clean design, and seamless integration across systems. Your Responsibilities Develop and maintain Python-based REST APIs with a strong focus on OpenAPI (Swagger) specifications and clean, testable code. Collaborate with internal teams to align on data structures, endpoints, versioning strategies, and deployment timelines. Work with tools such as Postman and Swagger UI to validate and document API endpoints. Monitor and enhance the performance, reliability, and security of deployed APIs. Support consumers of the APIs by maintaining clear documentation and assisting with technical queries. Contribute to continuous improvement efforts in our development practices, code quality, and system observability (e.g., logging, error handling). Use GitHub, Azure DevOps, or similar tools for version control and CI/CD workflows. Your Profile Strong experience (3+ years) in backend development using Python (e.g., FastAPI, Flask). Solid understanding of REST API design, versioning, authentication, and documentation (especially OpenAPI/Swagger). Proficient in using tools like Postman, VS Code, GitHub, and working with SQL-based databases. Familiar with Azure Functions or cloud-based deployment patterns (experience with Azure is a plus but not mandatory). Comfortable troubleshooting technical issues, analyzing logs, and collaborating with support or development teams to identify root causes. Experience or interest in distributed data processing with Spark or real-time data pipelines using Kafka is a plus, but not required. Team player with a collaborative mindset and a proactive approach to sharing knowledge and solving problems. Fluent in English, written and spoken.
Data Engineer role and are looking for professionals with strong expertise in Spark and SQL to join our dynamic team. This position offers the opportunity to work on modern data platforms with technologies like Azure Synapse , Databricks , and Apache Spark (PySpark) . Key Responsibilities include: Designing and developing scalable data pipelines using Azure Synapse, Databricks, and PySpark Integrating data from various sources with a focus on quality and consistency Optimizing workflows for performance and cost-efficiency Collaborating with cross-functional teams to deliver reliable data solutions Monitoring pipelines and ensuring data integrity Documenting workflows and best practices Skills We're Looking For : Technical: PySpark, SQL, Azure Synapse, Databricks, Delta Live Tables, ETL processes, cloud platforms (Azure, AWS, GCP) Soft Skills: Strong problem-solving, communication, and the ability to work in a fast-paced, collaborative environment
** We are looking for someone who can join us immediately ** About the role We are looking for a highly skilled and business-savvy Data Analyst to join our team. This role is ideal for someone who thrives on translating complex business requirements into impactful reporting solutions using Power BI and DAX. You will work closely with stakeholders across departments to deliver insights that drive strategic decisions and operational excellence. Key Responsibilities Collaborate with business stakeholders to gather and understand reporting needs and translate them into scalable data models and dashboards. Design, develop, and maintain Power BI reports and dashboards that provide actionable insights and support data-driven decision-making. Apply advanced DAX to create dynamic measures and KPIs that reflect business logic and performance metrics. Build and optimize semantic models using best practices in data modeling (star schema, normalization, etc.). Ensure data quality, consistency, and performance across all reporting layers. Communicate findings clearly and effectively to both technical and non-technical audiences. Support self-service BI initiatives by enabling business users through training and documentation. Required Skills & Experience 5+ years of strong experience in Power BI, including DAX, Power Query, and data modeling. Solid understanding of relational databases and proficiency in SQL. Proven ability to translate business questions into analytical solutions. Excellent communication and stakeholder management skills. Experience working in agile, cross-functional teams. High attention to detail and a proactive, solution-oriented mindset. Preferred Qualifications Bachelor’s or Master’s degree in Data Science, Business Analytics, Information Systems, or a related field. Experience with CI/CD pipelines for Power BI (e.g., using GitHub or Azure DevOps). Familiarity with tools like SSAS, Azure Synapse, or Databricks is a plus. Prior experience in retail, e-commerce, or consumer analytics is advantageous.
** We are looking for someone who can join us immediately ** About the role We are looking for a highly skilled and business-savvy Data Analyst to join our team. This role is ideal for someone who thrives on translating complex business requirements into impactful reporting solutions using Power BI and DAX. You will work closely with stakeholders across departments to deliver insights that drive strategic decisions and operational excellence. Key Responsibilities Collaborate with business stakeholders to gather and understand reporting needs and translate them into scalable data models and dashboards. Design, develop, and maintain Power BI reports and dashboards that provide actionable insights and support data-driven decision-making. Apply advanced DAX to create dynamic measures and KPIs that reflect business logic and performance metrics. Build and optimize semantic models using best practices in data modeling (star schema, normalization, etc.). Ensure data quality, consistency, and performance across all reporting layers. Communicate findings clearly and effectively to both technical and non-technical audiences. Support self-service BI initiatives by enabling business users through training and documentation. Required Skills & Experience 5+ years of strong experience in Power BI, including DAX, Power Query, and data modeling. Solid understanding of relational databases and proficiency in SQL. Proven ability to translate business questions into analytical solutions. Excellent communication and stakeholder management skills. Experience working in agile, cross-functional teams. High attention to detail and a proactive, solution-oriented mindset. Preferred Qualifications Bachelors or Masters degree in Data Science, Business Analytics, Information Systems, or a related field. Experience with CI/CD pipelines for Power BI (e.g., using GitHub or Azure DevOps). Familiarity with tools like SSAS, Azure Synapse, or Databricks is a plus. Prior experience in retail, e-commerce, or consumer analytics is advantageous. Show more Show less
Job Title: Python REST API Developer Your Mission Join Global BI Development Team as we shape the future of our BI landscape in the Azure Cloud. As a key member of the team, you’ll focus on building and maintaining robust, scalable, and well-documented REST APIs that power our Data-as-a-Service platform (API Portal). You’ll work closely with stakeholders to ensure reliable data delivery, clean design, and seamless integration across systems. Your Responsibilities Develop and maintain Python-based REST APIs with a strong focus on OpenAPI (Swagger) specifications and clean, testable code. Collaborate with internal teams to align on data structures, endpoints, versioning strategies, and deployment timelines. Work with tools such as Postman and Swagger UI to validate and document API endpoints. Monitor and enhance the performance, reliability, and security of deployed APIs. Support consumers of the APIs by maintaining clear documentation and assisting with technical queries. Contribute to continuous improvement efforts in our development practices, code quality, and system observability (e.g., logging, error handling). Use GitHub, Azure DevOps, or similar tools for version control and CI/CD workflows. Your Profile Strong experience (3+ years) in backend development using Python (e.g., FastAPI, Flask). Solid understanding of REST API design, versioning, authentication, and documentation (especially OpenAPI/Swagger). Proficient in using tools like Postman, VS Code, GitHub, and working with SQL-based databases. Familiar with Azure Functions or cloud-based deployment patterns (experience with Azure is a plus but not mandatory). Comfortable troubleshooting technical issues, analyzing logs, and collaborating with support or development teams to identify root causes. Experience or interest in distributed data processing with Spark or real-time data pipelines using Kafka is a plus, but not required. Team player with a collaborative mindset and a proactive approach to sharing knowledge and solving problems. Fluent in English, written and spoken.
Position Overview: We are seeking a detail-oriented QA Tester to join our data transformation team as we migrate our analytics infrastructure to Databricks. This role focuses on validating Power BI reports and ensuring data accuracy throughout our enterprise-wide transformation program. The successful candidate will play a critical role in maintaining data integrity and report reliability during this strategic migration. Key Responsibilities: Data Validation & Testing Design and execute comprehensive test plans for Power BI reports and dashboards affected by the Databricks migration Perform end-to-end data validation comparing source systems, transformed data, and final report outputs Validate data accuracy, completeness, and consistency across multiple data sources and reporting layers Create and maintain automated data validation scripts using Python or similar scripting languages SQL & Database Testing Write complex SQL queries to validate data transformations and business logic Compare data sets between legacy systems and new Databricks environment Identify and document data discrepancies, investigating root causes with data engineering teams Perform data profiling and quality assessments on large datasets Power BI Report Validation Test Power BI report functionality, performance, and user experience across different scenarios Validate calculations, filters, drill-down capabilities, and visual representations Ensure reports meet business requirements and maintain expected behaviour post-migration Document and track issues using established defect management processes Documentation & Reporting Create detailed test documentation including test cases, test data, and validation procedures Maintain comprehensive defect logs and provide regular status reports to stakeholders Develop reusable testing frameworks and validation procedures for future use Collaborate with business users to understand reporting requirements and acceptance criteria. Required Qualifications Technical Skills 3+ years of experience in software/data quality assurance or testing Good knowledge in SQL with ability to write queries for data validation Experience with Python, R, or similar scripting languages for automation Hands-on experience with Power BI Understanding of data warehousing concepts and ETL/ELT processes Professional Experience Experience testing business intelligence or analytics solutions Background in data migration or transformation projects preferred Familiarity with cloud data platforms (Azure, AWS, or similar) is a plus Experience with Databricks or Apache Spark environments is advantageous Soft Skills Strong analytical and problem-solving abilities Excellent attention to detail and commitment to data accuracy Effective communication skills for collaborating with technical and business teams Ability to work independently and manage multiple testing streams simultaneously
You are looking for a hands-on Data Engineer who will be responsible for designing, building, and maintaining scalable data ingestion pipelines. Your main focus will be on ensuring the delivery of high-quality, reliable, and scalable data pipelines to support downstream analytics, machine learning, and business intelligence solutions. You will work with various internal and external sources to onboard structured and semi-structured data using Azure-native services like Data Factory, Azure Data Lake, Event Hubs, as well as tools like Databricks or Apache Spark for data ingestion and transformation. Your responsibilities will include developing metadata-driven ingestion frameworks, collaborating with source system owners to define data ingestion specifications, implementing monitoring/alerting on ingestion jobs, and embedding data quality, lineage, and governance principles into ingestion processes. You will also optimize ingestion processes for performance, reliability, and cloud cost efficiency, and support both batch and real-time ingestion needs, including streaming data pipelines where applicable. To qualify for this role, you should have at least 3 years of hands-on experience in data engineering, with a specific focus on data ingestion or integration. You should have hands-on experience with Azure Data Services or equivalent cloud-native tools, experience in Python (PySpark) for data processing tasks, and familiarity with ETL frameworks, orchestration tools, and working with API-based data ingestion. Knowledge of data quality and validation strategies, CI/CD practices, version control, and infrastructure-as-code are also required. Bonus qualifications include experience with SAP.,
Job Title: Powerbi Support Job Duration: Full time Job Location: Hyderabad TG Work Hours: Night shift-5.30PM-3:00 AM IST Required Experience: 5 Years Job Description: BI Support Engineer: This person will take care about level 2 support of Data Engineering + Cube Applications Support Engineer Cube Applications Tasks: Troubleshooting and Issue Resolution: Identifying and resolving issues with Data & Analytics reports, dashboards, or data connections from end-to-end (from source systems down to Data & Analytics data products). Data Integration and ETL Support: Helping integrate various data sources and provide first and second level support to ensure ETL (Extract, Transform, Load) processes in both On-Prem and Cloud environment run without errors Documentation: Creating documentation for troubleshooting procedures, system configurations, and user guides. Skills: Knowledge of working with different Cloud technologies such as Azure, Databricks,… Knowledge of creating and maintaining ETL tools: SSIS, Synapse data pipelines,… Strong knowledge in TSQL and managing relational data Knowledge of creating and maintaining Power BI Reports and datasets Knowledge working with git hub repos Familiarity with supporting data engineering and Power BI applications in a high-availability and high-performance environment Knowledgeable in Python, PySpark, … Excellent customer service skills, strong technical trouble-shooting and problem solving skills
As a key member of the team, your main focus will be on constructing and maintaining robust, scalable, and well-documented REST APIs that serve as the foundation for our Data-as-a-Service platform (PUMA API Portal). You will collaborate closely with stakeholders to ensure dependable data delivery, sleek design, and seamless integration across various systems. Your responsibilities will include developing and managing Python-based REST APIs with a strong emphasis on OpenAPI (Swagger) specifications and clean, testable code. You will work alongside internal teams to synchronize on data structures, endpoints, versioning strategies, and deployment schedules. Utilizing tools such as Postman and Swagger UI, you will validate and document API endpoints. Additionally, you will be responsible for continuously monitoring and improving the performance, reliability, and security of deployed APIs. Supporting API consumers by upholding clear documentation and aiding with technical inquiries will also be part of your duties. Furthermore, your contribution to enhancing development practices, code quality, and system observability will be vital. Leveraging tools like GitHub, Azure DevOps, or similar platforms for version control and CI/CD workflows will also fall under your purview. To be successful in this role, you should possess a strong background (3+ years) in backend development utilizing Python (e.g., FastAPI, Flask). A solid grasp of REST API design, versioning, authentication, and documentation particularly OpenAPI/Swagger is essential. Proficiency in tools like Postman, VS Code, GitHub, and experience with SQL-based databases is required. While familiarity with Azure Functions or cloud-based deployment patterns is preferred (Azure experience is advantageous but not obligatory), troubleshooting technical issues, log analysis, and collaborating with support or development teams to pinpoint root causes should be well within your capabilities. Though experience or interest in distributed data processing with Spark or real-time data pipelines using Kafka is beneficial, it is not a mandatory requirement. A team player with a collaborative mindset, proactive in sharing knowledge, and adept at problem-solving will thrive in this role. Proficiency in both written and spoken English is a must-have skill. ,
Job Title: Python REST API Developer Your Mission Join Global BI Development Team as we shape the future of our BI landscape in the Azure Cloud. As a key member of the team, you’ll focus on building and maintaining robust, scalable, and well-documented REST APIs that power our Data-as-a-Service platform (API Portal). You’ll work closely with stakeholders to ensure reliable data delivery, clean design, and seamless integration across systems. Your Responsibilities Develop and maintain Python-based REST APIs with a strong focus on OpenAPI (Swagger) specifications and clean, testable code. Collaborate with internal teams to align on data structures, endpoints, versioning strategies, and deployment timelines. Work with tools such as Postman and Swagger UI to validate and document API endpoints. Monitor and enhance the performance, reliability, and security of deployed APIs. Support consumers of the APIs by maintaining clear documentation and assisting with technical queries. Contribute to continuous improvement efforts in our development practices, code quality, and system observability (e.g., logging, error handling). Use GitHub, Azure DevOps, or similar tools for version control and CI/CD workflows. Your Profile Strong experience (6+ years) in backend development using Python (e.g., FastAPI, Flask). Solid understanding of REST API design, versioning, authentication, and documentation (especially OpenAPI/Swagger). Proficient in using tools like Postman, VS Code, GitHub, and working with SQL-based databases. Familiar with Azure Functions or cloud-based deployment patterns (experience with Azure is a plus but not mandatory). Comfortable troubleshooting technical issues, analyzing logs, and collaborating with support or development teams to identify root causes. Experience or interest in distributed data processing with Spark or real-time data pipelines using Kafka is a plus, but not required. Team player with a collaborative mindset and a proactive approach to sharing knowledge and solving problems. Fluent in English, written and spoken.
DevOps Engineer – Data solutions YOUR MISSION We’re building something solid — and we’re nearly there. Our team has been steadily laying the foundation for a robust DevOps practice to support our Azure-based data platform. The team is in place, core processes are already running, and now we’re ready to level up. The goal is to make deployments faster, more reliable, and less dependent on manual work - so developers can focus on building. We’re looking for a hands-on DevOps Engineer who can work independently, take ownership of topics end-to-end. What You'll Do: Design and implement GitHub actions workflow forAzure databricks; DB solutions; Azure functions; App Services; REST API Solutions (APIOps), Power BI Solutions and AI/ML Solutions (MLOps) Define Pull Request flow including Pull Request, Review, Merging, Build, Acceptance and Deploy Understand the deployment needs of developers and define Git hub actions for each project, which will be used by developers to deploy their code to Production. Propose scalable architecture solutions to support development and operations. Installation of software and configuration of Git Hub Runners. Contribute light infrastructure automation using Terraform when required. Guiding and co-operation: Being the “go-to person” for developers, providing them clarifications by understanding the overall architecture setup. Support the operationsand development team to organize proper processand to make sure the development is adhered to the process. YOUR TALENT University degree in Computer Sciences or a similar field of studies 3+ years experience in setting up GitOps process and creating Git Hub Actions. Basic experience with terraform withInfrastructure as Code(IaC). Strong understanding of the following Azure Services: Azure storage account (ADLS), and Azure function apps,App services, databricks hosted in Azure. Background ideally in bothdata solution development and automation for CI/CD. Very high motivation in helping/guiding teammates to succeed in projects. Fluent in English