Jobs
Interviews

Codehive Labs Hyderabad

We can provide custom software development solutions tailored to your specific needs. Custom software development involves creating software applications from scratch, designed specifically to address your unique requirements and business processes

13 Job openings at Codehive Labs Hyderabad
BI Migration Developer Hyderabad 6 - 9 years INR 15.0 - 20.0 Lacs P.A. Remote Full Time

Role & responsibilities Role Overview: We are looking for an experienced Tableau to Power BI Migration Developer who will lead and execute the migration of enterprise Tableau dashboards and reports to Power BI. The ideal candidate should have strong expertise in both Tableau and Power BI, with a proven track record of delivering large-scale BI migration projects, including semantic layer alignment, visual redesign, performance optimization, and user training. Responsibilities: Analyze existing Tableau dashboards/reports and source systems to define migration strategy to Power BI. Perform detailed mapping of Tableau workbook components (Datasources, Dimensions, Measures, Calculations, LODs) to Power BI semantic model. Re-design and re-build reports and dashboards in Power BI, ensuring parity with or improvement upon Tableau implementations. Optimize Power BI reports for performance, usability, and scalability. Collaborate with Data Engineering teams to ensure proper data integration and modeling in Power BI. Develop and maintain documentation on migration processes, transformations, and Power BI best practices. Conduct user acceptance testing (UAT) with business users and incorporate feedback. Support training and enablement sessions for end users transitioning from Tableau to Power BI. Identify and automate repeatable patterns to improve migration efficiency. Troubleshoot and resolve issues during the migration lifecycle. Required Skills & Qualifications: 5 to 7 years of overall experience in Business Intelligence and Data Visualization. 3+ years of hands-on experience with Power BI developing reports, dashboards, DAX calculations, Power Query transformations, and data modeling. 3+ years of hands-on experience with Tableau — developing dashboards, data extracts, calculated fields, LOD expressions, and advanced visualizations. Strong experience in Tableau to Power BI migrations or similar BI platform migrations. Solid understanding of data modeling principles (star schema, snowflake schema) and data visualization best practices. Proficiency in DAX , M language (Power Query) , and SQL . Experience working with enterprise data sources: Data Warehouses, Data Lakes, RDBMS, APIs. Familiarity with version control (Git) and CI/CD pipelines for Power BI artifacts is a plus. Strong analytical and problem-solving skills. Excellent communication and documentation skills. Preferred Skills: Experience with BI migration automation tools (desirable). Experience working in Agile delivery environments . Experience with Azure Data Platform or similar cloud BI architectures. Familiarity with Power BI service administration and workspace governance. Preferred candidate profile

Senior Python Developer (Django + FastAPI) Hyderabad 6 - 11 years INR 15.0 - 22.5 Lacs P.A. Remote Full Time

Role & responsibilities We are looking for a highly skilled and self-driven Senior Python Developer with strong hands-on experience in Django and FastAPI to join our growing development team. This is a remote, full-time role based in India , working on scalable and high-performance backend systems. The ideal candidate is well-versed in building RESTful APIs, has experience with Dockerized deployments, and can work in a fast-paced, agile environment. Preferred candidate profile Design, develop, and maintain robust, scalable backend systems using Python , Django , and FastAPI . Develop and expose secure, well-documented REST APIs and/or GraphQL endpoints . Collaborate closely with frontend developers (ReactJS preferred) to integrate APIs and deliver cohesive user experiences. Design and optimize relational databases (PostgreSQL/MySQL) and NoSQL solutions as needed. Implement containerization and deployment pipelines using Docker and CI/CD tools. Ensure application security, performance, and responsiveness at scale. Participate in code reviews, testing, and system monitoring with a focus on best practices and maintainability. Work in Agile/Scrum teams and contribute to sprint planning, estimations, and delivery. 610 years of experience in backend development using Python . Proven expertise in Django and FastAPI frameworks. Experience building and consuming RESTful APIs . Solid knowledge of Docker and deploying containerized applications. Experience working with relational databases (e.g., PostgreSQL , MySQL ). Strong debugging, performance tuning, and problem-solving skills. Good understanding of software design principles , modular architecture , and clean code practices . Familiarity with version control systems like Git . Frontend integration experience, especially with ReactJS or similar JS frameworks. Familiarity with CI/CD pipelines and cloud platforms (AWS/GCP/Azure). Experience with asynchronous programming in Python. Exposure to unit testing , integration testing , and test-driven development (TDD) . Experience with logging/monitoring tools such as Prometheus, Grafana, or ELK stack.

Big Data Developer Hyderabad 5 - 10 years INR 15.0 - 25.0 Lacs P.A. Remote Full Time

Role & responsibilities We are seeking a talented and motivated Big Data Developer to design, develop, and maintain large-scale data processing applications. You will work with modern Big Data technologies, leveraging PySpark and Java/Scala, to deliver scalable, high-performance data solutions on AWS. The ideal candidate is skilled in big data frameworks, cloud services, and modern CI/CD practices. Preferred candidate profile Design and develop scalable data processing pipelines using PySpark and Java/Scala. Build and optimize data workflows for batch and real-time data processing. Integrate and manage data solutions on AWS services such as EMR, S3, Glue, Airflow, RDS, and DynamoDB. Implement containerized applications using Docker, Kubernetes, or similar technologies. Develop and maintain APIs and microservices/domain services as part of the data ecosystem. Participate in continuous integration and continuous deployment (CI/CD) processes using Jenkins or similar tools. Optimize and tune performance of Big Data applications and databases (both relational and NoSQL). Collaborate with data architects, data engineers, and business stakeholders to deliver end-to-end data solutions. Ensure best practices in data security, quality, and governance are followed. Must-Have Skills Proficiency with Big Data frameworks and programming using PySpark and Java/Scala Experience designing and building data pipelines for large-scale data processing Solid knowledge of distributed data systems and best practices in performance optimization Preferred Skills Experience with AWS services (EMR, S3, Glue, Airflow, RDS, DynamoDB, or similar) Familiarity with container orchestration tools (Docker, Kubernetes, or similar) Knowledge of CI/CD pipelines (e.g., Jenkins or similar tools) Hands-on experience with relational databases and SQL Experience with NoSQL databases (e.g., MongoDB, Cassandra, DynamoDB) Exposure to microservices or API gateway frameworks Qualifications Bachelors or Master’s degree in Computer Science, Engineering, or a related field 5+ years of experience in Big Data development Strong analytical, problem-solving, and communication skills Experience working in an Agile environment is a plus

Azure Data Engineer Hyderabad 5 - 10 years INR 10.0 - 20.0 Lacs P.A. Remote Full Time

We are seeking a skilled Azure Data Engineer with strong Power BI capabilities to design, build, and maintain enterprise data lakes on Azure, ingest data from diverse sources, and develop insightful reports and dashboards. This role requires hands-on experience in Azure data services, ETL processes, and BI visualization to support data-driven decision-making. Key Responsibilities Design and implement end-to-end data pipelines using Azure Data Factory (ADF) for batch ingestion from various enterprise sources. Build and maintain a multi-zone Medallion Architecture data lake in Azure Data Lake Storage Gen2 (ADLS Gen2), including raw staging with metadata tracking, silver layer transformations (cleansing, enrichment, schema standardization), and gold layer curation (joins, aggregations). Perform data processing and transformations using Azure Databricks (PySpark/SQL) and ADF, ensuring data lineage, traceability, and compliance. Integrate data governance and security using Databricks Unity Catalog, Azure Active Directory (Azure AD), Role-Based Access Control (RBAC), and Access Control Lists (ACLs) for fine-grained access. Develop and optimize analytical reports and dashboards in Power BI, including KPI identification, custom visuals, responsive designs, and export functionalities to Excel/Word. Conduct data modeling, mapping, and extraction during discovery phases, aligning with functional requirements for enterprise analytics. Collaborate with cross-functional teams to define schemas, handle API-based ingestion (REST/OData), and implement audit trails, logging, and compliance with data protection policies. Participate in testing (unit, integration, performance), UAT support, and production deployment, ensuring high availability and scalability. Create training content and provide knowledge transfer on data lake implementation and Power BI usage. Monitor and troubleshoot pipelines, optimizing for batch processing efficiency and data quality. Required Qualifications Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. 5+ years of experience in data engineering, with at least 3 years focused on Azure cloud services. Proven expertise in Azure Data Factory (ADF) for ETL/orchestration, Azure Data Lake Storage Gen2 (ADLS Gen2) for data lake management, and Azure Databricks for Spark-based transformations. Strong proficiency in Power BI for report and dashboard development, including DAX, custom visuals, data modeling, and integration with Azure data sources (e.g., DirectQuery or Import modes). Hands-on experience with Medallion Architecture (raw/silver/gold layers), data wrangling, and multi-source joins. Familiarity with API ingestion (REST, OData) from enterprise systems. Solid understanding of data governance tools like Databricks Unity Catalog, Azure AD for authentication, and RBAC/ACLs for security. Proficiency in SQL, PySpark, and data modeling techniques for dimensional and analytical schemas. Experience in agile methodologies, with the ability to deliver phased outcomes. Preferred Skills Certifications such as Microsoft Certified: Azure Data Engineer Associate (DP-203) or Power BI Data Analyst Associate (PL-300). Knowledge of Azure Synapse Analytics, Azure Monitor for logging, and integration with hybrid/on-premises sources. Experience in domains like energy, mobility, or enterprise analytics, with exposure to moderate data volumes. Strong problem-solving skills, with the ability to handle rate limits, pagination, and dynamic data in APIs. Familiarity with tools like Azure DevOps for CI/CD and version control of pipelines/notebooks. What We Offer Opportunity to work on cutting-edge data transformation projects. Competitive salary and benefits package. Collaborative environment with access to advanced Azure tools and training. Flexible work arrangements and professional growth opportunities. If you are a proactive engineer passionate about building scalable data solutions and delivering actionable insights, apply now. Role & responsibilities Preferred candidate profile

Cognos PowerBI Migration Developer Hyderabad 6 - 10 years INR 12.0 - 18.0 Lacs P.A. Work from Office Full Time

Responsibilities: * Design, develop, test & maintain Cognos/Power BI solutions. * Collaborate with stakeholders on data requirements & reporting needs. * Ensure data accuracy, security & compliance standards met.

Cognos To Power BI Migration Developer Hyderabad 5 - 10 years INR 12.0 - 20.0 Lacs P.A. Remote Full Time

*****Apply if you are Immediate Joiner Only***** Role & responsibilities Were seeking a seasoned BI Migration Developer to join our Analytics Engineering team and lead end-to-end migrations from IBM Cognos to Microsoft Power BI. Youll combine deep hands-on expertise in both platforms with our in-house Migration Accelerator a custom automation toolkitto streamline metadata extraction, transformation, and report/model conversion at scale. Preferred candidate profile Metadata Extraction & Analysis Use the Cognos SDK/REST APIs and Framework Manager packages to extract data-source schemas, joins, calculations, and report definitions. Profile and catalogue Cognos metadata (query subjects, data modules, prompt pages, conditional styles). Model Design & Build Translate Cognos relational models into Power BI’s tabular semantic layer: define tables, relationships, hierarchies, and role-level security. – Author DAX measures and Power Query (M) transformations that replicate or enhance Cognos calculations and filters. Automation & Tooling – Leverage and extend our Migration Accelerator to automate up to 50% of mapping and deployment tasks. – Develop custom connectors or scripts (Python/.NET) that plug into the accelerator for bespoke Cognos tasks or unsupported metadata scenarios. – Integrate the acceleration toolkit into CI/CD pipelines (Azure DevOps or GitHub Actions) for repeatable, auditable migrations. Report & Dashboard Recreation – Rebuild Cognos report layouts in Power BI Desktop and Power BI Paginated Reports where pixel-perfect output is required. – Adapt interactive dashboards using bookmarks, drill-throughs, and custom visuals to match or exceed Cognos functionality. Quality Assurance & Validation – Design automated test suites that compare row-counts, aggregations, and visualization outputs between Cognos and Power BI. – Drive user acceptance testing (UAT) with business stakeholders, capture feedback, and iterate quickly. Performance Tuning – Optimize data models for large-scale datasets: implement incremental refresh, aggregation tables, composite models, and performance best practices in Power BI Premium. Documentation & Knowledge Transfer – Produce clear runbooks, mapping guides, and “how-to” documentation for both the migration tool and manual processes. – Train BI developers and power-users on the migration accelerator, migration patterns, and Power BI adoption best practices. IBM Cognos Expertise Deep experience with Framework Manager, Report Studio, Cognos Analytics REST/SDK, and package/model design. Familiarity with complex query subjects, namespaces, macros, and prompt pages. Microsoft Power BI Mastery Advanced DAX (time-intelligence, virtual tables), Power Query (M) transformations, and semantic modeling. Experience with Power BI Premium features: incremental refresh, XMLA endpoints, and Paginated Reports. Automation & Scripting Proficient in Python or C# for API integrations, metadata parsing, and custom tool development. Comfortable building and extending automation frameworks or SDKs. Data Engineering & ETL Strong SQL skills (SQL Server, Azure Synapse) and ETL orchestration (Azure Data Factory or equivalent). Knowledge of data-warehouse schemas (star, snowflake) and dimensional modeling. DevOps & CI/CD Hands-on with Git, Azure DevOps/GitHub Actions pipelines, and deploying Power BI artifacts via Power BI REST API or Tabular Editor CLI. Soft Skills Excellent analytical abilities, meticulous attention to detail, and strong written/verbal communication. Proven track record of collaborating with cross-functional teams and driving technical workshops. Perks and benefits Regular Benefits

Python Automation Architect hyderabad 13 - 17 years INR 20.0 - 30.0 Lacs P.A. Remote Full Time

Role & responsibilities We are seeking a seasoned Python Architect to lead the architecture, design, and development of scalable, high-performance enterprise and SaaS solutions. The ideal candidate will bring extensive technical expertise, hands-on Python development skills, and proven experience in designing modern, cloud-native architectures. This role requires strong leadership, cross-functional collaboration, and mentoring capabilities. Architectural Leadership: Define and own the overall technical architecture for complex applications, ensuring scalability, reliability, and security. Software Design & Development: Drive high-quality software design patterns, code reviews, and best practices across development teams. Python & Framework Expertise: Architect and implement backend services using Python, Django/Flask, with optimized APIs and reusable components. Cloud Architecture: Design and deploy secure, scalable cloud solutions on Azure (experience with AWS is an added plus). Containerization & DevOps: Implement Docker -based deployments, CI/CD pipelines, and infrastructure automation. Web & UI Integration: Collaborate with frontend engineers on React integration to ensure seamless user experiences. SaaS Integration: Architect and integrate multi-tenant SaaS platforms, ensuring smooth data flows and third-party system interoperability. Test Automation: Establish and enforce test automation frameworks to improve quality, reduce regression issues, and accelerate delivery. AI for Code Optimization: Leverage AI-powered tools for code optimization, refactoring, and improving development productivity. Documentation: Prepare design documents, sequence diagrams, and technical specifications. Mentorship & Collaboration: Provide technical guidance to teams, partner with product managers, and communicate effectively with stakeholders. Preferred candidate profile Experience: 12+ years in software engineering with at least 5+ years in an Architectural / Technical Leadership role. Programming & Backend: Deep expertise in Python , Django, Flask, RESTful API design, and scalable backend architectures. Software Design: Proven experience applying architectural patterns, microservices design, and reusable frameworks. Cloud & DevOps: Hands-on experience with Azure cloud , CI/CD pipelines, Docker , and infrastructure as code (IaC). UI & Frontend Integration: Understanding of React -based frontend applications and their integration with Python backends. Testing & Automation: Experience designing and implementing test automation frameworks . SaaS Systems: Strong background in SaaS platform design and integration , including multi-tenant solutions. AI in Development: Practical exposure to AI tools for code optimization, automated refactoring, and quality improvements . Databases: Strong skills in SQL & NoSQL, schema design, and performance tuning. Soft Skills: Excellent problem-solving, communication, and leadership skills. Container orchestration (Kubernetes) and API Gateway experience. Exposure to data pipelines, ML/AI integrations, or event-driven architectures. TOGAF, Azure Solution Architect, or similar certifications.

MicroStrategy To Power BI Migration Developer hyderabad 5 - 10 years INR 12.0 - 16.0 Lacs P.A. Remote Full Time

Role & responsibilities Assess and document existing MicroStrategy reports, dashboards, metrics, and data models to create a prioritized migration inventory, utilizing tools like MicroStrategy Enterprise Manager for usage analytics. Develop and execute a comprehensive migration strategy, including data mapping, transformation, and validation, to replicate MicroStrategy functionality in Power BI. Utilize automation tools (e.g., Power BI Dataflows, custom scripts, or third-party solutions like Travinto Technologies code converter) to streamline code conversion, schema mapping, and report replication, reducing manual effort by up to 90%. Convert MicroStrategy metrics (e.g., level metrics with curly bracket notation) into Power BI measures using DAX, ensuring accuracy and performance optimization. Optimize Power BI reports for enhanced visualization, user-friendliness, and scalability, leveraging Power BIs dynamic features to improve upon MicroStrategys limitations. Conduct rigorous testing and quality assurance to ensure data accuracy, query performance, and error-free deployment of migrated reports. Collaborate with stakeholders to define success criteria, develop Proof of Concept (PoC) deliverables, and provide user training to facilitate Power BI adoption. Implement data governance practices, including row-level security and Azure Active Directory integration, to maintain data integrity post-migration. Document the migration process, challenges, and solutions to enable knowledge transfer and support post-migration maintenance. Provide ongoing support during the parallel run phase, addressing user queries and ensuring seamless transition from MicroStrategy to Power BI. Preferred candidate profile Metadata Extraction & Analysis Use the MSTR SDK/REST APIs and Framework Manager packages to extract data-source schemas, joins, calculations, and report definitions. Bachelor’s degree in Computer Science, Information Technology, or a related field. 4+ years of experience in Business Intelligence, with at least 2 years working on MicroStrategy and Power BI platforms. Proven experience in MicroStrategy to Power BI migration projects, including handling complex metrics, cubes, and SDK implementations. Strong proficiency in automation tools and techniques, such as Power BI Dataflows, Python, SQL scripting, or third-party migration tools (e.g., Travinto’s code converter). Expertise in DAX, Power Query, and Power BI’s visualization capabilities for report optimization. Familiarity with MicroStrategy features like Command Manager scripts, LDAP integration, and Enterprise Manager for report analysis. Knowledge of Indian labor laws and compliance requirements (e.g., PF, ESI) is a plus for contract roles. Excellent problem-solving skills, attention to detail, and the ability to work under tight deadlines. Strong communication skills to engage with technical and non-technical stakeholders. Experience with cloud-based BI solutions and integration with Azure or other Microsoft ecosystems. Certification in Power BI (e.g., Microsoft Certified: Data Analyst Associate) or MicroStrategy. Familiarity with ETL processes and data warehousing concepts. Prior experience in the Indian IT services industry or with global clients. Perks and benefits Regular Benefits

Corporate Trainer - Business Intelligence (Tableau & PowerBI) hyderabad 10 - 16 years INR 10.0 - 15.0 Lacs P.A. Hybrid Full Time

Role & responsibilities We are seeking an experienced Corporate Trainer with deep expertise in Business Intelligence (BI) tools, particularly Tableau and Microsoft PowerBI, to design and deliver high-impact training programs for corporate clients. The ideal candidate will empower professionals across industries to leverage data visualization, analytics, and reporting for data-driven decision-making. This role involves creating customized curricula, facilitating interactive workshops, and ensuring participants gain practical skills in BI tools. As a trainer, you'll bridge technical knowledge with real-world applications, helping organizations unlock insights from their data. Key Responsibilities: Training Delivery: Conduct engaging, hands-on training sessions (virtual and in-person) on Tableau and PowerBI, covering topics like data connection, dashboard creation, advanced visualizations, DAX formulas (for PowerBI), calculated fields (for Tableau), and interactive reporting. Tailor content to corporate audiences, from beginners to advanced users. Curriculum Development: Design and update training materials, including slides, exercises, case studies, and certification prep resources. Incorporate real-world business scenarios, such as sales analytics, financial reporting, or operational dashboards. Participant Assessment & Support: Evaluate trainee progress through quizzes, projects, and feedback sessions. Provide post-training support, such as Q&A sessions or mentoring, to ensure skill retention and application on the job. Tool Integration & Best Practices: Demonstrate integrations between Tableau/PowerBI and other tools (e.g., SQL, Excel, Azure), while teaching data modeling, ETL processes, and performance optimization. Promote best practices in data governance, security, and ethical BI usage. Client Collaboration: Work with corporate clients to assess training needs, customize programs, and measure ROI through metrics like improved data literacy or faster reporting cycles. Stay Current: Keep abreast of updates in Tableau, PowerBI, and broader BI trends (e.g., AI-driven analytics). Contribute to internal knowledge sharing or blog content on BI advancements. Administrative Tasks: Schedule sessions, track attendance, and prepare reports on training outcomes. Support sales teams by demoing training capabilities during pre-sales engagements. Required Qualifications & Skills: Education: Bachelor's degree in Computer Science, Information Technology, Business Analytics, or a related field. Certifications in Tableau (e.g., Tableau Desktop Specialist/Certified Associate) and PowerBI (e.g., Microsoft Certified: Power BI Data Analyst Associate) are highly preferred. Experience: 10+ years as a BI Trainer, Consultant, or Analyst, with proven hands-on expertise in both Tableau and PowerBI. Corporate training experience is essential; experience in diverse industries (e.g., finance, healthcare, retail) is a plus. Technical Skills: Advanced proficiency in Tableau: Data blending, LOD expressions, geospatial mapping, and publishing to Tableau Server/Online. Advanced proficiency in PowerBI: DAX, Power Query, data modeling, and integration with Power BI Service for sharing and collaboration. Strong understanding of BI fundamentals: Data warehousing, SQL querying, ETL processes, and visualization best practices. Familiarity with complementary tools like Excel, SQL Server, or Python/R for data prep (bonus). Soft Skills: Excellent communication and presentation skills, with the ability to simplify complex concepts for non-technical audiences. Strong problem-solving, adaptability, and a passion for teaching. Experience with virtual training platforms (e.g., Zoom, Microsoft Teams) is required. Other: Willingness to travel (up to 20%) and work flexible hours to accommodate global clients. Preferred Qualifications: Master's degree in Data Science or Business Analytics. Experience with other BI tools (e.g., QlikView, Looker) for comparative training. Background in adult learning principles or instructional design certifications (e.g., CPTD). Preferred candidate profile

BI Migration(Tableau-PowerBI) Developer hyderabad 6 - 10 years INR 15.0 - 22.5 Lacs P.A. Remote Full Time

Role & responsibilities Role Overview: We are looking for an experienced Tableau to Power BI Migration Developer who will lead and execute the migration of enterprise Tableau dashboards and reports to Power BI. The ideal candidate should have strong expertise in both Tableau and Power BI, with a proven track record of delivering large-scale BI migration projects, including semantic layer alignment, visual redesign, performance optimization, and user training. Responsibilities: Analyze existing Tableau dashboards/reports and source systems to define migration strategy to Power BI. Perform detailed mapping of Tableau workbook components (Datasources, Dimensions, Measures, Calculations, LODs) to Power BI semantic model. Re-design and re-build reports and dashboards in Power BI, ensuring parity with or improvement upon Tableau implementations. Optimize Power BI reports for performance, usability, and scalability. Collaborate with Data Engineering teams to ensure proper data integration and modeling in Power BI. Develop and maintain documentation on migration processes, transformations, and Power BI best practices. Conduct user acceptance testing (UAT) with business users and incorporate feedback. Support training and enablement sessions for end users transitioning from Tableau to Power BI. Identify and automate repeatable patterns to improve migration efficiency. Troubleshoot and resolve issues during the migration lifecycle. Required Skills & Qualifications: 7 to 10 years of overall experience in Business Intelligence and Data Visualization. 6+ years of hands-on experience with Power BI developing reports, dashboards, DAX calculations, Power Query transformations, and data modeling. 3+ years of hands-on experience with Tableau developing dashboards, data extracts, calculated fields, LOD expressions, and advanced visualizations. Strong experience in Tableau to Power BI migrations or similar BI platform migrations. Solid understanding of data modeling principles (star schema, snowflake schema) and data visualization best practices. Proficiency in DAX , M language (Power Query) , and SQL . Experience working with enterprise data sources: Data Warehouses, Data Lakes, RDBMS, APIs. Familiarity with version control (Git) and CI/CD pipelines for Power BI artifacts is a plus. Strong analytical and problem-solving skills. Excellent communication and documentation skills. Preferred Skills: Experience with BI migration automation tools (desirable). Experience working in Agile delivery environments . Experience with Azure Data Platform or similar cloud BI architectures. Familiarity with Power BI service administration and workspace governance. Preferred candidate profile

Lead Data Engineer (GCP) hyderabad 7 - 12 years INR 20.0 - 30.0 Lacs P.A. Remote Full Time

Role & responsibilities The Lead Data Engineer will build, optimize, and maintain high-throughput data pipelines on GCP, ensuring resilient imports, freshness monitoring, and automation-readiness. This role leads engineering teams to deliver scalable data solutions. Responsibilities Develop and maintain large-scale ingestion and transformation pipelines on GCP . Implement monitoring and alerting for data freshness and resiliency . Optimize ETL/ELT using BigQuery, Dataflow, Pub/Sub, Dataproc . Collaborate with AI teams to embed automation components. Drive best practices in DevOps, CI/CD, and testing. Mentor engineers and lead Agile delivery cycles. Technical Skills Proficiency in GCP data stack (BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage) . Strong programming skills in Python, SQL, PySpark, Java/Scala . Experience with Airflow/Cloud Composer for orchestration. Knowledge of CI/CD with Cloud Build, Cloud Deploy, Terraform for GCP . Hands-on with data quality frameworks and observability tools . Preferred candidate profile

AI / LLM Specialist hyderabad 10 - 16 years INR 25.0 - 30.0 Lacs P.A. Remote Full Time

Role & responsibilities The AI/LLM Specialist will focus on developing, fine-tuning, and embedding large language models into GCP-based data processing and automation pipelines. This role ensures accuracy, scalability, and cost efficiency of AI-driven automation. Responsibilities Train, fine-tune, and optimize LLMs for automation, mapping, and anomaly detection . Design prompt engineering, embeddings, and RAG strategies on GCP. Collaborate with engineers to embed LLMs into automation pipelines. Define evaluation frameworks for LLM accuracy and efficiency. Research emerging LLMs and frameworks for continuous innovation. Balance cost-performance tradeoffs for Vertex AI deployments . Technical Skills Deep expertise in LLMs (Vertex AI models, Hugging Face, open-source LLMs) . Proficiency in Python, LangChain, Transformers, Vector DBs (Vertex Matching Engine, Pinecone, FAISS) . Knowledge of Vertex AI pipelines, MLflow, and Cloud AI services . Strong understanding of RAG pipelines, embeddings, and semantic search . Familiarity with fine-tuning, LoRA, and RLHF techniques on GCP . Preferred candidate profile

Data Scientist bengaluru 3 - 7 years INR 20.0 - 25.0 Lacs P.A. Work from Office Full Time

Role & responsibilities We seek Data Scientist to join the Forecasting & Planning Research team. The data scientists will be responsible for modeling complex problems, discovering insights and identifying opportunities through the use of statistical, machine learning, algorithmic, data mining and visualization techniques. They are also responsible for moving the models to productionenvironment and automate with appropriate drift monitoring and model improvement process. They will need to collaborate effectively with internal stakeholders and cross-functional teams to understand requirements, solve problems, create operational efficiencies, and deliver successfully against high organizational standards. The candidates should be able to apply a breadth of tools, data sources and analytical techniques to answer a wide range of high-impact business questions and present the insights in concise and effective manner. Additionally, the candidates should be an effective communicator capable of independently driving issues to resolution, communicating insights to non-technical audiences and documenting the artifacts. This is a high impact role with goals that directly impacts the bottom line of the business. Preferred candidate profile Key Deliverables: • Development of accurate and reliable workforce forecasting models. • Automate training and forecasting by deploying data and model pipelines on AWS cloud infrastructure with drift monitoring with emphasis on accuracy and speed • Building Project launch impact estimates model using tools such as of analytics, time series, probability and deep learning. • Reduce MAPE (forecasting accuracy) across different risk functions. • Automate data ingestion for Project inputs from Excel to forecasting models • Improved data quality through rigorous cleaning and transformation processes and automating them. • Clear documentation of the code and artifacts. • Actionable insights derived from data analysis to support strategic decisions. • Experiment with latest forecasting algorithms & processes to optimize existing modeling infrastructure. Qualification Needed: • 5+ years of data scientist experience, preferably with Forecasting systems & Operational Research • 3+ years of data querying languages (e.g. SQL) and scripting languages (e.g. Python) experience • 3+ years of machine learning/statistical modeling data analysis tools and techniques, and parameters that affect their performance experience • 3+ years of AWS cloud experience building end-to-end products deploy, monitor and update them using tools such as Amazon SageMaker pipelines and docker containers. • Experience applying theoretical models in an applied environment • Understanding of demand forecasting & impact on operational capacity planning • Knowledge of time series models and deep learning for time series are an asset.

FIND ON MAP

Codehive Labs Hyderabad