Home
Jobs
4 Job openings at innovantes IT
About innovantes IT

Innovantes IT is a technology consulting firm that specializes in software development, IT services, and digital transformation solutions. They focus on creating innovative strategies for their clients to enhance efficiency and productivity through technology.

Project Manager

Chandigarh

8 - 10 years

INR 12.0 - 15.0 Lacs P.A.

Work from Office

Full Time

About Innovantes Innovantes IT Solutions LLP is a digital transformation partner specializing in custom software development, advanced analytics, Power BI solutions, and AI-driven applications for automotive OEMs, healthcare providers, and enterprises worldwide. We foster a culture of innovation, collaboration, and continuous learning. Role Overview As a Project Manager at Innovantes, you will drive successful delivery of software and analytics projects for both domestic and international enterprise clients. You will partner closely with clients, internal development teams, and senior leadership to ensure on-time, on-budget delivery, and to support pre-sales activities and proof-of-concept (POC) engagements. Key Responsibilities Lead end-to-end project delivery: planning, execution, monitoring, and closure Define project scope, objectives, deliverables, milestones, and success criteria Develop and maintain detailed project plans, resource allocations, and budgets Coordinate cross-functional teams (developers, QA, analytics, BI, DevOps) to meet project goals Communicate project status, risks, and issues to stakeholders and senior management Proactively identify and mitigate project risks; escalate when necessary Drive POCs: collaborate with product architects, tech leads, and clients to validate feasibility Support pre-sales efforts: contribute to solution scoping, estimates, and proposal development Establish and maintain strong client relationships; act as primary point of contact Ensure adherence to quality standards, agile/Scrum processes, and best practices Mentor and guide junior project managers and coordinators Required Qualifications Bachelors degree in Computer Science, Engineering, Business, or related field Minimum 8 years of project management experience in an IT Services environment Proven track record managing both domestic and international enterprise engagements Exceptional verbal and written communication skills; client-facing expertise Strong understanding of software development life cycles (SDLC) and agile frameworks Solid knowledge of data analytics concepts and BI tool implementations (e.g., Power BI) Hands-on programming experience (e.g., Python, Java, C#) to effectively liaise with development teams Demonstrated ability to lead POCs and pre-sales technical discussions Preferred Skills & Attributes PMP, PRINCE2, or Scrum Master certification Experience with cloud platforms (AWS, Azure, GCP) and modern DevOps practices Familiarity with database technologies (SQL, NoSQL) and ETL/ELT processes Strong analytical mindset with problem-solving orientation Ability to thrive in a fast-paced startup environment and drive multiple projects in parallel Proactive, collaborative, and customer-centric attitude.

Etl Developer

Chandigarh

3 - 5 years

INR 6.0 - 8.0 Lacs P.A.

Work from Office

Full Time

Role Overview We are seeking a talented ETL Engineer to design, implement, and maintain end-to-end data ingestion and transformation pipelines in Google Cloud Platform (GCP). This role will collaborate closely with data architects, analysts, and BI developers to ensure high-quality, performant data delivery into BigQuery and downstream Power BI reporting layers. Key Responsibilities Data Ingestion & Landing Architect and implement landing zones in Cloud Storage for raw data. Manage buckets/objects and handle diverse file formats (Parquet, Avro, CSV, JSON, ORC). ETL Pipeline Development Build and orchestrate extraction, transformation, and loading workflows using Cloud Data Fusion. Leverage Data Fusion Wrangler for data cleansing, filtering, imputation, normalization, type conversion, splitting, joining, sorting, union, pivot/unpivot, and format adjustments. Data Modeling Design and maintain fact and dimension tables using Star and Snowflake schemas. Collaborate on semantic layer definitions to support downstream reporting. Load & Orchestration Load curated datasets into BigQuery across different zones (raw, staging, curated). Develop SQL-based orchestration and transformation within BigQuery (scheduled queries, scripting). Performance & Quality Optimize ETL jobs for throughput, cost, and reliability. Implement monitoring, error handling, and data quality checks. Collaboration & Documentation Work with data analysts and BI developers to understand requirements and ensure data readiness for Power BI. Maintain clear documentation of pipeline designs, data lineage, and operational runbooks. Required Skills & Experience Bachelors degree in Computer Science, Engineering, or related field. 3+ years of hands-on experience building ETL pipelines in GCP. Proficiency with Cloud Data Fusion , including Wrangler transformations. Strong command of SQL , including performance tuning in BigQuery. Experience managing Cloud Storage buckets and handling Parquet, Avro, CSV, JSON, and ORC formats. Solid understanding of dimensional modeling: fact vs. dimension tables, Star and Snowflake schemas. Familiarity with BigQuery data zones (raw, staging, curated) and dataset organization. Experience with scheduling and orchestration tools (Cloud Composer, Airflow, or BigQuery scheduled queries). Excellent problem-solving skills and attention to detail. Preferred (Good to Have) Exposure to Power BI data modeling and DAX. Experience with other GCP services (Dataflow, Dataproc). Familiarity with Git, CI/CD pipelines, and infrastructure as code (Terraform). Knowledge of Python for custom transformations or orchestration scripts. Understanding of data governance best practices and metadata management.

Etl And Bi Developer

Chandigarh

3 - 5 years

INR 6.0 - 9.0 Lacs P.A.

Work from Office

Full Time

Were looking for a hands-on ETL & BI Engineer to design, build, and maintain robust data pipelines on Google Cloud Platform and turn that trusted data into compelling, actionable reports in Power BI. Youll partner with data architects, analysts, and BI developers to ensure timely delivery of clean, well-modeled data into BigQuery—and translate it into high-impact dashboards and metrics. Key Responsibilities 1. Data Ingestion & Landing Architect and manage landing zones in Cloud Storage for raw feeds Handle batch and streaming input in Parquet, Avro, CSV, JSON, ORC 2. ETL Pipeline Development Develop and orchestrate ETL workflows with Cloud Data Fusion (including Wrangler) Perform data cleansing, imputation, type conversions, joins/unions, pivots 3. Data Modeling & Semantic Layer Design star- and snowflake-schema fact and dimension tables in BigQuery Define and document the semantic layer to support Power BI datasets 4. Load & Orchestration Load curated datasets into BigQuery zones (raw staging curated) Implement orchestration via scheduled queries, Cloud Composer/Airflow, or Terraform-driven pipelines 5. Performance, Quality & Monitoring Tune SQL queries and ETL jobs for throughput, cost-efficiency, and reliability Implement automated data-quality checks, logging, and alerting Required Skills & Experience Bachelor’s degree in Computer Science, Engineering, or related field 3+ years building ETL pipelines on GCP (Cloud Data Fusion, Cloud Storage, BigQuery) Solid SQL expertise, including query optimization in BigQuery Strong grasp of dimensional modeling (star/snowflake schemas) Experience managing Cloud Storage buckets and handling diverse file formats Familiarity with orchestration tools (Cloud Composer, Airflow, or BigQuery scheduled queries) Excellent problem-solving skills, attention to detail, and collaborative mindset Preferred (Nice to Have) Experience with other GCP data services (Dataflow, Dataproc) Power BI skills: data modeling, report development, DAX calculations, and performance tuning. Python scripting for custom transformations or orchestration Understanding of CI/CD best practices (Git, Terraform, deployment pipelines) Knowledge of data governance frameworks and metadata management

Data Engineer

Chandigarh

3 - 4 years

INR 5.0 - 7.0 Lacs P.A.

Work from Office

Full Time

Key Responsibilities Design, develop, and maintain scalable ETL workflows using Cloud Data Fusion and Apache Airflow . Configure and manage various data connectors (e.g., Cloud Storage, Pub/Sub, JDBC, SaaS APIs) for batch and streaming data ingestion. Implement data transformations, cleansing, and enrichment logic in Python (and SQL) to meet analytic requirements. Optimize BigQuery data models (fact/dimension tables, partitioning, clustering) for performance and cost-efficiency. Monitor, troubleshoot, and tune pipeline performance; implement robust error-handling and alerting mechanisms. Collaborate with data analysts, BI developers, and architects to understand data requirements and deliver accurate datasets. Maintain documentation for data pipelines, schemas, and operational runbooks. Ensure data security and governance best practices are followed across the data lifecycle. Minimum Qualifications 3+ years of hands-on experience in data engineering, with a focus on cloud-native ETL. Proven expertise with Google Cloud Data Fusion , including pipeline authoring and custom plugin development. Solid experience building and orchestrating pipelines in Apache Airflow (DAG design, operators, hooks). Strong Python programming skills for data manipulation and automation. Deep understanding of BigQuery : schema design, SQL scripting, performance tuning, and cost management. Familiarity with additional GCP services: Cloud Storage, Pub/Sub, Dataflow, and IAM. Experience with version control (Git), CI/CD pipelines, and DevOps practices for data projects. Excellent problem-solving skills, attention to detail, and the ability to work independently in a fast-paced environment. Immediate availability to join. Preferred (Nice-to-Have) Experience with other data integration tools (e.g., Dataflow, Talend, Informatica). Knowledge of containerization (Docker, Kubernetes) for scalable data workloads. Familiarity with streaming frameworks (Apache Beam, Spark Streaming). Background in data modeling methodologies (Star/Snowflake schemas). Exposure to metadata management, data cataloguing, and data governance frameworks.

innovantes IT

innovantes IT

Information Technology and Services

Tech City

50-100 Employees

4 Jobs

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview