Impetus Technologies is a global technology company focused on building innovative products and solutions across multiple industries including finance, healthcare, and telecommunications.
Bengaluru, Karnataka
Not disclosed
On-site
Not specified
Bangalore, Karnataka, India;Noida, Uttar Pradesh, India;Indore, Madhya Pradesh, India;Gurgaon, Haryana, India Qualification : Position Summary As a BI Architect, you will be the core functionary to BI Practice at Impetus. You will enable our clients to get the real value out of their peta bytes of data. You will work closely with client business stakeholders to create BI strategy, architect, design and lead the implementation of end-to-end BI solutions. You will help business teams quantum jump in accessing key business metrics and improve customer experience across multiple client engagements. Description of Role Gather, describe and prioritize requirements with client business stakeholders Architect and design overall BI solutions including logical and physical data models, ETL workflows, dashboards, and reports Create BI development, testing and production deployment strategy Lead teams of BI leads and developers, translate designed solutions to them, review their work, provide guidance Manage client communications and alignment on BI architecture and project plans Ensure high quality BI solution deliveries Create and keep up to date the BI standards, guidelines and best practices Create skillset roadmap for company and manage updating of skills in the teams Be a thought leader in the industry - share knowledge, conduct training sessions, write whitepapers / case studies Contribute to pre-sales activities and acquire new projects Skills Required : BI Architect, BI Solutioning, ETL, Data Models Role : Skills / Requirements Experience in architecting, designing, and implementing complex BI solutions in large scale data management environments Expertise in at least two of the following BI Tools โ Power BI, Tableau, Qlik, Spotfire, QuickSight, Looker, MicroStrategy, SAP BO, Cognos, along with overall knowhow of multiple BI tools Deep knowledge of BI architecture and data warehousing Experience working with high volume databases and MPPs Experience with the preparation of data (e.g., data profiling, data cleansing, volume assessment and partitioning etc.) Strong knowledge of one of the cloud providers โ AWS / Azure / GCP Strong skills in cloud-based data intelligence platforms like (Databricks / Snowflake) Strong skills in databases (Oracle / MySQL / DB2) and expertise in writing SQL queries Experience and understating with Generative AI implementation in BI tools Very well versed in HiveQL / Spark SQL / Impala SQL Working knowledge of scripting languages like Perl, Shell, Python is desirable Working knowledge of data lake and data lake house architecture Hands on knowledge of ETL tool Hands on knowledge of enterprise repository tools, data modelling tools, data mapping tools, and data profiling tools Experience of working in BI platform migration project(s) Good understanding of business to build the formulae and calculations to recreate or migrate existing reports and dashboards Skills in administering Power BI, Tableau or MicroStrategy servers for maximum efficiency and performance Skills in setting up infrastructure including BI servers, sizing / capacity planning, clustered deployment architecture and ability to provide deployment solutions Experience in customizing / extending the default functionalities of BI tools Experience of working in multiple business domains (e.g. BFSI, healthcare, telecom) is desirable Experience with agile based development Outstanding communication, problem solving and interpersonal skills Self-starter and resourceful, skilled in identifying and mitigating risks Out of box thinker Experience : 14 to 16 years Job Reference Number : 13100
Indore
INR 7.155 - 9.695 Lacs P.A.
On-site
Part Time
Indore, Madhya Pradesh, India;Noida, Uttar Pradesh, India;Bangalore, Karnataka, India Qualification : Primary Tool & Expertise: Power BI and Semantic modelling Key Project Focus: Leading the migration of legacy reporting systems (Cognos or MicroStrategy) to Power BI solutions. Core Responsibility: Build / Develop optimized semantic models , metrics and complex reports and dashboards Work closely with the business analysts / BI teams to help business teams drive improvement in key business metrics and customer experience Responsible for timely, quality, and successful deliveries Sharing knowledge and experience within the team / other groups in the org Lead teams of BI engineers translate designed solutions to them, review their work, and provide guidance. Manage client communications and deliverables Skills Required : Power BI, Semanti Modelling, DAX, Power Query, Power BI Service, Data Warehousing, Data Modeling, Data Visualization, SQL Role : Core BI Skills: Power BI (Semanti Modelling, DAX, Power Query, Power BI Service) Data Warehousing Data Modeling Data Visualization SQL Datawarehouse, Database & Querying: Strong skills in databases (Oracle / MySQL / DB2 / Postgres) and expertise in writing SQL queries. Experience with cloud-based data intelligence platforms like (Databricks / Snowflake) Strong understanding of data warehousing and data modelling concepts and principles. Strong skills and experience in creating semantic models in the Power BI or similar tools. Additional BI & Data Skills (Good to Have): Certifications in Power BI and any Data platform Experience in other tools like MicroStrategy and Cognos Proven experience in migrating existing BI solutions to Power BI or other modern BI platforms. Experience with the broader Power Platform (Power Automate, Power Apps) to create integrated solutions Knowledge and experience in Power BI Admin features like, versioning, usage reports, capacity planning, creation of deployment pipelines etc Sound knowledge of various forms of data analysis and presentation methodologies. Experience in formal project management methodologies Exposure to multiple BI tools is desirable. Experience with Generative BI implementation. Working knowledge of scripting languages like Perl, Shell, Python is desirable. Exposure to one of the cloud providers โ AWS / Azure / GCP. Soft Skills & Business Acumen: Exposure to multiple business domains (e.g., Insurance, Reinsurance, Retail, BFSI, healthcare, telecom) is desirable. Exposure to complete SDLC. Out-of-the-box thinker and not just limited to the work done in the projects. Capable of working as an individual contributor and within a team. Good communication, problem-solving, and interpersonal skills. Self-starter and resourceful, skilled in identifying and mitigating risks Experience : 10 to 12 years Job Reference Number : 13099
Indore, Madhya Pradesh
INR Not disclosed
On-site
Not specified
Indore, Madhya Pradesh, India;Noida, Uttar Pradesh, India;Bangalore, Karnataka, India Qualification : Primary Tool & Expertise: Power BI and Semantic modelling Key Project Focus: Leading the migration of legacy reporting systems (Cognos or MicroStrategy) to Power BI solutions. Core Responsibility: Build / Develop optimized semantic models , metrics and complex reports and dashboards Work closely with the business analysts / BI teams to help business teams drive improvement in key business metrics and customer experience Responsible for timely, quality, and successful deliveries Sharing knowledge and experience within the team / other groups in the org Lead teams of BI engineers translate designed solutions to them, review their work, and provide guidance. Manage client communications and deliverables Skills Required : Power BI, Semanti Modelling, DAX, Power Query, Power BI Service, Data Warehousing, Data Modeling, Data Visualization, SQL Role : Core BI Skills: Power BI (Semanti Modelling, DAX, Power Query, Power BI Service) Data Warehousing Data Modeling Data Visualization SQL Datawarehouse, Database & Querying: Strong skills in databases (Oracle / MySQL / DB2 / Postgres) and expertise in writing SQL queries. Experience with cloud-based data intelligence platforms like (Databricks / Snowflake) Strong understanding of data warehousing and data modelling concepts and principles. Strong skills and experience in creating semantic models in the Power BI or similar tools. Additional BI & Data Skills (Good to Have): Certifications in Power BI and any Data platform Experience in other tools like MicroStrategy and Cognos Proven experience in migrating existing BI solutions to Power BI or other modern BI platforms. Experience with the broader Power Platform (Power Automate, Power Apps) to create integrated solutions Knowledge and experience in Power BI Admin features like, versioning, usage reports, capacity planning, creation of deployment pipelines etc Sound knowledge of various forms of data analysis and presentation methodologies. Experience in formal project management methodologies Exposure to multiple BI tools is desirable. Experience with Generative BI implementation. Working knowledge of scripting languages like Perl, Shell, Python is desirable. Exposure to one of the cloud providers โ AWS / Azure / GCP. Soft Skills & Business Acumen: Exposure to multiple business domains (e.g., Insurance, Reinsurance, Retail, BFSI, healthcare, telecom) is desirable. Exposure to complete SDLC. Out-of-the-box thinker and not just limited to the work done in the projects. Capable of working as an individual contributor and within a team. Good communication, problem-solving, and interpersonal skills. Self-starter and resourceful, skilled in identifying and mitigating risks Experience : 10 to 12 years Job Reference Number : 13099
Bengaluru, Karnataka
INR Not disclosed
On-site
Not specified
Bangalore, Karnataka, India;Noida, Uttar Pradesh, India;Indore, Madhya Pradesh, India;Gurgaon, Haryana, India Qualification : Position Summary As a BI Architect, you will be the core functionary to BI Practice at Impetus. You will enable our clients to get the real value out of their peta bytes of data. You will work closely with client business stakeholders to create BI strategy, architect, design and lead the implementation of end-to-end BI solutions. You will help business teams quantum jump in accessing key business metrics and improve customer experience across multiple client engagements. Description of Role Gather, describe and prioritize requirements with client business stakeholders Architect and design overall BI solutions including logical and physical data models, ETL workflows, dashboards, and reports Create BI development, testing and production deployment strategy Lead teams of BI leads and developers, translate designed solutions to them, review their work, provide guidance Manage client communications and alignment on BI architecture and project plans Ensure high quality BI solution deliveries Create and keep up to date the BI standards, guidelines and best practices Create skillset roadmap for company and manage updating of skills in the teams Be a thought leader in the industry - share knowledge, conduct training sessions, write whitepapers / case studies Contribute to pre-sales activities and acquire new projects Skills Required : BI Architect, BI Solutioning, ETL, Data Models Role : Skills / Requirements Experience in architecting, designing, and implementing complex BI solutions in large scale data management environments Expertise in at least two of the following BI Tools โ Power BI, Tableau, Qlik, Spotfire, QuickSight, Looker, MicroStrategy, SAP BO, Cognos, along with overall knowhow of multiple BI tools Deep knowledge of BI architecture and data warehousing Experience working with high volume databases and MPPs Experience with the preparation of data (e.g., data profiling, data cleansing, volume assessment and partitioning etc.) Strong knowledge of one of the cloud providers โ AWS / Azure / GCP Strong skills in cloud-based data intelligence platforms like (Databricks / Snowflake) Strong skills in databases (Oracle / MySQL / DB2) and expertise in writing SQL queries Experience and understating with Generative AI implementation in BI tools Very well versed in HiveQL / Spark SQL / Impala SQL Working knowledge of scripting languages like Perl, Shell, Python is desirable Working knowledge of data lake and data lake house architecture Hands on knowledge of ETL tool Hands on knowledge of enterprise repository tools, data modelling tools, data mapping tools, and data profiling tools Experience of working in BI platform migration project(s) Good understanding of business to build the formulae and calculations to recreate or migrate existing reports and dashboards Skills in administering Power BI, Tableau or MicroStrategy servers for maximum efficiency and performance Skills in setting up infrastructure including BI servers, sizing / capacity planning, clustered deployment architecture and ability to provide deployment solutions Experience in customizing / extending the default functionalities of BI tools Experience of working in multiple business domains (e.g. BFSI, healthcare, telecom) is desirable Experience with agile based development Outstanding communication, problem solving and interpersonal skills Self-starter and resourceful, skilled in identifying and mitigating risks Out of box thinker Experience : 14 to 16 years Job Reference Number : 13100
Bengaluru, Karnataka
INR Not disclosed
On-site
Not specified
Bangalore, Karnataka, India;Pune, Maharashtra, India;Noida, Uttar Pradesh, India Qualification : Job Overview: We are seeking a highly skilled AI/ML Architect with a strong foundation in Data Engineering and a proven ability to Design Thinking in solving complex business challenges. The ideal candidate will play a key role in shaping end-to-end AI/ML solutionsโfrom data architecture to model deploymentโwhile ensuring the design remains user-centric, scalable, and aligned with business objectives. Key Responsibilities: Lead the architecture and design of AI/ML solutions across multiple business domains. Collaborate with stakeholders to identify use cases and translate them into scalable, production-ready ML architectures. Leverage design thinking methodologies to drive innovative and user-centric solution design. Architect data pipelines and feature engineering processes for structured and unstructured data. Oversee the full ML lifecycle including data preprocessing, model training, evaluation, deployment, and monitoring. Ensure best practices in MLOps, including CI/CD for ML, model governance, and retraining strategies. Collaborate with data scientists, engineers, and product teams to align architecture with business goals. Mentor and guide engineering and data science teams on solution design, performance optimization, and system integration. Skills Required : AWS, machine learning, Artificial intelligence, python, production environment, Deep learning, NLP Role : Job Overview: We are seeking a highly skilled AI/ML Architect with a strong foundation in Data Engineering and a proven ability to Design Thinking in solving complex business challenges. The ideal candidate will play a key role in shaping end-to-end AI/ML solutionsโfrom data architecture to model deploymentโwhile ensuring the design remains user-centric, scalable, and aligned with business objectives. Key Responsibilities: Lead the architecture and design of AI/ML solutions across multiple business domains. Collaborate with stakeholders to identify use cases and translate them into scalable, production-ready ML architectures. Leverage design thinking methodologies to drive innovative and user-centric solution design. Architect data pipelines and feature engineering processes for structured and unstructured data. Oversee the full ML lifecycle including data preprocessing, model training, evaluation, deployment, and monitoring. Ensure best practices in MLOps, including CI/CD for ML, model governance, and retraining strategies. Collaborate with data scientists, engineers, and product teams to align architecture with business goals. Mentor and guide engineering and data science teams on solution design, performance optimization, and system integration. Experience : 10 to 15 years Job Reference Number : 13029
Bengaluru, Karnataka
INR Not disclosed
On-site
Not specified
Bangalore, Karnataka, India;Gurgaon, Haryana, India;Indore, Madhya Pradesh, India Qualification : Job Title: Java + Bigdata Engineer Company Name: Impetus Technologies Job Description: Impetus Technologies is seeking a skilled Java + Bigdata Engineer to join our dynamic team. The ideal candidate will possess strong expertise in Java programming and have hands-on experience with Bigdata technologies. Responsibilities: Design, develop, and maintain robust big data applications using Java and related technologies. Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Optimize application performance and scalability to handle large data sets effectively. Implement data processing solutions using frameworks such as Apache Hadoop, Apache Spark, or similar tools. Participate in code reviews, debugging, and troubleshooting of applications to ensure high-quality code standards. Stay updated with the latest trends and advancements in big data technologies and Java developments. Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. Strong proficiency in Java programming and experience with object-oriented design principles. Hands-on experience with big data technologies such as Hadoop, Spark, Kafka, or similar frameworks. Familiarity with cloud platforms and data storage solutions (AWS, Azure, etc.). Excellent problem-solving skills and a proactive approach to resolving technical challenges. Strong communication and interpersonal skills, with the ability to work collaboratively in a team-oriented environment. At Impetus Technologies, we value innovation and encourage our employees to push boundaries. If you are a passionate Java + Bigdata Engineer looking to take your career to the next level, we invite you to and be part of our growing team. Skills Required : Java, spark, pyspark, Hive, microservices Role : Job Title: Java + Bigdata Engineer Company Name: Impetus Technologies Roles and Responsibilities: Design, develop, and maintain scalable applications using Java and Big Data technologies. Collaborate with cross-functional teams to gather requirements and understand project specifications. Implement data processing and analytics solutions leveraging frameworks such as Apache Hadoop, Apache Spark, and others. Optimize application performance and ensure data integrity throughout the data lifecycle. Conduct code reviews and implement best practices to enhance code quality and maintainability. Troubleshoot and resolve issues related to application performance and data processing. Develop and maintain technical documentation related to application architecture, design, and deployment. Stay updated with industry trends and emerging technologies in Java and Big Data ecosystems. Participate in Agile development processes including sprint planning, log grooming, and daily stand-ups. Mentor junior engineers and provide technical guidance to ensure successful project delivery. Experience : 4 to 7 years Job Reference Number : 13044
Bengaluru, Karnataka
INR Not disclosed
On-site
Not specified
Bengaluru, Karnataka, India;Indore, Madhya Pradesh, India;Pune, Maharashtra, India;Hyderabad, Telangana, India Qualification : Overall 10-18 yrs. of Data Engineering experience with Minimum 4+ years of hands on experience in Databricks. Ready to travel Onsite and work at client location. Proven hands-on experience as a Databricks Architect or similar role with a deep understanding of the Databricks platform and its capabilities. Analyze business requirements and translate them into technical specifications for data pipelines, data lakes, and analytical processes on the Databricks platform. Design and architect end-to-end data solutions, including data ingestion, storage, transformation, and presentation layers, to meet business needs and performance requirements. Lead the setup, configuration, and optimization of Databricks clusters, workspaces, and jobs to ensure the platform operates efficiently and meets performance benchmarks. Manage access controls and security configurations to ensure data privacy and compliance. Design and implement data integration processes, ETL workflows, and data pipelines to extract, transform, and load data from various sources into the Databricks platform. Optimize ETL processes to achieve high data quality and reduce latency. Monitor and optimize query performance and overall platform performance to ensure efficient execution of analytical queries and data processing jobs. Identify and resolve performance bottlenecks in the Databricks environment. Establish and enforce best practices, standards, and guidelines for Databricks development, ensuring data quality, consistency, and maintainability. Implement data governance and data lineage processes to ensure data accuracy and traceability. Mentor and train team members on Databricks best practices, features, and capabilities. Conduct knowledge-sharing sessions and workshops to foster a data-driven culture within the organization. Will be responsible for Databricks Practice Technical/Partnership initiatives. Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects. Skills Required : Databricks, Unity Catalog, Pyspark, ETL, SQL, Delta Live Tables Role : Bachelor's or Masterโs degree in Computer Science, Information Technology, or related field. In depth hands-on implementation knowledge on Databricks. Delta Lake, Delta table - Managing Delta Tables, Databricks Cluster Configuration, Cluster policies. Experience handling structured and unstructured datasets Strong proficiency in programming languages like Python, Scala, or SQL. Experience with Cloud platforms like AWS, Azure, or Google Cloud, and understanding of cloud-based data storage and computing services. Familiarity with big data technologies like Apache Spark, Hadoop, and data lake architectures. Develop and maintain data pipelines, ETL workflows, and analytical processes on the Databricks platform. Should have good experience in Data Engineering in Databricks Batch process and Streaming Should have good experience in creating Workflows & Scheduling the pipelines. Should have good exposure on how to make packages or libraries available in DB. Familiarity in Databricks default runtimes Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Experience : 10 to 18 years Job Reference Number : 12932
Bengaluru, Karnataka
INR Not disclosed
On-site
Not specified
Bangalore, Karnataka, India;Gurgaon, Haryana, India;Indore, Madhya Pradesh, India Qualification : Rate revision March Projection -> Impact Core Requirements: 5 to 10 years of experience in C# and .NET Practical experience with Azure and cloud-based solutions Proficiency in the MVC framework Excellent communication skills, with the capability to collaborate directly with client stakeholders A software engineering mindset, demonstrating innovation, problem-solving, and value addition to the team and clients Availability to join within 2 weeks Skills Required : .Net, Azure, Role : Rate revision March Projection -> Impact Core Requirements: 5 to 10 years of experience in C# and .NET Practical experience with Azure and cloud-based solutions Proficiency in the MVC framework Excellent communication skills, with the capability to collaborate directly with client stakeholders A software engineering mindset, demonstrating innovation, problem-solving, and value addition to the team and clients Availability to join within 2 weeks Experience : 5 to 10 years Job Reference Number : 13054
Bengaluru, Karnataka
INR Not disclosed
On-site
Not specified
Bengaluru, Karnataka, India;Noida, Uttar Pradesh, India;Indore, Madhya Pradesh, India Qualification : Exceptional skills in evolving technologies and writing very efficient algorithms for complex data structures. Strong in Core JAVA (Exception handling, Collections, Multi-threading) Should be good in mathematics, problem solving skills Experience in usage of design patterns Good to have exposure on Microservices architecture, Linux and AWS/GCP/Azure Role : Be a leader of small but highly efficient team Responsible for performing the low-level design and developing the code base. Collaborate with team in release execution cycles & manage the implementation of suggested module independently Innovate solution for complex challenges in big data and analytics domain Responsible for identifying architectural gaps and weaknesses and recommend appropriate solutions Experience : 6 to 10 years Job Reference Number : 10709
Bengaluru, Karnataka
INR Not disclosed
On-site
Not specified
Bangalore, Karnataka, India;Noida, Uttar Pradesh, India;Pune, Maharashtra, India Qualification : Strong verbal and written communication skills Excellent analytical and problem-solving skills Hands-on and good understanding of DevOps tools and automation frameworks Demonstrated hands-on experience with DevOps techniques, building continuous integration solutions using Jenkins, Git, Maven, Nexus Experience with containerization & orchestration using Docker, Docker Compose, Kubernetes Experience with configuration management tools such as Ansible, Chef, or Puppet Proficiency in at least one scripting language: Batch/PowerShell Scripting, Shell Scripting, NSIS Scripting Experience working in UNIX, Linux, and/or Windows environments Hands-on experience with cloud technology (AWS/Azure/GCP) Experience managing web-based application environments on AWS, including services like EC2, ELB, RDS, and S3 Good knowledge of cloud concepts like networking, security, monitoring, and instances: DNS, Load Balancers, Reverse Proxy (Nginx, HAProxy, Apache Proxy) Familiarity with monitoring tools like CloudWatch, Nagios, etc. Hands-on experience in cloud deployment and automation using CloudFormation, Terraform, etc. Experience working with Java-based applications - Java/J2EE, Spring, Web services Build and Release Management: Maven, Nexus Hands-on experience with Monitoring Tools: Prometheus, Grafana Maintain Linux-based systems for optimal performance and stability. Collaborate with development, operations, and QA teams to support deployment and release workflows. Identify, implement, and promote DevOps best practices, focusing on automation and Infrastructure as Code (IaC). Proactively troubleshoot and resolve issues across the CI/CD pipeline, infrastructure, and applications. Research and propose new technologies and improvements to enhance platform reliability, performance, and security. Skills Required : DevOps, kubernetes, ci cd pipeline, jenkins, Ansible, Terraform, AWS Role : Docker: Strong hands-on experience with Docker, including containerization best practices and orchestration. Jenkins/Ansible: Expertise in building and managing CI/CD pipelines and deploying the applications using Configuration management tools. Linux/Windows OS: Proficiency in Linux system administration and shell scripting. Good to have some working experience on Windows OS . Solr & Kafka: Good to have experience in deploying and managing distributed search and messaging systems. Kubernetes (K8s): Good to have basic to intermediate understanding of Kubernetes for application scaling and orchestration. Soft Skills: Strong communication skills, problem-solving mindset, and the ability to work independently as well as in a collaborative team environment. Leadership: Experience in leading DevOps initiatives, mentoring team members, and working cross-functionally with other engineering teams. Experience : 9 to 15 years Job Reference Number : 12637
Noida, Indore
INR 10.0 - 14.0 Lacs P.A.
Work from Office
Full Time
Primary Tool & Expertise: Power BI and Semantic modelling Key Project Focus: Leading the migration of legacy reporting systems (Cognos or MicroStrategy) to Power BI solutions. Core Responsibility: Build / Develop optimized semantic models , metrics and complex reports and dashboards Work closely with the business analysts / BI teams to help business teams drive improvement in key business metrics and customer experience Responsible for timely, quality, and successful deliveries Sharing knowledge and experience within the team / other groups in the org Lead teams of BI engineers translate designed solutions to them, review their work, and provide guidance. Manage client communications and deliverables Core BI Skills: Power BI (Semanti Modelling, DAX, Power Query, Power BI Service) Data Warehousing Data Modeling Data Visualization SQL Datawarehouse, Database & Querying: Strong skills in databases (Oracle / MySQL / DB2 / Postgres) and expertise in writing SQL queries. Experience with cloud-based data intelligence platforms like (Databricks / Snowflake) Strong understanding of data warehousing and data modelling concepts and principles. Strong skills and experience in creating semantic models in the Power BI or similar tools . Additional BI & Data Skills (Good to Have): Certifications in Power BI and any Data platform Experience in other tools like MicroStrategy and Cognos Proven experience in migrating existing BI solutions to Power BI or other modern BI platforms. Experience with the broader Power Platform (Power Automate, Power Apps) to create integrated solutions Knowledge and experience in Power BI Admin features like, versioning, usage reports, capacity planning, creation of deployment pipelines etc Sound knowledge of various forms of data analysis and presentation methodologies. Experience in formal project management methodologies Exposure to multiple BI tools is desirable. Experience with Generative BI implementation. Working knowledge of scripting languages like Perl, Shell, Python is desirable. Exposure to one of the cloud providers - AWS / Azure / GCP. Soft Skills & Business Acumen: Exposure to multiple business domains (e.g., Insurance, Reinsurance, Retail, BFSI, healthcare, telecom) is desirable. Exposure to complete SDLC. Out-of-the-box thinker and not just limited to the work done in the projects. Capable of working as an individual contributor and within a team. Good communication, problem-solving, and interpersonal skills. Self-starter and resourceful, skilled in identifying and mitigating risks
Noida, Indore
INR 14.0 - 19.0 Lacs P.A.
Work from Office
Full Time
Position Summary As a BI Architect, you will be the core functionary to BI Practice at Impetus. You will enable our clients to get the real value out of their peta bytes of data. You will work closely with client business stakeholders to create BI strategy, architect, design and lead the implementation of end-to-end BI solutions. You will help business teams quantum jump in accessing key business metrics and improve customer experience across multiple client engagements. Description of Role Gather, describe and prioritize requirements with client business stakeholders Architect and design overall BI solutions including logical and physical data models, ETL workflows, dashboards, and reports Create BI development, testing and production deployment strategy Lead teams of BI leads and developers, translate designed solutions to them, review their work, provide guidance Manage client communications and alignment on BI architecture and project plans Ensure high quality BI solution deliveries Create and keep up to date the BI standards, guidelines and best practices Create skillset roadmap for company and manage updating of skills in the teams Be a thought leader in the industry - share knowledge, conduct training sessions, write whitepapers / case studies Contribute to pre-sales activities and acquire new projects Skills / Requirements Experience in architecting, designing, and implementing complex BI solutions in large scale data management environments Expertise in at least two of the following BI Tools - Power BI, Tableau, Qlik, Spotfire, QuickSight, Looker, MicroStrategy, SAP BO, Cognos, along with overall knowhow of multiple BI tools Deep knowledge of BI architecture and data warehousing Experience working with high volume databases and MPPs Experience with the preparation of data (e.g., data profiling, data cleansing, volume assessment and partitioning etc.) Strong knowledge of one of the cloud providers - AWS / Azure / GCP Strong skills in cloud-based data intelligence platforms like (Databricks / Snowflake) Strong skills in databases (Oracle / MySQL / DB2) and expertise in writing SQL queries Experience and understating with Generative AI implementation in BI tools Very well versed in HiveQL / Spark SQL / Impala SQL Working knowledge of scripting languages like Perl, Shell, Python is desirable Working knowledge of data lake and data lake house architecture Hands on knowledge of ETL tool Hands on knowledge of enterprise repository tools, data modelling tools, data mapping tools, and data profiling tools Experience of working in BI platform migration project(s) Good understanding of business to build the formulae and calculations to recreate or migrate existing reports and dashboards Skills in administering Power BI, Tableau or MicroStrategy servers for maximum efficiency and performance Skills in setting up infrastructure including BI servers, sizing / capacity planning, clustered deployment architecture and ability to provide deployment solutions Experience in customizing / extending the default functionalities of BI tools Experience of working in multiple business domains (e.g. BFSI, healthcare, telecom) is desirable Experience with agile based development Outstanding communication, problem solving and interpersonal skills Self-starter and resourceful, skilled in identifying and mitigating risks Out of box thinker
FIND ON MAP
Information Technology and Services
1001-5000 Employees
152 Jobs
Key People
Gallery Pages
Company Reviews
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.