Home
Jobs
Companies
Resume
152 Job openings at Impetus Technologies
About Impetus Technologies

Impetus Technologies is a global technology company focused on building innovative products and solutions across multiple industries including finance, healthcare, and telecommunications.

Technical Support Engineer

Indore, Madhya Pradesh

0 - 6 years

INR Not disclosed

On-site

Not specified

Indore, Madhya Pradesh, India;Noida, Uttar Pradesh, India;Bangalore, Karnataka, India Qualification : Technology: PL-SQL, Databases, Reporting, BI, Web Servers, Java Skills Required: Strong understanding of BI reporting and related ecosystem Prior Experience preferred In depth understanding of Databases Knowledge and exposure of Cloud deployment Experience in working with Linux and Windows operating systems Experience in Supporting US based customers Good verbal and written communication skills (English) Open to work in rotational shift Role : Technology: PL-SQL, Databases, Reporting, BI, Web Servers, Java Skills Required: Strong understanding of BI reporting and related ecosystem Prior Experience preferred In depth understanding of Databases Knowledge and exposure of Cloud deployment Experience in working with Linux and Windows operating systems Experience in Supporting US based customers Good verbal and written communication skills (English) Open to work in rotational shift Experience : 4 to 6 years Job Reference Number : 10954

Pre-Sales Solution Engineer- India

Indore, Madhya Pradesh

0 - 6 years

INR Not disclosed

On-site

Not specified

Indore, Madhya Pradesh, India;Bangalore, Karnataka, India;Noida, Uttar Pradesh, India Qualification : Pre-Sales Solution Engineer - India Experience areas or Skills : Pre-Sales experience of Software or analytics products Excellent verbal & written communication skills OLAP tools or Microsoft Analysis services (MSAS) Data engineering or Data warehouse or ETL Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Tableau or Micro strategy or any BI tool Hive QL or Spark SQL or PLSQL or TSQL Writing and troubleshooting SQL programming or MDX queries Working on Linux programming in Python, Java or Java Script would be a plus Filling RFP or Questioner from Customer NDA, Success Criteria, Project closure and other Documentation Be willing to travel or relocate as per requirement Role : Acts as main point of contact for Customer contacts involved in the evaluation process Product demonstrations to qualified leads Product demonstrations in support of marketing activity such as events or webinars Own RFP, NDA, PoC success criteria document, POC Closure and other documents Secures alignment on Process and documents with the customer / prospect Owns the technical win phases of all active opportunities Understand Customer domain and database schema Providing OLAP and Reporting solution Work closely with customers for understanding and resolving environment or OLAP cube or reporting related issues Co-ordinate with solutioning team for execution of PoC as per success plan Creates enhancement requests or identify requests for new features on behalf of customers or hot prospects Experience : 3 to 6 years Job Reference Number : 10771

Senior Data Scientist

Indore, Madhya Pradesh

0 - 16 years

INR Not disclosed

On-site

Not specified

Indore, Madhya Pradesh, India;Hyderabad, Telangana, India Qualification : Define and manage analytics strategy. Provides analytical expertise in the process of model development, refining, and implementation in a variety of analytics problems. Oversees team data scientists, de-bottlenecking issues related to project execution. Work closely with business and product management teams to develop and implement analytics solutions. Collaborate with data engineers & architects to implement and deploy scalable solutions. Communicate results to diverse technical and non-technical audiences. Design accurate and scalable prediction algorithms Generate actionable insights for business improvements. Ability to understand business requirements. Use case derivation and solution creation from structured/unstructured data. Storytelling, Business Communication, and Documentation Actively drive a culture of knowledge-building and sharing within the team Encourage continuous innovation and out-of-the-box thinking. Skills Required : R, Python, TensorFlow, PyTorch, Deep Learning Algorithms, Image/Video Analytics Role : R, Python, Scikit-Learn, TensorFlow, PyTorch Exploratory Data Analysis Machine Learning and Deep Learning Algorithms Model building, Hyperparameter tuning and Model performance metrics. MLOps, Data Pipeline, Data engineering Statistics Knowledge (Probability Distributions, Hypothesis Testing) Time series modeling, Forecasting, Image/Video Analytics, and Natural Language Processing (NLP). Experience : 12 to 16 years Job Reference Number : 11688

Senior Technical Project Manager

Indore, Madhya Pradesh

0 - 20 years

INR Not disclosed

On-site

Not specified

Indore, Madhya Pradesh, India;Bengaluru, Karnataka, India;Pune, Maharashtra, India;Hyderabad, Telangana, India;Noida, Uttar Pradesh, India Qualification : 15+ years of experience in the role of managing and implementing of high-end software products. Expertise in Java/ J2EE or EDW/SQL OR Hadoop/Hive/Spark and preferably hands-on. Good knowledge* of any of the Cloud (AWS/Azure/GCP) โ€“ Must Have Managed/ delivered and implemented complex projects dealing with considerable data size (TB/ PB) and with high complexity Experience in handling migration projects Good to have: Data Ingestion, Processing and Orchestration knowledge Skills Required : Java Architecture, Big Data, Cloud Technologies Role : Senior Technical Project Managers (STPMs) are in charge of handling all aspects of technical projects. This is a multi-dimensional and multi-functional role. You will need to be comfortable reporting program status to executives, as well as diving deep into technical discussions with internal engineering teams and external partners. You should collaborate with, and leverage, colleagues in business development, product management, analytics, marketing, engineering, and partner organizations. You have to manage multiple projects and ensures all releases on time. You are responsible for manage and deliver the technical solution to support an organizationโ€™s vision and strategic direction. The technology program manager delivers the technical solution to support an organizationโ€™s vision and strategic direction. You should be capable to working with a different type of customer and should possess good customer handling skills. Experience in working in ODC model and capable of presenting the Technical Design and Architecture to Senior Technical stakeholders. Should have experience in defining the project and delivery plan for each assignment Capable of doing resource allocations as per the requirements for each assignment Should have experience of driving RFPs. Should have experience of Account management โ€“ Revenue Forecasting, Invoicing, SOW creation etc. Experience : 15 to 20 years Job Reference Number : 13010

Bigdata(Pyspark) Engineer

Chennai, Tamil Nadu

0 - 7 years

INR Not disclosed

On-site

Not specified

Chennai, Tamil Nadu, India Qualification : Skills: Bigdata,Pyspark,Python ,Hadoop / HDFS; Spark; Good to have : GCP Roles/Responsibilities: Develops and maintains scalable data pipelines to support continuing increases in data volume and complexity. Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization. Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it. Writes unit/integration tests, contributes to engineering wiki, and documents work. Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues. Works closely with a team of frontend and end engineers, product managers, and analysts. Defines company data assets (data models), spark, sparkSQL, and hiveSQL jobs to populate data models. Designs data integrations and data quality framework. Basic Qualifications: BS or MS degree in Computer Science or a related technical field 4+ years of SQL experience (No-SQL experience is a plus) 4+ years of experience with schema design and dimensional data modelling 4+ years of experience with Big Data Technologies like Spark, Hive 2+ years of experience on data engineering on Google Cloud platform services like big query. Skills Required : Bigdata,Pyspark,Python ,Hadoop / HDFS; Spark; Role : Skills: Bigdata,Pyspark,Python ,Hadoop / HDFS; Spark; Good to have : GCP Roles/Responsibilities: Develops and maintains scalable data pipelines to support continuing increases in data volume and complexity. Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization. Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it. Writes unit/integration tests, contributes to engineering wiki, and documents work. Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues. Works closely with a team of frontend and end engineers, product managers, and analysts. Defines company data assets (data models), spark, sparkSQL, and hiveSQL jobs to populate data models. Designs data integrations and data quality framework. Basic Qualifications: BS or MS degree in Computer Science or a related technical field 4+ years of SQL experience (No-SQL experience is a plus) 4+ years of experience with schema design and dimensional data modelling 4+ years of experience with Big Data Technologies like Spark, Hive 2+ years of experience on data engineering on Google Cloud platform services like big query. Experience : 4 to 7 years Job Reference Number : 12907

Java + Bigdata

Chennai, Tamil Nadu

0 - 7 years

INR Not disclosed

On-site

Not specified

Chennai, Tamil Nadu, India Qualification : Skills: 5+ years of experience with Java + Bigdata as minimum required skill . Java, Micorservices ,Sprintboot, API ,Bigdata-Hive, Spark,Pyspark Skills Required : Java ,Bigdata ,Spark Role : Skills: 5+ years of experience with Java + Bigdata as minimum required skill . Java, Micorservices ,Sprintboot, API ,Bigdata-Hive, Spark,Pyspark Experience : 5 to 7 years Job Reference Number : 13049

Senior Software Engineer (AI/ML)

Noida, Uttar Pradesh

0 - 5 years

INR Not disclosed

On-site

Not specified

Noida, Uttar Pradesh, India;Bangalore, Karnataka, India;Gurugram, Haryana, India;Indore, Madhya Pradesh, India;Pune, Maharashtra, India;Hyderabad, Telangana, India Qualification : Strong experience in Python 2+ yearsโ€™ experience of working on feature/data pipelines using PySpark Understanding and experience around data science Exposure to AWS cloud services such as Sagemaker, Bedrock, Kendra etc. Experience with machine learning model lifecycle management tools, and an understanding of MLOps principles and best practice Experience with statistical models e.g., multinomial logistic regression Experience of technical architecture, design, deployment, and operational level knowledge Exploratory Data Analysis Knowledge around Model building, Hyperparameter tuning and Model performance metrics. Statistics Knowledge (Probability Distributions, Hypothesis Testing) Time series modelling, Forecasting, Image/Video Analytics, and Natural Language Processing (NLP). Good To Have: Experience researching and ing large language and Generative AI models. Experience with LangChain, LLAMAIndex, Foundation model tuning, Data Augmentation, and Performance Evaluation frameworks Able to provide analytical expertise in the process of model development, refining, and implementation in a variety of analytics problems. Knowledge on Docker and Kubernetes. Skills Required : Machine Learning, Natural Language Processing , AWS Sagemaker, Python Role : Generate actionable insights for business improvements. Ability to understand business requirements. Write clean, efficient, and reusable code following best practices. Troubleshoot and debug applications to ensure optimal performance. Write unit test cases Collaborate with cross-functional teams to define and deliver new features Use case derivation and solution creation from structured/unstructured data. Actively drive a culture of knowledge-building and sharing within the team Experience ing theoretical models in an applied environment. MLOps, Data Pipeline, Data engineering Statistics Knowledge (Probability Distributions, Hypothesis Testing) Experience : 4 to 5 years Job Reference Number : 13027

Senior Big Data Cloud QA

Noida, Uttar Pradesh

0 - 7 years

INR Not disclosed

On-site

Not specified

Noida, Uttar Pradesh, India;Gurgaon, Haryana, India;Bangalore, Karnataka, India;Indore, Madhya Pradesh, India;Pune, Maharashtra, India Qualification : Job Title: Senior Big Data Cloud QA Job Description: We are seeking an experienced Senior Big Data Cloud Quality Assurance Engineer to join our dynamic team. In this role, you will be responsible for ensuring the quality and performance of our big data applications and services deployed in cloud environments. You will work closely with developers, product managers, and other stakeholders to define testing strategies, develop test plans, and execute comprehensive testing processes. Key Responsibilities: Design and implement test plans and test cases for big data applications in cloud environments. Perform functional, performance, and scalability testing on large datasets. Identify, record, and track defects using bug tracking tools. Collaborate with development teams to understand product requirements and provide feed on potential quality issues early in the development cycle. Develop and maintain automated test scripts and frameworks for continuous integration and deployment. Analyze test results and provide detailed reports on the quality of releases. Mentor junior QA team members and share best practices in testing methodologies and tools. Stay updated on industry trends and advancements in big data and cloud technologies to continuously improve QA processes. Qualifications: Bachelorโ€™s degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of experience in software testing, with at least 2 years focused on big data applications and cloud technologies. Proficiency in testing frameworks and tools, such as JUnit, TestNG, Apache JMeter, or similar. Experience with big data technologies, such as Hadoop, Spark, or distributed databases. Strong understanding of cloud platforms, such as AWS, Azure, or Google Cloud. Familiarity with programming languages such as Java, Python, or Scala. Excellent analytical and problem-solving skills, with a keen attention to detail. Strong communication skills, both verbal and written, along with the ability to work collaboratively in a team environment. If you are a motivated and detail-oriented professional looking to advance your career in big data quality assurance, we encourage you to for this exciting opportunity. Skills Required : ETL Testing, Bigdata, Database Testing, API Testing, Selenium, SQL, Linux, Cloud Testing Role : Job Title: Senior Big Data Cloud QA Roles and Responsibilities: 1. Design and implement comprehensive test plans and test cases for big data applications deployed in cloud environments. 2. Collaborate with data engineers and developers to understand system architecture and data flow for effective testing. 3. Perform manual and automated testing for big data processing frameworks and tools, ensuring data quality and integrity. 4. Lead and mentor junior QA team members, providing guidance on best practices for testing big data solutions. 5. Identify and document defects, track their resolution, and verify fixes in a timely manner. 6. Develop and maintain automated test scripts using appropriate testing frameworks compatible with cloud big data platforms. 7. Execute performance testing to assess the scalability and reliability of big data applications in cloud environments. 8. Participate in design and code reviews, providing insights on testability and quality. 9. Work with stakeholders to define acceptance criteria and ensure that deliverables meet business requirements. 10. Stay updated on industry trends and advancements in big data technologies and cloud services to continually improve testing processes. 11. Ensure compliance with security and data governance policies during testing activities. 12. Provide detailed reports and metrics on testing progress, coverage, and outcomes to project stakeholders. Experience : 5 to 7 years Job Reference Number : 12944

Sr. Executive - Transport and Admin

Noida, Uttar Pradesh

0 - 8 years

INR Not disclosed

On-site

Not specified

Noida, Uttar Pradesh, India Qualification : If you have experience in both transportation management and administrative tasks, we have an exciting opportunity for you at Impetus Technologies. We are currently seeking a Senior Executive to join our team and oversee our transportation and administrative functions. At Impetus Technologies, we are a technology solutions company that thrives on innovation and excellence. Our team is dedicated to providing top-notch services to our clients and ensuring a smooth, efficient operation at all times. As a Senior Executive - Admin, you will play a crucial role in maintaining the seamless flow of our transportation and administrative operations. Key Responsibilities: Oversee transportation operations, including scheduling, routing, and ensuring timely delivery of goods and services. Manage and maintain fleet vehicles, including coordinating repairs, inspections, and registration renewals. Develop and implement transportation policies and procedures to ensure compliance with regulatory standards. Supervise and train transportation staff to ensure efficient and safe operations. Handle administrative tasks such as managing office supplies, coordinating travel arrangements, and overseeing office maintenance. Supervise administrative staff and ensure smooth day-to-day office operations. Assist in budget planning and control for transportation and administrative expenses. Qualifications: Bachelor's degree in business administration, logistics, or a related field. Proven experience in transportation management and administrative roles. Strong organizational and leadership skills. Excellent communication and interpersonal abilities. Proficiency in MS Office and transportation management software. If you possess the skills and experience required for this position and are seeking a challenging and rewarding career, we encourage you to for the Senior Executive - Admin role at Impetus Technologies. Join us in our mission to drive excellence and innovation in all aspects of our operations. We look forward to welcoming you to our team. Skills Required : Office Admin, Stakeholder Management, Facility Management, Facility Administration, travel arrangements, Housekeeping Management Role : Company: Impetus Technologies Job Title: Sr. Executive - Admin When it comes to the smooth functioning of a business, the roles and responsibilities of a Senior Executive in Transport and Administration at Impetus Technologies are critical. This role requires a keen eye for detail and a proactive approach to ensuring that all transportation and administrative tasks are carried out efficiently and effectively. Let's take a closer look at the specific responsibilities and roles of this position: Roles: 1. Overseeing Transportation Operations: The Senior Executive in Transport and Admin is responsible for managing transportation operations, including arranging and scheduling transportation for employees, visitors, and materials as necessary. This includes coordinating with transportation vendors, ensuring compliance with transportation regulations, and maintaining transportation records. 2. Facility Management: This role involves overseeing the day-to-day operations of the company's facilities, including office spaces, parking facilities, and other amenities. This includes ensuring that all facilities are well-maintained, clean, and safe for employees and visitors. 3. Vendor Management: The Senior Executive is responsible for managing relationships with vendors and service providers related to transportation and facility management. This includes negotiating contracts, monitoring service levels, and resolving any issues that may arise. 4. Budgeting and Cost Control: This position also requires managing the budget for transportation and facility management, ensuring that costs are kept under control and that resources are used efficiently. Responsibilities: 1. Develop and implement transportation and facility management policies and procedures to ensure compliance with company standards and regulations. 2. Plan and coordinate transportation for employees, visitors, and materials, ensuring timely and cost-effective delivery. 3. Maintain accurate and up-to-date records of transportation activities, including vehicle maintenance, fuel usage, and driver schedules. 4. Oversee the maintenance and upkeep of company facilities, ensuring that they are clean, safe, and well-maintained. 5. Monitor vendor performance and service levels, and participate in contract negotiations as necessary. 6. Prepare and manage budgets for transportation and facility management, and monitor expenses to ensure cost-effective operations. 7. Collaborate with other departments to ensure that transportation and facility management support the overall goals and objectives of the company. In summary, the Senior Executive in Transport and Admin plays a crucial role in ensuring that transportation and facility management operations run smoothly and efficiently. Their attention to detail, proactive approach, and strong organizational skills are essential for the success of these critical functions within Impetus Technologies. Experience : 4 to 8 years Job Reference Number : 12506

Lead Software Engineer (Pyspark/Python)

Noida, Uttar Pradesh

0 - 10 years

INR Not disclosed

On-site

Not specified

Noida, Uttar Pradesh, India;Gurgaon, Haryana, India;Hyderabad, Telangana, India;Indore, Madhya Pradesh, India;Bangalore, Karnataka, India Qualification : 5-7 years of good hands on exposure with Big Data technologies โ€“ pySpark (Data frame and SparkSQL), Hadoop, and Hive Good hands on experience of python and Bash Scripts Good understanding of SQL and data warehouse concepts Strong analytical, problem-solving, data analysis and research skills Demonstrable ability to think outside of the box and not be dependent on readily available tools Excellent communication, presentation and interpersonal skills are a must Good to have: Hands-on experience with using Cloud Platform provided Big Data technologies (i.e. IAM, Glue, EMR, RedShift, S3, Kinesis) Orchestration with Airflow and Any job scheduler experience Experience in migrating workload from on-premise to cloud and cloud to cloud migrations Skills Required : Python, Pyspark, AWS Role : Develop efficient ETL pipelines as per business requirements, following the development standards and best practices. Perform integration testing of different created pipeline in AWS env. Provide estimates for development, testing & deployments on different env. Participate in code peer reviews to ensure our applications comply with best practices. Create cost effective AWS pipeline with required AWS services i.e S3,IAM, Glue, EMR, Redshift etc. Experience : 8 to 10 years Job Reference Number : 13025

Technical Architect

Noida, Uttar Pradesh

0 - 15 years

INR Not disclosed

On-site

Not specified

Noida, Uttar Pradesh, India;Hyderabad, Telangana, India;Indore, Madhya Pradesh, India;Bangalore, Karnataka, India;Pune, Maharashtra, India Qualification : Job Descriptions for Technical Architect Position Summary: We are looking for candidates with hands on experience in Big Data and Cloud Technologies. Must have technical Skills 10+ Years of experience Expertise in designing and developing applications using Big Data and Cloud technologies โ€“ Must Have Expertise and hands-on experience* on Spark, and Hadoop echo system components โ€“ Must Have Expertise and hands-on experience* of any of the Cloud (AWS/Azure/GCP) โ€“ Must Have Good knowledge of Shell script & Java/Python โ€“ Must Have Good knowledge of migration projects on Hadoop โ€“ Good to Have Good Knowledge of one of the Workflow engine like Oozie, Autosys โ€“ Good to Have Good knowledge of Agile Developmentโ€“ Good to Have Passionate about exploring new technologies โ€“ Must Have Automation approach - โ€“ Good to Have Good Communication Skills โ€“ Must Have Data Ingestion, Processing and Orchestration knowledge Skills Required : Solution Architecting, Solution Design, orchestration, migration Role : Responsibilities Define Data Warehouse modernization approach and strategy for the customer Align the customer on the overall approach and solution Design systems for meeting performance SLA Resolve technical queries and issues for team Work with the team to establish an end-to-end migration approach for one use case so that the team can replicate the same for other iterations Experience : 10 to 15 years Job Reference Number : 12968

Analytical engineers

Noida, Uttar Pradesh

0 - 3 years

INR Not disclosed

On-site

Not specified

Noida, Uttar Pradesh, India;Gurugram, Haryana, India;Indore, Madhya Pradesh, India;Bengaluru, Karnataka, India;Pune, Maharashtra, India;Hyderabad, Telangana, India Qualification : 2-4 years of experience in designing, developing, and training machine learning models using diverse algorithms and techniques, including deep learning, NLP, computer vision, and time series analysis. Proven ability to optimize model performance through experimentation with architectures, hyperparameter tuning, and evaluation metrics. Hands-on experience in processing large datasets, including preprocessing, feature engineering, and data augmentation. Demonstrated ability to deploy trained AI/ML models to production using frameworks like Kubernetes and cloud-based ML platforms Solid understanding of monitoring and logging for performance tracking. Experience in exploring new AI/ML methodologies and documenting the development and deployment lifecycle, including performance metrics. Familiarity with AWS services, particularly SageMaker, is expected. Excellent communication, presentation, and interpersonal skills are essential. Good to have: Knowledge of GenAI (LangChain, Foundation model tuning, and GPT3) Amazon AWS Certified Machine Learning - Specialty certifications Skills Required : Machine Learning, Langchain, AWS Sagemaker, Python Role : Explore different models and transform data science prototypes for given problem Analyze dataset perform data enrichment, feature engineering and model training Abale to write code using Python, Pandas and Dataframe APIs Develop machine learning applications according to requirements Perform statistical analysis and fine-tuning using test results Collaborate with data engineers & architects to implement and deploy scalable solutions. Encourage continuous innovation and out-of-the-box thinking. Experience ing theoretical models in an applied environment. Experience : 1 to 3 years Job Reference Number : 13047

MLSE (Python/Pyspark)

Noida, Uttar Pradesh

0 - 8 years

INR Not disclosed

On-site

Not specified

Noida, Uttar Pradesh, India;Gurgaon, Haryana, India;Hyderabad, Telangana, India;Bangalore, Karnataka, India;Indore, Madhya Pradesh, India Qualification : 6-8 years of good hands on exposure with Big Data technologies โ€“ pySpark (Data frame and SparkSQL), Hadoop, and Hive Good hands on experience of python and Bash Scripts Good understanding of SQL and data warehouse concepts Strong analytical, problem-solving, data analysis and research skills Demonstrable ability to think outside of the box and not be dependent on readily available tools Excellent communication, presentation and interpersonal skills are a must Hands-on experience with using Cloud Platform provided Big Data technologies (i.e. IAM, Glue, EMR, RedShift, S3, Kinesis) Orchestration with Airflow and Any job scheduler experience Experience in migrating workload from on-premise to cloud and cloud to cloud migrations Good to have: Skills Required : Python, pyspark, SQL Role : Develop efficient ETL pipelines as per business requirements, following the development standards and best practices. Perform integration testing of different created pipeline in AWS env. Provide estimates for development, testing & deployments on different env. Participate in code peer reviews to ensure our applications comply with best practices. Create cost effective AWS pipeline with required AWS services i.e S3,IAM, Glue, EMR, Redshift etc. Experience : 6 to 8 years Job Reference Number : 13024

Lead Big Data Solution Engineer

Noida, Uttar Pradesh

0 - 10 years

INR Not disclosed

On-site

Not specified

Noida, Uttar Pradesh, India;Indore, Madhya Pradesh, India;Bangalore, Karnataka, India;Hyderabad, Telangana, India;Gurgaon, Haryana, India Qualification : Required Proven hands-on experience on designing, developing and supporting Database projects for analysis in a demanding environment. Proficient in database design techniques โ€“ relational and dimension designs Experience and a strong understanding of business analysis techniques used. High proficiency in the use of SQL or MDX queries. Ability to manage multiple maintenance, enhancement and project related tasks. Ability to work independently on multiple assignments and to work collaboratively within a team is required. Strong communication skills with both internal team members and external business stakeholders Added Advanatage Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Experience working on Hive or Spark SQL or Redshift or Snowflake will be an added advantage. Experience of working on Linux system Experience of Tableau or Micro strategy or Power BI or any BI tools will be an added advantage. Expertise of programming in Python, Java or Shell Script would be a plus Role : Roles & Responsibilities Be frontend person of the worldโ€™s most scalable OLAP product company โ€“ Kyvos Insights. Interact with senior-most technical and business people of large enterprises to understand their big data strategy and their problem statements in that area. Create, present, align customers with and implement solutions around Kyvos products for the most challenging enterprise BI/DW problems. Be the Go-To person for customers regarding technical issues during the project. Be instrumental in reading the pulse of the big data market and defining the roadmap of the product. Lead a few small but highly efficient teams of Big data engineers Efficient task status reporting to stakeholders and customer. Good verbal & written communication skills Be willing to work on off hours to meet timeline. Be willing to travel or relocate as per project requirement Experience : 5 to 10 years Job Reference Number : 11078

Lead Bigdata Quality Engineer

Noida, Uttar Pradesh

0 - 9 years

INR Not disclosed

On-site

Not specified

Noida, Uttar Pradesh, India;Gurugram, Haryana, India;Bangalore, Karnataka, India;Indore, Madhya Pradesh, India;Pune, Maharashtra, India;Hyderabad, Telangana, India Qualification : Must Have: Good knowledge of Big Data components and ETL. Thorough understanding of SDLC & STLC. Good experience in writing and executing functional/non-functional end-to-end test cases. Good hands-on knowledge of any database. Good hands-on experience in end-to-end debugging of a solution. Good hands-on experience in Linux OS w.r.t filesystem, application debugging. Good communication skills. Good understanding of test strategy, test plan. Basic understanding of networking. Good to have: Experience in automation using automation tools, scripting, programming language (Python, Java, Shell). Basic understanding of Layer 2/3 protocols, end to end data flow in network. Skills Required : Bigdata Testing, ETL, Automation Role : Design and implement comprehensive test strategies for big data applications and systems. Develop and execute test plans, test cases, and test scripts to ensure functionality, performance, and reliability of big data solutions. Collaborate with development teams to understand data pipelines and architecture for effective testing. Conduct data validation and data accuracy testing to ensure quality standards are met. Analyze large datasets to identify trends, anomalies, and issues using big data tools and technologies. Perform regression testing to verify that new code changes do not adversely affect existing functionality. Identify, record, and track defects using defect management tools, ensuring timely resolution. Work with cross-functional teams to facilitate integration testing and ensure end-to-end data flow integrity. Stay updated on industry best practices and emerging technologies related to big data testing and quality assurance. Mentor and train junior testers on best practices, tools, and methodologies in big data testing. Prepare detailed test reports and documentation for stakeholders and management, highlighting test results and recommendations for improvements. Participate in project planning and meetings, providing insights on testing timelines and resource requirements. Experience : 7 to 9 years Job Reference Number : 13004

Senior Software Developer (Python/Pyspark)

Noida, Uttar Pradesh

0 - 6 years

INR Not disclosed

On-site

Not specified

Noida, Uttar Pradesh, India;Gurgaon, Haryana, India;Hyderabad, Telangana, India;Pune, Maharashtra, India;Indore, Madhya Pradesh, India;Bangalore, Karnataka, India Qualification : Description Strong hands-on experience in Python Having good experience on Spark/Spark Structure Streaming. Experience of working on MSK (Kafka) Kinesis. Ability to design, build and unit test applications on Spark framework on Python. Exposure to AWS cloud services such as Glue/EMR, RDS, SNS, SQS, Lambda, Redshift etc. Good experience of writing SQL queries Strong technical development experience in effectively writing code, code reviews, and best practices Ability to solve complex data-driven scenarios and triage towards defects and production issues Ability to learn-unlearn-relearn concepts with an open and analytical mindset Skills Required : Pyspark, SQL Role : Work closely with business and product management teams to develop and implement analytics solutions. Collaborate with engineers & architects to implement and deploy scalable solutions. Actively drive a culture of knowledge-building and sharing within the team Able to quickly adapt and learn Able to jump into an ambiguous situation and take the lead on resolution Good To Have: Experience of working on MSK (Kafka), Amazon Elastic Kubernetes Service and Docker Exposure on GitHub Actions, Argo CD, Argo Workflows Experience of working on Databricks Experience : 4 to 6 years Job Reference Number : 12555

Technical Project Manager

Noida, Uttar Pradesh

0 - 18 years

INR Not disclosed

On-site

Not specified

Noida, Uttar Pradesh, India;Indore, Madhya Pradesh, India Qualification : We are seeking a highly experienced and dynamic Technical Project Manager to lead and manage our service engagements. The candidate will possess a strong technical ground, exceptional project management skills, and a proven track record of successfully delivering large-scale IT projects. You will be responsible for leading cross-functional teams, managing client relationships, and ensuring projects are delivered on time, within budget, and to the highest quality standards. 14+ years of experience in the role of managing and implementation of high-end software products, combined with technical knowledge in Business Intelligence (BI) and Data Engineering domains 5+ years of exeperience in project management with strong leadership and team management skills Hands-on with project management tools (e.g., Jira, Rally, MS Project) and strong expertise in Agile methodologies (certifications such as SAFe, CSM, PMP or PMI-ACP is a plus) Well versed with tracking project performance using appropriate metrics, tools and processes to successfully meet short/long term goals Rich experience interacting with clients, translating business needs into technical requirements, and delivering customer-focused solutions Exceptional verbal and written communication skills, with the ability to present complex concepts to techincal / non-technical stakeholders alike Strong understanding of BI concepts (reporting, analytics, data warehousing, ETL) leveraging expertise in tools such as Tableau, Power BI, Looker, etc. Knowledge of data modeling, database design, and data governance principles Proficiency in Data Engineering technologies (e.g., SQL, Python, cloud-based data solutions/platforms like AWS Redshift, Google BigQuery, Azure Synapse, Snowflake, Databricks) is a plus Skills Required : SAP BO, MicroStrategy, OBIEE Tableau, Power BI Role : This is a multi-dimensional and multi-functional role. You will need to be comfortable reporting program status to executives, as well as diving deep into technical discussions with internal engineering teams and external partners. Act as the primary point of contact for stakeholders and customers, gathering requirements, managing expectations, and delivering regular updates on project progress Manage and mentor cross-functional teams, fostering collaboration and ensuring high performance while meeting project milestones Drive Agile practices (e.g., Scrum, Kanban) to ensure iterative delivery, adaptability, and continuous improvement throughout the project lifecycle Identify, assess, and mitigate project risks, ensuring timely resolution of issues and adherence to quality standards. Maintain comprehensive project documentation, including status reports, roadmaps, and post-mortem analyses, to ensure transparency and accountability Define the project and delivery plan including defining scope, timelines, budgets, and deliverables for each assignment Capable of doing resource allocations as per the requirements for each assignment Experience : 14 to 18 years Job Reference Number : 12929

AWS Lead Big DataEngineer

Noida, Uttar Pradesh

0 - 12 years

INR Not disclosed

On-site

Not specified

Noida, Uttar Pradesh, India;Bangalore, Karnataka, India;Gurugram, Haryana, India;Hyderabad, Telangana, India;Indore, Madhya Pradesh, India;Pune, Maharashtra, India Qualification : Do you love to work on bleeding-edge Big Data technologies, do you want to work with the best minds in the industry, and create high-performance scalable solutions? Do you want to be part of the team that is solutioning next-gen data platforms? Then this is the place for you. You want to architect and deliver solutions involving data engineering on a Petabyte scale of data, that solve complex business problems Impetus is looking for a Big Data Developer that loves solving complex problems, and architects and delivering scalable solutions across a full spectrum of technologies. Experience in providing technical leadership in the Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, etc. Should be able to communicate with the customer in the functional and technical aspects Expert-level proficiency in Python/Pyspark Hands-on experience with Shell/Bash Scripting (creating, and modifying scripting files) Control-M, AutoSys, Any job scheduler experience Experience in visualizing and evangelizing next-generation infrastructure in Big Data space (Batch, Near Real-time, Real-time technologies). Should be able to guide the team for any functional and technical issues Strong technical development experience in effectively writing code, code reviews, and best practices code refactoring. Passionate for continuous learning, experimenting, ing and contributing towards cutting-edge open-source technologies and software paradigms Good communication, problem-solving & interpersonal skills. Self-starter & resourceful personality with the ability to manage pressure situations. Capable of providing the design and Architecture for typical business problems. Exposure and awareness of complete PDLC/SDLC. Out of box thinker and not just limited to the work done in the projects. Must Have Experience with AWS(EMR, Glue, S3, RDS, Redshift, Glue) Cloud Certification Skills Required : AWS, Pyspark, Spark Role : valuate and recommend the Big Data technology stack best suited for customer needs. Design/ Architect/ Implement various solutions arising out of high concurrency systems Responsible for timely and quality deliveries Anticipate on technological evolutions Ensure the technical directions and choices. Develop efficient ETL pipelines through spark or Hive. Drive significant technology initiatives end to end and across multiple layers of architecture Provides strong technical leadership in adopting and contributing to open-source technologies related to Big Data across multiple engagements Designing /architecting complex, highly available, distributed, failsafe compute systems dealing with a considerable amount (GB/TB) of data Identify and work on incorporating Non-functional requirements into the solution (Performance, scalability, monitoring etc.) Experience : 8 to 12 years Job Reference Number : 12400

Devops Manager

New Delhi, Pune, Bengaluru

10 - 14 years

INR 25.0 - 40.0 Lacs P.A.

Work from Office

Full Time

Technical Project Manager- DevOps Job Description - 10+ years overall IT experience - 5+ years of experience in technical project management, with a focus on DevOps, platform operations, and cloud infrastructure. - Strong understanding of DevOps methodologies, tools, and best practices (CI/CD, Docker, Kubernetes, Terraform, Ansible, Jenkins, etc.). - Experience with cloud platforms (AWS, Azure, GCP) and infrastructure automation. - Good to have experience in Data Platform and Data Pipeline Management - Experience in managing SRE teams - Proven experience in managing cross-functional teams and collaborating with multiple stakeholders. - Excellent leadership, organizational, and communication skills. - Experience providing technical support and incident management in production environments. - Project Management Professional (PMP), Agile (Scrum Master), or relevant certifications are a plus. - Familiarity with security best practices Roles & Responsibilities Project Management: - Lead, plan, and execute DevOps projects, ensuring alignment with customer goals and objectives. - Define project scope, objectives, deliverables, and timelines in collaboration with stakeholders. - Develop detailed project plans, manage risks, and monitor progress. - Ensure project deliverables meet quality standards and are delivered on time and within estimated effort. Team Leadership: - Manage and mentor the DevOps team, providing guidance on technical challenges and career growth. - Foster a collaborative and productive team environment by encouraging continuous learning and improvement. - Collaborate with cross-functional teams such as development, QA, operations, and security to ensure seamless project execution. CI/CD Pipeline Management: - Oversee the design, implementation, and optimization of continuous integration and continuous delivery (CI/CD) pipelines. - Ensure automation of deployment processes, reducing manual intervention and improving release cycles. - Work closely with development teams to integrate automated testing, security scanning, and monitoring in the CI/CD process. Platform Management: - Provide oversight and management of infrastructure platforms, ensuring availability, reliability, and scalability. - Collaborate with infrastructure teams to manage cloud environments (AWS, Azure, GCP) and optimize platform performance. - Ensure proactive platform monitoring, incident management, and problem resolution. Technical Support: - Provide technical support and guidance to the DevOps and platform teams during critical incidents or escalations. - Serve as an escalation point for complex technical issues, ensuring timely resolution and post-incident reviews. - Collaborate with the support teams to improve incident management processes and reduce mean time to resolution (MTTR). Stakeholder Management: - Serve as the main point of contact for project stakeholders, ensuring clear communication of project status, risks, and issues. - Manage stakeholder expectations and ensure alignment with business goals and priorities. - Provide regular updates on project progress, risks, and roadmaps to senior management. Resource & Budget Management: - Identify resource needs and allocate appropriate team members for project tasks. - Coordinate with management for recruitment and resource planning based on project requirements. - Ensure efficient utilization of team resources and budget control throughout the project lifecycle. Process Improvement: - Identify opportunities for process improvements within the DevOps and platform teams to enhance operational efficiency. - Standardize and document DevOps processes, methodologies, and tools to ensure consistency across projects. - Stay up-to-date with industry trends and emerging technologies, recommending new tools and practices that enhance DevOps and platform capabilities. Mandatory Skills Technical Project Manager, DevOps

Lead BI Engineer

Noida, Indore, Bengaluru

10 - 12 years

INR 13.0 - 18.0 Lacs P.A.

Work from Office

Full Time

Primary Tool & Expertise: Power BI and Semantic modelling Key Project Focus: Leading the migration of legacy reporting systems (Cognos or MicroStrategy) to Power BI solutions. Core Responsibility: Build / Develop optimized semantic models , metrics and complex reports and dashboards Work closely with the business analysts / BI teams to help business teams drive improvement in key business metrics and customer experience Responsible for timely, quality, and successful deliveries Sharing knowledge and experience within the team / other groups in the org Lead teams of BI engineers translate designed solutions to them, review their work, and provide guidance. Manage client communications and deliverables Roles and Responsibilities Core BI Skills: Power BI (Semanti Modelling, DAX, Power Query, Power BI Service) Data Warehousing Data Modeling Data Visualization SQL Datawarehouse, Database & Querying: Strong skills in databases (Oracle / MySQL / DB2 / Postgres) and expertise in writing SQL queries. Experience with cloud-based data intelligence platforms like (Databricks / Snowflake) Strong understanding of data warehousing and data modelling concepts and principles. Strong skills and experience in creating semantic models in the Power BI or similar tools . Additional BI & Data Skills (Good to Have): Certifications in Power BI and any Data platform Experience in other tools like MicroStrategy and Cognos Proven experience in migrating existing BI solutions to Power BI or other modern BI platforms. Experience with the broader Power Platform (Power Automate, Power Apps) to create integrated solutions Knowledge and experience in Power BI Admin features like, versioning, usage reports, capacity planning, creation of deployment pipelines etc Sound knowledge of various forms of data analysis and presentation methodologies. Experience in formal project management methodologies Exposure to multiple BI tools is desirable. Experience with Generative BI implementation. Working knowledge of scripting languages like Perl, Shell, Python is desirable. Exposure to one of the cloud providers โ€“ AWS / Azure / GCP. Soft Skills & Business Acumen: Exposure to multiple business domains (e.g., Insurance, Reinsurance, Retail, BFSI, healthcare, telecom) is desirable. Exposure to complete SDLC. Out-of-the-box thinker and not just limited to the work done in the projects. Capable of working as an individual contributor and within a team. Good communication, problem-solving, and interpersonal skills. Self-starter and resourceful, skilled in identifying and mitigating risks

FIND ON MAP

Impetus Technologies

Impetus Technologies

Impetus Technologies

Information Technology and Services

Bengaluru

1001-5000 Employees

152 Jobs

    Key People

  • Sunil Rao

    Managing Director
  • Shivendra Kumar

    Chief Technology Officer
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview