Home
Jobs
Companies
Resume
152 Job openings at Impetus Technologies
About Impetus Technologies

Impetus Technologies is a global technology company focused on building innovative products and solutions across multiple industries including finance, healthcare, and telecommunications.

DotNet Developer

Indore, Gurugram, Bengaluru

5 - 10 years

INR 8.0 - 12.0 Lacs P.A.

Work from Office

Full Time

Rate revision March Projection -> Impact Core Requirements: 5 to 10 years of experience in C# and .NET Practical experience with Azure and cloud-based solutions Proficiency in the MVC framework Excellent communication skills, with the capability to collaborate directly with client stakeholders A software engineering mindset, demonstrating innovation, problem-solving, and value addition to the team and clients Availability to join within 2 weeks Roles and Responsibilities Rate revision March Projection -> Impact Core Requirements: 5 to 10 years of experience in C# and .NET Practical experience with Azure and cloud-based solutions Proficiency in the MVC framework Excellent communication skills, with the capability to collaborate directly with client stakeholders A software engineering mindset, demonstrating innovation, problem-solving, and value addition to the team and clients Availability to join within 2 weeks

Business Intelligence Engineer

Noida, Pune, Bengaluru

3 - 6 years

INR 9.0 - 16.0 Lacs P.A.

Work from Office

Full Time

Description : Design / Implement solutions using Superset and / Cognos Work closely with the business analysts / BI lead to help business teams drive improvement in key business metrics and customer experience Responsible for timely, quality, and successful deliveries Sharing knowledge and experience within the team / other groups in the organisation Role : E xpertise in Superset and/or Cognos S trong skills in databases (Oracle / MySQL / DB2) and expertise in writing SQL queries Exposure to at least one of the following BI Tools Power BI, Tableau, Qlik, Spotfire, QuickSight, Looker, SAP BO Good to have certification in one of the above BI tools Sound knowledge of various forms of data analysis and presentation methodologies Working knowledge of scripting languages like Perl, Shell, Python is desirable Working knowledge of HiveQL / Spark SQL / Impala SQL is desirable Exposure to one of the cloud providers AWS / Azure / GCP Out of box thinker and not just limited to the work done in the projects Capable of working as an individual contributor and within team too Good communication, problem solving and interpersonal skills Self-starter and resourceful, skilled in identifying and mitigating risks

Senior Marketing Manager

Noida, Indore, Bengaluru

10 - 16 years

INR 30.0 - 40.0 Lacs P.A.

Hybrid

Full Time

Job Description The Senior Manager / Director of Marketing will drive the brand reach, positioning and revenue growth for the services and solutions of the company, by directing all online and traditional marketing initiatives in a fast-paced entrepreneurial environment. This role will be pivotal in designing effective marketing strategy to attract, engage and delight our ideal customer persona, from the challenges that they are looking to solve. The role will be responsible for developing and promoting the unique value, positioning, messaging, and go-to-market strategy for the companys product and services. Strive for excellence and drive the organization to touch, move & inspire partners, customers & the market. An ideal candidate would be someone with an analytics / AI / Data platform background with hands-on ABM experience. Experience in managing analyst relations will be a big plus. Roles & Responsibilities Role Own and execute end-to-end Marketing programs. Marketing generalists who can quickly pivot to specialize according to GTM requirements Responsible for evaluating and developing marketing strategies, planning and coordinating marketing efforts, communicating the marketing plans to those involved, and building awareness and positioning around the companyโ€™s services and products Collaborate with the senior leadership to develop a strong positioning & go-to-market strategy Develop strategies for attracting relevant customer profile (users, influencers & decision makers) from their intent & on their preferred channels Adopt a thorough and most current understanding of the services & solutions being developed and developing a plan to reflect the pitch through content collaterals at a defined cadence. Develop & drive Content Marketing strategy that creates high-traffic from our relevant persona, lead-converting resources, and shareable creative projects Manage the demand generation lifecycle, from attracting new conversations to nurturing them in the most valuable and relevant way, such that customers see their needs being addressed and are motivated to try our products and solutions Empower Sales & Sales Enablement with relevant content to send to customers, and fueling PR, partner, and influencer relations โ€“ that is relevant to our industry Serve as the Creative Leader behind all our online and offline events for our prospects and customers Engage with key clients on an ongoing basis to better understand how to communicate value to prospective clients and obtain data for use in marketing materials (case studies, videos, sales sheets Conduct market surveys and research to stay ahead of competition Creating sales tool like brochures, datasheet, presentations, and proposals Determine and coordinate online and onsite events like webinars, tradeshows, seminars, and customer events Report on periodic activities, results, and ROI for marketing Plan and implement software product launch globally Implement marketing strategies to meet or exceed demand generation and revenue targets

Java backend Developer

Indore, Gurugram, Bengaluru

4 - 7 years

INR 5.0 - 15.0 Lacs P.A.

Work from Office

Full Time

JD: 4-7 years of experience building large scale applications using Java, Spring framework Strong knowledge and experience on software development methods and performs due diligence in all lifecycle stages of analysis, build and testing. Ability to write good junits and extend code coverage Ability to troubleshoot problems in test and production Strong communication skills and a team player Roles & Responsibilities SKills : Java,Springboot, API ,Microservices JD: 4-7 years of experience building large scale applications using Java, Spring framework Strong knowledge and experience on software development methods and performs due diligence in all lifecycle stages of analysis, build and testing. Ability to write good junits and extend code coverage Ability to troubleshoot problems in test and production Strong communication skills and a team player

Senior Dot net Developer

Noida, Indore, Pune, Gurugram

8 - 13 years

INR 25.0 - 30.0 Lacs P.A.

Work from Office

Full Time

4 or more years of professional experience using GitHub version control 4 or more years of professional experience with building and integrating with RESTful Web API s 4 or more years of dedicated work with the following Microsoft technologies; C#, .NET Framework, .NET Core, and Entity Framework 4 or more years in the following technologies: Relational database development (such as Microsoft SQL Server, Oracle, DB2, etc.) 4 or more years of professional experience participating in a Scrum development environment Professional experience working with repositories, artifacts, and creating build & release pipelines in Azure DevOps will be a strong add on to have Experience with TDD/BDD, testing framework (Jest / React-testing-library / Junit) Understanding of common design patterns Exposure to containerization (Docker, Kubernetes) Knowledge of React JS for front end development is good to have Experience of gRPC and Kafka Strong knowledge of object-oriented programming is required. Experience working both independently and in a team-oriented, collaborative environment is essential Demonstrated ability to confirm to shifting priorities, demands, and timelines through analytical and problem-solving capabilities Ability to remain flexible during times of change and react to project adjustments and alterations promptly, efficiently, and positively. Strong written and oral communication skills. Strong interpersonal skills Must be able to learn, understand, and apply new technologies. Strong customer orientation. Excellent analytical and problem-solving capability. The ability to effectively prioritize and execute tasks in a high-pressure environment is crucial. Ability to influence colleagues and communicate effectively across all levels of the organization. Ability to manage multiple projects and work effectively under time constraints as necessary. Excellent verbal, written and relationship skills used to interact with a global group of technical and non-technical people. Attention to detail is a must. Follow our software development practices and methodologies Participate in scrum ceremonies, including Story Pointing, Sprint Planning, Sprint Reviews, and Sprint Retrospectives Estimate time and effort by defining and planning development tasks Create and execute unit test cases Write code consistent with defined technology stack and standards Facilitates and assists in design sessions Technical mentor for other software developers

Sr. Technical Architect - Snowflake / DBT

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru

10 - 15 years

INR 35.0 - 40.0 Lacs P.A.

Work from Office

Full Time

We are looking for an experienced Technical Lead or Technical Architect with expertise in Data Warehousing , DBT , Snowflake , Data Modeling , and Data Engineering . The ideal candidate will have strong hands-on experience in customer handling, requirements gathering, and analysis for new requirements and Change Requests (CR). You will lead and manage key technical initiatives, ensuring smooth delivery and execution of data engineering and warehousing solutions. Key Responsibilities: Lead and manage data warehousing projects, ensuring timely and efficient execution. Work extensively with Amazon Aurora , Amazon RDS , AWS DMS , Amazon DynamoDB , Oracle , and PL/SQL . Design and implement data models, data architecture, and data pipelines. Collaborate with clients to gather requirements, analyze needs, and provide solutions for new features and Change Requests (CR). Optimize database queries and maintain data integrity across multiple platforms. Provide technical guidance on data engineering best practices, ensuring the implementation of scalable, efficient, and secure solutions. Work closely with stakeholders to understand business requirements and translate them into technical solutions. Ensure proper documentation and reporting of data models, architecture, and configurations. Stay up to date with industry trends and innovations in data warehousing and engineering. Experience with Data Warehousing: Hands-on experience with DBT , Snowflake , Data Modeling , and Data Engineering . Strong Technical Skills: Proficiency in Amazon Aurora , Amazon RDS , AWS DMS , Amazon DynamoDB , Oracle , PL/SQL , and SQL . Data Architecture: Ability to design and implement scalable and efficient data architectures. Customer Handling: Experience in managing client relationships, gathering requirements, and analyzing business needs. Database Management: In-depth knowledge of relational and NoSQL databases and experience optimizing data pipelines. Problem-Solving Skills: Strong analytical and troubleshooting abilities to ensure data integrity and quality. Communication Skills: Excellent communication skills to interact with clients, stakeholders, and internal teams. Leadership Experience: Ability to lead technical teams and ensure the successful delivery of data-driven projects. Preferred Qualifications: Certifications in AWS or related data engineering fields. Experience working in Agile methodologies or similar frameworks. Previous experience in handling large-scale data engineering projects.

Software Engineer (Java + Big Data)

Chennai

5 - 7 years

INR 10.0 - 20.0 Lacs P.A.

Work from Office

Full Time

Job: Software Developer (Java + Big Data) Location: Indore Years of experience: 5-7 years Requisition Description 1. Problem solving and analytical skills 2. Good verbal and written communication skills Roles and Responsibilities 1. Design and develop high performance, scale-able applications with Java + Bigdata as minimum required skill . Java, Microservices , Spring boot, API ,Bigdata-Hive, Spark, Pyspark 2. Build and maintain efficient data pipelines to process large volumes of structured and unstructured data. 3. Develop micro-services, API and distributed systems 4. Worked on Spark, HDFS, CEPh, Solr/Elastic search, Kafka, Deltalake 5. Mentor and Guide junior members

Java Full Stack Developer

Bengaluru

4 - 7 years

INR 11.0 - 21.0 Lacs P.A.

Work from Office

Full Time

Role & responsibilities Proven experience as a Full Stack Developer or similar role Experience developing desktop and mobile applications Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery) Knowledge of multiple back-end languages (e.g. Java, ) and JavaScript frameworks (e.g. React) Familiarity with databases (e.g. MySQL, MongoDB), web servers (e.g. Apache) and UI/UX design Preferred candidate profile Great attention to detail Organizational skills An analytical mind Excellent communication and teamwork skills

QA - Bigdata Testing

Bengaluru

4 - 7 years

INR 10.0 - 19.0 Lacs P.A.

Work from Office

Full Time

We are looking for energetic, high-performing and highly skilled Quality Assurance Engineer to help shape our technology and product roadmap. You will be part of the fast-paced, entrepreneurial Enterprise Personalizationportfolio focused on delivering the next generation global marketing capabilities. This team is responsible for Global campaign tracking of new accounts acquisition and bounty payments and leverages transformational technologies, such as SQL, Hadoop, Spark, Pyspark, HDFS, MapReduce, Hive, HBase, Kafka & Java.Focus: Provides domain expertise to engineers on Automation, Testing and Quality Assurance (QA) methodologies and processes, crafts and executes test scripts, assists in preparation of test strategies, sets up and maintains test data & environments as well as logs results. 4 - 6 years of hands-on software testing experience in developing test cases and test plans with extensive knowledge of automated testing and architecture. Expert knowledge of Testing Frameworks and Test Automation Design Patterns like TDD, BDD etc. Expertise in developing software test cases for Hive, Spark, SQL written in pyspark SQL and Scala. Hands-on experience in Performance and Load Testing tools such as JMeter, pytest or similar tool. Experience with industry standard tools for defect tracking, source code management, test case management, test automation, and other management and monitoring tools Experience working with Agile methodology Experience with Cloud Platform (GCP) Experience in designing, developing, testing and debugging, and operating resilient distributed systems using Big Data ClustersGood sense for software quality, the clean code principles, test driven development and an agile mindset High engagement, self-organization, strong communication skills and team spirit Experience with building and adopting new test frameworks.Bonus skills:Testing Machine learning/data mining Roles and Responsibilities Responsible for testing and quality assurance of large data processing pipeline using Pyspark and SQL. Develops and tests software, including ongoing refactoring of code, and drives continuous improvement in code structure and quality Functions as a platform SME who drives quality and automation strategy at application level, identifies new opportunities and drives Software Engineers to deliver the highest quality code. Delivers on capabilities for the portfolio automation strategy and executes against the test and automation strategy defined at the portfolio level. Works with engineers to drive improvements in code quality via manual and automated testing. Involved in the review of the user story backlog and requirements specifications for completeness and weaknesses in function, performance, reliability, scalability, testability, usability, and security and compliance testing. Provides recommendations Plans and defines testing approach, providing advice on prioritization of testing activity in support of identified risks in project schedules or test scenarios.

GCP Data Engineer

Indore, Gurugram, Bengaluru

4 - 7 years

INR 10.0 - 19.0 Lacs P.A.

Work from Office

Full Time

We need GCP engineers for capacity building; - The candidate should have extensive production experience (1-2 Years ) in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application development is a must Roles and Responsibilities 4-7 years of IT experience range is preferred. Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies โ€“ Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos. Ability to drive the deployment of the customersโ€™ workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for GCP cloud implementations. Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams. Technical ability to become certified in required GCP technical certifications.

Devops Architect

Indore, Pune, Gurugram

10 - 19 years

INR 35.0 - 50.0 Lacs P.A.

Work from Office

Full Time

Senior Cloud Engineer/Architect with experience of Design and implement the overall cloud architecture for the data intelligent platform, ensuring scalability, reliability, and security Preferred candidate profile Designing and implementing the overall cloud architecture for the data intelligent platform, ensuring scalability, reliability, and security. Evaluating and selecting appropriate AWS services to meet the platform's requirements. Developing and maintaining reusable Infrastructure as Code (IaC) using Terraform to automate the provisioning and management of cloud resources. Implementing CI/CD pipelines for continuous integration and deployment of infrastructure changes. Ensuring that the platform adheres to security best practices and MMC compliance requirements, including data encryption, access controls, and monitoring. Collaborating with security teams to implement security measures and conduct regular audits. Integrating the marketplace with backend services to facilitate service requests and provisioning. Should have experience of working on Databricks Platform

Etl Test Engineer

Noida, Indore, Bengaluru

5 - 8 years

INR 15.0 - 30.0 Lacs P.A.

Work from Office

Full Time

Role & responsibilities Design, develop, and maintain Selenium automation scripts. โ€ข Perform ETL testing: validate data transformation logic, perform data reconciliation, and ensure data integrity across source and target systems (SQL, flat files, etc.). โ€ข Conduct manual functional, regression, and integration testing for frontend and backend components. โ€ข Create and execute test cases, test plans, and test strategies based on business requirements and technical specifications. โ€ข Participate in daily stand-ups, sprint planning, and defect triage meetings. โ€ข Identify, document, and track defects using tools like JIRA or similar. โ€ข Collaborate with developers, product owners, and business analysts to understand requirements and ensure test coverage. โ€ข Support UAT testing and coordinate with cross-functional teams for test sign-off. Preferred candidate profile Need a detail-oriented 5-7 years QA Engineer with hands-on experience in Selenium automation, ETL/data pipeline testing and manual testing Should have 3+ years of experience with Selenium WebDriver and TestNG/JUnit or similar frameworks. Strong programming/scripting skills in Java for automation. Solid experience with ETL/data warehouse testing, writing and executing complex SQL queries. Strong understanding of data validation, data transformation, and data quality rules. Experience with manual testing in Agile environments. Familiarity with CI/CD tools like Jenkins, Git, Maven. Exposure to bug tracking and test management tools like JIRA, Zephyr, or TestRail. Good-to-Have Skills: โ€ข Knowledge of API testing using tools like Postman or REST-assured. โ€ข Experience with Big Data testing or cloud platforms (AWS/GCP/Azure) is a plus. โ€ข Exposure to BDD frameworks like Cucumber. โ€ข Basic understanding of data modeling and data lake environments.

Java + Bigdata

Chennai

5 - 7 years

INR 10.0 - 14.0 Lacs P.A.

Work from Office

Full Time

Skills: 5+ years of experience with Java + Bigdata as minimum required skill . Java, Micorservices ,Sprintboot, API ,Bigdata-Hive, Spark,Pyspark Roles and Responsibilities Skills: 5+ years of experience with Java + Bigdata as minimum required skill . Java, Micorservices ,Sprintboot, API ,Bigdata-Hive, Spark,Pyspark

Senior Devops Engineer

Pune, Gurugram, Bengaluru

5 - 9 years

INR 10.0 - 20.0 Lacs P.A.

Work from Office

Full Time

Job Description: Expertise in GitLab Actions and Git workflows Databricks administration experience Strong scripting skills (Shell, Python, Bash) Experience with Jira integration in CI/CD workflows Familiarity with DORA metrics and performance tracking Proficient with SonarQube and JFrog Artifactory Deep understanding of branching and merging strategies Strong CI/CD and automated testing integration skills Git and Jira integration Infrastructure as Code experience (Terraform, Ansible) Exposure to cloud platform (Azure/AWS) Familiarity with monitoring/logging (Dynatrace, Grafana, Prometheus, ELK) Roles & Responsibilities Build and manage CI/CD pipelines using GitLab Actions for seamless integration and delivery. Administer Databricks workspaces, including access control, cluster management, and job orchestration. Automate infrastructure and deployment tasks using scripts (Shell, Python, Bash, etc.). Implement source control best practices, including branching, merging, and tagging. Integrate Jira with CI/CD pipelines to automate ticket updates and traceability. Track and improve DORA metrics (Deployment Frequency, Lead Time for Changes, Mean Time to Restore, Change Failure Rate). Manage code quality using SonarQube and artifact lifecycle using JFrog Artifactory. Ensure end-to-end testing is integrated into the delivery pipelines. Collaborate across Dev, QA, and Ops teams to streamline DevOps practices. Troubleshoot build and deployment issues and ensure high system reliability. Maintain up-to-date documentation and contribute to DevOps process improvements.

Analytical engineers

Noida, Indore, Pune

2 - 4 years

INR 4.0 - 6.0 Lacs P.A.

Work from Office

Full Time

2-4 years of experience in designing, developing, and training machine learning models using diverse algorithms and techniques, including deep learning, NLP, computer vision, and time series analysis. Proven ability to optimize model performance through experimentation with architectures, hyperparameter tuning, and evaluation metrics. Hands-on experience in processing large datasets, including preprocessing, feature engineering, and data augmentation. Demonstrated ability to deploy trained AI/ML models to production using frameworks like Kubernetes and cloud-based ML platforms Solid understanding of monitoring and logging for performance tracking. Experience in exploring new AI/ML methodologies and documenting the development and deployment lifecycle, including performance metrics. Familiarity with AWS services, particularly SageMaker, is expected. Excellent communication, presentation, and interpersonal skills are essential. Good to have: Knowledge of GenAI (LangChain, Foundation model tuning, and GPT3) Amazon AWS Certified Machine Learning - Specialty certifications Explore different models and transform data science prototypes for given problem Analyze dataset perform data enrichment, feature engineering and model training Abale to write code using Python, Pandas and Dataframe APIs Develop machine learning applications according to requirements Perform statistical analysis and fine-tuning using test results Collaborate with data engineers & architects to implement and deploy scalable solutions. Encourage continuous innovation and out-of-the-box thinking. Experience applying theoretical models in an applied environment.

MLSE (Python/Pyspark) Professional

Noida, Indore

6 - 8 years

INR 8.0 - 10.0 Lacs P.A.

Work from Office

Full Time

6-8 years of good hands on exposure with Big Data technologies - pySpark (Data frame and SparkSQL), Hadoop, and Hive Good hands on experience of python and Bash Scripts Good understanding of SQL and data warehouse concepts Strong analytical, problem-solving, data analysis and research skills Demonstrable ability to think outside of the box and not be dependent on readily available tools Excellent communication, presentation and interpersonal skills are a must Hands-on experience with using Cloud Platform provided Big Data technologies (i.e. IAM, Glue, EMR, RedShift, S3, Kinesis) Orchestration with Airflow and Any job scheduler experience Experience in migrating workload from on-premise to cloud and cloud to cloud migrations Good to have: Develop efficient ETL pipelines as per business requirements, following the development standards and best practices. Perform integration testing of different created pipeline in AWS env. Provide estimates for development, testing & deployments on different env. Participate in code peer reviews to ensure our applications comply with best practices. Create cost effective AWS pipeline with required AWS services i.e S3,IAM, Glue, EMR, Redshift etc.

Lead Software Engineer (Pyspark/Python)

Noida, Indore

5 - 7 years

INR 7.0 - 9.0 Lacs P.A.

Work from Office

Full Time

5-7 years of good hands on exposure with Big Data technologies - pySpark (Data frame and SparkSQL), Hadoop, and Hive Good hands on experience of python and Bash Scripts Good understanding of SQL and data warehouse concepts Strong analytical, problem-solving, data analysis and research skills Demonstrable ability to think outside of the box and not be dependent on readily available tools Excellent communication, presentation and interpersonal skills are a must Good to have: Hands-on experience with using Cloud Platform provided Big Data technologies (i.e. IAM, Glue, EMR, RedShift, S3, Kinesis) Orchestration with Airflow and Any job scheduler experience Experience in migrating workload from on-premise to cloud and cloud to cloud migrations Develop efficient ETL pipelines as per business requirements, following the development standards and best practices. Perform integration testing of different created pipeline in AWS env. Provide estimates for development, testing & deployments on different env. Participate in code peer reviews to ensure our applications comply with best practices. Create cost effective AWS pipeline with required AWS services i.e S3,IAM, Glue, EMR, Redshift etc.

Senior Software Engineer (AI/ML)

Noida, Indore, Pune

2 - 7 years

INR 4.0 - 9.0 Lacs P.A.

Work from Office

Full Time

Strong experience in Python 2+ years experience of working on feature/data pipelines using PySpark Understanding and experience around data science Exposure to AWS cloud services such as Sagemaker, Bedrock, Kendra etc. Experience with machine learning model lifecycle management tools, and an understanding of MLOps principles and best practice Experience with statistical models e.g., multinomial logistic regression Experience of technical architecture, design, deployment, and operational level knowledge Exploratory Data Analysis Knowledge around Model building, Hyperparameter tuning and Model performance metrics. Statistics Knowledge (Probability Distributions, Hypothesis Testing) Time series modelling, Forecasting, Image/Video Analytics, and Natural Language Processing (NLP). Good To Have: Experience researching and applying large language and Generative AI models. Experience with LangChain, LLAMAIndex, Foundation model tuning, Data Augmentation, and Performance Evaluation frameworks Able to provide analytical expertise in the process of model development, refining, and implementation in a variety of analytics problems. Knowledge on Docker and Kubernetes. Generate actionable insights for business improvements. Ability to understand business requirements. Write clean, efficient, and reusable code following best practices. Troubleshoot and debug applications to ensure optimal performance. Write unit test cases Collaborate with cross-functional teams to define and deliver new features Use case derivation and solution creation from structured/unstructured data. Actively drive a culture of knowledge-building and sharing within the team Experience applying theoretical models in an applied environment. MLOps, Data Pipeline, Data engineering Statistics Knowledge (Probability Distributions, Hypothesis Testing)

Technical Solution Architect - Databricks

Indore, Pune

4 - 8 years

INR 6.0 - 10.0 Lacs P.A.

Work from Office

Full Time

Overall 10-18 yrs. of Data Engineering experience with Minimum 4+ years of hands on experience in Databricks. Ready to travel Onsite and work at client location. Proven hands-on experience as a Databricks Architect or similar role with a deep understanding of the Databricks platform and its capabilities. Analyze business requirements and translate them into technical specifications for data pipelines, data lakes, and analytical processes on the Databricks platform. Design and architect end-to-end data solutions, including data ingestion, storage, transformation, and presentation layers, to meet business needs and performance requirements. Lead the setup, configuration, and optimization of Databricks clusters, workspaces, and jobs to ensure the platform operates efficiently and meets performance benchmarks. Manage access controls and security configurations to ensure data privacy and compliance. Design and implement data integration processes, ETL workflows, and data pipelines to extract, transform, and load data from various sources into the Databricks platform. Optimize ETL processes to achieve high data quality and reduce latency. Monitor and optimize query performance and overall platform performance to ensure efficient execution of analytical queries and data processing jobs. Identify and resolve performance bottlenecks in the Databricks environment. Establish and enforce best practices, standards, and guidelines for Databricks development, ensuring data quality, consistency, and maintainability. Implement data governance and data lineage processes to ensure data accuracy and traceability. Mentor and train team members on Databricks best practices, features, and capabilities. Conduct knowledge-sharing sessions and workshops to foster a data-driven culture within the organization. Will be responsible for Databricks Practice Technical/Partnership initiatives. Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects. Bachelors or Master s degree in Computer Science, Information Technology, or related field. In depth hands-on implementation knowledge on Databricks. Delta Lake, Delta table - Managing Delta Tables, Databricks Cluster Configuration, Cluster policies. Experience handling structured and unstructured datasets Strong proficiency in programming languages like Python, Scala, or SQL. Experience with Cloud platforms like AWS, Azure, or Google Cloud, and understanding of cloud-based data storage and computing services. Familiarity with big data technologies like Apache Spark, Hadoop, and data lake architectures. Develop and maintain data pipelines, ETL workflows, and analytical processes on the Databricks platform. Should have good experience in Data Engineering in Databricks Batch process and Streaming Should have good experience in creating Workflows & Scheduling the pipelines. Should have good exposure on how to make packages or libraries available in DB. Familiarity in Databricks default runtimes Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail.

Qa Lead

Noida, Indore, Bengaluru

10 - 13 years

INR 20.0 - 30.0 Lacs P.A.

Work from Office

Full Time

Experience - 10 to 13 years Must Have Expertise : Selenium , RestEasy or having experience in multiple Automation framework Able to identify and Architect Automation framework from scratch Can do high level and low-level framework development grooming, guide team technical / logistically all team members. Programming Python Problem solving & logic and analytical abilities Hands On QA Frameworks BDD, TDD ,Data Driven OS (Windows/Linux) concepts DBMS concepts & Ability to Write Queries Communication/Confidence/Attitude/Client Interaction Able to handle team and client Tracking of end to end deliverables Over all QA processes/Matrix/Reports Good to have : Big Data Technologies Cloud Technology Document Based Database {MongoDB} Data Analysis

FIND ON MAP

Impetus Technologies

Impetus Technologies

Impetus Technologies

Information Technology and Services

Bengaluru

1001-5000 Employees

152 Jobs

    Key People

  • Sunil Rao

    Managing Director
  • Shivendra Kumar

    Chief Technology Officer
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview