Bengaluru, Karnataka, India
Not disclosed
On-site
Full Time
We are looking for energetic, high-performing and highly skilled Quality Assurance Engineer to help shape our technology and product roadmap. You will be part of the fast-paced, entrepreneurial Enterprise Personalization portfolio focused on delivering the next generation global marketing capabilities. About the Role This team is responsible for Global campaign tracking of new accounts acquisition and bounty payments and leverages transformational technologies, such as SQL, Hadoop, Spark, Pyspark, HDFS, MapReduce, Hive, HBase, Kafka & Java. Responsibilities Provides domain expertise to engineers on Automation, Testing and Quality Assurance (QA) methodologies and processes. Crafts and executes test scripts. Assists in preparation of test strategies. Sets up and maintains test data & environments. Logs results. High-performing and highly skilled in Quality Assurance. Expertise in Automation, Testing and QA methodologies. Proficient in SQL, Hadoop, Spark, Pyspark, HDFS, MapReduce, Hive, HBase, Kafka & Java. If you are interested and can join us with in 0-30 Days, please do share your resume at- vaishali.tyagi@impetus.com Show more Show less
Indore, Madhya Pradesh, India
Not disclosed
On-site
Full Time
Open Location - Indore, Noida, Gurgaon, Bangalore, Hyderabad, Pune Job Description 3-8 years' experience working on Data engineering & ETL/ELT processes, data warehousing, and data lake implementation with AWS services Hands on experience in designing and implementing solutions like creating/deploying jobs, Orchestrating the job/pipeline and infrastructure configurations Expertise in designing and implementing pySpark and Spark SQL based solutions Design and implement data warehouses using Amazon Redshift, ensuring optimal performance and cost efficiency. Good understanding of security, compliance, and governance standards. Roles & Responsibilities Design and implement robust and scalable data pipelines using AWS/Azure services Drive architectural decisions for data solutions on AWS, ensuring scalability, security, and cost-effectiveness. Hands-on experience of Develop and deploy ETL/ELT processes using Glue/Azure data factory, Lambda/Azure functions, Step function/Azure logic apps/MWAA, S3 and Lake formation from various data sources. Strong Proficiency in pySpark, SQL, Python. Proficiency in SQL for data querying and manipulation. Experience with data modelling, ETL processes, and data warehousing concepts. Create and maintain documentation for data pipelines, processes, and following best practices. Knowledge of various Spark Optimization technique, Monitoring and Automation would be a plus. Participate in code reviews and ensure adherence to coding standards and best practices. Understanding of data governance, compliance, and security best practices. Strong problem-solving and troubleshooting skills. Excellent communication and collaboration skills – with understanding on stakeholder mapping Good to Have: Understanding of databricks is good to have. GenAI, Working with LLMs are good to have Mandatory Skills - AWS OR Azure Cloud, Python Programming, SQL, Spark SQL, Hive, Spark optimization techniques and Pyspark. Share resume at sonali.mangore@impetus.com with details (CTC, Expected CTC, Notice Period) Show more Show less
Bengaluru, Karnataka, India
Not disclosed
On-site
Full Time
About the Role: We are seeking a skilled QA/Test Engineer with a strong focus on backend systems to join our growing technology team. The ideal candidate will have hands-on experience in API testing, automation scripting using C# or Kotlin , and working within distributed, event-driven architectures. You’ll play a key role in validating complex backend workflows, ensuring data consistency, and integrating tests within CI/CD pipelines. Key Responsibilities: Design, develop, and maintain automated test scripts for backend components Perform robust API testing using Postman Debug and validate large-scale data workflows and backend logic Test event-driven systems and validate Kafka message flows Integrate automated tests into CI/CD pipelines Work closely with developers, DevOps, and product teams to ensure quality at all stages Must-Have Skills: 4+ years of experience in QA/Test Engineering with a backend focus Strong experience with API testing (Postman) Hands-on experience in test automation using C# or Kotlin Solid understanding of SQL and experience debugging data inconsistencies Familiarity with CI/CD tools and pipelines Good to Have: Experience with Kafka or similar messaging/event-streaming systems Background in financial systems or high-volume data validation Experience with distributed systems or microservices testing What We Offer: Opportunity to work on high-impact, scalable backend systems A collaborative environment focused on innovation and quality Competitive compensation and career growth opportunities Flexible work options and a supportive team Show more Show less
Bengaluru, Karnataka, India
Not disclosed
On-site
Full Time
Job Summary: We are looking for highly motivated and analytical Machine Learning Engineers with 1–3 years of experience in building scalable, production-ready AI/ML models. This role involves working on complex business problems using advanced ML/DL techniques across domains such as Natural Language Processing (NLP), Computer Vision, Time Series Forecasting, and Generative AI. You will be responsible for end-to-end model development, deployment, and performance tracking while collaborating with cross-functional teams including data engineering, DevOps, and product. Location: Noida / Gurugram / Indore / Bengaluru / Pune / Hyderabad Experience: 1–3 Years Education: BE / B.Tech / M.Tech / MCA / M.Com Key Responsibilities: Model Development & Experimentation Design and build machine learning models for NLP, computer vision, and time series prediction using supervised, unsupervised, and deep learning techniques. Conduct experiments to improve model performance via architectural modifications, hyperparameter tuning, and feature selection. Apply statistical analysis to validate and interpret model results. Evaluate models using appropriate metrics (e.g., accuracy, precision, recall, F1-score, AUC-ROC). Data Handling & Feature Engineering Process large structured and unstructured datasets using Python, Pandas, and DataFrame APIs. Perform feature extraction, transformation, and selection tailored to specific ML problems. Implement data augmentation and enrichment techniques to enhance training quality. Model Deployment & Productionization Deploy trained models to production environments using cloud platforms such as AWS (especially SageMaker). Containerize models using Docker and orchestrate deployments with Kubernetes. Implement monitoring, logging, and automated retraining pipelines for model health tracking. Collaboration & Innovation Collaborate with data engineers and architects to ensure smooth data flow and infrastructure alignment. Explore and adopt cutting-edge AI/ML methodologies and GenAI frameworks (e.g., LangChain, GPT-3). Contribute to documentation, versioning, and knowledge-sharing across teams. Drive innovation and continuous improvement in AI/ML delivery and engineering practices. Mandatory Technical Skills: Languages & Tools: Python (Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch) Model Development: Deep Learning, NLP, Time Series, Computer Vision Cloud Platforms: AWS (especially SageMaker) Model Deployment: Docker, Kubernetes, REST APIs ML Ops: Model monitoring, performance logging, CI/CD Frameworks: LangChain (for GenAI), Transformers, Hugging Face Preferred / Good to Have: Experience with Foundation Model tuning and prompt engineering Hands-on with Generative AI (GPT-3/4, OpenAI APIs, LangChain integrations) Certifications: AWS Certified Machine Learning – Specialty Experience with version control (Git), and experiment tracking tools (MLflow, Weights & Biases) Soft Skills: Excellent communication and presentation abilities Strong analytical and problem-solving mindset Ability to work in collaborative, fast-paced environments Curiosity to learn emerging technologies and apply them to real-world problems Show more Show less
Chennai, Tamil Nadu, India
Not disclosed
On-site
Full Time
Impetus Technologies Impetus Technologies is a leading digital engineering company specializing in data, cloud, and AI-driven solutions. Headquartered in Los Gatos, California , with global offices in India, Australia, and Canada , Impetus partners with Fortune 100 clients across banking, airlines, pharmaceuticals, and other industries. Innovative establishments like Gathr, ClearTrail, Intellicus and Kyvos have emerged from the Impetus family. Impetus and its affiliate companies employ over 3700 engineers across continents. The group counts some of the largest companies in the world as its customers, including the biggest banks, airlines, and pharmaceutical companies across the globe. The company is headquartered in Los Gatos, California. Impetus was named Databricks Migration Partner of the Year, 2024 and is also recognized among the best employers in the industry by a variety of national and international organizations, including Great Place to Work, AVTAR and HRD Congress. To know more about us, visit: https://www.impetus.com/about/ QA - Automation testing Skills: QA - Automation testing, Java, API testing, knowledge/exposure to AI/ML will be added advantage Show more Show less
Chennai, Tamil Nadu, India
Not disclosed
On-site
Full Time
Impetus Technologies Impetus Technologies is a leading digital engineering company specializing in data, cloud, and AI-driven solutions. Headquartered in Los Gatos, California , with global offices in India, Australia, and Canada , Impetus partners with Fortune 100 clients across banking, airlines, pharmaceuticals, and other industries. Innovative establishments like Gathr, ClearTrail, Intellicus and Kyvos have emerged from the Impetus family. Impetus and its affiliate companies employ over 3700 engineers across continents. The group counts some of the largest companies in the world as its customers, including the biggest banks, airlines, and pharmaceutical companies across the globe. The company is headquartered in Los Gatos, California. Impetus was named Databricks Migration Partner of the Year, 2024 and is also recognized among the best employers in the industry by a variety of national and international organizations, including Great Place to Work, AVTAR and HRD Congress. To know more about us, visit: https://www.impetus.com/about/ Java Bigdata Developer Job Description Skills required 4 + years of experience in Java + Bigdata Experience in Java, Micorservices ,Sprintboot, API ,Bigdata-Hive, Spark,Pyspark Show more Show less
Chennai, Tamil Nadu, India
Not disclosed
On-site
Full Time
Impetus Technologies Impetus Technologies is a leading digital engineering company specializing in data, cloud, and AI-driven solutions. Headquartered in Los Gatos, California , with global offices in India, Australia, and Canada , Impetus partners with Fortune 100 clients across banking, airlines, pharmaceuticals, and other industries. Innovative establishments like Gathr, ClearTrail, Intellicus and Kyvos have emerged from the Impetus family. Impetus and its affiliate companies employ over 3700 engineers across continents. The group counts some of the largest companies in the world as its customers, including the biggest banks, airlines, and pharmaceutical companies across the globe. The company is headquartered in Los Gatos, California. Impetus was named Databricks Migration Partner of the Year, 2024 and is also recognized among the best employers in the industry by a variety of national and international organizations, including Great Place to Work, AVTAR and HRD Congress. To know more about us, visit: https://www.impetus.com/about/ JD: 4-7 years of experience building large scale applications using Java, Spring framework Strong knowledge and experience on software development methods and performs due diligence in all lifecycle stages of analysis, build and testing. Ability to write good junits and extend code coverage Ability to troubleshoot problems in test and production Strong communication skills and a team player Show more Show less
Chennai, Tamil Nadu, India
Not disclosed
On-site
Full Time
Impetus Technologies Impetus Technologies is a leading digital engineering company specializing in data, cloud, and AI-driven solutions. Headquartered in Los Gatos, California , with global offices in India, Australia, and Canada , Impetus partners with Fortune 100 clients across banking, airlines, pharmaceuticals, and other industries. Innovative establishments like Gathr, ClearTrail, Intellicus and Kyvos have emerged from the Impetus family. Impetus and its affiliate companies employ over 3700 engineers across continents. The group counts some of the largest companies in the world as its customers, including the biggest banks, airlines, and pharmaceutical companies across the globe. The company is headquartered in Los Gatos, California. Impetus was named Databricks Migration Partner of the Year, 2024 and is also recognized among the best employers in the industry by a variety of national and international organizations, including Great Place to Work, AVTAR and HRD Congress. To know more about us, visit: https://www.impetus.com/about/ QA - Automation testing Skills: QA - Automation testing, Java, SQL API testing, knowledge/exposure to AI/ML will be added advantage Show more Show less
Chennai, Tamil Nadu, India
Not disclosed
On-site
Full Time
Impetus Technologies Impetus Technologies is a leading digital engineering company specializing in data, cloud, and AI-driven solutions. Headquartered in Los Gatos, California , with global offices in India, Australia, and Canada , Impetus partners with Fortune 100 clients across banking, airlines, pharmaceuticals, and other industries. Innovative establishments like Gathr, ClearTrail, Intellicus and Kyvos have emerged from the Impetus family. Impetus and its affiliate companies employ over 3700 engineers across continents. The group counts some of the largest companies in the world as its customers, including the biggest banks, airlines, and pharmaceutical companies across the globe. The company is headquartered in Los Gatos, California. Impetus was named Databricks Migration Partner of the Year, 2024 and is also recognized among the best employers in the industry by a variety of national and international organizations, including Great Place to Work, AVTAR and HRD Congress. To know more about us, visit: https://www.impetus.com/about/ JAVA Full Stack Developer (JAVA+React) Skills : Java,React/ReactJs,Redux/ReduxJS,Javascript,SQL,Springboot,Python,Data Analytics Full stack developer with good problem solving skills and competent in Java, React js and SQL as core skills. Ability to learn and adapt to changing Show more Show less
Gurugram, Haryana, India
Not disclosed
On-site
Full Time
We are looking for an experienced AWS Databricks Architect with strong DevOps capabilities to lead the design, development, and deployment of scalable data solutions on the cloud. This role demands deep hands-on knowledge of AWS, Databricks, and CI/CD pipelines, with an emphasis on automation and performance optimization. Key Responsibilities: Architect and implement scalable data solutions using AWS and Databricks. Develop and manage robust CI/CD pipelines for code deployment and infrastructure automation. Work closely with data engineering and DevOps teams to integrate data flows and cloud infrastructure. Ensure high performance, reliability, and security across deployed solutions. Provide technical leadership and mentorship to engineering teams. Required Skills: 8+ years of experience in DevOps/Cloud/Architecture roles. Strong expertise in AWS services (EC2, S3, Lambda, CloudFormation, etc.). Hands-on experience with Databricks (workflows, notebooks, clusters). Proficiency in Terraform, Jenkins, Git, Docker, Kubernetes . Solid understanding of Python/Scala and SQL. Strong communication skills with the ability to work cross-functionally. Nice to Have: Exposure to Azure or GCP cloud environments. Experience with Big Data tools (Spark, Kafka, etc.). AWS or Databricks certifications. Show more Show less
Gurugram, Haryana, India
Not disclosed
On-site
Full Time
About Impetus Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth. Founded in 1991, we are cloud and data engineering leaders providing solutions to fortune 100 enterprises, headquartered in Los Gatos, California, with development centers in NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad with over 3000 global team members. We also have offices in Canada and Australia and collaborate with a number of established companies, including American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon. Job Role- Data Scientist Experience- 3.5+ years Locations- Indore, Bangalore, Pune, Gurgaon, Noida Job Description Hands on experience of working with LLM (Large Language Models) or LangChain and OpenAI in particular. Implementing and fine tuning the AI- generated text prompts using LLMs (eg:-GPT4) Skilled in AI-specific utilities like ChatGPT, Hugging Face Transformers, etc. Ability to understand business requirements. Usecase derivation and solution creation from structured/unstructured data Story telling, Business communication and Documentation Programming Skills – Python, Scikit-Learn, TensorFlow, PyTorch, Keras Exploratory Data Analysis Machine Learning and Deep Learning Algorithms Model building, Hyperparameter tuning and Model performance metrics MLOps, Data Pipeline, Data engineering Statistics Knowledge (Probability Distributions, Hypothesis Testing) Time series modeling, Forecasting, Image/Video Analytics, Natural Language Processing (NLP). ML services from Clouds such as AWS, GCP, Azure and Databricks Optional - Big Data -Basic knowledge on Spark, Hive Roles & Responsibilities Acquire skills required for building Machine learning models and deploy them for production. Feature Engineering, EDA, Pipeline creation, Model training and hyperparameter tuning with structured and unstructured data sets. Develop Cloud based applications including LLM/GenAI and deploy them into production. Qualification Degree – Graduates/Postgraduate in CSE/IT or related field Show more Show less
Hyderabad, Telangana, India
Not disclosed
On-site
Full Time
About Impetus Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth. Founded in 1991, we are cloud and data engineering leaders providing solutions to fortune 100 enterprises, headquartered in Los Gatos, California, with development centers in NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad with over 3000 global team members. We also have offices in Canada and Australia and collaborate with a number of established companies, including American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon. Locations- Bangalore/Hyderabad/Noida Job Overview: We are seeking an experienced and highly skilled AWS Solutions Architect to join our team. The ideal candidate will have a strong background in designing, deploying, and managing scalable, secure, and cost-effective cloud architectures using Amazon Web Services (AWS). This role requires a deep understanding of AWS services, best practices, and a commitment to driving the organization's cloud strategy forward. Key Responsibilities: Cloud Architecture Design: Design scalable, high-availability, and fault-tolerant cloud architectures using AWS services like EC2, S3, Lambda, RDS, VPC, etc. Solution Implementation: Work with development and operations teams to ensure AWS architecture is implemented, deployed, and optimized according to best practices. Cloud Migration: Lead cloud migration projects from on-premises infrastructure to AWS, ensuring smooth transitions with minimal disruptions. Security & Compliance: Ensure that cloud solutions adhere to industry standards and compliance requirements (e.g., GDPR, HIPAA, etc.). Implement robust security measures like encryption, IAM, and multi-factor authentication. Cost Optimization: Continuously monitor and optimize AWS environments to reduce costs and improve efficiency, using tools such as AWS Cost Explorer, Trusted Advisor, and CloudWatch. Technical Leadership: Provide technical leadership, guidance, and mentorship to junior team members, fostering a collaborative and innovative environment. Documentation & Reporting: Develop and maintain comprehensive architecture documentation and regular reports on system performance, security, and cost efficiency. Required Skills & Qualifications: Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience). Minimum of 11 years of experience in cloud architecture, specifically with AWS. Expertise in core AWS services, including EC2, Lambda, S3, VPC, RDS, and others. Hands-on experience with AWS management tools (CloudFormation, CloudWatch, etc.). Solid understanding of networking, security, and storage concepts in AWS. Candidate should be able to lead the discussion with non-technical business stakeholders, understand the business domain and business challenges, and come up with technical solution/architecture to systematically cater various business needs. Having experience in implementing Customer 360 solution is a big plus. Experience with Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or AWS CDK. Familiarity with DevOps practices and CI/CD pipelines. Strong troubleshooting and problem-solving skills. Excellent communication skills, both written and verbal, with the ability to communicate complex technical concepts to non-technical stakeholders. Preferred Qualifications: AWS Certified Solutions Architect – Associate or Professional. Experience with multi-cloud environments (e.g., Azure, Google Cloud). Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes). Experience with serverless architectures and microservices. Show more Show less
Bengaluru, Karnataka, India
Not disclosed
On-site
Full Time
About Impetus Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth. Founded in 1991, we are cloud and data engineering leaders providing solutions to fortune 100 enterprises, headquartered in Los Gatos, California, with development centers in NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad with over 3000 global team members. We also have offices in Canada and collaborate with a number of established companies, including American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon. Experience- 3-8 years Location- Gurgaon & Bangalore Job Description You should have extensive production experience in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application development is a must Roles & Responsibilities Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos. Ability to drive the deployment of the customers’ workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for GCP cloud implementations. Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams. Show more Show less
Noida, Uttar Pradesh, India
Not disclosed
On-site
Full Time
Job Description Design / Implement solutions using Tableau and/or any of other leading BI tools Work closely with the business analysts / BI lead to help business teams drive improvement in key business metrics and customer experience Responsible for timely, quality, and successful deliveries Sharing knowledge and experience within the team / other groups in the org Roles & Responsibilities Experience in Tableau; along with knowledge of any of the other leading BI tool such as Power BI, MicroStrategy, Qlik Sense, Looker Experience working with high volume databases and MPPs Experience with the preparation of data (e.g., data profiling, data cleansing, volume assessment and partitioning, etc.) Good skills in databases (Oracle / MySQL / DB2) and expertise in writing SQL queries Very well versed in HiveQL / Spark SQL / Impala SQL Working knowledge of scripting languages like Perl, Shell, Python is desirable Hands-on knowledge of ETL tool Hands-on knowledge of enterprise repository tools, data modeling tools, data mapping tools, and data profiling tools Experience of working in BI platform migration project(s) Good understanding of business to build the formulae and calculations to recreate or migrate existing reports and dashboards Experience in customizing/extending the default functionalities of BI tools Experience of working in multiple business domains (e.g. BFSI, healthcare, telecom) is desirable Experience with agile-based development Outstanding communication, problem-solving and interpersonal skills Self-starter and resourceful, skilled in identifying and mitigating risks Show more Show less
Gurugram, Haryana, India
Not disclosed
On-site
Full Time
Job Description Good understanding of program logic with Python. Proficiency in Automation testing Using Selenium WebDriver/Robot Framework Have significant exposure and experience in SQL queries and Database Testing. Experience in API Testing/ Web Service (WCF services, Web API) Familiarity with Agile frameworks/Methodology Ability to create test plan and test cases. Ability to document and troubleshoot defects. Strong problem solving and troubleshooting skills. Person should have strong QA fundamentals. Knowledge of Cloud would be an added advantage. Excellent team player and self-motivator Roles & Responsibilities Establish QA related procedures and metrics to measure product quality Bridge between Development and Test Engineering teams Define the test framework and architecture as per the client requirements Roll out Best Practices for Test Engineering for optimum solution for product testing Guide and help in resolution of team technical and personal issues Provide technical support and technical quality control throughout all stages of the project First a role instead of a job Build smarter, more efficient, productive, self-driven and motivated teams Perform effective Team and People Management Show more Show less
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Location: Pune / Indore / Bangalore / Noida Job Title: Big Data Engineer Job Description: Key Technical Skills (Must-Have & Good-to-Have): 2-4 years of professional experience Expertise and hands-on experience with Python – Must Have In-depth knowledge of SparkQL / Spark DataFrame – Must Have Strong understanding of SQL – Good to Have Experience with Shell scripting – Good to Have Knowledge of workflow engines such as Oozie, Autosys – Good to Have Familiarity with Agile development methodologies – Good to Have Understanding of Cloud technologies – Good to Have Passion for exploring new technologies – Good to Have Approach to Automation – Good to Have Roles & Responsibilities: The selected candidate will be responsible for Data Warehouse modernization projects, including: Developing programs/scripts using Python/Java combined with SparkSQL/Spark DataFrame or Python/Java + Cloud-native SQL (e.g., RedshiftSQL, SnowSQL). Script validation and ensuring optimal performance. Performance tuning of data processes. Data ingestion from source to target platforms. Job orchestration. Experience Required: 2 to 4 years Education/Qualification: BE / B.Tech / MCA / M.Tech / M.Com Show more Show less
Noida, Uttar Pradesh, India
Not disclosed
On-site
Full Time
At Impetus Technologies, we are a technology solutions company that thrives on innovation and excellence. Our team is dedicated to providing top-notch services to our clients and ensuring a smooth, efficient operation at all times. As a Senior Executive - Admin, you will play a crucial role in maintaining the seamless flow of our transportation and administrative operations. Key Responsibilities: - Oversee transportation operations, including scheduling, routing, and ensuring timely delivery of goods and services. - Manage and maintain fleet vehicles, including coordinating repairs, inspections, and registration renewals. - Develop and implement transportation policies and procedures to ensure compliance with regulatory standards. - Supervise and train transportation staff to ensure efficient and safe operations. - Handle administrative tasks such as managing office supplies, coordinating travel arrangements, and overseeing office maintenance. - Supervise administrative staff and ensure smooth day-to-day office operations. - Assist in budget planning and control for transportation and administrative expenses. Qualifications: - Bachelor's degree in business administration, logistics, or a related field. - Proven experience in transportation management and administrative roles. - Strong organizational and leadership skills. - Excellent communication and interpersonal abilities. - Proficiency in MS Office and transportation management software. If you possess the skills and experience required for this position and are seeking a challenging and rewarding career, we encourage you to apply for the Senior Executive - Admin role at Impetus Technologies. Join us in our mission to drive excellence and innovation in all aspects of our operations. We look forward to welcoming you to our team. Show more Show less
Pune, Maharashtra, India
Not disclosed
On-site
Full Time
Job Description: Location: Indore, Noida, Pune and Bengaluru Qualifications: BE/B.Tech/MCA/M.Tech/M.Com in Computer Science or related field Required Skills: EDW Expertise: Hands-on experience with Teradata or Oracle. PL/SQL Proficiency: Strong ability to write complex queries. Performance Tuning: Expertise in optimizing queries to meet SLA requirements. Communication: Strong verbal and written communication skills. Experience Required (1-3 Years) Preferred Skills: Cloud Technologies: Working knowledge of AWS S3 and Redshift or equivalent. Database Migration: Familiarity with database migration processes. Big Data Tools: Understanding of SparkQL, and PySpark. Programming: Experience with Python for data processing and analytics. Data Management: Experience with import/export operations. Roles & Responsibilities Module Ownership: Manage a module and assist the team. Optimized PL/SQL Development: Write efficient queries. Performance Tuning: Improve database speed and efficiency. Requirement Analysis: Work with business users to refine needs. Application Development: Build solutions using complex SQL queries. Data Validation: Ensure integrity of large datasets (TB/PB). Testing & Debugging: Conduct unit testing and fix issues. Database Strategies: Apply best practices for development. Interested candidates can share their resumes at anubhav.pathania@impetus.com Show more Show less
Bengaluru, Karnataka, India
Not disclosed
On-site
Full Time
Job Description: We are seeking an experienced Engineer with strong expertise in PostgreSQL, PL/SQL programming, and cloud-based data migration. The ideal candidate will have hands-on experience in migrating and tuning databases, particularly from Oracle to PostgreSQL on GCP (AlloyDB / Cloud SQL), and be skilled in modern data architecture and cloud services. Locations - Indore/Bengaluru/Noida Key Responsibilities Design, build, test, and maintain scalable data architectures on GCP. Lead Oracle to PostgreSQL data migration initiatives (preferably AlloyDB / Cloud SQL). Optimize PostgreSQL performance (e.g., tuning autovacuum, stored procedures). Translate Oracle PL/SQL code to PostgreSQL equivalents. Integrate hybrid data storage using GCP services (BigQuery, Firestore, MemoryStore, Spanner). Implement database job scheduling, disaster recovery, and logging. Work with GCP Dataflow, MongoDB, and data migration services. Mentor and lead database engineering teams. Required Technical Skills Advanced PostgreSQL & PL/SQL programming (queries, procedures, functions). Strong experience with database migration (Oracle ➝ PostgreSQL on GCP). Proficient in Cloud SQL, AlloyDB, and performance tuning. Hands-on experience with BigQuery, Firestore, Spanner, MemoryStore, MongoDB, Cloud Dataflow. Understanding of OLTP and OLAP systems. Desirable Qualifications GCP Database Engineer Certification Exposure to Enterprise Architecture, Project Delivery, and Performance Benchmarking Strong analytical, problem-solving, and leadership skills. Years Of Experience- 7 to 10 Years Education/Qualification- BE / B.Tech / MCA / M.Tech / M.Com Interested candidates can directly share their resume at anubhav.pathania@impetus.com Show more Show less
Greater Bengaluru Area
Not disclosed
On-site
Full Time
We are looking for skilled ETL pipeline support engineer to join DevOps team. In this role, you will be ensuring the smooth operation of PROD ETL pipelines. Also responsible for monitoring, troubleshooting existing pipelines. This role requires a strong understanding of SQL, Spark, and experience with AWS Glue and Redshift . Required Skills and Experience: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience in supporting and maintaining ETL pipelines. Strong proficiency in SQL and experience with relational databases (e.g., Redshift). Solid understanding of distributed computing concepts and experience with Apache Spark. Hands-on experience with AWS Glue and other AWS data services (e.g., S3, Lambda). Experience with data warehousing concepts and best practices. Excellent problem-solving, analytical skills and strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Skills and Experience: Experience with other ETL tools and technologies Experience with scripting languages (e.g., Python). Familiarity with Agile development methodologies. Experience with data visualization tools (e.g., Tableau, Power BI). Roles & Responsibilities Monitor and maintain existing ETL pipelines, ensuring data quality and availability. identify and resolve pipeline issues and data errors. Troubleshoot data integration processes. If needed, collaborate with data engineers and other stakeholders to resolve complex issues Develop and maintain necessary documentation for ETL processes and pipelines. Participate in on-call rotation for production support. Show more Show less
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.