Jobs
Interviews

921 Sqoop Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

9 - 13 Lacs

Ahmedabad

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to design and implement data platform solutions.- Develop and maintain data pipelines for efficient data processing.- Implement data security and privacy measures to protect sensitive information.- Optimize data storage and retrieval processes for improved performance.- Conduct regular data platform performance monitoring and troubleshooting. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of cloud-based data platforms.- Experience with data modeling and database design.- Hands-on experience with ETL processes and tools.- Knowledge of data governance and compliance standards. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Ahmedabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Skill required: Network Billing Operations - Problem Management Designation: Network & Svcs Operation Analyst Qualifications: Any Graduation Years of Experience: 3 to 5 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Helps transform back office and network operations, reduce time to market and grow revenue, by improving customer experience and capex efficiency, and reducing cost-to-serveGood Customer Support Experience preferred with good networking knowledgeManage problems caused by information technology infrastructure errors to minimize their adverse impact on business and to prevent their recurrence by seeking the root cause of those incidents and initiating actions to improve or correct the situation. What are we looking for 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage Other skillsMust be self-motivated and understand short turnaround expectations Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Roles and Responsibilities: In this role you are required to do analysis and solving of lower-complexity problems Your day to day interaction is with peers within Accenture before updating supervisors In this role you may have limited exposure with clients and/or Accenture management You will be given moderate level instruction on daily work tasks and detailed instructions on new assignments The decisions you make impact your own work and may impact the work of others You will be an individual contributor as a part of a team, with a focused scope of work Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 4 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Contract Exp: 5-10yrs Location : Pune, Chennai, Hyderabad Type: Third Party Payroll Mode: Office ( Strictly no WFH,hybrid) Interview: F2F mandate -2nd round Required Details: Total Experience Relevant Experience Current Company: Designation: Current CTC Expected CTC Notice Period: Current Location Expected Location: Offer In hand: PAN Number: DOB: Reason for Job Change: Education Passed out Percentage Highest Degree Interview Availability Any 3 slots UAN Number: reach out :nithya@natobotics.com Pyspark, Snowflake Developer

Posted 4 weeks ago

Apply

4.0 - 9.0 years

7 - 11 Lacs

Pune

Work from Office

What You'll Do The Global Analytics & Insights (GAI) team is looking for a Data Engineer to help build of the data infrastructure for Avalara's core data assets- empowering the organization with accurate, timely data to drive data backed decisions. As a Data Engineer, you will help implement and maintain our data infrastructure using Snowflake, dbt (Data Build Tool), Python, Terraform, and Airflow. You will learn the ins and outs of Avalara's financial, sales, and marketing data to become a go-to resource of Avalara knowledge. You will have foundational SQL experience, an understanding of modern data stacks and technology, a desire to build things the right way using modern software principles, and experience with data and all things data-related. What Your Responsibilities Will Be Design functional data models by demonstrating understanding of business use cases and different data sources Develop scalable, reliable, and efficient data pipelines using dbt, Python, or other ELT tools Build scalable, complex dbt models to support a variety of data products Implement and maintain scalable data orchestration and transformation, ensuring data accuracy, consistency, and timeliness Collaborate with cross-functional teams to understand complex requirements and translate them into technical solutions\ you will report to the Senior Manager, Data & Analytics Engineering What You'll Need to be Successful Bachelor's degree in Computer Science or Engineering, or related field 4+ years experience in data engineering field, with deep SQL knowledge 3+ years of working with Git, and demonstrated experience collaborating with other engineers across repositories 2+ years of working with Snowflake 2+ years working with dbt (dbt core preferred) Experience working with complex Salesforce data Functional experience with AWS Functional experience with Infrastructure as Code, preferably Terraform Functional experience with CI CD and DevOps concepts

Posted 4 weeks ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

At Medtronic you can begin a life-long career of exploration and innovation, while helping champion healthcare access and equity for all. You’ll lead with purpose, breaking down barriers to innovation in a more connected, compassionate world. A Day in the Life Careers that Change Lives At Medtronic, we push the limits of technology to make tomorrow better than today, which makes it an exciting and rewarding place to work. We value what makes you unique. Be a part of a company that thinks differently to solve problems, make progress, and deliver meaningful innovations. As a Data Engineer II, you will be a part of our data engineering team responsible for Developing, deploying, monitoring, and supporting the data mart platform. In addition, you will be responsible for creating tools and automating operational tasks to integrate the data platform with external systems. Your entrepreneurial mindset and technical skills will be used, creating solutions that meet business needs and optimize customer experience directly impacting the organization and affecting the lives of millions. We believe that when people from diverse cultures, genders, and points of view come together, innovation is the result —and everyone wins. Medtronic walks the walk, creating an inclusive culture where you can thrive. A DAY IN THE LIFE: In general, the following responsibilities apply for the Data Engineer II role. This includes, but is not limited to the following: Work effectively within a geographically dispersed and cross-functional teams during all phases of the product development process. Must be responsive, flexible, self-motivated and able to succeed within an open collaborative peer environment Participates in reviews, code inspections and will support the development of documentation required. Be Agile and effectively navigate through changing project priorities. Work independently under limited supervision. Setup proactive monitoring and alerting Troubleshoot production issues Qualifications- MUST HAVE - MINIMUM REQUIREMENTS: TO BE CONSIDERED FOR THIS ROLE, PLEASE BE SURE THE MINIMUM REQUIREMENTS ARE EVIDENT ON YOUR RESUME Overall, 4-7 years of IT experience with Bachelor’s degree in computer engineering, Software Engineering, Computer Science, Electrical Engineering, or related technical field. Minimum 3 years of relevant experience in Data Engineering Minimum of 2 years of working experience in PySpark, and other data processing tools like Hive, Sqoop, etc. Minimum 1 year of experience in AWS and AWS native tools S3, Glue, Lambda, EMR , Athena Minimum 1 years of Hands-on experience with programming languages such as Python Strong Expertise in writing SQL Queries. Source Control systems Git/GitHub experience Strong problem-solving skills Experience in writing unit test and developing data quality frameworks Strong written and verbal communication & presentation skills Nice to Have Previous healthcare industry experience a plus Experience working with CI/CD tools preferrable Azure Pipelines & Terraform AWS certifications (AWS Developer /AWS Data Engineer) Working experience in any reporting tool like PowerBI. Physical Job Requirements The above statements are intended to describe the general nature and level of work being performed by employees assigned to this position, but they are not an exhaustive list of all the required responsibilities and skills of this position. Benefits & Compensation Medtronic offers a competitive Salary and flexible Benefits Package A commitment to our employees lives at the core of our values. We recognize their contributions. They share in the success they help to create. We offer a wide range of benefits, resources, and competitive compensation plans designed to support you at every career and life stage. About Medtronic We lead global healthcare technology and boldly attack the most challenging health problems facing humanity by searching out and finding solutions. Our Mission — to alleviate pain, restore health, and extend life — unites a global team of 95,000+ passionate people. We are engineers at heart— putting ambitious ideas to work to generate real solutions for real people. From the R&D lab, to the factory floor, to the conference room, every one of us experiments, creates, builds, improves and solves. We have the talent, diverse perspectives, and guts to engineer the extraordinary. Learn more about our business, mission, and our commitment to diversity here

Posted 4 weeks ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Pune

Work from Office

Description: Hiring Data Engineer with AWS or GCP Cloud Requirements: Role Summary: The Data Engineer will be responsible for designing, implementing, and maintaining the data infrastructure and pipelines necessary for AI/ML model training and deployment. They will work closely with data scientists and engineers to ensure data is clean, accessible, and efficiently processed Required Experience: • 6-8 years of experience in data engineering, ideally in financial services. • Strong proficiency in SQL, Python, and big data technologies (e.g., Hadoop, Spark). • Experience with cloud platforms (e.g., AWS, Azure, GCP) and data warehousing solutions. • Familiarity with ETL processes and tools. • Knowledge of data governance, security, and compliance best practices. Job Responsibilities: Key Responsibilities: • Build and maintain scalable data pipelines for data collection, processing, and analysis. • Ensure data quality and consistency for training and testing AI models. • Collaborate with data scientists and AI engineers to provide the required data for model development. • Optimize data storage and retrieval to support AI-driven applications. • Implement data governance practices to ensure compliance and security. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 4 weeks ago

Apply

5.0 - 10.0 years

15 - 27 Lacs

Bengaluru

Work from Office

Job Summary We are seeking a highly motivated Senior Data Engineer with expertise in designing, building, and securing data systems. The ideal candidate will have a strong background in data engineering, security compliance, and distributed systems, with a focus on ensuring adherence to industry standards and regulatory requirements. Location: Bangalore Experience: 4 to 13 Years Must Have: Informatica BDM, Oozie Scheduling, Hive, HDFS Key Responsibilities Design, implement, and maintain secure data systems, including wrapper solutions for components with minimal security controls, ensuring compliance with bank standards. Identify security design gaps in existing and proposed architectures and recommend enhancements to strengthen system resilience. Develop and enforce security controls for data transfers, including CRON, ETLs, and JDBC-ODBC scripts. Ensure compliance with data sensitivity standards, such as avoiding storage of card numbers or PII in logs, and maintaining data integrity. Collaborate on distributed systems, focusing on resiliency, monitoring, and troubleshooting in production environments. Work with Agile/DevOps practices, CI/CD pipelines (GitHub, Jenkins), and scripting tools to optimize data workflows. Troubleshoot and resolve issues in large-scale data infrastructures, including SQL/NoSQL databases, HDFS, Hive, and HQL. Requirements -5+ years of total experience, with4+ years in Informatica Big Data Management. Extensive knowledge of Oozie scheduling, HQL, Hive, HDFS, and data partitioning. Proficiency in SQL and NoSQL databases, along with Linux OS configuration and shell scripting. Strong understanding of networking concepts (DNS, Proxy, ACL, Policy) and data transfer security. In-depth knowledge of compliance and regulatory requirements (encryption, anonymization, policy controls). Familiarity with Agile/DevOps, CI/CD, and distributed systems monitoring. Ability to address data sensitivity concerns in logging, events, and in-memory storage. About Us For a customer in the banking sector with financial services requirements, we worked on Informatica Big Data Management, Oozie, Hive, and security compliance frameworks. Contact [dlt] and [slt] for more details.

Posted 4 weeks ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: AI Engineer Experience: Minimum 2 years CTC: 3 LPA - 6 LPA Location: Hinjewadi, Pune Work Mode: Work from Office Availability: Immediate Joiner About Us: Rasta.AI, a product of AI Unika Technologies (P) Ltd, is a pioneering technology company based in Pune. We specialize in road infrastructure monitoring and maintenance using cutting-edge AI, computer vision, and 360-degree imaging. Our platform delivers real-time insights into road conditions to improve safety, efficiency, and sustainability. We collaborate with government agencies, private enterprises, and citizens to enhance road management through innovative tools and solutions. Role Description: This is a full-time, on-site role. As an AI Engineer, you will be responsible for developing innovative AI models and software solutions to address real-world challenges. You will collaborate with cross-functional teams to identify business opportunities and provide customized solutions. You will also work alongside talented engineers, designers, and data scientists to implement and maintain these models and solutions. Technical Skills Required: Programming Languages: Python (and other AI-supported languages) Generative AI: LLMs (e.g., GPT, LLaMA), LangChain, Hugging Face Transformers, OpenAI API Databases: SQL, MongoDB Vector Databases: FAISS, Pinecone, Weaviate, ChromaDB Python Libraries: NumPy, Pandas, Scikit-learn, Streamlit Deep Neural Networks: CNN, RNN, and LLM Data Analysis Libraries: TensorFlow, Pandas, NumPy, Scikit-learn, Matplotlib, Tensor Board Frameworks: Flask, Django Operating Systems: Ubuntu, Windows Tools: Jupyter Notebook, PyCharm IDE, Excel, Roboflow Big Data (Bonus): Hadoop (Hive, Sqoop, Flume), Kafka, Spark Code Repository Tools: Git, GitHub DevOps-AWS: Docker, Kubernetes, Instance hosting and management Analytical Skills Exploratory Data Analysis Predictive Modeling Text Mining Natural Language Processing Machine Learning Image Processing Object Detection Instance Segmentation Deep Learning DevOps AWS Knowledge Expertise Proficiency in TensorFlow library with RNN and CNN Familiarity with pre-trained models like VGG-16, ResNet-50, and Mobile Net Knowledge of Spark Core, Spark SQL, Spark Streaming, Cassandra, and Kafka Designing and Architecting Hadoop Applications Experience with chat-bot platforms (a bonus) Responsibilities The entire lifecycle of model development Data Collection and Preprocessing Model Development Model Training Model Testing Model Validation Deployment and Maintenance Collaboration and Communication Qualifications Bachelor's or Master's degree in a relevant field (AI, Data Science, Computer Science, etc.) Minimum 2 years of experience developing and deploying AI-based software products Strong programming skills in Python (and potentially C++ or Java) Experience with machine learning libraries (TensorFlow, PyTorch, Kera's, scikit-learn) Experience with Generative AI (e.g., LLMs, prompt engineering, RAG pipelines using tools like LangChain or Hugging Face) Experience with Vector Databases (e.g., FAISS, Pinecone, Weaviate) for semantic search and retrieval-augmented generation Experience with computer vision, natural language processing, or recommendation systems Experience with cloud computing platforms (Google Cloud, AWS) Problem-solving skills Excellent communication and presentation skills Experience with data infrastructure and tools (SQL, NoSQL, and big data platforms) Teamwork skills Join Us! If you are passionate about AI and want to contribute to groundbreaking projects in a dynamic startup environment, we encourage you to apply! Be part of our mission to drive technological advancement in India. Drop Your CV - hr@aiunika.com

Posted 4 weeks ago

Apply

4.0 - 9.0 years

7 - 8 Lacs

Bengaluru

Work from Office

Hiring for Snowflake Developer for Bangalore/Chennai The individual must have 4+ years of diversified experience developing applications using java languagewith 1+ years of experience in Big data technologies such as Spark, Kafka/Spark Streaming, HBase, Hive,Oozie, Knox, Hadoop (Hortonworks) and related eco system.Mandatorty skills: Core Java, Spark, Kafka, Hive, SQLRelevant Experience in BIG DataStrong Development ExperienceStrong experience in SPARK SCALA HIVEStrong experience in HIVEExperience in Hadoop DevelopmentExperience in Data loading Tools like SQOOPKnowledge of workflow/ Schedulers like Oozie, AirflowProven understanding with Hadoop, HBase, SQOOPExperience in AWS or Azure Good to have skills Spark, Scala, Oozie, HIVE, Shell script, Jenkins, Ansible, Github, Nifi, Elastic, KIBANA, GRAFANA, Kafka.Exp in KAFKAGood experience in Talend ETL toolExperience in creating and consuming RESTful web services.Knowledge with building stream-processing systems using solutions such as Spark-Streaming, KafkaKnowledge of principles and components of Big Data processing and analytics highly desired.

Posted 1 month ago

Apply

12.0 years

8 - 10 Lacs

Chennai

On-site

Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities, and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* The Analytics and Intelligence Engine (AIE) team transforms analytical and operational data into Consumer and Wealth Client insights and enables personalization opportunities that are provided to Associate and Customer-facing operational applications. The Big data technologies used in this are Hadoop /PySpark / Scala, HQL as ETL, Unix as file Landing environment, and real time (or near real time) streaming applications. Job Description* We are actively seeking a talented and motivated Senior Hadoop Developer/ Lead to join our dynamic and energetic team. As a key contributor to our agile scrum teams, you will collaborate closely with the Insights division. We are looking for a candidate who can showcase strong technical expertise in Hadoop and related technologies, and who excels at collaborating with both onshore and offshore team members. The role requires both hands-on coding and collaboration with stakeholders to drive strategic design decisions. While functioning as an individual contributor for one or more teams, the Senior Hadoop Data Engineer may also have the opportunity to lead and take responsibility for end-to-end solution design and delivery, based on the scale of implementation and required skillsets. Responsibilities* Develop high-performance and scalable solutions for Insights, using the Big Data platform to facilitate the collection, storage, and analysis of massive data sets from multiple channels. Utilize your in-depth knowledge of Hadoop stack and storage technologies, including HDFS, Spark, Scala, MapReduce, Yarn, Hive, Sqoop, Impala, Hue, and Oozie, to design and optimize data processing workflows. Implement Near real-time and Streaming data solutions to provide up-to-date information to millions of Bank customers using Spark Streaming, Kafka. Collaborate with cross-functional teams to identify system bottlenecks, benchmark performance, and propose innovative solutions to enhance system efficiency. Take ownership of defining Big Data strategies and roadmaps for the Enterprise, aligning them with business objectives. Apply your expertise in NoSQL technologies like MongoDB, SingleStore, or HBase to efficiently handle diverse data types and storage requirements. Stay abreast of emerging technologies and industry trends related to Big Data, continuously evaluating new tools and frameworks for potential integration. Provide guidance and mentorship to junior teammates. Requirements* Education* Graduation / Post Graduation: BE/B.Tech/MCA Certifications If Any: NA Experience Range* 12+ Years Foundational Skills* Minimum of 12 years of industry experience, with at least 10 years focused on hands-on work in the Big Data domain. Highly skilled in Hadoop stack technologies, such as HDFS, Spark, Hive, Yarn, Sqoop, Impala and Hue. Strong proficiency in programming languages such as Python, Scala, and Bash/Shell Scripting. Excellent problem-solving abilities and the capability to deliver effective solutions for business-critical applications. Strong command of Visual Analytics Tools, with a focus on Tableau. Desired Skills* Experience in Real-time streaming technologies like Spark Streaming, Kafka, Flink, or Storm. Proficiency in NoSQL technologies like HBase, MongoDB, SingleStore, etc. Familiarity with Cloud Technologies such as Azure, AWS, or GCP. Working knowledge of machine learning algorithms, statistical analysis, and programming languages (Python or R) to conduct data analysis and develop predictive models to uncover valuable patterns and trends. Proficiency in Data Integration and Data Security within the Hadoop ecosystem, including knowledge of Kerberos. Work Timings* 12:00 PM to 09.00 PM IST. Job Location* Chennai

Posted 1 month ago

Apply

0.0 - 3.0 years

15 - 27 Lacs

Bengaluru, Karnataka

On-site

5-8 years of software development experience with Hadoop framework components(HDFS, Spark, PySpark, Sqoop, Hive, HQL, Spark, Scala) 4+ years of experience using Python, SQL and shell scripting Experience in developing and tuning spark applications Excellent understanding of spark architecture, data frames and tuning spark Strong knowledge of database concepts, systems architecture, and data structures is a must Qualification- BE/BTech in IT/ BSC-IT, MCA (Full time) or related qualification Job Types: Full-time, Permanent Pay: ₹1,500,000.00 - ₹2,700,000.00 per year Location Type: In-person Schedule: Day shift Fixed shift Experience: Python: 4 years (Required) Hadoop: 3 years (Required) Spark Architecture: 3 years (Required) Location: Bangalore, Karnataka (Required) Work Location: In person

Posted 1 month ago

Apply

4.0 - 6.0 years

7 - 8 Lacs

Bengaluru

Work from Office

Diverse Lynx is looking for Snowflake Developer to join our dynamic team and embark on a rewarding career journey Design and develop data solutions within the Snowflake cloud data platform, including data warehousing, data lake, and data modeling solutions Participate in the design and implementation of data migration strategies Ensure the quality of custom solutions through the implementation of appropriate testing and debugging procedures Provide technical support and troubleshoot issues as needed Stay up-to-date with the latest developments in the Snowflake platform and data warehousing technologies Contribute to the ongoing improvement of development processes and best practices

Posted 1 month ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeComprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues and deploy applications to the cloud platform

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

We are seeking an experienced AIX Graphics Device Driver Developer to design, implement, and maintain 2D and 3D graphic device drivers for the IBM AIX operating system. This role requires deep knowledge of AIX/Linux device driver architecture, graphics hardware interaction, and kernel programming on AIX. Key Responsibilities * Design, develop, and maintain graphics device drivers for AIX on IBM Power Systems. * Port and adapt open-source or proprietary graphics stacks (e.g., X11, GLX, OpenGL components) to AIX. * Integrate and debug graphics drivers across kernel and user-space interfaces. * Work with low-level graphics subsystems, including framebuffer, DRM (Direct Rendering Manager), and X Server extensions. * Enable and optimize support for GPU hardware * Analyze and resolve system-level issues, kernel crashes, or graphics anomalies reported by QA or customer support. * Participate in system bring-up, debugging graphics acceleration and mode setting issues. * Contribute to documentation and tooling to improve diagnostics and driver deployment on AIX. Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise * 5+years of experience in C programming and kernel-mode driver development. * Deep knowledge of AIX and Linux kernel internals, device driver framework, and ODM (Object Data Manager). * Experience with X Window System, X11 device driver model, Xorg, GLX, and OpenGL/Mesa stack. * Experience working with graphics hardware (e.g., Radeon, NVIDIA, or IBM graphics). * Familiarity with framebuffer, KMS (Kernel Mode Setting), and DRM subsystem concepts. * Strong debugging and profiling skills using kdb, truss, snap, crash, and errpt. * Comfortable with hardware programming concepts such as MMIO, PCIe, interrupt handling, and DMA. * Ability to read and interpret datasheets and hardware specs

Posted 1 month ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Gurugram

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat In your role, you will be responsible for Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core TechnologiesOSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala, Python HBase, Hive Good to have Aws -S3, Athena, Dynamo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark Data Frames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Mumbai

Work from Office

Role Overview: Lead the architectural design and implementation of a secure, scalable Cloudera-based Data Lakehouse for one of India’s top public sector banks. Key Responsibilities: * Design end-to-end Lakehouse architecture on Cloudera * Define data ingestion, processing, storage, and consumption layers * Guide data modeling, governance, lineage, and security best practices * Define migration roadmap from existing DWH to CDP * Lead reviews with client stakeholders and engineering teams Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proven experience with Cloudera CDP, Spark, Hive, HDFS, Iceberg * Deep understanding of Lakehouse patterns and data mesh principles * Familiarity with data governance tools (e.g., Apache Atlas, Collibra) * Banking/FSI domain knowledge highly desirable.

Posted 1 month ago

Apply

16.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

About Us Zycus is a pioneer in Cognitive Procurement software and has been a trusted partner of choice for large global enterprises for two decades. Zycus has been consistently recognized by Gartner, Forrester, and other analysts for its Source to Pay integrated suite. Zycus powers its S2P software with the revolutionary Merlin AI Suite. Merlin AI takes over the tactical tasks and empowers procurement and AP officers to focus on strategic projects; offers data-driven actionable insights for quicker and smarter decisions, and its conversational AI offers a B2C type user-experience to the end-users. Zycus helps enterprises drive real savings, reduce risks, and boost compliance, and its seamless, intuitive, and easy-to-use user interface ensures high adoption and value across the organization. Start your #CognitiveProcurement journey with us, as you are #MeantforMore We Are An Equal Opportunity Employer Zycus is committed to providing equal opportunities in employment and creating an inclusive work environment. We do not discriminate against applicants on the basis of race, color, religion, gender, sexual orientation, national origin, age, disability, or any other legally protected characteristic. All hiring decisions will be based solely on qualifications, skills, and experience relevant to the job requirements. Job Description We are looking for a Director of Engineering to lead one of our key product engineering teams. This role will report directly to the VP of Engineering and will be responsible for successful execution of the company's business mission through development of cutting-edge software products and solutions. As an owner of the product you will be required to plan and execute the product road map and provide technical leadership to the engineering team. You will have to collaborate with Product Management and Implementation teams and build a commercially successful product. You will be responsible to recruit & lead a team of highly skilled software engineers and provide strong hands onengineering leadership. Requirement deep technical knowledge in Software Product Engineering using Java/J2EE, Node.js, React.js, fullstack, NosqlDB, mongodb, cassandra, neo4j, elastic search, kibana, elk, kafka, redis, , kubernetes, apache, solr, activemq, rabbitmq, spark, scala, sqoop, hbase, hive, websocket, webcrawler, springboot , etc. is a must. Job Requirement 16+ years of experience in Software Engineering with at least 5+ years as an engineering leader in a software product company. Hands-on technical leadership with proven ability to recruit high performance talent High technical credibility - ability to audit technical decisions and push for the best solution to a problem. Experience building E2E Application right from backend database to persistent layer. Experience UI technologies Angular, react.js, Node.js or fullstack environment will be preferred. Experience with NoSQL technologies (MongoDB, Cassandra, Neo4j, Dynamodb, etc.) Elastic Search, Kibana, ELK, Logstash. Experience in developing Enterprise Software using Agile Methodology. Good understanding of Kafka, Redis, ActiveMQ, RabbitMQ, Solr etc. SaaS cloud-based platform exposure. Ownership E2E design development and also quality enterprise product/application deliverable exposure. A track record of setting and achieving high standards. Strong understanding of modern technology architecture. Key Programming Skills: Java, J2EE with cutting edge technologies. Excellent team building, mentoring and coaching skills are a must-have. Five Reasons Why You Should Join Zycus Cloud Product Company: We are a Cloud SaaS Company and our products are created by using the latest technologies like ML and AI. Our UI is in Angular JS and we are developing our mobile apps using React. A Market Leader: Zycus is recognized by Gartner (worlds leading market research analyst) as a Leader in Procurement Software Suites. Move between Roles: We believe that change leads to growth and therefore we allow our employees to shift careers and move to different roles and functions within the organization Get a Global Exposure: You get to work and deal with our global customers. Create an Impact: Zycus gives you the environment to create an impact on the product and transform your ideas into reality. Even our junior engineers get the opportunity to work on different product features.

Posted 1 month ago

Apply

16.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

About Us Zycus is a pioneer in Cognitive Procurement software and has been a trusted partner of choice for large global enterprises for two decades. Zycus has been consistently recognized by Gartner, Forrester, and other analysts for its Source to Pay integrated suite. Zycus powers its S2P software with the revolutionary Merlin AI Suite. Merlin AI takes over the tactical tasks and empowers procurement and AP officers to focus on strategic projects; offers data-driven actionable insights for quicker and smarter decisions, and its conversational AI offers a B2C type user-experience to the end-users. Zycus helps enterprises drive real savings, reduce risks, and boost compliance, and its seamless, intuitive, and easy-to-use user interface ensures high adoption and value across the organization. Start your #CognitiveProcurement journey with us, as you are #MeantforMore We Are An Equal Opportunity Employer Zycus is committed to providing equal opportunities in employment and creating an inclusive work environment. We do not discriminate against applicants on the basis of race, color, religion, gender, sexual orientation, national origin, age, disability, or any other legally protected characteristic. All hiring decisions will be based solely on qualifications, skills, and experience relevant to the job requirements. Job Description We are looking for a Director of Engineering to lead one of our key product engineering teams. This role will report directly to the VP of Engineering and will be responsible for successful execution of the company's business mission through development of cutting-edge software products and solutions. As an owner of the product you will be required to plan and execute the product road map and provide technical leadership to the engineering team. You will have to collaborate with Product Management and Implementation teams and build a commercially successful product. You will be responsible to recruit & lead a team of highly skilled software engineers and provide strong hands on engineering leadership. Requirement deep technical knowledge in Software Product Engineering using Amazon Web Services,Java 8 Java/J2EE, Node.js, React.js, fullstack, NosqlDB, mongodb, cassandra, neo4j, elastic search, kibana, elk, kafka, redis, docker, kubernetes, Amazon Web Services ,Architecture Concepts,Design PatternsData Structures & Algorithms,Distributed Computing,Multi-threading,AWS,Docker,Kubernetes, apache, solr, activemq, rabbitmq, spark, scala, sqoop, hbase, hive, websocket, webcrawler, springboot, etc. is a must. Job Requirement 16+ years of experience in Software Engineering with at least 5+ years as an engineering leader in a software product company. Hands-on technical leadership with proven ability to recruit high performance talent High technical credibility - ability to audit technical decisions and push for the best solution to a problem. Experience building E2E Application right from backend database to persistent layer. Experience UI technologies Angular, react.js, Node.js or fullstack environment will be preferred. Experience with NoSQL technologies (MongoDB, Cassandra, Neo4j, Dynamodb, etc.) Elastic Search, Kibana, ELK, Logstash. Experience in developing Enterprise Software using Agile Methodology. Good understanding of Kafka, Redis, ActiveMQ, RabbitMQ, Solr etc. SaaS cloud-based platform exposure. Experience on Docker, Kubernetes etc. Ownership E2E design development and also quality enterprise product/application deliverable exposure A track record of setting and achieving high standards Strong understanding of modern technology architecture Key Programming Skills: Java, J2EE with cutting edge technologies Excellent team building, mentoring and coaching skills are a must-have Five Reasons Why You Should Join Zycus Cloud Product Company: We are a Cloud SaaS Company and our products are created by using the latest technologies like ML and AI. Our UI is in Angular JS and we are developing our mobile apps using React. A Market Leader: Zycus is recognized by Gartner (worlds leading market research analyst) as a Leader in Procurement Software Suites. Move between Roles: We believe that change leads to growth and therefore we allow our employees to shift careers and move to different roles and functions within the organization Get a Global Exposure: You get to work and deal with our global customers. Create an Impact: Zycus gives you the environment to create an impact on the product and transform your ideas into reality. Even our junior engineers get the opportunity to work on different product features.

Posted 1 month ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderābād

On-site

Skills: 3-5 years of Data Warehouse, Business Intelligence work experience required working with Talend Open Studio Extensive experience with Talend Real-Time Big Data Platform in the areas of design, development and Testing with focus on Talend Data Integration and Talend Big-data Real Time including Big data Streaming (Spark) jobs with different databases Experience working with data bases like Greenplum, HAWQ, Oracle, Teradata, MsSql Server, Sybase, Casandra, Mongo-DB, Flat files, API’s and different Hadoop concepts in Bigdata (ecosystems like Hive, Pig, Sqoop, and Map Reduce) Working knowledge of Java is preferred Advanced knowledge of ETL, including the ability to read and write efficient, robust code, follow or implement best practices and coding standards, design implement common ETL strategies (CDC, SCD, etc.), and create reusable maintainable jobs Solid background in database systems (such as Oracle, SQL Server, Redshift and Salesforce) along with strong knowledge of PLSQL and SQL Knowledge and hands on of Unix commands and Shell Scripting Good knowledge of SQL, including the ability to write stored procedures, triggers, functions etc.

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Impetus is hiring for good GCP Data Engineers, If you are good in Bigdata, Spark, pyspark & GCP-Pub Sub, Dataproc, Big query etc & you are immediate joiner & can join us in 0-30 days, please share your resume at rashmeet.g.tuteja@impetus.com. Responsibilities Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Should have strong experience in Bigdata, Spark, Pyspark & Python. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including. Good hands-on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built.

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Impetus is hiring for good GCP Data Engineers If you are good in Bigdata, Spark, pyspark & GCP-Pub Sub, Dataproc, Big query etc & you are immediate joiner & can join us in 0-30 days, please share your resume at vaishali.tyagi@impetus.com. Responsibilities Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Should have strong experience in Bigdata, Spark, Pyspark & Python. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including. Good hands-on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built.

Posted 1 month ago

Apply

8.0 - 13.0 years

9 - 13 Lacs

Bengaluru

Work from Office

As a Technical Specialist, you will develop and enhance Optical Network Management applications, leveraging experience in Optical Networks. You will work with fault supervision, and performance monitoring. Collaborating in an agile environment, you will drive innovation, optimize efficiency, and explore UI technologies like React. Your role will focus on designing, coding, testing, and improving network management applications to enhance functionality and customer satisfaction. You have: Bachelor's degree and 8 years of experience (or equivalent) in Optics Network. Hands-on working experience with CORE JAVA, Spring, Kafka, Zookeeper, Hibernate, and Python. Working knowledge of RDBMS, PL-SQL, Linux, Docker, and database concepts. Exposure to UI technologies like REACT. It would be nice if you also had: Domain knowledge in OTN, Photonic network management. Strong communication skills and the ability to manage complex relationships. Develop software for Network Management of Optics Division products, including Photonic/WDM, Optical Transport, SDH, and SONET. Enable user control over network configuration through Optics Network Management applications. Utilize Core Java, Spring, Kafka, Python, and RDBMS to build high-performing solutions for network configuration. Interface Optics Network Management applications with various Network Elements, providing a user-friendly graphical interface and implementing algorithms to simplify network management and reduce OPEX. Deploy Optics Network Management applications globally, supporting hundreds of installations for customers. Contribute to new developments and maintain applications as part of the development team, focusing on enhancing functionality and customer satisfaction.

Posted 1 month ago

Apply

8.0 - 13.0 years

19 - 25 Lacs

Bengaluru

Work from Office

In this role, you will play a key role in designing, building, and optimizing scalable data products within the Telecom Analytics domain. You will collaborate with cross-functional teams to implement AI-driven analytics, autonomous operations, and programmable data solutions. This position offers the opportunity to work with cutting-edge Big Data and Cloud technologies, enhance your data engineering expertise, and contribute to advancing Nokias data-driven telecom strategies. If you are passionate about creating innovative data solutions, mastering cloud and big data platforms, and working in a fast-paced, collaborative environment, this role is for you! You have: Bachelors or masters degree in computer science, Data Engineering, or related field with 8+ years of experience in data engineering with a focus on Big Data, Cloud, and Telecom Analytics. Hands-on expertise in Ab Initio for data cataloguing, metadata management, and lineage. Skills in data warehousing, OLAP, and modelling using Big Query, Click house, and SQL. Experience with data persistence technologies like S3, HDFS, and Iceberg. Hands-on experience with Python and scripting languages. It would be nice if you also had: Experience with data exploration and visualization using Superset or BI tools. Knowledge in ETL processes and streaming tools such as Kafka. Background in building data products for the telecom domain and understanding of AI and machine learning pipeline integration. Data GovernanceManage source data within the Metadata Hub and Data Catalog. ETL DevelopmentDevelop and execute data processing graphs using Express It and the Co-Operating System. ETL OptimizationDebug and optimize data processing graphs using the Graphical Development Environment (GDE). API IntegrationLeverage Ab Initio APIs for metadata and graph artifact management. CI/CD ImplementationImplement and maintain CI/CD pipelines for metadata and graph deployments. Team Leadership & MentorshipMentor team members and foster best practices in Ab Initio development and deployment.

Posted 1 month ago

Apply

7.0 - 9.0 years

7 - 11 Lacs

Bengaluru

Work from Office

As a Data Engineer on the Nokia Enterprise Data Platform (EDP) team, you will play a vital role in shaping our data as a service platform. Collaborating closely with a dynamic team of developers, architects, and product managers, and tackling exciting challenges related to data integrations and performance monitoring in a fast-paced Agile environment. Your expertise will drive improvements in application performance while increasing our data capabilities amidst growing complexities. You have: Bachelor of Engineering in Computer science or Information & Technology, or Communication with 7-9 years of relevant experience working in a software development or equivalent role. Proficiency in SQL, Python, Spark, and Azure Data Engineering tools Experience in designing and maintaining scalable Azure-based infrastructure Experience with Power BI, SAP, or GIT Experience in implementing security best practices for cloud infrastructure It would be nice if you also had: Familiarity with Lakehouse architecture, Delta tables, and Databricks Motivation and drive to overcome challenges and succeed in a fast growing technical landscape. Knowledge of Agile/Scrum methodologies Develop, maintain, and optimize Nokias Enterprise Data Platform to ensure seamless data delivery across the organization. Collaborate with cross-functional teams to gather requirements, analyze demands, and implement data solutions aligned with business needs. Monitor and troubleshoot data pipeline performance, ensuring efficient data integration and throughput. Design and implement scalable Azure-based infrastructure, adhering to best practices for security and reliability. Build and manage CI/CD pipelines using Azure DevOps to streamline software development and deployment processes. Provide technical expertise in coding, testing, and system analysis, ensuring adherence to standard development practices. Communicate effectively with stakeholders, articulating technical challenges and project updates in a clear and concise manner. Foster a proactive and team-oriented environment, mentoring junior engineers, and promoting knowledge sharing across the team.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies