Jobs
Interviews

916 Sqoop Jobs - Page 9

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Job Title:DEVOPS- AWS Glue, KMS, ALB , ECS and Terraform/TerragruntExperience5-10YearsLocation:Bangalore : DEVOPS, AWS, Glue, KMS, ALB , ECS, Terraform, Terragrunt

Posted 2 weeks ago

Apply

3.0 - 6.0 years

13 - 18 Lacs

Bengaluru

Work from Office

We are looking to hire Data engineer for the Platform Engineering team. It is a collection of highly skilled individuals ranging from development to operations with a security first mindset who strive to push the boundaries of technology. We champion a DevSecOps culture and raise the bar on how and when we deploy applications to production. Our core principals are centered around automation, testing, quality, and immutability all via code. The role is responsible for building self-service capabilities that improve our security posture, productivity, and reduce time to market with automation at the core of these objectives. The individual collaborates with teams across the organization to ensure applications are designed for Continuous Delivery (CD) and are well-architected for their targeted platform which can be on-premise or the cloud. If you are passionate about developer productivity, cloud native applications, and container orchestration, this job is for you! Principal Accountabilities: The incumbent is mentored by senior individuals on the team to capture the flow and bottlenecks in the holistic IT delivery process and define future tool sets Skills and Software Requirements: Experience with a language such as Python, Go,SQL, Java, or Scala GCP data services (BigQuery; Dataflow; Dataproc; Cloud Composer; Pub/Sub; Google Cloud Storage; IAM) Experience with Jenkins, Maven, Git, Ansible, or CHEF Experience working with containers, orchestration tools (like Kubernetes, Mesos, Docker Swarm etc.) and container registries (GCE, Docker hub etc.) Experience with [SPI]aaS- Software-as-a-Service, Platform-as-a-Service, or Infrastructure-as- a-Service Acquire, cleanse, and ingest structured and unstructured data on the cloud Combine data from disparate sources in to a single, unified, authoritative view of data (e.g., Data Lake) Enable and support data movement from one system service to another system service Experience implementing or supporting automated solutions to technical problems Experience working in a team environment, proactively executing on tasks while meeting agreed delivery timelines Ability to contribute to effective and timely solutions Excellent oral and written communication skills

Posted 2 weeks ago

Apply

4.0 - 7.0 years

5 - 10 Lacs

Bengaluru

Work from Office

1 Have a good understanding of AWS services specifically in the following areas RDS, S3, add EC2, VPC, KMS, ECS, Lambda, AWS Organizations and IAM policy setup. Also Python as a main skill. 2 Architect/design/code database infrastructure deployment using terraform. Should be able to write terraform modules that will deploy database services in AWS 3 Provide automation solutions using python lambda's for repetitive tasks such as running quarterly audits, daily health checks in RDS in multiple accounts. 4 Have a fair understanding of Ansible to automate Postgres infrastructure deployment and automation of repetitive tasks for on prem servers 5 Knowledge of Postgres and plpgsql functions6 Hands on experience with Ansible and Terraform and the ability to contribute to ongoing projects with minimal coaching.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

The role of a Data Engineer is crucial for ensuring the smooth operation of the Data Platform in Azure / AWS Databricks. As a Data Engineer, you will be responsible for the continuous development, enhancement, support, and maintenance of data availability, data quality, performance enhancement, and stability of the system. Your primary responsibilities will include designing and implementing data ingestion pipelines from various sources using Azure Databricks, ensuring the efficient and smooth running of data pipelines, and adhering to security, regulatory, and audit control guidelines. You will also be tasked with driving optimization, continuous improvement, and efficiency in data processes. To excel in this role, it is essential to have a minimum of 5 years of experience in the data analytics field, hands-on experience with Azure/AWS Databricks, proficiency in building and optimizing data pipelines, architectures, and data sets, and excellent skills in Scala or Python, PySpark, and SQL. Additionally, you should be capable of troubleshooting and optimizing complex queries on the Spark platform, possess knowledge of structured and unstructured data design/modelling, data access, and data storage techniques, and expertise in designing and deploying data applications on cloud solutions such as Azure or AWS. Moreover, practical experience in performance tuning and optimizing code running in Databricks environment, demonstrated analytical and problem-solving skills, particularly in a big data environment, are essential for success in this role. In terms of technical/professional skills, proficiency in Azure/AWS Databricks, Python/Scala/Spark/PySpark, HIVE/HBase/Impala/Parquet, Sqoop, Kafka, Flume, SQL, RDBMS, Airflow, Jenkins/Bamboo, Github/Bitbucket, and Nexus will be advantageous for executing the responsibilities effectively.,

Posted 2 weeks ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Noida, India

Work from Office

Spark/PySpark Technical hands on data processing Table designing knowledge using Hive - similar to RDBMS knowledge Database SQL knowledge for retrieval of data - transformation queries such as joins (full, left , right) , ranking , group by Good Communication skills. Additional skills - GitHub , Jenkins , shell scripting would be added advantage Mandatory Competencies Big Data - Big Data - Hadoop Big Data - Big Data - SPARK Big Data - Big Data - Pyspark DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Jenkins Beh - Communication and collaboration Database - Database Programming - SQL DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - GitLab,Github, Bitbucket DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Basic Bash/Shell script writing Big Data - Big Data - HIVE

Posted 2 weeks ago

Apply

7.0 - 12.0 years

11 - 16 Lacs

Bengaluru

Work from Office

As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up . Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII: At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Target as a tech companyAbsolutely. Were the behind-the-scenes powerhouse that fuels Targets passion and commitment to cutting-edge innovation. We anchor every facet of one of the worlds best-loved retailers with a strong technology framework that relies on the latest tools and technologiesand the brightest peopleto deliver incredible value to guests online and in stores. Target Technology Services is on a mission to offer the systems, tools and support that guests and team members need and deserve. Our high-performing teams balance independence with collaboration, and we pride ourselves on being versatile, agile and creative. We drive industry-leading technologies in support of every angle of the business, and help ensure that Target operates smoothly, securely and reliably from the inside out. Role overview As a Lead Engineer, you serve as the technical anchor for the engineering team that supports a product. You create, own and are responsible for the application architecture that best serves the product in its functional and non-functional needs. You identify and drive architectural changes to accelerate feature development or improve the quality of service (or both). You have deep and broad engineering skills and are capable of standing up an architecture in its whole on your own, but you choose to influence a wider team by acting as a force multiplier. Core responsibilities of this job are described within this job description. Job duties may change at any time due to business needs. Use your skills, experience and talents to be a part of groundbreaking thinking and visionary goals. As a Lead Engineer, youll take the lead as youUse your technology acumen to apply and maintain knowledge of current and emerging technologies within specialized area(s) of the technology domain. Evaluate new technologies and participates in decision-making, accounting for several factors such as viability within Targets technical environment, maintainability, and cost of ownership. Initiate and execute research and proof-of-concept activities for new technologies. Lead or set strategy for testing and debugging at the platform or enterprise level. In complex and unstructured situations, serve as an expert resource to create and improve standards and best practices to ensure high-performance, scalable, repeatable, and secure deliverables. Lead the design, lifecycle management, and total cost of ownership of services. Provide the team with thought leadership to promote re-use and develop consistent, scalable patterns. Participate in planning services that have enterprise impact. Provide suggestions for handling routine and moderately complex technical problems, escalating issues when appropriate. Gather information, data, and input from a wide variety of sources; identify additional resources when appropriate, engage with appropriate stakeholders, and conduct in-depth analysis of information. Provide suggestions for handling routine and moderately complex technical problems, escalating issues when appropriate. Develop plans and schedules, estimate resource requirements, and define milestones and deliverables. Monitor workflow and risks; play a leadership role in mitigating risks and removing obstacles. Lead and participate in complex construction, automation, and implementation activities, ensuring successful implementation with architectural and operational requirements met. Establish new standards and best practices to monitor, test, automate, and maintain IT components or systems. Serve as an expert resource in disaster recovery and disaster recovery planning. Stay current with Targets technical capabilities, infrastructure, and technical environment. Develop fully attributed data models, including logical, physical, and canonical. Influence data standards, policies, and procedures. Install, configure, and/or tune data management solutions with minimal guidance. Monitor data management solution(s) and identify optimization opportunities About you: Bachelor's degree (or equivalent experience) in Computer Science, Engineering, or related field. 7+ years of hands-on software development experience, including at least one full-cycle project implementation. Expertise in Targets technology landscape, with a solid understanding of industry trends, competitors products, and differentiating features. Proficient in Kotlin with advanced knowledge of Microservices architecture and Event-driven architectures . Strong experience with high-priority, large-scale applications capable of processing millions of records. Proven ability to design and implement highly scalable and observable systems . Working on mission-critical applications with large transaction volumes and high throughput. Building systems that are scalable , with a focus on performance and resilience. Leveraging cutting-edge tools for data correlation and pattern analysis. Experience with Scala , Hadoop , and other Big Data technologies is preferred Strong retail domain knowledge with experience working on multi-channel platforms. Hands-on experience with high-performance messaging platforms that are highly scalable. Useful Links: Life at Targethttps://india.target.com/ Benefitshttps://india.target.com/life-at-target/workplace/benefits Culture https://india.target.com/life-at-target/belonging

Posted 2 weeks ago

Apply

4.0 - 9.0 years

9 - 13 Lacs

Bengaluru

Work from Office

As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Our global in-house technology team of more than 5,000 of engineers, data scientists, architects, coaches and product managers strive to make Target the most convenient, safe and joyful place to shop. We use agile practices and leverage open-source software to adapt and build best-in-class technology for our team members and guestsand we do so with a focus on diversity and inclusion, experimentation and continuous learning. Pyramid Overview Our Product Engineering teams fuel Targets business with cutting-edge technology to deliver incredible experiences and value for guests and team members. Using a responsive architecture platform, we build and deploy industry-leading technology enabling Target to operate efficiently, securely, and reliably from the inside out. We work across Target, developing comprehensive product strategies, leveraging enterprise and guest feedback to set the standard for best in retail. Position Overview 4+ years of experience in software design & development with 3+ years of experience in building scalable backend applications using Java Demonstrates broad and deep expertise in Java/Kotlin and frameworks. Designs, develops, and approves end-to-end functionality of a product line, platform, or infrastructure. Communicates and coordinates with project team, partners, and stakeholders. Demonstrates expertise in analysis and optimization of systems capacity, performance, and operational health. Maintains deep technical knowledge within areas of expertise. Stays current with new and evolving technologies via formal training and self-directed education. Experience integrating with third party and opensource frameworks. About You 4 year degree or equivalent experience Experience4 -7 years Programming experience with Java - Springboot & Kotlin - micronaut Strong problem-solving skills with a good understanding of data structures and algorithms. Must have exposure to non-relational databases like MongoDB. Must have exposure to distributed systems and microservice architecture. Good to Have exposure to Data Pipeline, ML Ops, Spark, Python Demonstrates a solid understanding of the impact of own work on the team and/or guests Writes and organizes code using multiple computer languages, including distributed programming and understand different frameworks and paradigm Delivers high-performance, scalable, repeatable, and secure deliverables with broad impact (high throughput and low latency) Influences and applies data standards, policies, and procedures Maintains technical knowledge within areas of expertise Stays current with new and evolving technologies via formal training and self-directed education. Know More Here: Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Follow us on social media https://www.linkedin.com/company/target/ Target Tech- https://tech.target.com/ Target in india Belonging

Posted 2 weeks ago

Apply

9.0 - 16.0 years

30 - 35 Lacs

Bengaluru

Work from Office

KPMG India is looking for Manager - Azure Data Engineering to join our dynamic team and embark on a rewarding career journey Delegating responsibilities and supervising business operations Hiring, training, motivating and coaching employees as they provide attentive, efficient service to customers, assessing employee performance and providing helpful feedback and training opportunities. Resolving conflicts or complaints from customers and employees. Monitoring store activity and ensuring it is properly provisioned and staffed. Analyzing information and processes and developing more effective or efficient processes and strategies. Establishing and achieving business and profit objectives. Maintaining a clean, tidy business, ensuring that signage and displays are attractive. Generating reports and presenting information to upper-level managers or other parties. Ensuring staff members follow company policies and procedures. Other duties to ensure the overall health and success of the business.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Preferred Education Master's Degree Required Technical And Professional Expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices, Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices- Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)- Spark Good to have Python. Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands, Experience in Concurrent design and multi-threading. Preferred Technical And Professional Experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python.

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

On-site

Hiring: Part-Time Hadoop & Apache Spark Trainer (Big Data – Data Engineering) Join LCMGO – The Live Course Marketplace Are you a Big Data professional with hands-on expertise in Hadoop, Apache Spark, and data engineering tools? LCMGO is looking for engaging trainers to deliver live, hands-on sessions on industry-relevant Big Data technologies. 🔍 What You’ll Do • Deliver live, interactive classes on Big Data tools & frameworks: Hadoop, Spark (RDD/DataFrame/Structured Streaming), Hive, HDFS, Sqoop, Kafka, etc. • Guide learners through data engineering workflows, real-world big data projects, and hands-on labs • Provide job-focused insights, interview prep, Q&A support, and mentoring • Help professionals become job-ready for roles in Big Data and Data Engineering • Design practice assignments, live demos, and capstone projects simulating enterprise data pipelines ✅ Who Can Apply • Professionals with practical experience in Big Data technologies (Hadoop/Spark and related tools) • Trainers with prior teaching or mentoring experience (preferred) – freshers passionate about teaching are welcome too • Strong communication skills with a passion for simplifying complex data topics 💡 Why LCMGO? • Flexible part-time opportunity • Teach passionate learners globally • Be part of a fast-growing learning platform 📩 Apply now and help build the next generation of data engineering experts!

Posted 2 weeks ago

Apply

5.0 - 8.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: AWS Glue. Experience: 5-8 Years.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Preferred Education Master's Degree Required Technical And Professional Expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices, Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices- Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)- Spark Good to have Python. Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands, Experience in Concurrent design and multi-threading. Preferred Technical And Professional Experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Bengaluru

Work from Office

The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Position Overview: Seeking a skilled Python and Airflow Developer with a strong data engineering background. Key Responsibilities: Develop and maintain data pipelines using Apache Airflow and Python. Collaborate with business managers to understand data requirements. Optimize existing data workflows for performance and reliability. Manage SQL and NoSQL databases for business intelligence. Ensure data quality and security. Troubleshoot data pipeline issues. Required Qualifications: Proven experience in Python programming and data engineering. Hands-on experience with Apache Airflow. Strong understanding of SQL and experience with SQL databases. Familiarity with NoSQL databases. Experience in data modeling and ETL processes. Must have: Python and Apache Airflow, preferably 5+ years of experience.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Must-have skills Cloud certified in one of these categories Azure Data Engineer Azure Data Factory , Azure Data bricks Spark (PySpark or scala), SQL, DATA Ingestion, Curation Semantic Modelling/ Optimization of data model to work within Rahona Experience in Azure ingestion from on-prem source, e.g. mainframe, SQL server, Oracle. Experience in Sqoop / Hadoop Microsoft Excel (for metadata files with requirements for ingestion) Any other certificate in Azure/AWS/GCP and data engineering hands-on experience in cloud Strong Programming skills with at least one of Python, Scala or Java Mandatory Skills: DataBricks - Data Engineering . Experience: 5-8 Years .

Posted 2 weeks ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Pune

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Key skills Azure Data Factory (primary) , Azure Data bricks Spark (PySpark, SQL) Must-have skills Cloud certified in one of these categories Azure Data Engineer Azure Data Factory , Azure Data bricks Spark (PySpark or scala), SQL, DATA Ingestion, Curation Semantic Modelling/ Optimization of data model to work within Rahona Experience in Azure ingestion from on-prem source, e.g. mainframe, SQL server, Oracle. Experience in Sqoop / Hadoop Microsoft Excel (for metadata files with requirements for ingestion) Any other certificate in Azure/AWS/GCP and data engineering hands-on experience in cloud Strong Programming skills with at least one of Python, Scala or Java Strong SQL skills ( T-SQL or PL-SQL) Data files movement via mailbox Source-code versioning/promotion tools, e.g. Git/Jenkins Orchestration tools, e.g. Autosys, Oozie Source-code versioning with Git Nice-to-have skills Experience working with mainframe files Experience in Agile environment, JIRA/Confluence tool Key skills Azure Data Factory (primary) , Azure Data bricks Spark (PySpark, SQL) Experience - 5 to 10 Years Must-have skills Cloud certified in one of these categories Azure Data Engineer Azure Data Factory , Azure Data bricks Spark (PySpark or scala), SQL, DATA Ingestion, Curation Semantic Modelling/ Optimization of data model to work within Rahona Experience in Azure ingestion from on-prem source, e.g. mainframe, SQL server, Oracle. Experience in Sqoop / Hadoop Microsoft Excel (for metadata files with requirements for ingestion) Any other certificate in Azure/AWS/GCP and data engineering hands-on experience in cloud Strong Programming skills with at least one of Python, Scala or Java Strong SQL skills ( T-SQL or PL-SQL) Data files movement via mailbox Source-code versioning/promotion tools, e.g. Git/Jenkins Orchestration tools, e.g. Autosys, Oozie Source-code versioning with Git. Nice-to-have skills Experience working with mainframe files Experience in Agile environment, JIRA/Confluence to

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Chennai

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: DataBricks - Data Engineering Experience: 5-8 Years

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Company Description Blend is a premier AI services provider, committed to co-creating meaningful impact for its clients through the power of data science, AI, technology, and people. With a mission to fuel bold visions, Blend tackles significant challenges by seamlessly aligning human expertise with artificial intelligence. The company is dedicated to unlocking value and fostering innovation for its clients by harnessing world-class people and data-driven strategy. We believe that the power of people and AI can have a meaningful impact on your world, creating more fulfilling work and projects for our people and clients. For more information, visit www.blend360.com Job Description You will be a key member of our Data Engineering team, focused on designing, developing, and maintaining robust data solutions on on-premise environments. You will work closely with internal teams and client stakeholders to build and optimize data pipelines and analytical tools using Python, PySpark, SQL, and Hadoop ecosystem technologies. This role requires deep hands-on experience with big data technologies in traditional data center environments (non-cloud). What you’ll be doing? Design, build, and maintain on-premise data pipelines to ingest, process, and transform large volumes of data from multiple sources into data warehouses and data lakes Develop and optimize PySpark and SQL jobs for high-performance batch and real-time data processing Ensure the scalability, reliability, and performance of data infrastructure in an on-premise setup Collaborate with data scientists, analysts, and business teams to translate their data requirements into technical solutions Troubleshoot and resolve issues in data pipelines and data processing workflows Monitor, tune, and improve Hadoop clusters and data jobs for cost and resource efficiency Stay current with on-premise big data technology trends and suggest enhancements to improve data engineering capabilities Qualifications Bachelor’s degree in Computer Science, Software Engineering, or a related field 6+ years of experience in data engineering or a related domain Strong programming skills in Python (with experience in PySpark) Expertise in SQL with a solid understanding of data warehousing concepts Hands-on experience with Hadoop ecosystem components (e.g., HDFS, Hive, Oozie, Sqoop) Proven ability to design and manage data solutions in on-premise environments (no cloud dependency) Strong problem-solving skills with an ability to work independently and collaboratively

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Key responsibilities: Working with clients to understand their data. Basedontheunderstanding you will be building the data structures and pipelines. You will be working on the application from end to end collaborating with UI and other development teams. You will be responsible for building the data pipelines to migrate and load the data into the HDFSeither on-prem or in the cloud. Developing Data ingestion/process/integration pipelines effectively. Creating Hive data structures,metadata and loading the data into data lakes / Big Data warehouse environments. Optimized(Performance tuning) many data pipelines effectively to minimize cost. Codeversioning control and git repository is up to date. You will be responsible for building and maintaining CI/CD of the data pipelines. You will be managing the unit testing of all data pipelines Requirements Skills & Experience: Bachelor’s degree in computer science or related field. Minimum of 5+years working experience with Spark, Hadoop eco systems. Minimum of 4+years working experience on designing data streaming pipelines. Should be an expert in either Python/Scala/Java. Should have experience in Data Ingestion and Integration into data lake using hadoop ecosystem tools such as Sqoop, Spark, SQL, Hive, Airflow, etc.. Should have experience optimizing (Performance tuning) data pipelines. Minimum experience of 3+ years on NoSQL and Spark Streaming. Knowledge of Kubernetes and Docker is a plus. Should have experience with Cloud services either Azure/AWS. Should have experience with on-prem distribution such as Cloudera/HortonWorks/MapR. Basic understanding of CI/CD pipelines. Basic knowledge of Linux environment and commands.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are a Senior Software Engineer at Elevance Health, a prominent health company in America dedicated to enhancing lives and simplifying healthcare. Elevance Health is the largest managed healthcare company in the Blue Cross Blue Shield (BCBS) Association, serving over 45 million lives across 14 states. This Fortune 500 company is currently ranked 20th and led by Gail Boudreaux, a prominent figure in the Fortune list of most powerful women. Your role will be within Carelon Global Solutions (CGS), a subsidiary of Elevance Health, focused on simplifying complex operational processes in the healthcare system. CGS brings together a global team of innovators across various locations, including Bengaluru and Gurugram in India, to optimize healthcare operations effectively and efficiently. As a Senior Software Engineer, your primary responsibility involves collaborating with data architects to implement data models and ensure seamless integration with AWS services. You will be responsible for supporting, monitoring, and resolving production issues to meet SLAs, being available 24/7 for business application support. You should have hands-on experience with technologies like Snowflake, Python, AWS S3-Athena, RDS, Cloudwatch, Lambda, and more. Your expertise should include handling nested JSON files, analyzing daily loads/issues, working closely with admin/architect teams, and understanding complex job and data flows in the project. To qualify for this role, you need a Bachelor's degree in Information Technology/Data Engineering or equivalent education and experience, along with 5-8 years of overall IT experience and 2-9 years in AWS services. Experience in agile development processes is preferred. You are expected to have skills in Snowflake, AWS services, complex SQL queries, and technologies like Hadoop, Kafka, HBase, Sqoop, and Scala. Your ability to analyze, research, and solve technical problems will be crucial for success in this role. Carelon promises limitless opportunities for its associates, emphasizing growth, well-being, purpose, and belonging. With a focus on learning and development, an innovative culture, and comprehensive rewards, Carelon offers a supportive environment for personal and professional growth. Carelon is an equal opportunity employer that values diversity and inclusivity. If you require accommodations due to a disability, you can request the Reasonable Accommodation Request Form. This is a full-time position that offers a competitive benefits package and a conducive work environment.,

Posted 3 weeks ago

Apply

5.0 years

6 - 9 Lacs

Bengaluru

On-site

As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Career Level - IC3 As a Sr. Support Engineer, you will be the technical interface to customers, Original Equipment Manufacturers (OEMs) and Value-Added Resellers (VARs) for resolution of problems related to the installation, recommended maintenance and use of Oracle products. Have an understanding of all Oracle products in their competencies and in-depth knowledge of several products and/or platforms. Also, you should be highly experienced in multiple platforms and be able to complete assigned duties with minimal direction from management. In this position, you will routinely act independently while researching and developing solutions to customer issues. RESPONSIBILITIES: To manage and resolve Service Requests logged by customers (internal and external) on Oracle products and contribute to proactive support activities according to product support strategy and model Provide expert-level troubleshooting and technical support for Oracle’s Big Data Service (BDS) , DFS, DIS, Data Catalog, and associated cloud services Diagnose and resolve complex issues across the Hadoop ecosystem (e.g., HDFS, YARN, Spark, Hive, Impala, Sqoop, Oozie) Manage cluster configurations, upgrades, patches, and installations using tools like Ambari Support real-time data processing frameworks (Kafka, Flink) and ETL pipelines (ODI, Informatica) Collaborate with OCI platform teams to support secure and scalable AI/ML data workflows Engage in hands-on support for agentic frameworks (LangChain, Semantic Kernel, CrewAI) and RAG-based systems Interact regularly with customers, build technical documentation, and contribute to knowledge sharing Collaborate cross-functionally with product engineering, infrastructure, and cloud ops teams for holistic support delivery Qualifications Bachelor’s degree in Computer Science, Engineering, or a related technical field 5+ years of proven experience supporting Oracle Big Data platforms including Oracle’s Big Data Service (BDS) , DFS, DIS, Data Catalog, and Oracle Cloud Infrastructure (OCI) Strong expertise in Hadoop ecosystem: HDFS, YARN, Spark, Hive, Impala, Sqoop, Oozie, Ranger, Kerberos Experience in Linux OS administration, networking, TLS/SSL, and SSO integration Experience with data integration tools (ODI/Informatica) and cloud data sources (FusionApps/BICC, Snowflake) Hands-on experience with LLMs, agentic frameworks (LangChain, Semantic Kernel, CrewAI), RAG pipelines, and vector databases (FAISS, Pinecone, Weaviate) Proficiency in Python and Shell scripting Skills & Competencies Deep understanding of Oracle’s Big Data Service (BDS) , Data Flow Service (DFS), Data Integration Service(DIS), Data Catalog architecture and operations Cluster administration using Ambari and troubleshooting across the Cloudera stack Real-time processing using Kafka, Flink AI/ML workflow support, including OCI Gen AI services and integration of agentic pipelines Working knowledge of cloud services, networking, system-level security, and distributed architectures Experience supporting multi-tier enterprise applications Personal Competencies Strong customer focus with ability to handle escalations and technical deep dives Structured problem-solving mindset Self-motivated with a continuous learning attitude Excellent communication, documentation, and global collaboration skills Results-oriented with a passion for service quality and technical excellence

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You should have 5-8 years of experience with the Hadoop ecosystem, especially in utilizing Hive for data querying and analysis. It is essential to have experience in data modeling and ETL processes. Proficiency in MySQL is required, including the ability to write complex queries, stored procedures, and optimize queries. You should be capable of working with large datasets for data analysis purposes. In this role, you will be expected to work closely with and mentor the team, actively contribute to discussions, and present findings clearly. The key skills for this position include expertise in Big Data, Hive, Spark, Sqoop, and MySQL.,

Posted 3 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

About Prospecta Founded in 2002 in Sydney, Australia, with additional offices in India, North America, Canada, and a local presence in Europe, the UK, and Southeast Asia, Prospecta is dedicated to providing top-tier data management and automation software for enterprise clients. Our journey began with a mission to offer innovative solutions, leading us to become a prominent data management software company over the years. Our flagship product, MDO (Master Data Online), is an enterprise Master Data Management (MDM) platform designed to streamline data management processes, ensuring accurate, compliant, and relevant master data creation, as well as efficient data disposal. With a strong presence in asset-intensive industries such as Energy and Utilities, Oil and Gas, Mining, Infrastructure, and Manufacturing, we have established ourselves as a trusted partner in the field. Culture at Prospecta At Prospecta, our culture is centered around growth and embracing new challenges. We boast a passionate team that collaborates seamlessly to deliver value to our customers. Our diverse backgrounds create an exciting work environment that fosters a rich tapestry of perspectives and ideas. We are committed to nurturing an environment that focuses on both professional and personal development. Career progression at Prospecta is not just about climbing the corporate ladder but about encountering a continuous stream of meaningful opportunities that enhance personal growth and technical proficiency, all under the guidance of exceptional leaders. Our organizational structure emphasizes agility, responsiveness, and achieving tangible outcomes. If you thrive in a dynamic environment, enjoy taking on various roles, and are willing to go the extra mile to achieve goals, Prospecta is the ideal workplace for you. We continuously push boundaries while maintaining a sense of fun and celebrating victories, both big and small. About the Job Position: Jr. Platform Architect/ Sr. Backend Developer Location: Gurgaon Role Summary: In this role, you will be responsible for implementing technology solutions in a cost-effective manner by understanding project requirements and effectively communicating them to all stakeholders and facilitators. Key Responsibilities - Collaborate with enterprise architects, data architects, developers & engineers, data scientists, and information designers to identify and define necessary data structures, formats, pipelines, metadata, and workload orchestration capabilities. - Possess expertise in service architecture, development, and ensuring high performance and scalability. - Demonstrate experience in Spark, Elastic Search, SQL performance tuning, and optimization. - Showcase proficiency in architectural design and development of large-scale data platforms and data applications. - Hands-on experience with AWS, Azure, and OpenShift. - Deep understanding of Spark and its internal architecture. - Expertise in designing and building new Cloud Data platforms and optimizing them at the organizational level. - Strong hands-on experience in Big Data technologies such as Hadoop, Sqoop, Hive, and Spark, including DevOps. - Solid SQL (Hive/Spark) skills and experience in tuning complex queries. Must-Have - 7+ years of experience. - Proficiency in Java, Spring Boot, Apache Spark, AWS, OpenShift, PostgreSQL, Elastic Search, Message Queue, Microservice architecture, and Spark. Nice-to-Have - Knowledge of Angular, Python, Scala, Azure, Kafka, and various file formats like Parquet, AVRO, CSV, JSON, Hadoop, Hive, and HBase. What will you get Growth Path At Prospecta, your career journey is filled with growth and opportunities. Depending on your career trajectory, you can kickstart your career or accelerate your professional development in a dynamic work environment. Your success is our priority, and as you exhibit your abilities and achieve results, you will have the opportunity to quickly progress into leadership roles. We are dedicated to helping you enhance your experience and skills, providing you with the necessary tools, support, and opportunities to reach new heights in your career. Benefits - Competitive salary. - Health insurance. - Paid time off and holidays. - Continuous learning and career progression. - Opportunities to work onsite at various office locations and/or client sites. - Participation in annual company events and workshops.,

Posted 3 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Big Data Engineer Qualification and Experience:B.Tech/B.E/ MSc/MCA - any specialization; 3- 8 Years Qualification and Experience B.Tech/B.E/ MSc/MCA any specialization; Experience: 3- 8 Years Responsibilities Manage the core analytics support for the account assigned and leading the analysis and publication of reports Importing, cleaning, transforming, validating or modeling data with the purpose of understanding or making conclusions from the data for business decisions Evaluating CVM campaign offerings for their impact, profitability, ROI and targets. Skilled in segment identification and base management to enhance the MOU and ARPU, Customer Retention & Loyalty. Regular data mining to find out Recharge pattern, MoU trend, Grace & churn, Unique recharge, Power STV penetration, Incremental ARPU, zero usage. Extensive use of data-mining tools such as SAS, MS-Access, SQL and MS Excel in order to identify & exploit potential revenue streams. Monitoring and analyzing market penetration of various products. Use Business Intelligence tool for making Pre and Post launch analysis of products Requirements of the role 3 year relevant experience in Hadoop architecture,Mapreduce/Yarn Concepts,Hive/Pig/Sqoop/OOzie Job Code: Big Data Engineer Location: Trivandrum/Mumbai For more information, Please mail to: recruitment@flytxt.com

Posted 3 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Chennai

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: DataBricks - Data Engineering. Experience:5-8 Years.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Pune

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: DataBricks - Data Engineering. Experience:5-8 Years.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies