Jobs
Interviews

437 Apache Pig Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

6 - 12 Lacs

bengaluru

Work from Office

ACL Digital is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 9 hours ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

noida

Work from Office

Diverse Lynx is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 9 hours ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

bengaluru

Work from Office

Diverse Lynx is looking for Big Data Developer to join our dynamic team and embark on a rewarding career journey Design, develop, and maintain big data solutions to meet business requirements and support data-driven decision making Work with stakeholders to understand their data needs and determine how to best use big data technologies to meet those needs Design and implement scalable, high-performance big data architectures, using technologies such as Hadoop, Spark, and NoSQL databases Extract, transform, and load large data sets into a big data platform for analysis and reporting Write complex SQL queries and develop custom scripts to process big data Collaborate with data scientists, data analysts, and other stakeholders to develop predictive models and algorithms that drive insights and decision making

Posted 9 hours ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

chennai

Work from Office

Diverse Lynx is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 9 hours ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

hyderabad

Work from Office

Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices, Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices- Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)- Spark Good to have Python. Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands, Experience in Concurrent design and multi-threading. Preferred technical and professional experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python.

Posted 9 hours ago

Apply

6.0 - 10.0 years

5 - 10 Lacs

bengaluru

Work from Office

As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Career Level - IC3 Responsibilities As a Sr. Support Engineer, you will be the technical interface to customers, Original Equipment Manufacturers (OEMs) and Value-Added Resellers (VARs) for resolution of problems related to the installation, recommended maintenance and use of Oracle products. Have an understanding of all Oracle products in their competencies and in-depth knowledge of several products and/or platforms. Also, you should be highly experienced in multiple platforms and be able to complete assigned duties with minimal direction from management. In this position, you will routinely act independently while researching and developing solutions to customer issues. RESPONSIBILITIES:To manage and resolve Service Requests logged by customers (internal and external) on Oracle products and contribute to proactive support activities according to product support strategy and modelOwning and resolving problems and managing customer expectations throughout the Service Request lifecycle in accordance with global standardsWorking towards, adopting and contributing to new processes and tools (diagnostic methodology, health checks, scripting tools, etc.)Contributing to Knowledge Management content creation and maintenanceWorking with development on product improvement programs (testing, SRP, BETA programs etc) as requiredOperating within Oracle business processes and proceduresRespond and resolve customer issues within Key Performance Indicator targetsMaintaining product expertise within the teamMaintain an up-to-date and in-depth knowledge of new products released in the market for supported product QUALIFICATIONS:Bachelors degree in Computer Science, Engineering or related technical field5+ years of proven professional and technical experience in Big Data Appliance (BDA), Oracle Cloud Infrastructure (OCI), Linux OS and within areas like Cloudera distribution for Hadoop (CDH), HDFS, YARN, Spark, Hive, Sqoop, Oozie and Intelligent Data Lake.Excellent verbal and written skills in English SKILLS & COMPETENCIES:Minimum technical skills:As a member of the Big Data Appliance (BDA), the focus is to troubleshoot highly complex technical issues related to the Big Data Appliance and within areas like Cloudera distribution for Hadoop (CDH), HDFS, YARN, Spark, Hive, Sqoop, Oozie and Intelligent Data Lake.Have good hands on experience in Linux Systems, Cloudera Hadoop architecture, administration and troubleshooting skills with good knowledge of different technology products/services/processes.Responsible for resolving complex issues for BDA (Big Data Appliance) customers. This would include resolving issues pertaining to Cloudera Hadoop, Big Data SQL, BDA upgrades/patches and installs. The candidate will also collaborate with other teams like Hardware, development, ODI, Oracle R, etc to help resolve customers issues on the BDA machine. The candidate will also be responsible for interacting with customer counterparts on a regular basis and serving as the technology expert on the customers behalf.Experience in multi-tier architecture environment required.Fundamental understanding of computer networking, systems, and database technologies. Personal competencies:Desire to learn, or expand knowledge, about Oracle database and associated productsCustomer focusStructured Problem Recognition and ResolutionExperience of contributing to a shared knowledge baseExperience of Support level work, like resolving customer problems and managing customer expectations, and escalations.CommunicationPlanning and organizingWorking globallyQualityTeam WorkingResults oriented Qualifications Career Level - IC3

Posted 13 hours ago

Apply

6.0 - 10.0 years

9 - 14 Lacs

bengaluru

Work from Office

About Oracle FSGIU - Finergy: The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients' business goals. Job Summary: We are seeking a skilled Senior Data Migration Engineer with expertise in AWS, Databricks, Python, PySpark, and SQL to lead and execute complex data migration projects. The ideal candidate will design, develop, and implement data migration solutions to move large volumes of data from legacy systems to modern cloud-based platforms, ensuring data integrity, accuracy, and minimal downtime. Job Responsibilities Software Development: Design, develop, test, and deploy high-performance and scalable data solutions using Python, PySpark, SQL Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications. Implement efficient and maintainable code using best practices and coding standards. AWS & Databricks Implementation: Work with Databricks platform for big data processing and analytics. Develop and maintain ETL processes using Databricks notebooks. Implement and optimize data pipelines for data transformation and integration. Utilize AWS services (e.g., S3, Glue, Redshift, Lambda) and Databricks to build and optimize data migration pipelines. Leverage PySpark for large-scale data processing and transformation tasks. Continuous Learning: Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks. Share knowledge with the team and contribute to a culture of continuous improvement. SQL Database Management: Utilize expertise in SQL to design, optimize, and maintain relational databases. Write complex SQL queries for data retrieval, manipulation, and analysis. Qualifications & Skills: Education: Bachelors degree in Computer Science, Engineering, Data Science, or a related field. Advanced degrees are a plus. 4 to 8 Years of experience inDatabricksand big data frameworks Proficient in AWS services and data migration Experience inUnity Catalogue Familiarity withBatch and real time processing Data engineering with strong skills in Python, PySpark, SQL Certifications: AWS Certified Solutions Architect, Databricks Certified Professional, or similar are a plus. Soft Skills: Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Ability to work in a fast-paced, agile environment.

Posted 13 hours ago

Apply

6.0 - 7.0 years

5 - 9 Lacs

pune

Work from Office

Diverse Lynx is looking for Cloudera Hadoop Administrator to join our dynamic team and embark on a rewarding career journey Collaborate with cross-functional teams to achieve strategic outcomes Apply subject expertise to support operations, planning, and decision-making Utilize tools, analytics, or platforms relevant to the job domain Ensure compliance with policies while improving efficiency and outcomes Disclaimer: This job description has been sourced from a public domain and may have been modified by Naukri.com to improve clarity for our users. We encourage job seekers to verify all details directly with the employer via their official channels before applying.

Posted 3 days ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

pune

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: Scala programming . Experience: 3-5 Years .

Posted 3 days ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

hyderabad

Work from Office

NTT DATA Services currently seeks Python Hadoop Developer to join our team in Responsibilities Skillset: Python, Hadoop, ETL, RDBMS, Unix, GCP Vertex AI Responsibilities: Data Ingestion Pipelines, AIML Model deployments, Data Engineering, ML Engineering with Python.

Posted 3 days ago

Apply

2.0 - 4.0 years

15 - 25 Lacs

hyderabad

Work from Office

Overview Data Science Team works in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Azure Pipelines. You will be part of a collaborative interdisciplinary team around data, where you will be responsible of our continuous delivery of statistical/ML models. You will work closely with process owners, product owners and final business users. This will provide you the correct visibility and understanding of criticality of your developments. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Active contributor to code & development in projects and services Partner with data engineers to ensure data access for discovery and proper data is prepared for model consumption. Partner with ML engineers working on industrialization. Communicate with business stakeholders in the process of service design, training and knowledge transfer. Support large-scale experimentation and build data-driven models. Refine requirements into modelling problems. Influence product teams through data-based recommendations. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create reusable packages or libraries. Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Leverage big data technologies to help process data and build scaled data pipelines (batch to real time) Implement end-to-end ML lifecycle with Azure Databricks and Azure Pipelines Automate ML models deployments Qualifications BE/B.Tech in Computer Science, Maths, technical fields. Overall 2-4 years of experience working as a Data Scientist. 2+ years experience building solutions in the commercial or in the supply chain space. 2+ years working in a team to deliver production level analytic solutions. Fluent in git (version control). Understanding of Jenkins, Docker are a plus. Fluent in SQL syntaxis. 2+ years experience in Statistical/ML techniques to solve supervised (regression, classification) and unsupervised problems. 2+ years experience in developing business problem related statistical/ML modeling with industry tools with primary focus on Python or Pyspark development. Data Science Hands on experience and strong knowledge of building machine learning models supervised and unsupervised models. Knowledge of Time series/Demand Forecast models is a plus Programming Skills Hands-on experience in statistical programming languages like Python, Pyspark and database query languages like SQL Statistics Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Cloud (Azure) Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Business storytelling and communicating data insights in business consumable format. Fluent in one Visualization tool. Strong communications and organizational skills with the ability to deal with ambiguity while juggling multiple priorities Experience with Agile methodology for team work and analytics product creation. Experience in Reinforcement Learning is a plus. Experience in Simulation and Optimization problems in any space is a plus. Experience with Bayesian methods is a plus. Experience with Causal inference is a plus. Experience with NLP is a plus. Experience with Responsible AI is a plus. Experience with distributed machine learning is a plus Experience in DevOps, hands-on experience with one or more cloud service providers AWS, GCP, Azure(preferred) Model deployment experience is a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is preferred Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills Stakeholder engagement-BU, Vendors. Experience building statistical models in the Retail or Supply chain space is a plus

Posted 4 days ago

Apply

6.0 - 10.0 years

11 - 16 Lacs

pune, chennai, bengaluru

Work from Office

ADB- DE: Primary: Azure, Databricks,ADF, Pyspark/Python Secondary:Datawarehouse, Must Have 10+ Years of IT experience in Datawarehouse and ETL Hands-on data experience on Cloud Technologies on Azure, ADF, Synapse, Pyspark/Python Ability to understand Design, Source to target mapping (STTM) and create specifications documents Flexibility to operate from client office locations Able to mentor and guide junior resources, as needed Nice to Have Any relevant certifications Banking experience on RISK & Regulatory OR Commercial OR Credit Cards/Retail"

Posted 4 days ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

chennai

Work from Office

Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, Good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands Preferred technical and professional experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python

Posted 4 days ago

Apply

2.0 - 5.0 years

8 - 9 Lacs

chennai

Work from Office

Job Title: Hadoop Administrator Location: Chennai, India Experience: 5 yrs of experience in IT, with At least 2+ years of experience with cloud and system administration. At least 3 years of experience with and strong understanding of big data technologies in Hadoop ecosystem Hive, HDFS, Map/Reduce, Flume, Pig, Cloudera, HBase Sqoop, Spark etc. Company: Smartavya Analytica Private limited is a niche Data and AI company. Based in Pune, we are pioneers in data-driven innovation, transforming enterprise data into strategic insights. Established in 2017, our team has experience in handling large datasets up to 20 PBs in a single implementation, delivering many successful data and AI projects across major industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are leaders in Big Data, Cloud and Analytics projects with super specialization in very large Data Platforms. https://smart-analytica.com SMARTAVYA ANALYTICA Smartavya Analytica is a leader in Big Data, Data Warehouse and Data Lake Solutions, Data Migration Services and Machine Learning/Data Science projects on all possible flavours namely on-prem, cloud and migration both ways across platforms such as traditional DWH/DL platforms, Big Data Solutions on Hadoop, Public Cloud and Private Cloud.smart-analytica.com Empowering Your Digital Transformation with Data Modernization and AI Job Overview: Smartavya Analytica Private Limited is seeking an experienced Hadoop Administrator to manage and support our Hadoop ecosystem. The ideal candidate will have strong expertise in Hadoop cluster administration, excellent troubleshooting skills, and a proven track record of maintaining and optimizing Hadoop environments. Key Responsibilities: Install, configure, and manage Hadoop clusters, including HDFS, YARN, Hive, HBase, and other ecosystem components. Monitor and manage Hadoop cluster performance, capacity, and security. Perform routine maintenance tasks such as upgrades, patching, and backups. Implement and maintain data ingestion processes using tools like Sqoop, Flume, and Kafka. Ensure high availability and disaster recovery of Hadoop clusters. Collaborate with development teams to understand requirements and provide appropriate Hadoop solutions. Troubleshoot and resolve issues related to the Hadoop ecosystem. Maintain documentation of Hadoop environment configurations, processes, and procedures. Requirement: Experience in Installing, configuring and tuning Hadoop distributions. Hands on experience in Cloudera. Understanding of Hadoop design principals and factors that affect distributed system performance, including hardware and network considerations. Provide Infrastructure Recommendations, Capacity Planning, work load management. Develop utilities to monitor cluster better Ganglia, Nagios etc. Manage large clusters with huge volumes of data Perform Cluster maintenance tasks Create and removal of nodes, cluster monitoring and troubleshooting Manage and review Hadoop log files Install and implement security for Hadoop clusters Install Hadoop Updates, patches and version upgrades. Automate the same through scripts Point of Contact for Vendor escalation. Work with Hortonworks in resolving issues Should have Conceptual/working knowledge of basic data management concepts like ETL, Ref/Master data, Data quality, RDBMS Working knowledge of any scripting language like Shell, Python, Perl Should have experience in Orchestration & Deployment tools. Academic Qualification: BE / B.Tech in Computer Science or equivalent along with hands-on experience in dealing with large data sets and distributed computing in data warehousing and business intelligence systems using Hadoop.

Posted 4 days ago

Apply

3.0 - 8.0 years

6 - 10 Lacs

bengaluru

Work from Office

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way youd like, where youll be supported and inspired bya collaborative community of colleagues around the world, and where youll be able to reimagine whats possible. Join us and help the worlds leading organizationsunlock the value of technology and build a more sustainable, more inclusive world. Job Role Very good Understanding of current work and the tools and technologies being used. Comprehensive knowledge and clarity on Bigquery, ETL, GCS, Airflow/Composer, SQL, Python. Experience working with Fact and Dimension tables, SCD. Minimum 3 years" experience in GCP Data Engineering. Java/ Python/ Spark on GCP, Programming experience in any one language - either Python or Java or PySpark,SQL. GCS(Cloud Storage), Composer (Airflow) and BigQuery experience. Should have worked on handling big data Your Profile Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. Pipeline development experience using Dataflow or Dataproc (Apache Beam etc). Any other GCP services or databases like Datastore, Bigtable, Spanner, Cloud Run, Cloud Functions etc. Proven analytical skills and Problem-solving attitude. Excellent Communication Skills. What youll love about working here You can shape yourcareerwith us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learnon one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.

Posted 4 days ago

Apply

6.0 - 11.0 years

14 - 17 Lacs

hyderabad

Work from Office

As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure

Posted 5 days ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

chennai

Work from Office

Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices, Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices- Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)- Spark Good to have Python. Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands, Experience in Concurrent design and multi-threading. Preferred technical and professional experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python.

Posted 5 days ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS BigData Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to team members to foster a productive work environment. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the highest standards of quality and functionality. Your role will be pivotal in driving innovation and efficiency within the team, while also maintaining a focus on delivering exceptional results for stakeholders. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS BigData.- Strong understanding of cloud computing concepts and services.- Experience with data processing frameworks such as Apache Spark or Hadoop.- Familiarity with data warehousing solutions and ETL processes.- Ability to design and implement scalable data architectures. Additional Information:- The candidate should have minimum 7.5 years of experience in AWS BigData.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

4.0 - 9.0 years

9 - 13 Lacs

bengaluru

Work from Office

Job Area :Engineering Group, Engineering Group > Hardware Engineering General Summary: As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Hardware Engineer, you will plan, design, optimize, verify, and test electronic systems, bring-up yield, circuits, mechanical systems, Digital/Analog/RF/optical systems, equipment and packaging, test systems, FPGA, and/or DSP systems that launch cutting-edge, world class products. Qualcomm Hardware Engineers collaborate with cross-functional teams to develop solutions and meet performance requirements. Minimum Qualifications: Bachelor's degree in Computer Science, Electrical/Electronics Engineering, Engineering, or related field and 4+ years of Hardware Engineering or related work experience. OR Master's degree in Computer Science, Electrical/Electronics Engineering, Engineering, or related field and 3+ years of Hardware Engineering or related work experience. OR PhD in Computer Science, Electrical/Electronics Engineering, Engineering, or related field and 2+ years of Hardware Engineering or related work experience. Job Description We are seeking a skilled LSF Cluster Support Engineer to join our Infrastructure team at Qualcomm. This role involves both supporting and developing for our LSF clusters. The ideal candidate will have extensive experience with LSF, Python, Bash, and Linux systems. You will be responsible for troubleshooting, maintaining, and optimizing our LSF environments, while also contributing to development projects. Key Responsibilities: Provide expert support for LSF cluster environments. Troubleshoot and resolve issues related to job scheduling and cluster performance. Collaborate with development teams to enhance LSF functionalities. Develop and maintain scripts and tools using Python and Bash. Ensure high availability and reliability of cluster operations. Preferred Qualifications: Proven experience of 5+ years with LSF cluster support and administration, including user-facing support Strong proficiency in Python, Bash scripting, and Linux systems. Excellent problem-solving skills and ability to work under pressure. Experience in development and support roles, ideally in a high-performance computing environment. Experience with additional cluster management tools like PBS, NC, etc. Experience in containers and Docker. Experience & Knowledge of other programming languages. Principal Duties and Responsibilities: Support internal users with grid job issues : Act as the first point of contact for internal teams facing problems with job submission, monitoring, or execution on the grid; provide timely troubleshooting and guidance. Diagnose and resolve grid-related failures : Investigate job failures, queue delays, and resource allocation issues; work with users to resolve errors and optimize job configurations. Configure and maintain job submission environments : Assist users in setting up and adjusting job submission tools, scripts, and environment variables to ensure compatibility with grid requirements. Collaborate on internal tooling : Work with developers to improve, test, and support internal tools that interface with the grid, such as job submission portals or automation scripts. Document processes and common issues : Create and maintain internal documentation covering grid usage, troubleshooting steps, and best practices to streamline support and reduce recurring issues. Level of Responsibility: Works independently with minimal supervision. Provides supervision/guidance to other team members. Decision-making is significant in nature and affects work beyond immediate work group. Requires verbal and written communication skills to convey complex information. May require negotiation, influence, tact, etc. Has a moderate amount of influence over key organizational decisions. Tasks do not have defined steps; planning, problem-solving, and prioritization must occur to complete the tasks effectively.

Posted 6 days ago

Apply

5.0 - 9.0 years

5 - 9 Lacs

bengaluru

Work from Office

" Educational Requirements Bachelor of Engineering , BCA , BSc , MCA , MTech , MSc Service Line Data & Analytics Unit Responsibilities 5-8 yrs exp in Azure (Hands on experience in Azure Data bricks and Azure Data Factory) Good knowledge in SQL, PySpark. Should have knowledge in Medallion architecture pattern Knowledge on Integration Runtime Knowledge on different ways of scheduling jobs via ADF (Event/Schedule etc) Should have knowledge of AAS, Cubes. To create, manage and optimize the Cube processing. Good Communication Skills. Experience in leading a team Additional Responsibilities: Good knowledge on software configuration management systems Strong business acumen, strategy and cross-industry thought leadership Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Two or three industry domain knowledge Understanding of the financial processes for various types of projects and the various pricing models available Client Interfacing skills Knowledge of SDLC and agile methodologies Project and Team management Preferred Skills: Technology->Big Data - Data Processing->Spark

Posted 6 days ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

chennai

Work from Office

Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices, Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices- Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)- Spark Good to have Python. Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands, Experience in Concurrent design and multi-threading. Preferred technical and professional experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python.

Posted 6 days ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

chennai

Work from Office

Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices, Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices- Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)- Spark Good to have Python. Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands, Experience in Concurrent design and multi-threading Preferred technical and professional experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python

Posted 1 week ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

bengaluru

Work from Office

Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices, Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices- Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)- Spark Good to have Python. Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands, Experience in Concurrent design and multi-threading. Preferred technical and professional experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python.

Posted 1 week ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

chennai

Work from Office

As an Associate Software Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Spring Boot, Java2/EE, Microsservices- Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) Spark Good to have Python Preferred technical and professional experience None

Posted 1 week ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

chennai

Work from Office

Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices, Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices- Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)- Spark Good to have Python. Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands, Experience in Concurrent design and multi-threading. Preferred technical and professional experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies