Home
Jobs
Companies
Resume

198 Cloudera Jobs - Page 7

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10 - 14 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

Client expectation apart from JD Longer AWS data engineering experience (glue, spark, ECR ECS docker), python, pyspark, hudi/iceberg/Terraform, Kafka. Java in early career would be a great addition but not a prio. (for the OOP part and java connectors).

Posted 3 months ago

Apply

7 - 10 years

0 - 3 Lacs

Mumbai

Hybrid

Naukri logo

Mandatory Skills Azure Synapse Pyspark Cloudera

Posted 3 months ago

Apply

7 - 10 years

0 - 3 Lacs

Bangalore Rural

Hybrid

Naukri logo

Mandatory Skills Azure Synapse Pyspark Cloudera

Posted 3 months ago

Apply

7 - 10 years

0 - 3 Lacs

Bengaluru

Hybrid

Naukri logo

Mandatory Skills Azure Synapse Pyspark Cloudera

Posted 3 months ago

Apply

7 - 10 years

0 - 3 Lacs

Hyderabad

Hybrid

Naukri logo

Mandatory Skills Azure Synapse Pyspark Cloudera

Posted 3 months ago

Apply

4 - 8 years

9 - 13 Lacs

Chennai, Pune, Delhi

Work from Office

Naukri logo

Develop, test, and deploy data processing applications using Apache Spark and Scala. Optimize and tune Spark applications for better performance on large-scale data sets. Work with the Cloudera Hadoop ecosystem (e.g., HDFS, Hive, Impala, HBase, Kafka) to build data pipelines and storage solutions. Collaborate with data scientists, business analysts, and other developers to understand data requirements and deliver solutions. Design and implement high-performance data processing and analytics solutions. Ensure data integrity, accuracy, and security across all processing tasks. Troubleshoot and resolve performance issues in Spark, Cloudera, and related technologies. Implement version control and CI/CD pipelines for Spark applications. Required Skills Experience: Minimum 8 years of experience in application development. Strong hands on experience in Apache Spark, Scala, and Spark SQL for distributed data processing. Hands-on experience with Cloudera Hadoop (CDH) components such as HDFS, Hive, Impala, HBase, Kafka, and Sqoop. Familiarity with other Big Data technologies, including Apache Kafka, Flume, Oozie, and Nifi. Experience building and optimizing ETL pipelines using Spark and working with structured and unstructured data. Experience with SQL and NoSQL databases such as HBase, Hive, and PostgreSQL. Knowledge of data warehousing concepts, dimensional modeling, and data lakes. Ability to troubleshoot and optimize Spark and Cloudera platform performance. Familiarity with version control tools like Git and CI/CD tools (e.g., Jenkins, GitLab).

Posted 3 months ago

Apply

15 - 19 years

40 - 50 Lacs

Hyderabad

Work from Office

Naukri logo

Role: To The ideal professional for this Cloud Architect role will: Have a passion for design, technology, analysis, collaboration, agility, and planning, along with a drive for continuous improvement and innovation. Exhibit expertise in managing high-volume data projects that leverage Cloud Platforms, Data Warehouse reporting and BI Tools, and the development of relational databases. Research, identify, and internally market enabling data management technologies based on business and end-user requirements. Seek ways to apply new technology to business processes with a focus on modernizing the approach to data management. Consult with technical subject matter experts and develop alternative technical solutions. Advise on options, risks, costs versus benefits, and impact on other business processes and system priorities. Demonstrate strong technical leadership skills and the ability to mentor others in related technologies. Qualifications Bachelor's degree in a computer-related field or equivalent professional experience is required. Preferred masters degree in computer science, information systems or related discipline, or equivalent and extensive related project experience. 10+ years of hands-on software development experience building data platforms with tools and technologies such as Hadoop, Cloudera, Spark, Kafka, Relational SQL, NoSQL databases, and data pipeline/workflow management tools. 6+ years of experience working with various cloud platforms (at least two from among AWS, Azure & GCP). Experience in multi-cloud data platform migration and hands-on experience working with AWS, AZURE GCP Experience in Data & Analytics projects is a must. Data modeling experience relational and dimensional with consumption requirements (reporting, dashboarding, and analytics). Thorough understanding and application of AWS services related to Cloud data platform and Datalake implementation S3 Datalake, AWS EMR, AWS Glue, Amazon Redshift, AWS Lambda, and Step functions with file formats such as Parquet, Avro, and Iceberg. Must know the key tenets of architecting and designing solutions on AWS and Azure Clouds. Expertise and implementation experience in data-specific areas, such as AWS Data Lake, Data Lakehouse Architecture, and Azure Synapse and SQL Datawarehouse. Apply technical knowledge to architect and design solutions that meet business and IT needs, create Data & Analytics roadmaps, drive POCs and MVPs, and ensure the long-term technical viability of new deployments, infusing key Data & Analytics technologies where applicable. Be the Voice of the Customer to share insights and best practices, connect with the Engineering team to remove key blockers, and drive migration solutions and implementations. Familiarity with tools like DBT, Airflow, and data test automation. MUST have experience with Python/ PySpark/Scala in Big Data environments. Strong skills in SQL queries in Big Data tools such as Hive, Impala, Presto. Experience working with and extracting value from large, disconnected, and/or unstructured datasets. Demonstrated ability to build processes that support data transformation, data structures, metadata, dependency, and workload management. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), and working familiarity with a variety of databases. Experience building and optimizing big data data pipelines, architectures, and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.

Posted 3 months ago

Apply

3 - 6 years

3 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Hadoop Admin 1 Position Hadoop administration Automation (Ansible, shell scripting or python scripting) DEVOPS skills (Should be able to code at least in one language preferably python Location: Preferably Bangalore, Otherwise Chennai, Pune, Hyderabad Working Type: Remote

Posted 3 months ago

Apply

4 - 8 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

JR REQ -BigData Engineer --4to8year---HYD----Karuppiah Mg --- TCS C2H ---900000

Posted 3 months ago

Apply

6 - 11 years

0 - 3 Lacs

Bengaluru

Work from Office

Naukri logo

SUMMARY This is a remote position. Job Description: EMR Admin We are seeking an experienced EMR Admin with expertise in Big data services such as Hive, Metastore, H-base, and Hue. The ideal candidate should also possess knowledge in Terraform and Jenkins. Familiarity with Kerberos and Ansible tools would be an added advantage, although not mandatory. Additionally, candidates with Hadoop admin skills, proficiency in Terraform and Jenkins, and the ability to handle EMR Admin responsibilities are encouraged to apply. Location: Remote Experience: 6+ years Must-Have: The candidate should have 4 years in EMR Admin. Requirements Requirements: Proven experience in EMR administration Proficiency in Big data services including Hive, Metastore, H-base, and Hue Knowledge of Terraform and Jenkins Familiarity with Kerberos and Ansible tools (preferred) Experience in Hadoop administration (preferred)

Posted 3 months ago

Apply

7 - 9 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Cloudera Data Platform Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Graduation Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Cloudera Data Platform. Your typical day will involve collaborating with cross-functional teams, developing and deploying applications, and ensuring their smooth functioning. Roles & Responsibilities: Design, build, and configure applications using Cloudera Data Platform to meet business process and application requirements. Collaborate with cross-functional teams to identify and prioritize application requirements. Develop and deploy applications, ensuring their smooth functioning and adherence to quality standards. Troubleshoot and debug applications, identifying and resolving technical issues in a timely manner. Stay updated with the latest advancements in Cloudera Data Platform and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Expertise in Cloudera Data Platform. Good To Have Skills:Experience with Hadoop, Spark, and other big data technologies. Strong understanding of data engineering concepts and principles. Experience with application development using Java, Python, or other programming languages. Solid grasp of database technologies, including SQL and NoSQL databases. Experience with data integration and ETL tools such as Apache NiFi or Talend. Additional Information: The candidate should have a minimum of 7.5 years of experience in Cloudera Data Platform. The ideal candidate will possess a strong educational background in computer science, software engineering, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office.

Posted 3 months ago

Apply

5 - 7 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica Data Quality Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : Role:Application Developer Project Role Description:Design, build and configure applications to meet business process and application requirements. Must have Skills :Informatica Data Quality, SSI: NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job Requirements:'',//?field Key Responsibilities:A:Mid to senior Data Governance specialist InformaticaData Quality B:Knowledge and experience in Data Governance C:Bare minimum technical skills needed are Cloudera Data Steward Studio, Atlas, Ranger, Informatica DQ D:Have to liaise between various business units E:Assist design, develop, test, and deploy data quality processes using tools such as Informatica, Trillium, Oracle, SAP, etc F:Collaborate with business users to identify attributes that require data quality Technical Experience:A:Overall experience of 3-5 years with strong database background B:2-3 years experience with a data quality tool such as Informatica, Trillium, Oracle, SAP, etc C:Bachelor Degree or equivalent experience D:Experience in data profiling and data quality analysis preferred E:Strong understanding of data quality best practices and concepts F:Strong data and SQL skills required Professional Attributes:A:Good Communication Skill B:Good Analytical Skill C:Leadership Quality D:Multitasking Ability Educational Qualification:Others Additional Info:Others

Posted 3 months ago

Apply

15 - 20 years

15 - 20 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Architecture Principles Good to have skills : NA Minimum 15 year(s) of experience is required Educational Qualification : Graduate Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Data Architecture Principles, SSI: NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job Requirements :Key Responsibilities :Good knowledge of Cloudera Data lake platform Good knowledge of Informatica BDM Service Delivery experience of more than 5 years on Cloudera/Informatica stack Good Communications and Client stakeholder management Technical Experience :Nice to have Spark/HBase/API Nice to have knowledge on Qlik Sense environment Good knowledge on Data Architecture, Data Pipelines Good knowledge of Tuning Spark and Impala workloads Professional Attributes :A:Team player B:Good decision maker C:Team builing skill D:Cool under pressure E:Role includes visit to client site Mandatory Educational Qualification:GraduateAdditional Info : Qualification Graduate

Posted 3 months ago

Apply

7 - 12 years

9 - 14 Lacs

Gurgaon

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica Data Quality Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Graduation Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Informatica Data Quality, SSI:NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job Requirements :Key Responsibilities :A:Mid to senior Data Governance specialist InformaticaData Quality B:Knowledge and experience in Data Governance C:Bare minimum technical skills needed are Cloudera Data Steward Studio, Atlas, Ranger, Informatica DQ D:Have to liaise between various business units E:Assist design, develop, test, and deploy data quality processes using tools such as Informatica, Trillium, Oracle, SAP, etc F:Collaborate with business users to identify attributes that require data quality Technical Experience :A:Overall experience of 3-5 years with strong database background B:2-3 years experience with a data quality tool such as Informatica, Trillium, Oracle, SAP, etc C:Bachelor Degree or equivalent experience D:Experience in data profiling and data quality analysis preferred E:Strong understanding of data quality best practices and concepts F:Strong data and SQL skills required Professional Attributes :A:Good Communication Skill B:Good Analytical Skill C:Leadership Quality D:Multitasking Ability Educational Qualification:Additional Info :Level and Across Accenture Location Facilities Qualifications Graduation

Posted 3 months ago

Apply

7 - 11 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Hadoop Administration, Risk Asessment URS Preparation Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Graduation Summary :As a Hadoop Administrator, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve working with Hadoop, managing and monitoring Hadoop clusters, and ensuring the security and scalability of the Hadoop infrastructure. Roles & Responsibilities: Lead the design, build, and configuration of Hadoop applications to meet business process and application requirements. Manage and monitor Hadoop clusters, ensuring the security and scalability of the Hadoop infrastructure. Collaborate with cross-functional teams, applying expertise in Hadoop administration and related technologies. Communicate technical findings effectively to stakeholders, utilizing data visualization tools for clarity. Stay updated with the latest advancements in Hadoop administration and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Experience in Hadoop administration. Good To Have Skills:Experience in related technologies such as Hive, Pig, and HBase. Strong understanding of Hadoop architecture and related technologies. Experience in managing and monitoring Hadoop clusters. Experience in ensuring the security and scalability of the Hadoop infrastructure. Additional Information: The candidate should have a minimum of 7.5 years of experience in Hadoop administration. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office.QualificationsGraduation

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Chennai

Work from Office

Naukri logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Natural Language Processing (NLP) Good to have skills : Big data Cloudera and SPARC, Minimum 5 year(s) of experience is required Educational Qualification : BE Summary :As an AI/ML Engineer specializing in Natural Language Processing (NLP), you will be responsible for developing applications and systems that utilize AI to improve performance and efficiency. Your typical day will involve working with deep learning, neural networks, chatbots, and NLP to create innovative solutions for our clients. Roles & Responsibilities: Develops applications and systems that utilize AI to improve performance and efficiency, including but not limited to deep learning, neural networks, chatbots, natural language processing. Design and develop NLP-based applications and systems using deep learning, neural networks, and chatbots. Collaborate with cross-functional teams to identify business requirements and translate them into technical solutions. Implement and optimize NLP algorithms and models to improve performance and accuracy. Stay up-to-date with the latest advancements in NLP and AI technologies and integrate innovative approaches for sustained competitive advantage. Communicate technical findings effectively to stakeholders, utilizing data visualization tools for clarity. Build predictive models, develop advanced algorithms that extract and classify information from large datasets quantify model performance Evaluate emerging technologies that may contribute to our analytical platform Identify and exploit new patterns in data using various techniques Worked with ML data mining toolkits like NLP, Semantic Web, R, Core NLP, NLTK etc. Information retrieval libraries like Lucene, SOLR fast paced, test driven, collaborative and iterative programming environment Professional & Technical Skills: Able to engage with customers for solving business problems leveraging Artificial Intelligent, Machine Learning Solid understanding of machine learning algorithms and statistical analysis. Also, good understanding of machine learning algorithms such as CRFs , SVM Identifying applicability of Machine Learning, Natural Language Processing (NLP) to use cases with ability to project both the business, Technology benefits Experience in implementing various NLP algorithms and models. Identify ways of embedding, integrating Artificial Intelligent, Machine Learning services into the enterprise architecture seamlessly Keep abreast of new technology innovations in the field of Machine Learning, Natural Language Processing (NLP) bring it Working experience in any of the NLP application areas, Sematic Web and Ontologies Machine Translation, Sentiment Analysis Document Classification, Question Answer Matching, Text Summarization Have worked with RNN, LSTM etc. Work exp in creating NLP pipelines for processing large amount of document corpus Expertise with Python Additional Information: The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions.-Be a self-starter and a fast learner Possess strong problem-solving skills with the ability to methodically analyze and resolve tech challenges Possess strong written, verbal, communication, analytical, technical, inter-personal and presentation skills Qualifications BE

Posted 3 months ago

Apply

4 - 6 years

6 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Python/Spark/Scala experience AWS experienced will be added advantage. Professional hand-on experience in scala/python Having around 4 to 6 years of experience with excellent coding skills in Java programming language. Having knowledge (or hands on experience) of big data platform and frameworks is good to have. Candidate should have excellent code understanding skills where in should be able to read opensource code (Trino)and build optimizations or improvements on top of it. Working experience in Presto/Trino is a great advantage. Knowledge in Elastic Search, Grafana will be good to have. Experience working under Agile methodology Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications Preferred technical and professional experience Having around 4 to 6 years of experience with excellent coding skills in Java programming language. Having knowledge (or hands on experience) of big data platform and frameworks is good to have. Candidate should have excellent code understanding skills where in should be able to read opensource code (Trino)and build optimizations or improvements on top of it

Posted 3 months ago

Apply

4 - 8 years

15 - 20 Lacs

Gurgaon

Work from Office

Naukri logo

Data Strategy Consultant Public Services Domain Find endless opportunities to solve our clients' toughest challenges, as you work with exceptional people, the latest tech and leading companies across industries. Practice: P ublic Service s , Industry Consulting, Global Network I Areas of Work: Data Strategy | Level: Consultant | Location: Gurgaon/Mumbai/ Bangalore / Hyderabad/Pune/Kolkata/Chennai | Years of Exp: 4-8 years Explore an Exciting Career at Accenture Do you believe in creating an impact? Are you a problem solver who enjoys working on transformative strategies for global clients? Are you passionate about being part of an inclusive, diverse, and collaborative culture? Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Strategy. The Practice- A Brief Sketch: The GN Management Consulting Group is a part of Accenture Strategy and Consulting and we help clients with strategies that are at the intersection of business and technology, drive value and impact, shape new businesses & design operating models for the future. As a part of this high performing team, you will help with the following: a. Project Delivery: Deliver projects for global Public Service clients by working together with medium to large size teams. Responsibilities may include strategy, implementation, process design and change management for specific modules. b. Business Development: Support efforts of global sales team to identify and win potential opportunities within the practice. c. Industry Experience: Provide industry expertise in Public Service industry segments. d. Domain Development: Development of assets and methodologies, development of point-of-view, research or white papers, internal tools, or materials for use by larger community. Industry/ functional Skills Overall Industry- Has very good understanding of Global Public Service market trends, best practices, and clients across both developed and emerging economies Has very good exposure to large transformation programs in the PS domain Thought leadership experience is preferred:Exposure to consulting assets, methodologies, points-of-view, research or white papers, marketing collaterals etc. in public service. The right candidate must have analytical skills to provide clarity to complex issues and gather data-driven insights. Qualifications Job Role- Data Strategy Data Capability Maturity Assessment, Data & Analytics / AI Strategy, Data Operating Model & Governance, Data Hub Enablement, Data on Cloud Strategy, Data Architecture Strategy Develop and implement data collection strategies to ensure accurate and relevant data capture from new support services such as Viva Connections Support Site (integrated with ServiceNow), IVR, and Gen-AI Chatbot. Lead the ongoing monitoring and analysis of data post-implementation to identify trends, areas for improvement, and emerging issues in the support experience. Data SME Technical understanding of data platforms, data on cloud strategy, data strategy, data operating model, change management of data transformation programs, data modeling skills. Experience with information strategy, data architecture, data modernization, data governance, data management, data operating model and data security experience across all stages of the innovation spectrum, with a remit to build the future in real-time. Certified on DAMA (Data Management) or Azure Data Architecture or Google Cloud Data Analytics or AWS Data Analytics will be desirable. Data on Cloud Architect - Technical understanding of data platform strategy for data on cloud migrations, big data technologies, experience in architecting large scale data lake and DW on cloud solutions. Experience one or more technologies in this space : AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera Data Transformation Lead Understanding of data supply chain and data platforms on cloud, experience in conducting alignment workshops, building value realization framework for data transformations, program management experience

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Noida

Work from Office

Naukri logo

About The Role : Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Wipro Limited (NYSE:WIT, BSE:507685, NSE:WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. 1. Understanding the NBA requirements. 2. Provide Subject matter expertise in relation to Pega CDH from technology perspective. 3. Participate actively in the creation and review of the Conceptual Design, Detailed Design, and estimations. 4. Implementing the NBAs as per agreed requirement/solution 5. Supporting the end-to-end testing and provide fixes with quick TAT. 6. Deployment knowledge to manage the implementation activities. 7. Experience in Pega CDH v8.8 multi app or 24.1 and retail banking domain is preferred. 8. Good communication skills 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Reinvent your world.We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 3 months ago

Apply

3 - 8 years

5 - 8 Lacs

Chennai, Pune, Delhi

Work from Office

Naukri logo

Tech Stalwart Solution Private Limited is looking for Data Engineer - SSIS and Tableau to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 3 months ago

Apply

4 - 9 years

6 - 11 Lacs

Kochi

Work from Office

Naukri logo

Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers

Posted 3 months ago

Apply

4 - 9 years

6 - 11 Lacs

Kochi

Work from Office

Naukri logo

Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers

Posted 3 months ago

Apply

4 - 9 years

6 - 11 Lacs

Kochi

Work from Office

Naukri logo

Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers

Posted 3 months ago

Apply

4 - 9 years

6 - 11 Lacs

Kochi

Work from Office

Naukri logo

Responsibilities As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include:Strategic SAP Solution Focus:Working across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution Delivery:Involvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies