Home
Jobs
Companies
Resume

198 Cloudera Jobs - Page 5

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5 - 10 years

0 - 1 Lacs

Pune

Work from Office

Naukri logo

Position Overview: Cloud Architect with expertise in Hadoop and Google Cloud Platform (GCP) Data Stack , along with experience in Big Data Architecture and Migration . The ideal candidate should have strong proficiency in GCP Big Data tools , including Hadoop, Hive, HDFS, Impala, Spark, MapReduce, MS SQL, Kafka, and Redis . Familiarity with Cloudera, HBase, MongoDB, MariaDB, and Event Hub is a plus. Key Responsibilities: Design, implement, and optimize Big Data architecture on GCP, and Hadoop ecosystems . Lead data migration projects from on-premise to cloud platforms (GCP). Develop and maintain ETL pipelines using tools like Spark, Hive, and Kafka . Manage Hadoop clusters, HDFS, and related components . Work with data streaming technologies like Kafka and Event Hub for real-time data processing. Optimize SQL and NoSQL databases (MS SQL, Redis, MongoDB, MariaDB, HBase) for high availability and scalability. Collaborate with data scientists, analysts, and DevOps teams to integrate Big Data solutions. Ensure data security, governance, and compliance in cloud and on-premise environments. Required Skills & Experience: 5-10 years of experience as Cloud Architect Strong expertise in Hadoop (HDFS, Hive, Impala, Spark, MapReduce) Hands-on experience with GCP Big Data Services Proficiency in MS SQL, Kafka, Redis for data processing and analytics Experience with Cloudera, HBase, MongoDB, and MariaDB Knowledge of real-time data streaming and event-driven architectures Understanding Big Data security and performance optimization Ability to design and execute data migration strategies Location : Koregaon Park, Pune, Maharashtra (India) Shift Timings : USA Time Zone (06:30 PM IST to 03:30 AM IST)

Posted 2 months ago

Apply

1 - 6 years

2 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

Sahaj Retail Limited is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 2 months ago

Apply

1 - 5 years

3 - 7 Lacs

Allahabad, Noida

Work from Office

Naukri logo

Feather Thread Corporation is looking for Bigdata administrator to join our dynamic team and embark on a rewarding career journey. Office Management:Oversee general office operations, including maintenance of office supplies, equipment, and facilities Manage incoming and outgoing correspondence, including mail, email, and phone calls Coordinate meetings, appointments, and travel arrangements for staff members as needed Administrative Support:Provide administrative support to management and staff, including scheduling meetings, preparing documents, and organizing files Assist with the preparation of reports, presentations, and other materials for internal and external stakeholders Maintain accurate records and databases, ensuring data integrity and confidentiality Communication and Coordination:Serve as a point of contact for internal and external stakeholders, including clients, vendors, and partners Facilitate communication between departments and team members, ensuring timely and effective information flow Coordinate logistics for company events, meetings, and conferences Documentation and Compliance:Assist with the development and implementation of company policies, procedures, and guidelines Maintain compliance with regulatory requirements and industry standards Ensure proper documentation and record-keeping practices are followed Project Support:Provide support to project teams by assisting with project coordination, documentation, and tracking of tasks and deadlines Collaborate with team members to ensure project deliverables are met on time and within budget

Posted 2 months ago

Apply

4 - 9 years

12 - 16 Lacs

Kochi

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 6 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers

Posted 2 months ago

Apply

4 - 9 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers

Posted 2 months ago

Apply

4 - 9 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers

Posted 2 months ago

Apply

4 - 9 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers

Posted 2 months ago

Apply

2 - 7 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities

Posted 2 months ago

Apply

5 - 7 years

11 - 13 Lacs

Nasik, Pune, Nagpur

Work from Office

Naukri logo

Euclid Innovations Pvt Ltd is looking for Data Engineer Drive to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 2 months ago

Apply

8 - 10 years

30 - 35 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Business Area: Professional Services Seniority Level: Mid-Senior level Job Description: At Cloudera, we empower people to transform complex data into clear and actionable insights. With as much data under management as the hyperscalers, were the preferred data partner for the top companies in almost every industry. Powered by the relentless innovation of the open source community, Cloudera advances digital transformation for the world s largest enterprises. If the idea of making data and analytics easy and accessible for everyone excites you, we want you on our team! Cloudera seeks a proactive, collaborative, creative and customer-orientated Sr. Solutions Consultant, Data Services to join our Data Services adoption and acceleration team. You will own, evangelize and collaborate with our customers, devise reference enterprise data architectures. You will get an opportunity to be part of a team that will foster a long-standing relationship with our customers building strong trusted advisor relationships. If you re looking for a role that encourages creative freedom, lets you influence product roadmap and impact our most interesting use cases, then this is the place for you! As a Sr. Solutions Consultant, Data Services you will : Help shape customer decisions to adopt Cloudera Data Platform (CDP) as a product specialist and trusted advisor. Provide expert-level assistance to customers, addressing technical challenges and offering solutions to optimize the use of Clouderas products. Collaborate closely with sales, product, and engineering teams to ensure alignment with customer needs and the delivery of tailored solutions. Gather and communicate customer feedback to internal teams, contributing to product improvements and the development of new features. Contribute to the Cloudera community through blogs, meetups, industry events, and by adding to both internal and external knowledge repositories. Understand and leverage advancements in AI and Hybrid Cloud to better serve clients and address their unique needs. We re excited about you if you have : 5+ years of professional experience in providing solutions, reference architectures, and driving product adoption. 3+ years of hands-on experience with Kubernetes, RedHat OpenShift, Rancher, Docker, or similar technologies. Proficient in Linux/RHEL with hands-on experience. Experience with Amazon AWS, Microsoft Azure, and/or Google Cloud, including relevant skills and certifications. Strong experience with SDS and SDN components within the Kubernetes ecosystem. Hands-on knowledge of the data ecosystem, including AI/ML, data warehousing, data engineering, and data-in-motion technologies. Experience with Java/Golang (Go), Python, or other scripting languages. Familiarity with NVIDIA GPU technologies. A passion for your work and a strong drive for success. A Bachelors or Masters degree from an accredited university is required. What you can expect from us: Generous PTO Policy Support work life balance with Unplugged Days Flexible WFH Policy Mental & Physical Wellness programs Phone and Internet Reimbursement program Access to Continued Career Development Comprehensive Benefits and Competitive Packages Paid Volunteer Time Employee Resource Groups #LI-AB1 #LI-Remote

Posted 2 months ago

Apply

3 - 7 years

3 - 7 Lacs

Karnataka

Work from Office

Naukri logo

Description Detailed JD RTIM Pega CDH 8.8 Multi App Infinity 24.1 Java Restful API oAuth 1. Understanding the NBA requirements and the complete CDH architecture 2. Review of the Conceptual Design Detailed Design and estimations. 3. Reviewing and contributing to the deployment activities and practices 4. Contributing to overall technical solution and putting it to practise. 5. Contributing to the requirement discussion with the Subject matter expertise in relation to Pega CDH 6. Experience in Pega CDH v8.8 multi app or 24.1 and retail banking domain is preferred. 7. Conducting peer code reviews 8. Excellent communications skills Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade B Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family 60236 (P) Software Engineering Local Role Name 6362 Software Developer Local Skills 5700 Pega Languages RequiredEnglish Role Rarity To Be Defined

Posted 2 months ago

Apply

2 - 5 years

3 - 7 Lacs

Karnataka

Work from Office

Naukri logo

EXP 4 to 6 yrs Location Any PSL Location Rate below 14$ JD - DBT/AWS Glue/Python/Pyspark Hands-on experience in data engineering, with expertise in DBT/AWS Glue/Python/Pyspark. Strong knowledge of data engineering concepts, data pipelines, ETL/ELT processes, and cloud data environments (AWS) Technology DBT, AWS Glue, Athena, SQL, Spark, PySpark Good understanding of Spark internals and how it works. Goot skills in PySpark Good understanding of DBT basically should be to understand DBT limitations and when it will end-up in model explosion Good hands-on experience in AWS Glue AWS expertise should know different services and should know how to configure them and infra-as-code experience Basic understanding of different open data formats Delta, Iceberg, Hudi Ability to engage in technical conversations and suggest enhancements to the current Architecture and design"

Posted 2 months ago

Apply

2 - 5 years

3 - 7 Lacs

Maharashtra

Work from Office

Naukri logo

Description Overall 10+ years of experience in Python and Shell Knowledge of distributed systems like Hadoop and Spark as well as cloud computing platforms such as Azure and AWS Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade B Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills Ruby;automation;Python Languages RequiredENGLISH Role Rarity To Be Defined

Posted 2 months ago

Apply

3 - 7 years

1 - 5 Lacs

Telangana

Work from Office

Naukri logo

Location Chennai and Hyderbad preferred but customer is willing to take resources from Hyderabad Experience 5 to 8 yrs ( U3 ). Exp - 5- 10 Yrs Location - Hyderabad / Chennai Location Proven experience as a development data engineer or similar role, with ETL background. Experience with data integration / ETL best practices and data quality principles. Play a crucial role in ensuring the quality and reliability of the data by designing, implementing, and executing comprehensive testing. By going over the User Stories build the comprehensive code base and business rules for testing and validation of the data. Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. Familiarity with Agile/Scrum development methodologies. Excellent analytical and problem solving skills. Strong communication and collaboration skills. Experience with big data technologies (Hadoop, Spark, Hive).

Posted 2 months ago

Apply

2 - 6 years

5 - 9 Lacs

Uttar Pradesh

Work from Office

Naukri logo

Proven experience as a development data engineer or similar role, with ETL background. Experience with data integration / ETL best practices and data quality principles. Play a crucial role in ensuring the quality and reliability of the data by designing, implementing, and executing comprehensive testing. By going over the User Stories build the comprehensive code base and business rules for testing and validation of the data. Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. Familiarity with Agile/Scrum development methodologies. Excellent analytical and problem solving skills. Strong communication and collaboration skills. Experience with big data technologies (Hadoop, Spark, Hive)

Posted 2 months ago

Apply

5 - 10 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Description Primary Skill Kafka Cluster Management,Kafka admin,Kubernetes,Helm ,DevOps, Jenkins Secondary Skill:Grafana, Prometheus, Dynatrace Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills Kafka Cluster Management;Kubernetes Languages RequiredENGLISH Role Rarity To Be Defined

Posted 2 months ago

Apply

5 - 7 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title Spark Python Scala Developer Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data - Data Processing->Spark Preferred Skills: Technology->Big Data - Data Processing->Spark Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements

Posted 2 months ago

Apply

5 - 8 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title Big Data Analyst Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data - NoSQL->MongoDB Preferred Skills: Technology->Big Data - NoSQL->MongoDB Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Educational Requirements Bachelor of Engineering Service Line Cloud & Infrastructure Services * Location of posting is subject to business requirements

Posted 2 months ago

Apply

3 - 5 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title Big Data Analyst Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Preferred Skills: Technology->Big Data->Oracle BigData Appliance Educational Requirements Bachelor of Engineering Service Line Cloud & Infrastructure Services * Location of posting is subject to business requirements

Posted 2 months ago

Apply

5 - 7 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title HADOOP ADMIN Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data - Hadoop->Hadoop Administration Preferred Skills: Technology->Big Data - Hadoop->Hadoop Administration Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements

Posted 2 months ago

Apply

10 - 15 years

35 - 40 Lacs

Mumbai, Bengaluru, Gurgaon

Work from Office

Naukri logo

Data Strategy & Data Governance Manager Join our team in Technology Strategy for an exciting career opportunity to enable our most strategic clients to realize exceptional business value from technology Practice: Technology Strategy & Advisory, Capability Network I Areas of Work: Data Strategy I Level: Manager | Location: Bangalore/Gurgaon/Mumbai/Pune/Chennai/Hyderabad/Kolkata | Years of Exp: 10 to 15 years Explore an Exciting Career at Accenture Do you believe in creating an impact? Are you a problem solver who enjoys working on transformative strategies for global clients? Are you passionate about being part of an inclusive, diverse and collaborative culture? Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Technology Strategy & Advisory. The Practice- A Brief Sketch: The Technology Strategy & Advisory Practice is a part of and focuses on the clients' most strategic priorities. We help clients achieve growth and efficiency through innovative R&D transformation, aimed at redefining business models using agile methodologies. As part of this high performing team, you will work on the scaling Data & Analytics"and the data that fuels it all"to power every single person and every single process. You will part of our global team of experts who work on the right scalable solutions and services that help clients achieve your business objectives faster. Business Transformation: Assessment of Data & Analytics potential and development of use cases that can transform business Transforming Businesses: Envisioning and Designing customized, next-generation data and analytics products and services that help clients shift to new business models designed for todays connected landscape of disruptive technologies Formulation of Guiding Principles and Components: Assessing impact to client's technology landscape/ architecture and ensuring formulation of relevant guiding principles and platform components. Product and Frameworks :Evaluate existing data and analytics products and frameworks available and develop options for proposed solutions. Bring your best skills forward to excel in the role: Leverage your knowledge of technology trends across Data & Analytics and how they can be applied to address real world problems and opportunities. Interact with client stakeholders to understand their Data & Analytics problems, priority use-cases, define a problem statement, understand the scope of the engagement, and also drive projects to deliver value to the client Design & guide development of Enterprise-wide Data & Analytics strategy for our clients that includes Data & Analytics Architecture, Data on Cloud, Data Quality, Metadata and Master Data strategy Establish framework for effective Data Governance across multispeed implementations. Define data ownership, standards, policies and associated processes Define a Data & Analytics operating model to manage data across organization . Establish processes around effective data management ensuring Data Quality & Governance standards as well as roles for Data Stewards Benchmark against global research benchmarks and leading industry peers to understand current & recommend Data & Analytics solutions Conduct discovery workshops and design sessions to elicit Data & Analytics opportunities and client pain areas. Develop and Drive Data Capability Maturity Assessment, Data & Analytics Operating Model & Data Governance exercises for clients A fair understanding of data platform strategy for data on cloud migrations, big data technologies, large scale data lake and DW on cloud solutions. Utilize strong expertise & certification in any of the Data & Analytics on Cloud platforms Google, Azure or AWS Collaborate with business experts for business understanding, working with other consultants and platform engineers for solutions and with technology teams for prototyping and client implementations. Create expert content and use advanced presentation, public speaking, content creation and communication skills for C-Level discussions. Demonstrate strong understanding of a specific industry , client or technology and function as an expert to advise senior leadership. Manage budgeting and forecasting activities and build financial proposals Qualifications Your experience counts! MBA from a tier 1 institute 5 7 years of Strategy Consulting experience at a consulting firm 3+ years of experience on projects showcasing skills across these capabilities- Data Capability Maturity Assessment, Data & Analytics Strategy, Data Operating Model & Governance, Data on Cloud Strategy, Data Architecture Strategy At least 2 years of experience on architecting or designing solutions for any two of these domains - Data Quality, Master Data (MDM), Metadata, data lineage, data catalog. Experience in one or more technologies in the data governance space:Collibra, Talend, Informatica, SAP MDG, Stibo, Alteryx, Alation etc. 3+ years of experience in designing end-to-end Enterprise Data & Analytics Strategic Solutions leveraging Cloud & Non-Cloud platforms like AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera, Informatica, Snowflake, Palantir Deep Understanding of data supply chain and building value realization framework for data transformations 3+ years of experience leading or managing teams effectively including planning/structuring analytical work, facilitating team workshops, and developing Data & Analytics strategy recommendations as well as developing POCs Foundational understanding of data privacy is desired Mandatory knowledge of IT & Enterprise architecture concepts through practical experience and knowledge of technology trends e.g. Mobility, Cloud, Digital, Collaboration A strong understanding in any of the following industries is preferred: Financial Services, Retail, Consumer Goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources or equivalent domains CDMP Certification from DAMA preferred Cloud Data & AI Practitioner Certifications (Azure, AWS, Google) desirable but not essential

Posted 2 months ago

Apply

6 - 10 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions " underpinned by the world's largest delivery network " Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 492,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at About Global Network: Accenture Strategy shapes our clients' future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, every business is a digital business. Digital is changing the way organizations engage with their employees, business partners, customers, and communities how they manufacture and deliver products and services, and how they run their organizations. This is our unique differentiator. We seek people who recognize and understand the impact that digital, and technology have on every industry and every sector, and share our passion to shape unique strategies that allow our clients to succeed in this environment. To bring this global perspective to our clients, Accenture Strategys services include those provided by our Global Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Approximately 10,000 consultants are part of this rapidly expanding network, providing specialized and strategic industry and functional consulting expertise from key locations around the world. Our Global Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit . Practice Overview: Skill/Operating Group Technology Consulting Level Consultant Location Gurgaon/Mumbai/Bangalore/Kolkata/Pune Travel Percentage Expected Travel could be anywhere between 0-100% Principal Duties And Responsibilities: Working closely with our clients, Consulting professionals design, build and implement strategies that can help enhance business performance. They develop specialized expertise"strategic, industry, functional, technical"in a diverse project environment that offers multiple opportunities for career growth. The opportunities to make a difference within exciting client initiatives are limitless in this ever-changing business landscape. Here are just a few of your day-to-day responsibilities. Architect large scale data lake, DW, and Delta Lake on cloud solutions using AWS, Azure, GCP, Ali Cloud, Snowflake, Hadoop, or Cloudera Design Data Mesh Strategy and Architecture Build strategy and roadmap for data migration to cloud Establish Data Governance Strategy & Operating Model Implementing programs/interventions that prepare the organization for implementation of new business processes Deep understanding of Data and Analytics platforms, data integration w/ cloud Provide thought leadership to the downstream teams for developing offerings and assets Identifying, assessing, and solving complex business problems for area of responsibility, where analysis of situations or data requires an in-depth evaluation of variable factors Overseeing the production and implementation of solutions covering multiple cloud technologies, associated Infrastructure / application architecture, development, and operating models Called upon to apply your solid understanding of Data, Data on cloud and disruptive technologies. Driving enterprise business, application, and integration architecture Helping solve key business problems and challenges by enabling a cloud-based architecture transformation, painting a picture of, and charting a journey from the current state to a "to-be" enterprise environment Assisting our clients to build the required capabilities for growth and innovation to sustain high performance Managing multi-disciplinary teams to shape, sell, communicate, and implement programs Experience in participating in client presentations & orals for proposal defense etc. Experience in effectively communicating the target state, architecture & topology on cloud to clients Qualifications Qualifications: Bachelors degree MBA Degree from Tier-1 College (Preferable) 6-10 years of large-scale consulting experience and/or working with hi tech companies in data architecture, data governance, data mesh, data security and management. Certified on DAMA (Data Management) or Azure Data Architecture or Google Cloud Data Analytics or AWS Data Analytics Experience: We are looking for experienced professionals with Data strategy, data architecture, data on cloud, data modernization, data operating model and data security experience across all stages of the innovation spectrum, with a remit to build the future in real-time. The candidate should have practical industry expertise in one of these areas - Financial Services, Retail, consumer goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources. Key Competencies and Skills: The right candidate should have competency and skills aligned to one or more of these archetypes - Data SME - Experience in deal shaping & strong presentation skills, leading proposal experience, customer orals; technical understanding of data platforms, data on cloud strategy, data strategy, data operating model, change management of data transformation programs, data modeling skills. Data on Cloud Architect - Technical understanding of data platform strategy for data on cloud migrations, big data technologies, experience in architecting large scale data lake and DW on cloud solutions. Experience one or more technologies in this space:AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera Data Strategy - Data Capability Maturity Assessment, Data & Analytics / AI Strategy, Data Operating Model & Governance, Data Hub Enablement, Data on Cloud Strategy, Data Architecture Strategy Data Transformation Lead - Understanding of data supply chain and data platforms on cloud, experience in conducting alignment workshops, building value realization framework for data transformations, program management experience Exceptional interpersonal and presentation skills - ability to convey technology and business value propositions to senior stakeholders Capacity to develop high impact thought leadership that articulates a forward-thinking view of the market Other desired skills - Strong desire to work in technology-driven business transformation Strong knowledge of technology trends across IT and digital and how they can be applied to companies to address real world problems and opportunities. Comfort conveying both high-level and detailed information, adjusting the way ideas are presented to better address varying social styles and audiences. Leading proof of concept and/or pilot implementations and defining the plan to scale implementations across multiple technology domains Flexibility to accommodate client travel requirements Published Thought leadership Whitepapers, POVs

Posted 2 months ago

Apply

4 - 9 years

6 - 11 Lacs

Pune

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers

Posted 2 months ago

Apply

12 - 18 years

20 - 35 Lacs

Navi Mumbai, Mumbai, Mumbai (All Areas)

Work from Office

Naukri logo

1. BACKGROUND Job Title: Senior Technical Infra Manager - Hadoop Systems and Administration IT Domain: Big Data, Data Analytics and Data Management Client Location : BKC, Mumbai, INDIA (Client Facing Role) Functional Domain: Capital Markets, Banking, BFSI Experience: Overall, 14-18 years in Systems Administration with mandatory experience for the last 6 years in Hadoop Cluster Administration in Cloudera or Horton Works Expertise: Infrastructure, Systems Administration, Hadoop Platform Administration Project : Greenfield Hadoop (CDP) based Data Warehouse (DWH) Implementation Company : Smartavya Analytica Private limited is a niche Data and AI company. Based in Pune, we are pioneers in data-driven innovation, transforming enterprise data into strategic insights. Established in 2017, our team has experience in handling large datasets up to 20 PBs in a single implementation, delivering many successful data and AI projects across major industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are leaders in Hadoop, CDP, Big Data, Cloud and Analytics projects with super specialization in very large Data Platforms. https://smart-analytica.com Empowering Your Digital Transformation with Data Modernization and AI Job Summary: For its flagship project having more than 100 CDP nodes, we are looking for a senior and accomplished Senior / General Manager level Infrastructure leader in Hadoop, Big Data, Cloud and Data encompassing Infrastructure, CDP Platform, Applications and other Hadoop areas. The person in this role will be responsible for actively own, provide leadership, manage a team of 15 professionals, own the entire Hadoop infrastructure of 150 nodes encompassing 25+ PB’s, lead the team managing the client CDP cluster on 24*7 basis and lead a team 15 Hadoop Admins working round the clock on 365 days *24-hour basis. In this role, you will play a key part in Technical Solutioning, Infrastructure Management, Issue Resolution, Solve Performance bottlenecks, Cluster Monitoring, Cluster Administration, Linux Administration, Security Design and Management and Team Management and mentoring. The ideal candidate possesses a strong analytical mindset, a deep understanding of Hadoop Cluster Management techniques, and a knack for presenting complex analysis and resolution in a clear and concise manner. 2. KEY RESPONSIBILITIES & SKILLS a. Client and Team Leadership and Mentoring • As owner, interact with VP/Director level senior clients and provide end to end leadership • Lead a team of 15 Infra / Hadoop Administrators working on 24*7 shift basis • Design and draw long term roadmap of overall Data Strategy and implementation • Roster Management and Ensure coverage. Ensure all 3 shifts across all 365 days are manned and working efficiently • Train new team members on Hadoop Administration and get them ready on projects b. Hadoop Cluster Management and Administration: • Installation and Configuration: Install and configure Hadoop clusters using Cloudera Data Platform (CDP), ensuring optimal setup tailored to project needs. • Cluster Access Management: Utilize the Cloudera Manager Admin Console to manage cluster access and oversee administrative operations. • Cluster Maintenance and Upgrades: Plan and execute cluster upgrades, patches, and migrations with minimal downtime. • Resource Management: Ensure high availability and scalability by effectively managing cluster resources and configurations. c. Hadoop System Monitoring and Performance Optimization: • Health Monitoring: Proactively Monitor cluster health and performance using Cloudera Manager tools to ensure system reliability and ensure high availability. • Performance Tuning: Analyse system metrics to identify bottlenecks and optimize resource utilization for enhanced performance. • Troubleshooting: Promptly troubleshoot and resolve issues related to cluster operations and data processing. • Perform Linux administration tasks and manage system configurations. • Ensure data integrity and backup procedures including DR Replication. d. Security Management and Compliance • Access Control: Configure and manage user access, roles, and permissions within the Hadoop environment to maintain security protocols. Implement and manage security and data governance. • Data Security: Ensure data security by implementing encryption, authentication mechanisms, and adhering to security policies. • Regulatory Compliance: Maintain compliance with industry regulations pertinent to capital markets, ensuring data handling meets required standards. • Vulnerability Management: Collaborate with security teams to identify and address vulnerabilities, ensuring system integrity. 3. Qualifications : a. Experience: • Overall: 14-18 years in Senior Positions in Infrastructure Management, Data Center Management, Systems Administration, Big Data / Hadoop / CDP Architect and / or Senior Solutions Architect. • Hadoop Administration: Minimum of 6 years managing Hadoop clusters using Cloudera CDP or Hortonworks. b. Technical Expertise : • Proficient in managing Infrastructure, Hadoop platform administration of components and tools, including HDFS, YARN, MapReduce, Hive, Ozone, DR Replication, Kudu Management, Spark Streaming and related areas. • Strong understanding of Cloudera Data Platform (CDP) features and administrative tools. • Experience with Linux/Unix system administration and scripting languages (e.g., Bash, Python). • Knowledge of data warehouse concepts and big data best practices. • Hadoop Technologies: (Preferred) Hands-on experience with HDFS, Hive, and Spark for handling large-scale data environments. c. Domain Knowledge: Familiarity with capital markets and financial services is highly desirable. d. Soft Skills : • Excellent problem-solving and analytical abilities. • Strong communication skills, both written and verbal. • Ability to engage effectively with clients and stakeholders. • Leadership skills with experience guiding technical teams. 4. Educational Background Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 5. What We Offer • Challenging Projects: Work on a cutting-edge greenfield project in the big data and analytics space. • Professional Growth: Opportunities for learning and career advancement within the organization. • Collaborative Environment: Join a dynamic team focused on innovation and excellence. • Competitive Compensation: Attractive salary package commensurate with experience and expertise.

Posted 2 months ago

Apply

5 - 9 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Clifyx Technology. is looking for Lead Data Engineer (Bigdata, Cloudera, Pyspark, Hive, Scala) to join our dynamic team and embark on a rewarding career journey Designing and developing data pipelines: Lead data engineers are responsible for designing and developing data pipelines that move data from various sources to storage and processing systems. Building and maintaining data infrastructure: Lead data engineers are responsible for building and maintaining data infrastructure, such as data warehouses, data lakes, and data marts. Ensuring data quality and integrity: Lead data engineers are responsible for ensuring data quality and integrity, by setting up data validation processes and implementing data quality checks. Managing data storage and retrieval: Lead data engineers are responsible for managing data storage and retrieval, by designing and implementing data storage systems, such as NoSQL databases or Hadoop clusters. Developing and maintaining data models: Lead data engineers are responsible for developing and maintaining data models, such as data dictionaries and entity-relationship diagrams, to ensure consistency in data architecture. Managing data security and privacy: Lead data engineers are responsible for managing data security and privacy, by implementing security measures, such as access controls and encryption, to protect sensitive data. Leading and managing a team: Lead data engineers may be responsible for leading and managing a team of data engineers, providing guidance and support for their work.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies