Jobs
Interviews

237 Cloudera Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

10 - 15 Lacs

Mumbai, Gurugram, Bengaluru

Work from Office

S&C GN - Tech Strategy & Advisory - Cloud Architecture - Senior Manager Global Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Approximately 10,000 consultants are part of this rapidly expanding network, providing specialized and strategic industry and functional consulting expertise from key locations around the world. Our Global Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world.For more information visit . Practice Overview: Skill/Operating Group Technology Consulting Level Senior Manager Location Gurgaon/Mumbai/Bangalore/Pune/Kolkata Travel Percentage Expected Travel could be anywhere between 0-100% Principal Duties And Responsibilities: Working closely with our clients, Consulting professionals design, build and implement strategies that can help enhance business performance. They develop specialized expertisestrategic, industry, functional, technicalin a diverse project environment that offers multiple opportunities for career growth.The opportunities to make a difference within exciting client initiatives are limitless in this ever-changing business landscape.Here are just a few of your day-to-day responsibilities. Providing thought leadership in Cloud Architecture and Data Modernization, with a strong focus on industry-specific trends and best practices. To lead large scale Cloud Assessment, Migration and Application Modernization Projects Design an Application Modernization Strategy for Enterprises to meet Business Goals Build strategy and roadmap for Migration to Cloud Design the Application Modernization Architecture on cloud solutions such as AWS, Azure, GCP, Ali Cloud Deep understanding of Microservices and EDA To lead Large Scale Big Data Modernization Strategy and Architecture Engagements. Design Holistic Data Strategy to help enterprises to meet their Business goals. Architect large scale data lake, DW, and Delta Lake on cloud solutions using AWS, Azure, GCP, Ali Cloud, Snowflake, Hadoop, or Cloudera Design Data Mesh Strategy and Architecture Build strategy and roadmap for data migration to cloud Establish Data Governance Strategy & Operating Model Implementing programs/interventions that prepare the organization for implementation of new business processes Provide thought leadership to the downstream teams for developing offerings and assets Identifying, assessing, and solvingcomplex business problemsfor area of responsibility, where analysis of situations or data requires an in-depth evaluation of variable factors Overseeing the production and implementation of solutions covering multiple cloud technologies, associated Infrastructure / application architecture, development, and operating models Driving enterprise business, application, and integration architecture Helping solves key business problems and challenges by enabling a cloud-based architecture transformation, painting a picture of, and charting a journey from the current state to a to-be enterprise environment Assisting our clients to build the required capabilities for growth and innovation to sustain high performance Managing multi-disciplinary teams to shape, sell, communicate, and implement programs Experience in participating in client presentations & orals for proposal defense etc. Experience in effectively communicating the target state, architecture & topology on cloud to clients. Qualifications: Bachelors degree MBA Degree from Tier-1 College (Preferable) 15-20 years of large-scale consulting experience and/or working with hi tech companies leading Projects on Cloud Strategy, Assessment, Cloud architecture, App Modernization, Containers, information security and information management. Experience in data architecture, data governance, data mesh, data security and management. Bachelors degree MBA Degree from Tier-1 College (Preferable) 15-20 years of large-scale consulting experience and/or working with hi tech companies leading Projects on Cloud Strategy, Assessment, Cloud architecture, App Modernization, Containers, information security and information management. Experience in data architecture, data governance, data mesh, data security and management. Certified on DAMA (Data Management) or Azure Data Architecture or Google Cloud Data Analytics or AWS Data Analytics Architect Certification on Azure/AWS/GCP Experience: We are seeking experienced professionals who have led large-scale engagements in application modernization, cloud architecture, and cloud strategy. The ideal candidates will have technical expertise in cloud strategy, assessments, cloud architecture, application modernization, and containers, across all stages of the innovation lifecycle, with a focus on shaping the future in real-time. The candidate should have practical industry expertise in one of these areas - Financial Services, Retail, consumer goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources. Key Competencies and Skills: The right candidate should have competency and skills aligned to one or more of these archetypes Application Assessment & Migration - Experience leading and Large scale assessment and cloud migration projects Application Modernization Experience leading and designing the composable architecture leveraging Microservices, EDA on the Cloud Platforms. Cloud Architecture :Experience with private, public and hybrid cloud architectures, pros/cons, and Hybrid cloud integration architecture Cloud-native application development, DevOps and data integration within cloud platform Cloud Migration :Delivering cloud migration roadmaps and managing execution from a project management perspective. Cloud Deployment across various hyper-scaler platforms (AWS, Azure, GCP) and models (PaaS, SaaS, IaaS), containers and virtualization platforms (VMWare.) Data SME - Experience in deal shaping & strong presentation skills, leading proposal experience, customer orals; technical understanding of data platforms, data on cloud strategy, data strategy, data operating model, change management of data transformation programs, data modeling skills. Data on Cloud Architect - Technical understanding of data platform strategy for data on cloud migrations, big data technologies, experience in architecting large scale data lake and DW on cloud solutions. Experience one or more technologies in this space : AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera Data Strategy - Data Capability Maturity Assessment, Data & Analytics / AI Strategy, Data Operating Model & Governance, Data Hub Enablement, Data on Cloud Strategy, Data Architecture Strategy Data Transformation Lead - Understanding of data supply chain and data platforms on cloud, experience in conducting alignment workshops, building value realization framework for data transformations, program management experience Exceptional interpersonal and presentation skills - ability to convey technology and business value propositions to senior stakeholders Capacity to develop high impact thought leadership that articulates a forward-thinking view of the market Other desired skills - Strong desire to work in technology-driven business transformation Strong knowledge of technology trends across IT and digital and how they can be applied to companies to address real world problems and opportunities. Comfort conveying both high-level and detailed information, adjusting the way ideas are presented to better address varying social styles and audiences. Leading proof of concept and/or pilot implementations and defining the plan to scale implementations across multiple technology domains Flexibility to accommodate client travel requirements Published Thought leadership Whitepapers, POVs

Posted 3 days ago

Apply

7.0 - 12.0 years

15 - 25 Lacs

Bengaluru

Hybrid

Looking for Hadoop Developer

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Java Developer, you will be responsible for analyzing, designing, programming, debugging, and modifying software enhancements and/or new products used in various computer programs. Your expertise in Java, Spring MVC, Spring Boot, Database design, and query handling will be utilized to write code, complete programming, and perform testing and debugging of applications. You will work on local, networked, cloud-based, or Internet-related computer programs, ensuring the code meets the necessary standards for commercial or end-user applications such as materials management, financial management, HRIS, mobile apps, or desktop applications products. Your role will involve working with RESTful Web Services/Microservices for JSON creation, data parsing/processing using batch and stream mode, and messaging platforms like Kafka, Pub/Sub, ActiveMQ, among others. Proficiency in OS, Linux, virtual machines, and open source tools/platforms is crucial for successful implementation. Additionally, you will be expected to have an understanding of data modeling and storage with NoSQL or relational DBs, as well as experience with Jenkins, Containerized Microservices deployment in Cloud environments, and Big Data development (Spark, Hive, Impala, Time-series DB). To excel in this role, you should have a solid understanding of building Microservices/Webservices using Java frameworks, REST API standards and practices, and object-oriented analysis and design patterns. Experience with cloud technologies like Azure, AWS, and GCP will be advantageous. A candidate with Telecom domain experience and familiarity with protocols such as TCP, UDP, SNMP, SSH, FTP, SFTP, Corba, SOAP will be preferred. Additionally, being enthusiastic about work, passionate about coding, a self-starter, and proactive will be key qualities for success in this position. Strong communication, analytical, and problem-solving skills are essential, along with the ability to write quality/testable/modular code. Experience in Big Data platforms, participation in Agile Development methodologies, and working in a start-up environment will be beneficial. Team leading experience is an added advantage, and immediate joiners will be given special priority. If you possess the necessary skills and experience, have a keen interest in software development, and are ready to contribute to a dynamic team environment, we encourage you to apply for this role.,

Posted 5 days ago

Apply

2.0 - 4.0 years

25 - 30 Lacs

Pune

Work from Office

Rapid7 is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 6 days ago

Apply

10.0 - 12.0 years

0 - 1 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Experience Required: 10 to 12yrs (3-4 Years in Cloudera + Cloud Migration) Work location Hyderabad, Bangalore, Chennai, Noida, Pune Work Type - Hybrid model Work Time - Canada EST hours is a must Job Summary: We are seeking a skilled Cloudera Migration Specialist to lead the migration of our on-premises Cloudera cluster to Microsoft Azure. The ideal candidate will have 3–4 years of hands-on experience with Cloudera platform administration, optimization, and migration, along with a strong understanding of Azure cloud services and data engineering best practices. Key Responsibilities: Lead and execute the migration of Cloudera workloads (HDFS, Hive, Spark, Impala, HBase, etc.) from on-premise infrastructure to Azure. • Assess the existing Cloudera cluster, identify dependencies, and prepare a detailed migration roadmap. • Develop and implement data migration scripts, workflows, and cloud-native configurations. • Design and deploy equivalent services on Azure using Azure HDInsight, Azure Data Lake, Azure Synapse, or other relevant services. • Ensure data integrity, performance tuning, and post-migration validation. • Collaborate with infrastructure, security, and DevOps teams to ensure compliance and automation. • Prepare and maintain documentation of the migration plan, architecture, and troubleshooting playbooks. • Provide knowledge transfer and training to internal teams post-migration. Required Skills & Experience: 3–4 years of hands-on experience with Cloudera (CDH/CDP) ecosystem components (e.g., HDFS, YARN, Hive, Spark, Impala, HBase). • Proven experience in Cloudera cluster migrations, preferably to cloud platforms like Azure. • Solid understanding of cloud-native equivalents and data architectures on Azure. • Experience with Azure services such as HDInsight, Data Lake Storage, Synapse Analytics, Blob Storage. • Proficiency in Linux system administration, shell scripting, and automation tools. • Strong problem-solving and troubleshooting abilities in distributed data environments. • Familiarity with security controls, Kerberos, Ranger, LDAP integration, and data governance Preferred Qualifications: Cloudera Certified Administrator / Developer. • Experience with Azure DevOps, Terraform, or Ansible for infrastructure provisioning. • Knowledge of disaster recovery planning and HA architectures on Azure. • Familiarity with performance tuning in cloud vs. on-prem Hadoop environments

Posted 1 week ago

Apply

7.0 - 12.0 years

7 - 11 Lacs

Pune

Work from Office

Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients acrossbanking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT JOB SUMMARY: Position Sr Consultant Location Pune / Bangalore Band M3/M4 (7 to 14 years) Role Description: Must Have Skills: Should have experience in PySpark and Scala + Spark for 4+ years (Min experience). Proficient in debugging and data analysis skills. Should have Spark experience of 4+ years Should have understanding of SDLC and Big Data Application Life Cycle Should have experience in GIT HUB and GIT commands Good to have experience in CICD tools such Jenkins and Ansible Fast problem solving and self-starter Should have experience in using Control-M and Service Now (for Incident management ) Positive attitude, good communication skills (written and verbal both), should not have mother tongue interference. WHY JOIN CAPCO You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry. We offer A work culture focused on innovation and creating lasting value for our clients and employees Ongoing learning opportunities to help you acquire new skills or deepen existing expertise A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients A diverse, inclusive, meritocratic culture We offer: A work culture focused on innovation and creating lasting value for our clients and employees Ongoing learning opportunities to help you acquire new skills or deepen existing expertise A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients #LI-Hybrid

Posted 1 week ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Number of Openings 4 ECMS ID* 534188 Assignment Duration 6 Total Yrs. of Experience 5 -8years Relevant Yrs. of experience 5 Detailed JD (Roles and Responsibilities) We need candidate with 5-8 year of experience. Must have Python, PySpark ML, Snowpark ML, Snowflake ML deployment experience. Experience in CDSW Python( ML models) to Snowflake compatible Python. Also, ONNX conversion, MLflow registration experience required. ML Ops Experience: Deployment of migrated ML Code and CI/CD, Model object deployment, Model Monitoring. Requires Testing of adapted python code. Sagemaker orchestration also required. For both Cloudera / Snowflake experience is preferred but not required. Mandatory skills Python, PySpark ML, Snowpark ML, Snowflake ML deployment experience Desired/ Secondary skills Python, SQL, Data Engineering Domain FS Proposed Vendor Rate from ECMS ID (Visible to the user/ requestor) 6636 INR/day Delivery Anchor for tracking the sourcing statistics, technical evaluation, interviews and feedback etc. Vamshi. Taduri Work Location: Pune, Mysore Work Mode: WFO, WFH or Hybrid model (specify the days) WFO( 5 days in a week from Infosys ODC) BG Check (Post or Pre onboarding) Pre onboarding Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) YES/ NO regular ( Morning shift) 8AM to 5PM

Posted 1 week ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Kochi

Work from Office

Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc

Posted 1 week ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Candidates must have experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems 10 - 15 years of experience in data engineering and architecting data platforms 5 – 8 years’ experience in architecting and implementing Data Platforms Azure Cloud Platform. 5 – 8 years’ experience in architecting and implementing Data Platforms on-prem (Hadoop or DW appliance) Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow. Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc Candidates should have experience in delivering both business decision support systems (reporting, analytics) and data science domains / use cases

Posted 1 week ago

Apply

2.0 - 5.0 years

14 - 17 Lacs

Mysuru

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 1 week ago

Apply

4.0 - 9.0 years

16 - 25 Lacs

Navi Mumbai, Bengaluru, Mumbai (All Areas)

Hybrid

Role & responsibilities: Design and implement scalable data pipelines for feature extraction, transformation, and loading (ETL) using technologies such as Pyspark, Scala, and relevant big data frameworks. Govern and optimize data pipelines to ensure high reliability, efficiency, and data quality across on-premise and cloud environments. Collaborate closely with data scientists, ML engineers, and business stakeholders to understand data requirements and translate them into technical solutions. Implement best practices for data governance, metadata management, and compliance with regulatory requirements. Lead a team of data engineers, providing technical guidance, mentorship, and fostering a culture of innovation and collaboration. Stay updated with industry trends and advancements in big data technologies and contribute to the continuous improvement of our data engineering practices. Preferred candidate profile Strong experience in data engineering with hands-on experience in designing and implementing data pipelines. Strong proficiency in programming languages such as Pyspark and Scala, with experience in big data technologies (Cloudera, Hadoop ecosystem). Proven leadership experience in managing and mentoring a team of data engineers. Experience working in a banking or financial services environment is a plus. Excellent communication skills with the ability to collaborate effectively across teams and stakeholders.

Posted 1 week ago

Apply

13.0 - 20.0 years

35 - 70 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Required Skills and Experience 13+ Years is a must with 7+ years of relevant experience working on Big Data Platform technologies. Proven experience in technical skills around Cloudera, Teradata, Databricks, MS Data Fabric, Apache, Hadoop, Big Query, AWS Big Data Solutions (EMR, Redshift, Kinesis, Qlik) Good Domain Experience in BFSI or Manufacturing area . Excellent communication skills to engage with clients and influence decisions. High level of competence in preparing Architectural documentation and presentations. Must be organized, self-sufficient and can manage multiple initiatives simultaneously. Must have the ability to coordinate with other teams independently. Work with both internal/external stakeholders to identify business requirements, develop solutions to meet those requirements / build the Opportunity. Note: If you have experience in BFSI Domain than the location will be Mumbai only If you have experience in Manufacturing Domain the location will be Mumbai & Bangalore only. Interested candidates can share their updated resumes on shradha.madali@sdnaglobal.com

Posted 1 week ago

Apply

5.0 - 10.0 years

5 - 14 Lacs

Bengaluru

Work from Office

Experience:- 5+ Years. Location:- Bengaluru. Candidates Preffered:- Immediate, Serving Notice Period, 30 days. Role:- Big Data QA We are looking for a skilled ETL Tester with proven experience in designing developing and validating data solutions. PRINCIPAL ACCOUNTABILITIES • Developing test scenarios and test cases from requirements and design specifications and execute the test suites. • Perform System integration testing • Working in agile teams along with Testing, your contributions could be code review, requirements/user stories grooming, or coding where it is reasonable. • Provide Inputs and works on automation, Integration, and regression testing. • Interact with stakeholders, participate in scrum meetings and provide status on daily basis. KNOWLEDGE AND EXPERIENCE • 5 to 8 years relevant work experience in software testing primarily on Database /ETL and exposure towards Big Data Testing • Hands on experience in Testing Big Data Application on: Azure , Cloudera • Understanding of more query languages: Pig, HiveQL, etc. • Excellent skills in writing SQL queries and good knowledge on database [Oracle/ Netezza/SQL ] • Handson on any one scripting languages: Java/Scala/Python etc. • Good to have experience in Unix shell scripting • Experience in Agile development, knowledge on using Jira • Good analytical skills and communications skills. • Prior Health Care industry experience is a plus. • Flexible to work / Adopt quickly with different technologies and tools Experience working in cross-functional teams and collaborating effectively with various stakeholders. Excellent analytical, problem-solving, and communication skills to present and document technical concepts clearly. Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. Interested candidates share your updated resume to the below mentioned Mail ID Mail ID:- arshitha.n@rlabsglobal.com

Posted 1 week ago

Apply

5.0 - 10.0 years

12 - 16 Lacs

Bengaluru

Remote

Job Summary: The Data Engineer for the Central Data Platform who will be responsible for designing, developing, and maintaining the data infrastructure. This role involves working with large datasets, ensuring data quality, and implementing data integration solutions to facilitate efficient and accurate tax processing. The Data Engineer will collaborate with cross-functional teams to understand data requirements and provide insights that drive decision-making processes. Key Responsibilities: Data Infrastructure Development: Design and implement scalable data pipelines and ETL processes to collect, process, and store data. Develop and maintain data models and database schemas to support the system. Data Integration: Integrate data from various sources, including external systems and internal databases, ensuring data consistency and reliability. Implement data validation and cleansing processes to maintain data quality. Data Management: Monitor and optimize data storage solutions to ensure eAicient data retrieval and processing. Implement data security measures to protect sensitive tax information. Collaboration and Support: Work closely with data analysts, data scientists, and other stakeholders to understand data requirements and provide technical support. Collaborate with IT and software development teams to integrate data solutions into the overall system architecture. Performance Optimization: Identify and resolve performance bottlenecks in data processing and storage. Continuously improve data processing workflows to enhance system performance and scalability. Documentation and Reporting: Document data engineering processes, data models, and system configurations. Generate reports and dashboards to provide insights into data trends and system performance. Qualifications: Bachelors or master’s degree in computer science, Information Technology, Data Science, or a related field. At least 5 years of proven experience as a Data Engineer, with string expertise in SAP Hana/BW Strong proficiency in SQL- yrs and experience with relational databases (e.g., SAP Hana/ BW, PostgreSQL, MySQL). Experience with big data technologies (e.g., Hadoop, Spark, Cloudera) and data processing frameworks. Familiarity with cloud platforms (e.g., AWS, Azure yrs, Google Cloud) and their data services. Proficiency in programming languages such as Python, Java, or Scala. Excellent problem-solving skills and strong analytical abilities. Experience with data visualization tools (e.g., Tableau, Power BI-2yrs) is a plus. Hands on experience in using ETL – yrs tools.

Posted 1 week ago

Apply

4.0 - 7.0 years

25 - 30 Lacs

Ahmedabad

Work from Office

ManekTech is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 1 week ago

Apply

12.0 - 20.0 years

35 - 60 Lacs

Bengaluru

Work from Office

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Join the innovative team at Kyndryl as a Client Technical Solutioner and unlock your potential to shape the future of technology solutions. As a key player in our organization, you will embark on an exciting journey where you get to work closely with customers, understand their unique challenges, and provide them with cutting-edge technical solutions and services. Picture yourself as a trusted advisor – collaborating directly with customers to unravel their business needs, pain points, and technical requirements. Your expertise and deep understanding of our solutions will empower you to craft tailored solutions that address their specific challenges and drive their success. Your role as a Client Technical Solutioner is pivotal in developing domain-specific solutions for our cutting-edge services and offerings. You will be at the forefront of crafting tailored domain solutions and cost cases for both simple and complex, long-term opportunities, demonstrating we meet our customers' requirements while helping them overcome their business challenges. At Kyndryl, we believe in the power of collaboration and your expertise will be essential in supporting our Technical Solutioning and Solutioning Managers during customer technology and business discussions, even at the highest levels of Business/IT Director/LOB. You will have the chance to demonstrate the value of our solutions and products, effectively communicating their business and technical benefits to decision makers and customers. In this role, you will thrive as you create innovative technical solutions that align with industry trends and exceed customer expectations. Your ability to collaborate seamlessly with internal stakeholders will enable you to gather the necessary documents and technical insights to deliver compelling bid submissions. Not only will you define winning cost models for deals, but you will also lead these deals to profitability, ensuring the ultimate success of both our customers and Kyndryl. You will play an essential role in contract negotiations, up to the point of signature, and facilitate a smooth engagement hand-over process. As the primary source of engagement management and solution design within your technical domain, you will compile, refine, and take ownership of final solution documents. Your technical expertise will shine through as you present these documents in a professional and concise manner, showcasing your mastery of the subject matter. You’ll have the opportunity to contribute to the growth and success of Kyndryl by standardizing our go-to-market pitches across various industries. By creating differentiated propositions that align with market requirements, you will position Kyndryl as a leader in the industry, opening new avenues of success for our customers and our organization. Join us as a Client Technical Solutioner at Kyndryl and unleash your potential to shape the future of technical solutions while enjoying a stimulating and rewarding career journey filled with innovation, collaboration, and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience 10 – 15 Years (Specialist Seller / Consultant) is a must with 3 – 4 years of relevant experience as Data. Hands on experience in the area of Data Platforms (DwH / Datalake) like Cloudera / Databricks / MS Data Fabric / Teradata / Apache Hadoop / BigQuery / AWS Big Data Solutions (EMR, Redshift, Kinesis) / Qlik etc. Proven past experience in modernizing legacy data / app & transforming them to cloud - architectures Strong understanding of data modelling and database design. Expertise in data integration and ETL processes. Knowledge of data warehousing and business intelligence concepts. Experience with data governance and data quality management Good Domain Experience in BFSI or Manufacturing area. Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). Strong understanding of data integration techniques, including ETL (Extract, Transform, Load), Processes, Data Pipelines, and Data Streaming using Python , Kafka for streams , Pyspark , DBT , and ETL services Understanding & Experience in Data Security principles - Data Masking / Encryption etc Knowledge of Data Governance principles and practices, including Data Quality, Data Lineage, Data Privacy, and Compliance. Knowledge of systems development, including system development life cycle, project management approaches and requirements, design, and testing techniques Excellent communication skills to engage with clients and influence decisions. High level of competence in preparing Architectural documentation and presentations. Must be organized, self-sufficient and can manage multiple initiatives simultaneously. Must have the ability to coordinate with other teams and vendors, independently Deep knowledge of Services offerings and technical solutions in a practice Demonstrated experience translating distinctive technical knowledge into actionable customer insights and solutions Prior consultative selling experience Externally recognized as an expert in the technology and/or solutioning areas, to include technical certifications supporting subdomain focus area(s) Responsible for Prospecting & Qualifying leads, do the relevant Product / Market Research independently, in response to Customer’s requirement / Pain Point. Advising and Shaping Client Requirements to produce high-level designs and technical solutions in response to opportunities and requirements from Customers and Partners. Work with both internal / external stakeholders to identify business requirements, develop solutions to meet those requirements / build the Opportunity. Understand & analyze the application requirements in Client RFPs Design software applications based on the requirements within specified architectural guidelines & constraints. Lead, Design and implement Proof of Concepts & Pilots to demonstrate the solution to Clients /prospects. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 1 week ago

Apply

5.0 - 9.0 years

12 - 17 Lacs

Noida

Work from Office

Spark/PySpark Technical hands on data processing Table designing knowledge using Hive - similar to RDBMS knowledge Database SQL knowledge for retrieval of data - transformation queries such as joins (full, left, right), ranking, group by Good Communication skills. Additional skills - GitHub, Jenkins, shell scripting would be added advantage Mandatory Competencies Big Data - Big Data - Pyspark Big Data - Big Data - SPARK Big Data - Big Data - Hadoop Big Data - Big Data - HIVE DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Jenkins Beh - Communication and collaboration Database - Database Programming - SQL DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - GitLab,Github, Bitbucket DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Basic Bash/Shell script writing

Posted 1 week ago

Apply

5.0 - 8.0 years

4 - 7 Lacs

Mumbai

Work from Office

Excellent Knowledge on Spark; The professional must have a thorough understanding Spark framework, Performance Tuning etc Excellent Knowledge and hands-on experience of at least 4+ years in Scala and PySpark Excellent Knowledge of the Hadoop eco System- Knowledge of Hive mandatory Strong Unix and Shell Scripting Skills Excellent Inter-personal skills and for experienced candidates Excellent leadership skills Mandatory for anyone to have Good knowledge of any of the CSPs like Azure,AWS or GCP; Certifications on Azure will be additional Plus. Mandatory Skills: PySpark. Experience: 5-8 Years.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

5 - 8 Lacs

Gurugram

Work from Office

Role description Do You Make The Cut? FreeCharge is looking for a passionate, independent and results driven Technical Program Management professional. As a program professional you will be in a unique position to experience the best of both product and engineering functions. You will drive programs, execute and launch products that will impact millions of customers for the largest mobile commerce company in India. The program manager is expected to work in an extremely ambiguous and fast paced environment and is expected to think on the feet to facilitate smooth and flawless product launches. Education: Bachelors / Masters in Software Engineering Responsibilities Program Manager manages product releases from concept to successful launch. Work closely with product and engineering teams and other cross functional teams from concept through development to launch. Plan and deliver flawless product by co-ordinating with various engineering functions such as, but not limited to, development, QA and release management Prepare detailed project plans, generate appropriate metrics to assist with decision making. Track milestones, be able to build consensus, resolve conflicts, prepare risk mitigation plans and strategies, keep teams focussed and aligned on delivery. Enable clear, concise and transparent information sharing across all stakeholders. Proactively identify and clear bottlenecks, carry out escalation management in timely manner, make tradeoffs, and balance the business needs versus technical constraints. De-risk the product launch either by having contingency plans, iterative execution or any out of the box solutions. Identify engineering and product process deficiencies and work with teams to eliminate them. Sounds Like You? Overall 2-4 years of software development experience and at least 4+ years of Technical Program Management experience in software industry. Bachelor's degree in Engineering, Computer Science or related technical field. Hands on technical design and architecture experience. Analytical thinking skills, ability to see the bigger picture. Experience with Agile methodologies and tools Experience with tools such as excel, PowerPoint, SharePoint. A proven track record of delivering initiatives from conception through completion on time, within budget and on or beyond scope. Ability to work under tight deadlines in a high pressure environment and able to adjust to multiple demands. Excellent oral and written communication skills with both technical and non-technical individuals. Excellent people person skills will be required. Experience with building stream-processing systems, using solutions such as Storm, or Spark-Streaming. Experience with the integration of data from multiple data sources. Experience with SQL, NoSQL databases. Experience with various messaging systems, such as Kafka, or RabbitMQ. Experience with confluent packages like Kafka connect, schema registry. Experience with Cloudera/MapR/Hortonworks. Experience with AWS big data solutions. Good knowledge of Big Data querying tools, such as Pig, Hive or Impala Knowledge of various ETL techniques and frameworks, such as Flume.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Pune

Work from Office

We are seeking a skilled ML Platform Engineer, responsible for automating, deploying, patching, and maintaining our machine learning platform infrastructure. The ideal candidate will have hands-on experience with Cloudera Data Science Workbench (CDSW), Cloudera Data Platform (CDP), Docker, Kubernetes, Python, Ansible, GitLab, and MLOps best practices. Responsibilities Automate deployment and management processes for machine learning platforms using tools such as Ansible and Python. Deploy, monitor, and patch ML platform components, including Cloudera Data Science Workbench (CDSW), Docker containers, and Kubernetes clusters. Ensure high availability and reliability of ML infrastructure through proactive maintenance and regular updates. Develop and maintain comprehensive documentation for platform configurations, processes, and procedures. Troubleshoot and resolve platform issues, ensuring minimal downtime and optimal performance. Implement best practices for security, scalability, and automation within the ML platform ecosystem. Skills Must have Experience with CDSW or similar data science platforms. Proficiency in containerization and orchestration using Docker and Kubernetes. Solid scripting and automation skills in Python and Ansible. Experience with GitLab for source control and CI/CD automation. Understanding of MLOps principles and practices. Familiarity with patching, updating, and maintaining platform infrastructure. Profound Unix knowledge Excellent problem-solving skills and a collaborative approach to team projects. Strong experience with Python programming language in developing enterprise level applications Proficient in designing, developing and maintaining distributed systems and services Experience in Ansible automation tool for platform IaC, deployment automation and configuration management Strong problem-solving skills with the ability to handle complex systems Good communication and teamwork skills, able to work independently and collaboratively with other teams to deliver quality software solution Fluent in English with the ability to explain complex concepts in simple terms Nice to have Agile environment Previous banking domain Other Languages English: C1 Advanced Seniority Senior

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Palantir Foundry Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress, address any challenges that arise, and facilitate communication among team members to foster a productive work environment. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior professionals to support their growth and development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Palantir Foundry.- Strong understanding of application design and development principles.- Experience with data integration and management within Palantir Foundry.- Ability to troubleshoot and resolve application-related issues effectively.- Familiarity with agile methodologies and project management practices. Additional Information:- The candidate should have minimum 3 years of experience in Palantir Foundry.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Pune

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 2 weeks ago

Apply

10.0 - 14.0 years

8 - 13 Lacs

Navi Mumbai

Work from Office

Skill required: Network Billing Operations - Problem Management Designation: Network & Svcs Operation Assoc Manager Qualifications: Any Graduation Years of Experience: 10 to 14 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Helps transform back office and network operations, reduce time to market and grow revenue, by improving customer experience and capex efficiency, and reducing cost-to-serveGood Customer Support Experience preferred with good networking knowledgeManage problems caused by information technology infrastructure errors to minimize their adverse impact on business and to prevent their recurrence by seeking the root cause of those incidents and initiating actions to improve or correct the situation. What are we looking for 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage, Direct active participation on GenAI and Machine Learning projects Other skills:Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems Typically creates new solutions, leveraging and, where needed, adapting existing methods and procedures The person requires understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor or team leads Generally interacts with peers and/or management levels at a client and/or within Accenture The person should require minimal guidance when determining methods and procedures on new assignments Decisions often impact the team in which they reside and occasionally impact other teams Individual would manage medium-small sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 2 weeks ago

Apply

8.0 - 13.0 years

5 - 9 Lacs

Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Engineering Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for a highly skilled and experienced ETL Lead with a strong background in Data Warehousing, ETL processes, and Big Data technologies. The ideal candidate should have hands-on experience with Cloudera tools, excellent SQL/PLSQL skills, and the ability to lead a team while working closely with clients in the Banking/Insurance domain. Roles & Responsibilities:Design, develop, and maintain end-to-end Data Warehouse and ETL solutions.Work extensively with SQL, PLSQL, Oracle, Hadoop, and Cloudera Data Platform tools (Spark, Hive, Impala).Lead a team of 4+ engineers, guiding both technical and operational tasks.Manage daily operations and provide L1/L2/L3 support, ensuring no SLA breaches.Collaborate with clients to understand requirements and provide scalable, reliable solutions.Conduct team meetings, client discussions, and knowledge transfer (KT) sessions.Utilize Shell scripting, Unix commands, and have working knowledge of Java/Python for task automation and integration.Ensure the team follows best practices in coding, documentation, and support.Stay up to date with new technologies and demonstrate a willingness to learn as per client needs.Be available to work from client location and travel daily as required.Proficient in analysing and rewriting legacy systems.Solid team leadership skills with experience managing 4+ people.Excellent communication and interpersonal skills.Demonstrated ability to manage operations and support teams efficiently.Willingness to work from the client location and travel as required.Proactive and positive attitude toward learning and adopting new technologies.Strong analytical skills and solution-oriented mindset.Ability to work under minimal supervision in a fast-paced environment. Professional & Technical Skills: 8+ years of experience in Data Warehousing and ETL Technologies.Strong expertise in SQL, PLSQL, Oracle, and Hadoop ecosystem.Hands-on experience with Spark, Hive, Impala on Cloudera Data Platform.Strong understanding of the Banking/Insurance domain.Understand legacy code and translate it effectively to modern technologies. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Engineering.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Overall Responsibilities: Data Pipeline Development: Design, develop, and maintain highly scalable and optimized ETL pipelines using PySpark on the Cloudera Data Platform, ensuring data integrity and accuracy. Data Ingestion: Implement and manage data ingestion processes from a variety of sources (e.g., relational databases, APIs, file systems) to the data lake or data warehouse on CDP. Data Transformation and Processing: Use PySpark to process, cleanse, and transform large datasets into meaningful formats that support analytical needs and business requirements. Performance Optimization: Conduct performance tuning of PySpark code and Cloudera components, optimizing resource utilization and reducing runtime of ETL processes. Data Quality and Validation: Implement data quality checks, monitoring, and validation routines to ensure data accuracy and reliability throughout the pipeline. Automation and Orchestration: Automate data workflows using tools like Apache Oozie, Airflow, or similar orchestration tools within the Cloudera ecosystem. Monitoring and Maintenance: Monitor pipeline performance, troubleshoot issues, and perform routine maintenance on the Cloudera Data Platform and associated data processes. Collaboration: Work closely with other data engineers, analysts, product managers, and other stakeholders to understand data requirements and support various data-driven initiatives. Documentation: Maintain thorough documentation of data engineering processes, code, and pipeline configurations. Software Requirements: Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques. Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase. Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala). Familiarity with Hadoop, Kafka, and other distributed computing tools. Experience with Apache Oozie, Airflow, or similar orchestration frameworks. Strong scripting skills in Linux. Category-wise Technical Skills: PySpark: Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques. Cloudera Data Platform: Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase. Data Warehousing: Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala). Big Data Technologies: Familiarity with Hadoop, Kafka, and other distributed computing tools. Orchestration and Scheduling: Experience with Apache Oozie, Airflow, or similar orchestration frameworks. Scripting and Automation: Strong scripting skills in Linux. Experience: 5-12 years of experience as a Data Engineer, with a strong focus on PySpark and the Cloudera Data Platform. Proven track record of implementing data engineering best practices. Experience in data ingestion, transformation, and optimization on the Cloudera Data Platform. Day-to-Day Activities: Design, develop, and maintain ETL pipelines using PySpark on CDP. Implement and manage data ingestion processes from various sources. Process, cleanse, and transform large datasets using PySpark. Conduct performance tuning and optimization of ETL processes. Implement data quality checks and validation routines. Automate data workflows using orchestration tools. Monitor pipeline performance and troubleshoot issues. Collaborate with team members to understand data requirements. Maintain documentation of data engineering processes and configurations. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. Relevant certifications in PySpark and Cloudera technologies are a plus. Soft Skills: Strong analytical and problem-solving skills. Excellent verbal and written communication abilities. Ability to work independently and collaboratively in a team environment. Attention to detail and commitment to data quality. S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies