Home
Jobs
Companies
Resume

156 Distributed Computing Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

25 - 30 Lacs

Pune

Hybrid

Naukri logo

So, what’s t he r ole all about? As a Data Engineer, you will be responsible for designing, building, and maintaining large-scale data systems, as well as working with cross-functional teams to ensure efficient data processing and integration. You will leverage your knowledge of Apache Spark to create robust ETL processes, optimize data workflows, and manage high volumes of structured and unstructured data. How will you make an impact? Design, implement, and maintain data pipelines using Apache Spark for processing large datasets. Work with data engineering teams to optimize data workflows for performance and scalability. Integrate data from various sources, ensuring clean, reliable, and high-quality data for analysis. Develop and maintain data models, databases, and data lakes. Build and manage scalable ETL solutions to support business intelligence and data science initiatives. Monitor and troubleshoot data processing jobs, ensuring they run efficiently and effectively. Collaborate with data scientists, analysts, and other stakeholders to understand business needs and deliver data solutions. Implement data security best practices to protect sensitive information. Maintain a high level of data quality and ensure timely delivery of data to end-users. Continuously evaluate new technologies and frameworks to improve data engineering processes. Have you got what it takes? 8-11 years of experience as a Data Engineer, with a strong focus on Apache Spark and big data technologies. Expertise in Spark SQL , DataFrames , and RDDs for data processing and analysis. Proficient in programming languages such as Python , Scala , or Java for data engineering tasks. Hands-on experience with cloud platforms like AWS , specifically with data processing and storage services (e.g., S3 , BigQuery , Redshift , Databricks ). Experience with ETL frameworks and tools such as Apache Kafka , Airflow , or NiFi . Strong knowledge of data warehousing concepts and technologies (e.g., Redshift , Snowflake , BigQuery ). Familiarity with containerization technologies like Docker and Kubernetes . Knowledge of SQL and relational databases, with the ability to design and query databases effectively. Solid understanding of distributed computing, data modeling, and data architecture principles. Strong problem-solving skills and the ability to work with large and complex datasets. Excellent communication and collaboration skills to work effectively with cross-functional teams. You will have an advantage if you also have: Knowledge of SQL and relational databases, with the ability to design and query databases effectively. Solid understanding of distributed computing, data modeling, and data architecture principles. Strong problem-solving skills and the ability to work with large and complex datasets. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7235 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 1 hour ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Gurugram

Work from Office

Naukri logo

A Data Engineer specializing in enterprise data platforms, experienced in building, managing, and optimizing data pipelines for large-scale environments. Having expertise in big data technologies, distributed computing, data ingestion, and transformation frameworks. Proficient in Apache Spark, PySpark, Kafka, and Iceberg tables, and understand how to design and implement scalable, high-performance data processing solutions.What you’ll doAs a Data Engineer – Data Platform Services, responsibilities include: Data Ingestion & Processing Designing and developing data pipelines to migrate workloads from IIAS to Cloudera Data Lake. Implementing streaming and batch data ingestion frameworks using Kafka, Apache Spark (PySpark). Working with IBM CDC and Universal Data Mover to manage data replication and movement. Big Data & Data Lakehouse Management Implementing Apache Iceberg tables for efficient data storage and retrieval. Managing distributed data processing with Cloudera Data Platform (CDP). Ensuring data lineage, cataloging, and governance for compliance with Bank/regulatory policies. Optimization & Performance Tuning Optimizing Spark and PySpark jobs for performance and scalability. Implementing data partitioning, indexing, and caching to enhance query performance. Monitoring and troubleshooting pipeline failures and performance bottlenecks. Security & Compliance Ensuring secure data access, encryption, and masking using Thales CipherTrust. Implementing role-based access controls (RBAC) and data governance policies. Supporting metadata management and data quality initiatives. Collaboration & Automation Working closely with Data Scientists, Analysts, and DevOps teams to integrate data solutions. Automating data workflows using Airflow and implementing CI/CD pipelines with GitLab and Sonatype Nexus. Supporting Denodo-based data virtualization for seamless data access Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-7 years of experience in big data engineering, data integration, and distributed computing. Strong skills in Apache Spark, PySpark, Kafka, SQL, and Cloudera Data Platform (CDP). Proficiency in Python or Scala for data processing. Experience with data pipeline orchestration tools (Apache Airflow, Stonebranch UDM). Understanding of data security, encryption, and compliance frameworks Preferred technical and professional experience Experience in banking or financial services data platforms. Exposure to Denodo for data virtualization and DGraph for graph-based insights. Familiarity with cloud data platforms (AWS, Azure, GCP). Certifications in Cloudera Data Engineering, IBM Data Engineering, or AWS Data Analytics

Posted 23 hours ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Gurugram

Work from Office

Naukri logo

A Data Engineer specializing in enterprise data platforms, experienced in building, managing, and optimizing data pipelines for large-scale environments. Having expertise in big data technologies, distributed computing, data ingestion, and transformation frameworks. Proficient in Apache Spark, PySpark, Kafka, and Iceberg tables, and understand how to design and implement scalable, high-performance data processing solutions.What you’ll doAs a Data Engineer – Data Platform Services, responsibilities include: Data Ingestion & Processing Designing and developing data pipelines to migrate workloads from IIAS to Cloudera Data Lake. Implementing streaming and batch data ingestion frameworks using Kafka, Apache Spark (PySpark). Working with IBM CDC and Universal Data Mover to manage data replication and movement. Big Data & Data Lakehouse Management Implementing Apache Iceberg tables for efficient data storage and retrieval. Managing distributed data processing with Cloudera Data Platform (CDP). Ensuring data lineage, cataloging, and governance for compliance with Bank/regulatory policies. Optimization & Performance Tuning Optimizing Spark and PySpark jobs for performance and scalability. Implementing data partitioning, indexing, and caching to enhance query performance. Monitoring and troubleshooting pipeline failures and performance bottlenecks. Security & Compliance Ensuring secure data access, encryption, and masking using Thales CipherTrust. Implementing role-based access controls (RBAC) and data governance policies. Supporting metadata management and data quality initiatives. Collaboration & Automation Working closely with Data Scientists, Analysts, and DevOps teams to integrate data solutions. Automating data workflows using Airflow and implementing CI/CD pipelines with GitLab and Sonatype Nexus. Supporting Denodo-based data virtualization for seamless data access Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-7 years of experience in big data engineering, data integration, and distributed computing. Strong skills in Apache Spark, PySpark, Kafka, SQL, and Cloudera Data Platform (CDP). Proficiency in Python or Scala for data processing. Experience with data pipeline orchestration tools (Apache Airflow, Stonebranch UDM). Understanding of data security, encryption, and compliance frameworks Preferred technical and professional experience Experience in banking or financial services data platforms. Exposure to Denodo for data virtualization and DGraph for graph-based insights. Familiarity with cloud data platforms (AWS, Azure, GCP). Certifications in Cloudera Data Engineering, IBM Data Engineering, or AWS Data Analytics

Posted 23 hours ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : Java Enterprise EditionMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of the projects you are involved in, ensuring that the applications you develop are efficient and effective in meeting user needs. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with Java Enterprise Edition.- Strong understanding of distributed computing principles.- Experience with data processing frameworks and tools.- Familiarity with cloud platforms and services. Additional Information:- The candidate should have minimum 5 years of experience in Apache Spark.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 23 hours ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Pune, Gurugram

Work from Office

Naukri logo

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What You ll Do Collaborate with client facing teams to understand solution context and contribute in technical requirement gathering and analysis Design and implement technical features leveraging best practices for technology stack being used Work with technical architects on the team to validate design and implementation approach Write production-ready code that is easily testable, understood by other developers and accounts for edge cases and errors Ensure highest quality of deliverables by following architecture/design guidelines, coding best practices, periodic design/code reviews Write unit tests as well as higher level tests to handle expected edge cases and errors gracefully, as well as happy paths Uses bug tracking, code review, version control and other tools to organize and deliver work Participate in scrum calls and agile ceremonies, and effectively communicate work progress, issues and dependencies Consistently contribute in researching & evaluating latest technologies through rapid learning, conducting proof-of- concepts and creating prototype solutions What You ll Bring 2+ years of relevant hands-on experience CS foundation is must Strong command over distributed computing framework like Spark (preferred) or others. Strong analytical / problems solving Ability to quickly learn and become hands on new technology and be innovative in creating solutions Strong in at least one of the Programming languages - Python or Java, Scala, etc. and Programming basics - Data Structures Hands on experience in building modules for data management solutions such as data pipeline, orchestration, ingestion patterns (batch, real time) Experience in designing and implementation of solution on distributed computing and cloud services platform (but not limited to) - AWS, Azure, GCP Good understanding of RDBMS, with some exp on ETL is preferred Additional Skills: Understanding of DevOps, CI / CD, data security, experience in designing on cloud platform AWS Solutions Architect certification with understanding of broader AWS stack Knowledge of data modeling and data warehouse concepts Willingness to travel to other global offices as needed to work with client or other internal project teams Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered.

Posted 3 days ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Apache Spark Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL), AWS ArchitectureMinimum 5 year(s) of experience is required Educational Qualification : Mandatory 15 years of Fulltime qualification Summary :As an Application Lead, you will be responsible for designing, building, and configuring applications using Apache Spark. Your typical day will involve leading the effort to design and build applications, acting as the primary point of contact, and utilizing your expertise in Apache Spark to deliver impactful solutions. Roles & Responsibilities:- Lead the effort to design, build, and configure applications using Apache Spark.- Act as the primary point of contact for the project, collaborating with cross-functional teams to ensure successful delivery of applications.- Utilize your expertise in Apache Spark to develop and deploy advanced data processing pipelines, including experience with Spark Streaming and Spark SQL.- Design and implement scalable and fault-tolerant solutions using Apache Spark, ensuring high performance and reliability.- Stay updated with the latest advancements in Apache Spark and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: - Must To Have Skills: Expertise in Apache Spark.- Good To Have Skills: Experience with Oracle Procedural Language Extensions to SQL (PLSQL) and AWS Architecture.- Strong understanding of distributed computing principles and experience with distributed data processing frameworks.- Experience with data processing pipelines, including Spark Streaming and Spark SQL.- Experience designing and implementing scalable and fault-tolerant solutions using Apache Spark.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 5 years of experience in Apache Spark.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Gurugram office. Qualification Mandatory 15 years of Fulltime qualification

Posted 4 days ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : A Engineering graduate preferably computer science graduate 15 years of full time education Summary :As a Software Development Engineer, you will analyze, design, code, and test multiple components of application code across one or more clients. You will also perform maintenance, enhancements, and/or development work in a dynamic environment. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the design and implementation of software solutions.- Conduct code reviews and provide constructive feedback to team members.- Participate in architectural discussions and propose innovative solutions.- Mentor junior team members and help in their professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of distributed computing and big data processing.- Experience with cloud platforms such as AWS or Azure.- Hands-on experience in building scalable and efficient data pipelines.- Knowledge of SQL and NoSQL databases. Additional Information:- The candidate should have a minimum of 5 years of experience in PySpark.- This position is based at our Chennai office.- A Engineering graduate preferably computer science graduate with 15 years of full-time education is required. Qualification A Engineering graduate preferably computer science graduate 15 years of full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : A Engineering Graduate preferably computer science graduate 15 years of full time Education Summary :As a Software Development Engineer, you will analyze, design, code, and test multiple components of application code across one or more clients. You will also perform maintenance, enhancements, and/or development work. This role requires a strong understanding of software development principles and the ability to work independently and as part of a team. You will have the opportunity to contribute to the success of our clients by delivering high-quality software solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Collaborate with cross-functional teams to analyze, design, and develop software solutions.- Write clean, efficient, and maintainable code that meets the project requirements.- Perform unit testing and debugging to ensure the quality and stability of the software.- Participate in code reviews to provide feedback and ensure adherence to coding standards.- Identify and resolve technical issues and bugs in a timely manner.- Stay up-to-date with the latest industry trends and technologies to continuously improve skills and knowledge. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with data processing frameworks like Apache Spark.- Strong understanding of software development principles and best practices.- Experience with distributed computing and parallel processing.- Knowledge of SQL and relational databases.- Familiarity with version control systems like Git.- Excellent problem-solving and analytical skills.- Ability to work in a fast-paced and dynamic environment. Additional Information:- The candidate should have a minimum of 3 years of experience in PySpark.- This position is based at our Pune office.- A Engineering Graduate preferably computer science graduate with 15 years of full-time Education is required. Qualification A Engineering Graduate preferably computer science graduate 15 years of full time Education

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Hadoop Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function smoothly and efficiently. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking ways to improve processes and deliver high-quality solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure best practices and quality standards are met. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Hadoop.- Strong understanding of distributed computing principles and frameworks.- Experience with data processing and analysis using Hadoop ecosystem tools.- Familiarity with programming languages such as Java or Python.- Knowledge of data storage solutions and data management best practices. Additional Information:- The candidate should have minimum 3 years of experience in Apache Hadoop.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : Java Enterprise EditionMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project specifications, developing application features, and ensuring that the applications are aligned with business needs. You will also engage in problem-solving discussions and contribute to the overall success of the project by implementing effective solutions and enhancements. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with Java Enterprise Edition.- Strong understanding of distributed computing principles.- Experience with data processing frameworks and tools.- Familiarity with cloud platforms and services. Additional Information:- The candidate should have minimum 5 years of experience in Apache Spark.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark, Payroll Conversion and Testing, solution build support Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing innovative solutions to enhance business operations and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement efficient PySpark applications.- Collaborate with cross-functional teams to analyze and address application requirements.- Optimize application performance and scalability.- Troubleshoot and debug applications to ensure seamless functionality.- Stay updated with industry trends and best practices in PySpark development. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data processing and manipulation using PySpark.- Experience in building data pipelines and ETL processes.- Knowledge of distributed computing and parallel processing.- Good To Have Skills: Experience with Apache Spark. Additional Information:- The candidate should have a minimum of 3 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Apache Spark Good to have skills : PySparkMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will require a balance of technical expertise and leadership skills to drive the project forward successfully. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Facilitate regular team meetings to discuss progress and address any roadblocks. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with PySpark.- Strong understanding of distributed computing principles.- Experience with data processing frameworks and tools.- Familiarity with cloud platforms and services related to big data. Additional Information:- The candidate should have minimum 5 years of experience in Apache Spark.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Hadoop Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and enhance application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement software solutions to meet business requirements.- Collaborate with cross-functional teams to enhance application functionality.- Conduct code reviews and provide technical guidance to team members.- Troubleshoot and debug applications to ensure optimal performance.- Stay updated on industry trends and technologies to drive continuous improvement. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Hadoop.- Strong understanding of distributed computing principles.- Experience with data processing frameworks like MapReduce and Spark.- Hands-on experience in designing and implementing scalable data pipelines.- Knowledge of Hadoop ecosystem components such as HDFS, YARN, and Hive. Additional Information:- The candidate should have a minimum of 3 years of experience in Apache Hadoop.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Apache Spark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Your role will require you to stay updated with the latest technologies and methodologies to enhance application performance and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to foster their professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Strong understanding of distributed computing principles.- Experience with data processing frameworks and tools.- Familiarity with cloud platforms and services.- Ability to optimize application performance and scalability. Additional Information:- The candidate should have minimum 3 years of experience in Apache Spark.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

The Systems Engineer II ESP supports, with direction, the Enterprise Server Platforms at CME Group. The incumbent must have an understanding of server(Windows or Linux) deployment, configuration, troubleshooting, security, scripting, and networking. Strong communication and documentation skills are required as the candidate will typically be working with customers for support and new initiatives. Principal Accountabilities: Analyzes requirements with supervision and supports existing development/QA platforms; Documents deployment and configuration procedures. Defines simple problems and describes the cause and effect relationship; Gathers and compares data about the problems with supervision and documents the details; Prepares analysis report and reviews with supervision; Participates in discussions with experienced team members to generate ideas for problem solving. Demonstrates knowledge of elementary systems (Linux/Windows), distributed computing architecture (client server, intranet/internet), h/w platforms and resources - CPU, memory, disk, end-user devices, peripheral. Troubleshoots most of the known issues with support; Works with supervision to document problem resolution.

Posted 5 days ago

Apply

9.0 - 12.0 years

11 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT THE ROLE Role Description: We are seeking a Data Solutions Architect with deep expertise in Biotech/Pharma to design, implement, and optimize scalable and high-performance data solutions that support enterprise analytics, AI-driven insights, and digital transformation initiatives. This role will focus on data strategy, architecture, governance, security, and operational efficiency, ensuring seamless data integration across modern cloud platforms. The ideal candidate will work closely with engineering teams, business stakeholders, and leadership to establish a future-ready data ecosystem, balancing performance, cost-efficiency, security, and usability. This position requires expertise in modern cloud-based data architectures, data engineering best practices, and Scaled Agile methodologies. Roles & Responsibilities: Design and implement scalable, modular, and future-proof data architectures that initiatives in enterprise. Develop enterprise-wide data frameworks that enable governed, secure, and accessible data across various business domains. Define data modeling strategies to support structured and unstructured data, ensuring efficiency, consistency, and usability across analytical platforms. Lead the development of high-performance data pipelines for batch and real-time data processing, integrating APIs, streaming sources, transactional systems, and external data platforms. Optimize query performance, indexing, caching, and storage strategies to enhance scalability, cost efficiency, and analytical capabilities. Establish data interoperability frameworks that enable seamless integration across multiple data sources and platforms. Drive data governance strategies, ensuring security, compliance, access controls, and lineage tracking are embedded into enterprise data solutions. Implement DataOps best practices, including CI/CD for data pipelines, automated monitoring, and proactive issue resolution, to improve operational efficiency. Lead Scaled Agile (SAFe) practices, facilitating Program Increment (PI) Planning, Sprint Planning, and Agile ceremonies, ensuring iterative delivery of enterprise data capabilities. Collaborate with business stakeholders, product teams, and technology leaders to align data architecture strategies with organizational goals. Act as a trusted advisor on emerging data technologies and trends, ensuring that the enterprise adopts cutting-edge data solutions that provide competitive advantage and long-term scalability. Must-Have Skills: Experience in data architecture, enterprise data management, and cloud-based analytics solutions. Well versed in domain of Biotech/Pharma industry and has been instrumental in solving complex problems for them using data strategy. Expertise in Databricks, cloud-native data platforms, and distributed computing frameworks. Strong proficiency in modern data modeling techniques, including dimensional modeling, NoSQL, and data virtualization. Experience designing high-performance ETL/ELT pipelines and real-time data processing solutions. Deep understanding of data governance, security, metadata management, and access control frameworks. Hands-on experience with CI/CD for data solutions, DataOps automation, and infrastructure as code (IaC). Proven ability to collaborate with cross-functional teams, including business executives, data engineers, and analytics teams, to drive successful data initiatives. Strong problem-solving, strategic thinking, and technical leadership skills. Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with Data Mesh architectures and federated data governance models. Certification in cloud data platforms or enterprise architecture frameworks. Knowledge of AI/ML pipeline integration within enterprise data architectures. Familiarity with BI & analytics platforms for enabling self-service analytics and enterprise reporting. Education and Professional Certifications 9 to 12 years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.

Posted 6 days ago

Apply

10.0 - 13.0 years

12 - 15 Lacs

Hyderabad, Gurugram, Ahmedabad

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 11 S&P Global EDO The Role: Lead- Software Engineering IT- Application Development. Join Our Team: Step into a dynamic team at the cutting edge of data innovation! Youll collaborate daily with talented professionals from around the world, designing and developing next-generation data products for our clients. Our team thrives on a diverse toolkit that evolves with emerging technologies, offering you the chance to work in a vibrant, global environment that fosters creativity and teamwork. The Impact: As a Lead Software Developer at S&P Global, youll be a driving force in shaping the future of our data products. Your expertise will streamline software development and deployment, aligning cutting-edge solutions with business needs. By ensuring seamless integration and continuous delivery, youll enhance product capabilities, delivering high-quality systems that meet the highest standards of availability, security, and performance. Your work will empower our clients with impactful, data-driven solutions, making a real difference in the financial world. Whats in it for You: Career Development: Build a rewarding career with a global leader in financial information and analytics, supported by continuous learning and a clear path to advancement. Dynamic Work Environment: Thrive in a fast-paced, forward-thinking setting where your ideas fuel innovation and your contributions shape groundbreaking solutions. Skill Enhancement: Elevate your expertise on an enterprise-level platform, mastering the latest tools and techniques in software development. Versatile Experience: Dive into full-stack development with hands-on exposure to cloud computing, Bigdata, and revolutionary GenAI technologies. Leadership Opportunities: Guide and inspire a skilled team, steering the direction of our products and leaving your mark on the future of technology at S&P Global. Responsibilities: Architect and develop scalable Bigdata and cloud applications, harnessing a range of cloud services to create robust, high-performing solutions. Design and implement advanced CI/CD pipelines, automating software delivery for fast, reliable deployments that keep us ahead of the curve. Tackle complex challenges head-on, troubleshooting and resolving issues to ensure our products run flawlessly for clients. Lead by example, providing technical guidance and mentoring to your team, driving innovation and embracing new processes. Deliver top-tier code and detailed system design documents, setting the standard with technical walkthroughs that inspire excellence. Bridge the gap between technical and non-technical stakeholders, turning complex requirements into elegant, actionable solutions. Mentor junior developers, nurturing their growth and helping them build skills and careers under your leadership. What Were Looking For: Were seeking a passionate, experienced professional with: 10-13 years of hands-on experience designing and building data-intensive solutions using distributed computing, showcasing your mastery of scalable architectures. Proven success implementing and maintaining enterprise search solutions in large-scale environments, ensuring peak performance and reliability. A history of partnering with business stakeholders and users to shape research directions and craft robust, maintainable products. Extensive experience deploying data engineering solutions in public clouds like AWS, GCP, or Azure, leveraging cloud power to its fullest. Advanced programming skills in Python, Java, .NET or Scala, backed by a portfolio of impressive projects. Strong knowledge of Gen AI tools (e.g., GitHub Copilot, ChatGPT, Claude, or Gemini) and their power to boost developer productivity. Expertise in containerization, scripting, cloud platforms, and CI/CD practices, ready to shine in a modern development ecosystem. 5+ years working with Python, Java, .NET, Kubernetes, and data/workflow orchestration tools, proving your technical versatility. Deep experience with SQL, NoSQL, Apache Spark, Airflow, or similar tools, operationalizing data-driven pipelines for large-scale batch and stream processing. A knack for rapid prototyping and iteration, delivering high-quality solutions under tight deadlines. Outstanding communication and documentation skills, adept at explaining complex ideas to technical and non-technical audiences alike. Take the Next Step: Ready to elevate your career and make a lasting impact in data and technologyJoin us at S&P Global and help shape the future of financial information and analytics. Apply today! Return to Work Have you taken time out for caring responsibilities and are now looking to return to workAs part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply and will actively support your return to the workplace.

Posted 6 days ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Hyderabad, Gurugram

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 10 Job Summary: We are seeking a talented Java Developer to join our dynamic team. The ideal candidate will have strong proficiency in Java experience working with public cloud platforms such as AWS or Microsoft Azure, and a solid foundation in computer science principles. What Youll Do: Design, develop, test, document, deploy, maintain, and enhance software applications for a quantitative product that conducts complex mathematical calculations to accurately derive and analyze the various S&P index. Manage individual project priorities, deadlines, and deliverables. Collaborate with key stakeholders to develop system architectures, API specifications, and implementation requirements. Engage in code reviews, knowledge sharing, and mentorship to promote ongoing technical development within the team. Analyze system performance and optimize applications for maximum speed and scalability. What You'll Need: 5+ years of proven experience as a Senior Developer with a strong command of Java, Springboot, Experience developing RESTful APIs using a variety of tools Hands-on experience with public cloud platforms (AWS, Microsoft Azure). Solid understanding of algorithms, data structures, and software architecture . Experience with distributed computing frameworks like Apache Spark. Familiarity with data lake architectures and data processing. Ability to translate abstract business requirements into concrete technical solutions. Strong analytical skills to assess the behavior and performance of loosely coupled systems, ensuring they operate efficiently and effectively in a distributed environment. Educational Qualifications: Bachelors/masters degree in computer science, Information Technology, or a related field. Technologies & Tools We Use: Programming LanguagesJava, Python FrameworksSpring Boot, Apache Spark Cloud PlatformsAWS, Microsoft Azure Development ToolsGit, Docker, Jenkins About S&P Global Dow Jones Indic e s At S&P Dow Jones Indices, we provide iconic and innovative index solutions backed by unparalleled expertise across the asset-class spectrum. By bringing transparency to the global capital markets, we empower investors everywhere to make decisions with conviction. Were the largest global resource for index-based concepts, data and research, and home to iconic financial market indicators, such as the S&P 500 and the Dow Jones Industrial Average . More assets are invested in products based upon our indices than any other index provider in the world. With over USD 7.4 trillion in passively managed assets linked to our indices and over USD 11.3 trillion benchmarked to our indices, our solutions are widely considered indispensable in tracking market performance, evaluating portfolios and developing investment strategies. S&P Dow Jones Indices is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit www.spglobal.com/spdji . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries

Posted 6 days ago

Apply

8.0 - 13.0 years

11 - 16 Lacs

Hyderabad, Gurugram

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 11 Job Summary: We are seeking a talented Full Stack Developer to join our dynamic team. The ideal candidate will have strong proficiency in either Java or Python, experience working with public cloud platforms such as AWS or Microsoft Azure, and a solid foundation in computer science principles. What Youll Do: Design, develop, test, document, deploy, maintain, and enhance software applications for a quantitative product that conducts complex mathematical calculations to accurately derive and analyze the various S&P index. Manage individual project priorities, deadlines, and deliverables. Collaborate with key stakeholders to develop system architectures, API specifications, and implementation requirements. Engage in code reviews, knowledge sharing, and mentorship to promote ongoing technical development within the team. Analyze system performance and optimize applications for maximum speed and scalability. What You'll Need: 8+ years of proven experience as a Full Stack Developer with a strong command of either Java or Python full stack Hands-on experience with public cloud platforms (AWS, Microsoft Azure). Solid understanding of algorithms, data structures, and software architecture. Experience with distributed computing frameworks like Apache Spark. Strong understanding of micro frontend architecture principles and implementation patterns Ability to translate abstract business requirements into concrete technical solutions. Strong analytical skills to assess the behavior and performance of loosely coupled systems, ensuring they operate efficiently and effectively in a distributed environment. Educational Qualifications: Bachelors/masters degree in computer science, Information Technology, or a related field. Technologies & Tools We Use: Programming Languages Java, Python FrameworksApache Spark, React/Angular or Vue.js DatabaseSQL (Postgres or equivalent) , No SQL (Mongo or equivalent) Cloud PlatformsAWS, Microsoft Azure Development ToolsGit, Docker, Jenkins About S&P Global Dow Jones Indic e s At S&P Dow Jones Indices, we provide iconic and innovative index solutions backed by unparalleled expertise across the asset-class spectrum. By bringing transparency to the global capital markets, we empower investors everywhere to make decisions with conviction. Were the largest global resource for index-based concepts, data and research, and home to iconic financial market indicators, such as the S&P 500 and the Dow Jones Industrial Average . More assets are invested in products based upon our indices than any other index provider in the world. With over USD 7.4 trillion in passively managed assets linked to our indices and over USD 11.3 trillion benchmarked to our indices, our solutions are widely considered indispensable in tracking market performance, evaluating portfolios and developing investment strategies. S&P Dow Jones Indices is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit www.spglobal.com/spdji . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries

Posted 6 days ago

Apply

2.0 - 7.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Job TitleAI Scientist Position Overview: We are seeking a talented and experienced Core AI Algorithm Developer to join our Lab45 AI Platform Team, Wipro. We are looking for candidates with 4 to 10 years of hands-on experience in developing cutting-edge AI algorithms, such as in Generative AI, LLM, Deep Learning, Unsupervised AI, etc. along with expertise in Python, TensorFlow, PyTorch, PySpark, distributed computing, Statistics, and cloud technologies. Candidate should have strong foundation in AI and good coding skills. Key Responsibilities: Develop and implement state-of-the-art AI algorithms and models to solve complex problems in diverse domains. Collaborate with cross-functional teams to understand business requirements and translate them into scalable AI production-grade solutions. Work with large datasets to extract insights, optimize algorithms, and enhance model performance. Contribute to the creation of intellectual property (IP) through patents, research papers, and innovative solutions. Stay abreast of the latest advancements in AI research and technologies and apply them to enhance our AI offerings. Collaborate with cross-functional teams to understand business requirements, gather feedback, and iterate on AI solutions. Strong communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams. Qualifications: Master's, or Ph.D. degree (preferred) in Computer Science, Artificial Intelligence, Machine Learning, or related field. 4 to 10 years of proven experience in developing cutting-edge AI algorithms and solutions. Strong proficiency in Python programming and familiarity with TensorFlow, PyTorch, PySpark, etc. Experience with distributed computing and cloud platforms (e.g., Azure, AWS, GCP). Demonstrated ability to work with large datasets and optimize algorithms for scalability and efficiency. Excellent problem-solving skills and a strong understanding of AI concepts and techniques. Proven track record of delivering high-quality, innovative solutions and contributing to IP creation (e.g., patents, research papers). Strong communication and collaboration skills, with the ability to work effectively in a team environment.

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of the projects you are involved in. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Strong understanding of distributed computing principles.- Experience with data processing frameworks and tools.- Familiarity with cloud platforms and services.- Ability to write efficient and scalable code. Additional Information:- The candidate should have minimum 5 years of experience in Apache Spark.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

2.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions that align with organizational goals and enhance operational efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to analyze business requirements and translate them into technical solutions.- Develop and implement software solutions using Apache Spark to enhance application functionality.- Troubleshoot and debug applications to optimize performance and ensure seamless operation.- Stay updated with industry trends and best practices to continuously improve application development processes.- Provide technical guidance and support to junior team members to foster skill development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Strong understanding of big data processing and distributed computing.- Experience with data processing frameworks like Hadoop and Spark SQL.- Hands-on experience in developing scalable and efficient applications using Apache Spark.- Knowledge of programming languages such as Scala or Python. Additional Information:- The candidate should have a minimum of 2 years of experience in Apache Spark.- This position is based at our Pune office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Conduct code reviews and ensure coding standards are met- Implement best practices for application design and development Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark- Strong understanding of big data processing- Experience with distributed computing frameworks- Hands-on experience in developing scalable applications- Knowledge of data processing and transformation techniques Additional Information:- The candidate should have a minimum of 5 years of experience in Apache Spark- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education

Posted 1 week ago

Apply

6.0 - 11.0 years

19 - 27 Lacs

Haryana

Work from Office

Naukri logo

About Company Job Description Key responsibilities: 1. Understand, implement, and automate ETL pipelines with better industry standards 2. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, design infrastructure for greater scalability, etc 3. Developing, integrating, testing, and maintaining existing and new applications 4. Design, and create data pipelines (data lake / data warehouses) for real world energy analytical solutions 5. Expert-level proficiency in Python (preferred) for automating everyday tasks 6. Strong understanding and experience in distributed computing frameworks, particularly Spark, Spark-SQL, Kafka, Spark Streaming, Hive, Azure Databricks etc 7. Limited experience in using other leading cloud platforms preferably Azure. 8. Hands on experience on Azure data factory, logic app, Analysis service, Azure blob storage etc. 9. Ability to work in a team in an agile setting, familiarity with JIRA and clear understanding of how Git works 10. Must have 5-7 years of experience

Posted 1 week ago

Apply

4.0 - 7.0 years

14 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

A Data Engineer specializing in enterprise data platforms, experienced in building, managing, and optimizing data pipelines for large-scale environments. Having expertise in big data technologies, distributed computing, data ingestion, and transformation frameworks. Proficient in Apache Spark, PySpark, Kafka, and Iceberg tables, and understand how to design and implement scalable, high-performance data processing solutions.What you’ll doAs a Data Engineer – Data Platform Services, responsibilities include: Data Ingestion & Processing Designing and developing data pipelines to migrate workloads from IIAS to Cloudera Data Lake. Implementing streaming and batch data ingestion frameworks using Kafka, Apache Spark (PySpark). Working with IBM CDC and Universal Data Mover to manage data replication and movement. Big Data & Data Lakehouse Management Implementing Apache Iceberg tables for efficient data storage and retrieval. Managing distributed data processing with Cloudera Data Platform (CDP). Ensuring data lineage, cataloging, and governance for compliance with Bank/regulatory policies. Optimization & Performance Tuning Optimizing Spark and PySpark jobs for performance and scalability. Implementing data partitioning, indexing, and caching to enhance query performance. Monitoring and troubleshooting pipeline failures and performance bottlenecks. Security & Compliance Ensuring secure data access, encryption, and masking using Thales CipherTrust. Implementing role-based access controls (RBAC) and data governance policies. Supporting metadata management and data quality initiatives. Collaboration & Automation Working closely with Data Scientists, Analysts, and DevOps teams to integrate data solutions. Automating data workflows using Airflow and implementing CI/CD pipelines with GitLab and Sonatype Nexus. Supporting Denodo-based data virtualization for seamless data access Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-7 years of experience in big data engineering, data integration, and distributed computing. Strong skills in Apache Spark, PySpark, Kafka, SQL, and Cloudera Data Platform (CDP). Proficiency in Python or Scala for data processing. Experience with data pipeline orchestration tools (Apache Airflow, Stonebranch UDM). Understanding of data security, encryption, and compliance frameworks Preferred technical and professional experience Experience in banking or financial services data platforms. Exposure to Denodo for data virtualization and DGraph for graph-based insights. Familiarity with cloud data platforms (AWS, Azure, GCP). Certifications in Cloudera Data Engineering, IBM Data Engineering, or AWS Data Analytics

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies