Jobs
Interviews

8340 Hadoop Jobs - Page 25

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 7.0 years

9 - 13 Lacs

Coimbatore

Work from Office

Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will develop and configure software systems, either end-to-end or for specific stages of the product lifecycle. Your typical day will involve collaborating with various teams to ensure the successful implementation of software solutions, applying your knowledge of technologies and methodologies to support project goals and client needs. You will engage in problem-solving activities, guiding your team through challenges while ensuring that the software development process aligns with best practices and client expectations. Your role will also include mentoring team members and fostering a collaborative environment to drive innovation and efficiency in software development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure adherence to timelines and quality standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Strong understanding of cloud computing principles and practices.- Experience with data engineering and ETL processes.- Familiarity with programming languages such as Python or Scala.- Ability to design and implement scalable data solutions. Additional Information:- The candidate should have minimum 7.5 years of experience in Microsoft Azure Databricks.- This position is based in Coimbatore.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

7.0 - 12.0 years

15 - 25 Lacs

Bengaluru

Hybrid

Looking for Hadoop Developer

Posted 1 week ago

Apply

2.0 - 5.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture strategy. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor and evaluate team performance to ensure alignment with project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of data integration processes and tools.- Experience with data warehousing concepts and practices.- Familiarity with ETL processes and data pipeline development.- Ability to work with various database management systems. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the highest quality of deliverables, while continuously seeking opportunities for improvement in application functionality and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Collaborate with cross-functional teams to gather requirements and provide technical insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Strong understanding of ETL processes and data integration techniques.- Experience with data warehousing concepts and methodologies.- Familiarity with SQL and database management systems.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 3 years of experience in Informatica PowerCenter.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Scala, PySparkMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to guarantee the quality of the applications you create, while continuously seeking ways to enhance functionality and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct thorough testing and debugging of applications to ensure optimal performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with PySpark, Scala.- Strong understanding of data integration and ETL processes.- Familiarity with cloud computing concepts and services.- Experience in application lifecycle management and agile methodologies. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, manage project timelines, and contribute to the overall success of application development initiatives. Roles & Responsibilities:1.Serve as a client-facing technical lead, working closely with stakeholders to gather requirements and translate them into actionable ETL solutions.2.Design and develop new stored procedures in MS SQL Server, with a strong focus on performance and maintainability.3.Build/enhance SSIS packages, implementing best practices for modularity, reusability, and error handling.4.Architect and design ETL workflows, including staging, cleansing, data masking, transformation, and loading strategies.5.Implement comprehensive error handling and logging mechanisms to support reliable, auditable data pipelines.6.Design and maintain ETL-related tables, including staging, audit/logging, and dimensional/historical tables.7.Work with Snowflake to build scalable cloud-based data integration and warehousing solutions.8.Reverse-engineer and optimize existing ETL processes and stored procedures for better performance and maintainability.9.Troubleshoot job failures, data discrepancies in Production Professional & Technical Skills: 1.7+ years of experience in Data Warehousing [MS SQL, Snowflake], MS SQL Server (T-SQL, stored procedures, indexing, performance tuning).2.Proven expertise in SSIS package development, including parameterization, data flow, and control flow design.3.Strong experience in ETL architecture, including logging, exception handling, and data validation.4.Proficient in data modeling for ETL, including staging, target, and history tables.5.Hands-on experience with Snowflake, including data loading, transformation scripting, and optimization.6.Ability to manage historical data using SCDs, auditing fields, and temporal modeling.7.Set up Git repositories, define version control standards, and manage code branching/release. DevOps, and CI/CD practices for data pipelines.8.Ability to work independently while managing multiple issues and deadlines.9.Excellent communication skills, both verbal and written, with demonstrated client interaction.Would be a Plus:10.DW migration from MS SQL to Snowflake.11.Experience with modern data integration tools such as Matillion.12.Knowledge of BI tools like Tableau. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Warehouse ETL Testing Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing solutions, and ensuring that applications function seamlessly to support organizational goals. You will also participate in testing and validation processes to guarantee that the applications meet the required standards and specifications, contributing to the overall success of the projects you are involved in. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application requirements and specifications.- Engage in continuous learning to stay updated with industry trends and technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Warehouse ETL Testing.- Strong understanding of data integration processes and methodologies.- Experience with various ETL tools and frameworks.- Ability to perform data validation and quality assurance checks.- Familiarity with database management systems and SQL. Additional Information:- The candidate should have minimum 3 years of experience in Data Warehouse ETL Testing.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the business environment. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking ways to improve processes and solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with data processing frameworks.- Strong understanding of distributed computing principles.- Familiarity with cloud platforms and services.- Experience in developing and deploying applications in a microservices architecture. Additional Information:- The candidate should have minimum 3 years of experience in Apache Spark.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. Team Overview: Global Credit & Model Risk Oversight, Transaction Monitoring & GRC Capabilities (CMRC) provides independent challenge and ensures that significant Credit and Model risks are properly evaluated and monitored, and Anti-Money Laundering (AML) risks are mitigated through the transaction monitoring program. In addition, CMRC hosts the central product organization responsible for the ongoing maintenance and modernization of GRC platforms and capabilities. How will you make an impact in this role? The AML Data Capabilities team was established with a mission to own and govern data across products – raw data, derivations, organized views to cater for analytics and production use cases and to manage the end-to-end data quality. This team comprises of risk data experts with deep SME knowledge of risk data, systems and processes covering all aspects of customer life cycle. Our mission is to build and support Anti-Money Laundering Transaction Monitoring data and rule needs in collaboration with Strategy and technology partners with focus on our core tenets of Timeliness, Quality and process efficiency. Responsibilities include: · Develop and Maintain Organized Data Layers to cater for both Production use cases and Analytics for Transaction Monitoring of Anti-Money Laundering rules. · Manage end to end Big Data Integration processes for building key variables from disparate source systems with 100% accuracy and 100% on time delivery · Partner closely with Strategy and Modeling teams in building incremental intelligence, with strong emphasis on maintaining globalization and standardization of attribute calculations across portfolios. · Partner with Tech teams in designing and building next generation data quality controls. · Drive automation initiatives within existing processes and fully optimize delivery effort and processing time · Effectively manage relationship with stakeholders across multiple geographies · Contribute into evaluating and/or developing right tools, common components, and capabilities · Follow industry best agile practices to deliver on key priorities Implementation of defined rules on Lucy platform in order to identify the AML alerts. · Ensuring process and actions are logged and support regulatory reporting, documenting the analysis and the rule build in form of qualitative document for relevant stakeholders. Minimum Qualifications · Academic Background: Bachelor’s degree with up to 2 year of relevant work experience · Strong Hive, SQL skills, knowledge of Big data and related technologies · Hands on experience on Hadoop & Shell Scripting is a plus · Understanding of Data Architecture & Data Engineering concepts · Strong verbal and written communication skills, with the ability to cater to versatile technical and non-technical audience · Willingness to Collaborate with Cross-Functional teams to drive validation and project execution · Good to have skills - Python / Py-Spark · Excellent Analytical & critical thinking with attention to detail · Excellent planning and organizations skills including ability to manage inter-dependencies and execute under stringent deadlines · Exceptional drive and commitment; ability to work and thrive in in fast changing, results driven environment; and proven ability in handling competing priorities Behavioral Skills/Capabilities: Enterprise Leadership Behaviors Set the Agenda: Ø Ability to apply thought leadership and come up with ideas Ø Take complete perspective into picture while designing solutions Ø Use market best practices to design solutions Bring Others with You: Ø Collaborate with multiple stakeholders and other scrum team to deliver on promise Ø Learn from peers and leaders Ø Coach and help peers Do It the Right Way: Ø Communicate Effectively Ø Be candid and clear in communications Ø Make Decisions Quickly & Effectively Ø Live the company culture and values We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 1 week ago

Apply

2.0 - 7.0 years

6 - 10 Lacs

Pune

Work from Office

Atos is seeking a highly skilled and experienced Kubernetes Expert with strong programming skills to join our dynamic team. As a Kubernetes Expert, you will play a crucial role in designing, implementing, and maintaining our Kubernetes infrastructure to ensure scalability, reliability, and efficiency of our services. Responsibilities: Develop and maintain Kubernetes clusters for open-source applications (like Apache Nifi, Apache Airflow), ensuring high availability, scalability, and security. Deploy, configure, and manage clusters on Kubernetes, including setting up leader election, shared state management, and clustering. Utilize ArgoCD for GitOps continuous delivery, automating the deployment of applications and resources within the Kubernetes environment. Use Crossplane to manage cloud resources and services, ensuring seamless integration and provisioning. Implement and manage identity and access management using Keycloak, ensuring secure access to the application. Utilize Azure Vault for securely storing and managing sensitive information such as API keys, passwords, and other secrets required for data workflows. Manage ingress traffic to the application using Kong, providing features such as load balancing, security, and monitoring of API requests. Ensure the availability and management of persistent block storage for various application repositories. Set up and manage certificates using Cert-Manager and Trust-Manager to establish secure connections between the applications. Implement monitoring and observability solutions to ensure the health and performance of the application and its underlying infrastructure. Troubleshoot and resolve issues related to Kubernetes infrastructure, including performance bottlenecks, resource constraints, and network connectivity. Implement security best practices for Kubernetes environments, including RBAC, network policies, secrets management, and define strategy to integrate security with various virtualization environment service providers like VMware or cloud hyperscalers. Stay updated with the latest Kubernetes features, tools, and technologies, and evaluate their applicability to improve our infrastructure and workflows. Mentor and train team members on Kubernetes concepts, best practices, and tools. Contribute to the development and maintenance of internal documentation, runbooks, and knowledge base articles related to Kubernetes. Requirements: Bachelor's degree in Computer Science, Engineering, or related field. Master's degree preferred. 2+ years of experience in designing, deploying, and managing Kubernetes clusters in production environments. Solid experience with infrastructure-as-code tools such as Crossplane. Proficiency in Kubernetes and container orchestration. Knowledge of Apache NiFi 2.0, including clustering and data flow management. Familiarity with GitOps practices and tools like ArgoCD. Experience with container monitoring and logging tools such as Prometheus and Grafana. Solid understanding of networking principles, including DNS, load balancing, and security in Kubernetes environments. Experience with identity and access management tools like Keycloak. Proficiency in secrets management using tools like Azure Vault. Experience with API gateway management using Kong. Knowledge of persistent storage solutions for Kubernetes. Experience with certificate management using Cert-Manager and Trust-Manager. Preferred Qualifications: Kubernetes certification (e.g., Certified Kubernetes Administrator - CKA, Certified Kubernetes Application Developer CKAD, Certified Kubernetes Security Specialist CKS). Familiarity with CI/CD pipelines and tools such as GitHub. Knowledge of software-defined networking (SDN) solutions for Kubernetes. Contributions to open-source projects related to Kubernetes or containerization technologies.

Posted 1 week ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Python (Programming Language), ScalaMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by delivering high-quality applications that align with business objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Python (Programming Language), Scala.- Strong understanding of data processing frameworks and distributed computing.- Experience with data integration and ETL processes.- Familiarity with cloud platforms and services related to application development. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Dataiku Data Science Studio (DSS) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while staying updated with the latest technologies and methodologies in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Dataiku Data Science Studio (DSS).- Strong understanding of data integration and transformation processes.- Experience with application development methodologies such as Agile or Scrum.- Familiarity with data visualization techniques and tools.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Dataiku Data Science Studio (DSS).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications function seamlessly within the existing infrastructure. You will also engage in problem-solving activities, providing support and enhancements to existing applications, while continuously seeking ways to improve processes and user experiences. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Strong understanding of ETL processes and data integration techniques.- Experience with data warehousing concepts and methodologies.- Familiarity with SQL and database management systems.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Informatica PowerCenter.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

What are we looking for? Must have experience with at least one cloud platform (AWS, GCP, or Azure) – AWS preferred Must have experience with lakehouse-based systems such as Iceberg, Hudi, or Delta Must have experience with at least one programming language (Python, Scala, or Java) along with SQL Must have experience with Big Data technologies such as Spark, Hadoop, Hive, or other distributed systems Must have experience with data orchestration tools like Airflow Must have experience in building reliable and scalable ETL pipelines Good to have experience in data modeling Good to have exposure to building AI-led data applications/services Qualifications and Skills 2–6 years of professional experience in a Data Engineering role. Knowledge of distributed systems such as Hadoop, Hive, Spark, Kafka, etc.

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with organizational goals. You will also participate in testing and troubleshooting to enhance application performance and user experience, contributing to the overall success of the projects you are involved in. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in continuous learning to stay updated with industry trends and technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Strong understanding of ETL processes and data integration techniques.- Experience with data warehousing concepts and methodologies.- Familiarity with SQL and database management systems.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Informatica PowerCenter.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

12.0 - 15.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : PySparkMinimum 12 year(s) of experience is required Educational Qualification : Graduate Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with PySpark.- Strong understanding of data integration and ETL processes.- Experience in developing scalable applications using cloud technologies.- Familiarity with data governance and compliance standards. Additional Information:- The candidate should have minimum 12 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A Graduate is required. Qualification Graduate

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Talend ETL Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time educationAs a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Roles & Responsibilities:- Expected to be a SME with deep knowledge and experience.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Create data pipelines to extract, transform, and load data across systems.- Implement ETL processes to migrate and deploy data across systems.- Ensure data quality and integrity throughout the data lifecycle. Professional & Technical Skills: - Required Skill:Expert proficiency in Talend Big Data.- Strong understanding of data engineering principles and best practices.- Experience with data integration and data warehousing concepts.- Experience with data migration and deployment.- Proficiency in SQL and database management.- Knowledge of data modeling and optimization techniques. Additional Information:- The candidate should have a minimum of 3 years of experience in Talend ETL.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in code reviews to ensure adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Knowledge of data visualization techniques and tools. Additional Information:- The candidate should have minimum 2 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Chennai office.- A 15 years full time education is required.- Candidate should be ready to work in rotational shift Qualification 15 years full time education

Posted 1 week ago

Apply

9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: We are seeking an experienced Data Architect to design, implement, and optimize scalable data solutions on Amazon Web Services (AWS) and / or Google Cloud Platform (GCP). The ideal candidate will lead the development of enterprise-grade data architectures that support analytics, machine learning, and business intelligence initiatives while ensuring security, performance, and cost optimization. Who we are looking for: Primary Responsibilities: Key Responsibilities Architecture & Design: Design and implement comprehensive data architectures using AWS or GCP services Develop data models, schemas, and integration patterns for structured and unstructured data Create solution blueprints, technical documentation, architectural diagrams, and best practice guidelines Implement data governance frameworks and ensure compliance with security standards Design disaster recovery and business continuity strategies for data systems Technical Leadership: Lead cross-functional teams in implementing data solutions and migrations Provide technical guidance on cloud data services selection and optimization Collaborate with stakeholders to translate business requirements into technical solutions Drive adoption of cloud-native data technologies and modern data practices Platform Implementation: Implement data pipelines using cloud-native services (AWS Glue, Google Dataflow, etc.) Configure and optimize data lakes and data warehouses (S3 / Redshift, GCS / BigQuery) Set up real-time streaming data processing solutions (Kafka, Airflow, Pub / Sub) Implement automated data quality monitoring and validation processes Establish CI/CD pipelines for data infrastructure deployment Performance & Optimization: Monitor and optimize data pipeline performance and cost efficiency Implement data partitioning, indexing, and compression strategies Conduct capacity planning and scaling recommendations Troubleshoot complex data processing issues and performance bottlenecks Establish monitoring, alerting, and logging for data systems Skill: Bachelor’s degree in computer science, Data Engineering, or related field 9+ years of experience in data architecture and engineering 5+ years of hands-on experience with AWS or GCP data services Experience with large-scale data processing and analytics platforms AWS Redshift, S3, Glue, EMR, Kinesis, Lambda AWS Data Pipeline, Step Functions, CloudFormation Big Query, Cloud Storage, Dataflow, Dataproc, Pub/Sub GCP Cloud Functions, Cloud Composer, Deployment Manager IAM, VPC, and security configurations SQL and NoSQL databases Big data technologies (Spark, Hadoop, Kafka) Programming languages (Python, Java, SQL) Data modeling and ETL/ELT processes Infrastructure as Code (Terraform, CloudFormation) Container technologies (Docker, Kubernetes) Data warehousing concepts and dimensional modeling Experience with modern data architecture patterns Real-time and batch data processing architectures Data governance, lineage, and quality frameworks Business intelligence and visualization tools Machine learning pipeline integration Strong communication and presentation abilities Leadership and team collaboration skills Problem-solving and analytical thinking Customer-focused mindset with business acumen Preferred Qualifications: Master’s degree in relevant field Cloud certifications (AWS Solutions Architect, GCP Professional Data Engineer) Experience with multiple cloud platforms Knowledge of data privacy regulations (GDPR, CCPA) Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, colour, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment.

Posted 1 week ago

Apply

4.0 - 9.0 years

10 - 14 Lacs

Pune

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that the applications developed meet both user needs and technical requirements. Your role will be pivotal in fostering a collaborative environment that encourages innovation and problem-solving among team members. Roles & Responsibilities:Minimum of 4 years of experience in data engineering or similar roles.Proven expertise with Databricks and data processing frameworks.Technical Skills SQL, Spark, Py spark, Databricks, Python, Scala, Spark SQLStrong understanding of data warehousing, ETL processes, and data pipeline design.Experience with SQL, Python, and Spark.Excellent problem-solving and analytical skills.Effective communication and teamwork abilities. Professional & Technical Skills: Experience and knowledge of Azure SQL Database, Azure Data Factory, ADLS Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Azure Data Services.- This position is based in Pune.- A 15 year full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Gurugram

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Apache Spark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary As an Application Lead, you will be responsible for designing, building, and configuring applications. Acting as the primary point of contact, you will lead the development team, oversee the delivery process, and ensure successful project execution. Roles & ResponsibilitiesAct as a Subject Matter Expert (SME) in application developmentLead and manage a development team to achieve performance goalsMake key technical and architectural decisionsCollaborate with cross-functional teams and stakeholdersProvide technical solutions to complex problems across multiple teamsOversee the complete application development lifecycleGather and analyze requirements in coordination with stakeholdersEnsure timely and high-quality delivery of projects Professional & Technical SkillsMust-Have SkillsProficiency in Apache SparkStrong understanding of big data processingExperience with data streaming technologiesHands-on experience in building scalable, high-performance applicationsKnowledge of cloud computing platformsMust-Have Additional SkillsPySparkSpark SQL / SQLAWS Additional InformationThis is a full-time, on-site role based in GurugramCandidates must have a minimum of 5 years of hands-on experience with Apache SparkA minimum of 15 years of full-time formal education is mandatory Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and mentoring among team members.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Strong understanding of data warehousing concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with reporting tools and data visualization techniques.- Ability to troubleshoot and optimize data models for performance. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Scala, PySparkMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Your role will be pivotal in shaping the direction of application projects and ensuring that they meet the needs of the organization and its clients. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with PySpark, Scala.- Strong understanding of data engineering principles and practices.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and compliance standards. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Palantir Foundry Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Project Role:Lead Data Engineer Project Role Description:Design, build and enhance applications to meet business process and requirements in Palantir foundry.Work experience:Minimum 6 years Must have Skills: Palantir Foundry, PySparkGood to Have Skills: Experience in PySpark, python and SQLKnowledge on Big Data tools & TechnologiesOrganizational and project management experience.Job Requirements & Key Responsibilities:Responsible for designing, developing, testing, and supporting data pipelines and applications on Palantir foundry.Configure and customize Workshop to design and implement workflows and ontologies.Collaborate with data engineers and stakeholders to ensure successful deployment and operation of Palantir foundry applications.Work with stakeholders including the product owner, data, and design teams to assist with data-related technical issues and understand the requirements and design the data pipeline.Work independently, troubleshoot issues and optimize performance.Communicate design processes, ideas, and solutions clearly and effectively to team and client. Assist junior team members in improving efficiency and productivity.Technical Experience:Proficiency in PySpark, Python and SQL with demonstrable ability to write & optimize SQL and spark jobs.Hands on experience on Palantir foundry related services like Data Connection, Code repository, Contour, Data lineage & Health checks.Good to have working experience with workshop, ontology, slate.Hands-on experience in data engineering and building data pipelines (Code/No Code) for ELT/ETL data migration, data refinement and data quality checks on Palantir Foundry.Experience in ingesting data from different external source systems using data connections and sync.Good Knowledge on Spark Architecture and hands on experience on performance tuning & code optimization.Proficient in managing both structured and unstructured data, with expertise in handling various file formats such as CSV, JSON, Parquet, and ORC.Experience in developing and managing scalable architecture & managing large data sets.Good understanding of data loading mechanism and adeptly implement strategies for capturing CDC.Nice to have test driven development and CI/CD workflows.Experience in version control software such as Git and working with major hosting services (e. g. Azure DevOps, GitHub, Bitbucket, Gitlab).Implementing code best practices involves adhering to guidelines that enhance code readability, maintainability, and overall quality.Educational Qualification:15 years of full-term education Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Warehouse ETL Testing Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Your role will be pivotal in driving the success of application development initiatives while fostering a collaborative and innovative work environment. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure adherence to timelines and quality standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Warehouse ETL Testing.- Strong understanding of data integration processes and methodologies.- Experience with various ETL tools and frameworks.- Ability to design and execute test cases for data validation.- Familiarity with database management systems and SQL. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Warehouse ETL Testing.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies