Home
Jobs

61 Aws Emr Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Naukri logo

Dear Candidate, We are hiring a Cloud Architect to design and oversee scalable, secure, and cost-efficient cloud solutions. Great for architects who bridge technical vision with business needs. Key Responsibilities: Design cloud-native solutions using AWS, Azure, or GCP Lead cloud migration and transformation projects Define cloud governance, cost control, and security strategies Collaborate with DevOps and engineering teams for implementation Required Skills & Qualifications: Deep expertise in cloud architecture and multi-cloud environments Experience with containers, serverless, and microservices Proficiency in Terraform, CloudFormation, or equivalent Bonus: Cloud certification (AWS/Azure/GCP Architect) Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 3 weeks ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Naukri logo

Dear Candidate, Looking for a Cloud Data Engineer to build cloud-based data pipelines and analytics platforms. Key Responsibilities: Develop ETL workflows using cloud data services. Manage data storage, lakes, and warehouses. Ensure data quality and pipeline reliability. Required Skills & Qualifications: Experience with BigQuery, Redshift, or Azure Synapse. Proficiency in SQL, Python, or Spark. Familiarity with data lake architecture and batch/streaming. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 3 weeks ago

Apply

5 - 10 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

EPAM has presence across 40+ countries globally with 55,000 + professionals & numerous delivery centers, Key locations are North America, Eastern Europe, Central Europe, Western Europe, APAC, Mid East & Development Centers in India (Hyderabad, Pune & Bangalore). Location: Gurgaon/Pune/Hyderabad/Bengaluru/Chennai Work Mode: Hybrid (2-3 days office in a week) Job Description: 5-14 Years of in Big Data & Data related technology experience Expert level understanding of distributed computing principles Expert level knowledge and experience in Apache Spark Hands on programming with Python Proficiency with Hadoop v2, Map Reduce, HDFS, Sqoop Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming Good understanding of Big Data querying tools, such as Hive, and Impala Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files Good understanding of SQL queries, joins, stored procedures, relational schemas Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of ETL techniques and frameworks Performance tuning of Spark Jobs Experience with native Cloud data services AWS/Azure/GCP Ability to lead a team efficiently Experience with designing and implementing Big data solutions Practitioner of AGILE methodology WE OFFER Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications Opportunity to share your ideas on international platforms Sponsored Tech Talks & Hackathons Possibility to relocate to any EPAM office for short and long-term projects Focused individual development Benefit package: • Health benefits, Medical Benefits• Retirement benefits• Paid time off• Flexible benefits Forums to explore beyond work passion (CSR, photography, painting, sports, etc

Posted 1 month ago

Apply

5 - 8 years

5 - 15 Lacs

Pune, Chennai

Work from Office

Naukri logo

• SQL: 2-4 years of experience • Spark: 1-2 years of experience • NoSQL Databases: 1-2 years of experience • Database Architecture: 2-3 years of experience • Cloud Architecture: 1-2 years of experience • Experience in programming language like Python • Good Understanding of ETL (Extract, Transform, Load) concepts • Good analytical and problem-solving skills • Inclination for learning & be self-motivated. • Knowledge of ticketing tool like JIRA/SNOW. • Good communication skills to interact with Customers on issues & requirements. Good to Have: • Knowledge/Experience in Scala.

Posted 1 month ago

Apply

2 - 3 years

0 - 0 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

Role Proficiency: Acts under very minimal guidance to develop error free code; testing and documenting applications Outcomes: Understand the applications features and component design and develop the same in accordance with user stories/requirements. Code debug test and document; and communicate product/component/feature development stages. Develop optimized code with appropriate approach and algorithms following standards and security guidelines independently Effectively interact with customers and articulate their input Optimise efficiency cost and quality by identifying opportunities for automation/process improvements and agile delivery models Mentor Developer I - Software Engineering to become more effective in their role Learn technology business domain and system domain as recommended by the project/account Set FAST goals and provide feedback to FAST goals of mentees Measures of Outcomes: Adherence to engineering processes and standards (coding standards) Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Meet the Defined productivity standards for project Completion of applicable technical/domain certifications Completion of all mandatory training requirements Outputs Expected: Configure: Follow configuration process Test: Create and conduct unit testing Domain relevance: Develop features and components with good understanding of the business problem being addressed for the client Manage Defects: Raise fix retest defects Estimate: Estimate time effort and resource dependence for one's own work Mentoring: Mentor junior developers in the team Set FAST goals and provide feedback to FAST goals of mentees Document: Create documentation for one's own work Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Status Reporting: Report status of tasks assigned Comply with project related reporting standards/process Release: Adhere to release management process Design: Understand the design/LLD and link it to requirements/user stories Code: Develop code with guidance for the above Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Develop user interfaces business software components and embedded software components Manage and guarantee high levels of cohesion and quality Use data models Estimate effort time required for own work Perform and evaluate tests in the customers or target environments Team player Good written and verbal communication abilities Proactively ask for and offer help Knowledge Examples: Appropriate software programs / modules Technical designing Programming languages DBMS Operating Systems and software platforms Integrated development environment (IDE) Agile methods Knowledge of customer domain and sub domain where problem is solved Additional Comments: Responsibilities and Skills - Manage incident response, root cause analysis, and ensure high system availability. - Oversee support for Hadoop, Spark, Hive, PySpark, Snowflake, and AWS EMR. - Maintain Python Flask APIs, Scala applications, and Airflow workflows. - Optimize SQL/HQL queries and manage shell/bash scripts. - Develop monitoring and ing systems, and provide detailed reporting. - 3+ years in production support/data engineering, with team leadership. - Expertise in Hadoop, Spark, Hive, PySpark, SQL, HQL, Python, Scala, and Python Flask API. - Proficiency in Unix/Linux, shell/bash scripting, Snowflake, and AWS EMR. - Experience with Airflow and incident management. - Strong problem-solving and communication skills. Required Skills Python,Pyspark,Airflow

Posted 1 month ago

Apply

4 - 9 years

12 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers AWS S3 , Redshift , and EMR for data storage and distributed processing. AWS Lambda , AWS Step Functions , and AWS Glue to build serverless, event-driven data workflows and orchestrate ETL processes

Posted 1 month ago

Apply

8 - 13 years

12 - 22 Lacs

Gurugram

Work from Office

Naukri logo

Data & Information Architecture Lead 8 to 15 years - Gurgaon Summary An Excellent opportunity for Data Architect professionals with expertise in Data Engineering, Analytics, AWS and Database. Location Gurgaon Your Future Employer : A leading financial services provider specializing in delivering innovative and tailored solutions to meet the diverse needs of our clients and offer a wide range of services, including investment management, risk analysis, and financial consulting. Responsibilities Design and optimize architecture of end-to-end data fabric inclusive of data lake, data stores and EDW in alignment with EA guidelines and standards for cataloging and maintaining data repositories Undertake detailed analysis of the information management requirements across all systems, platforms & applications to guide the development of info. management standards Lead the design of the information architecture, across multiple data types working closely with various business partners/consumers, MIS team, AI/ML team and other departments to design, deliver and govern future proof data assets and solutions Design and ensure delivery excellence for a) large & complex data transformation programs, b) small and nimble data initiatives to realize quick gains, c) work with OEMs and Partners to bring the best tools and delivery methods. Drive data domain modeling, data engineering and data resiliency design standards across the micro services and analytics application fabric for autonomy, agility and scale Requirements Deep understanding of the data and information architecture discipline, processes, concepts and best practices Hands on expertise in building and implementing data architecture for large enterprises Proven architecture modelling skills, strong analytics and reporting experience Strong Data Design, management and maintenance experience Strong experience on data modelling tools Extensive experience in areas of cloud native lake technologies e.g. AWS Native Lake Solution onsibilities

Posted 1 month ago

Apply

5 - 10 years

20 - 35 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

EPAM has presence across 40+ countries globally with 55,000 + professionals & numerous delivery centers, Key locations are North America, Eastern Europe, Central Europe, Western Europe, APAC, Mid East & Development Centers in India (Hyderabad, Pune & Bangalore). Location: Gurgaon/Pune/Hyderabad/Bengaluru/Chennai Work Mode: Hybrid (2-3 days office in a week) Job Description: 5-14 Years of in Big Data & Data related technology experience Expert level understanding of distributed computing principles Expert level knowledge and experience in Apache Spark Hands on programming with Python Proficiency with Hadoop v2, Map Reduce, HDFS, Sqoop Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming Good understanding of Big Data querying tools, such as Hive, and Impala Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files Good understanding of SQL queries, joins, stored procedures, relational schemas Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of ETL techniques and frameworks Performance tuning of Spark Jobs Experience with native Cloud data services AWS/Azure Ability to lead a team efficiently Experience with designing and implementing Big data solutions Practitioner of AGILE methodology WE OFFER Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications Opportunity to share your ideas on international platforms Sponsored Tech Talks & Hackathons Possibility to relocate to any EPAM office for short and long-term projects Focused individual development Benefit package: • Health benefits, Medical Benefits• Retirement benefits• Paid time off• Flexible benefits Forums to explore beyond work passion (CSR, photography, painting, sports, etc

Posted 1 month ago

Apply

4 - 9 years

12 - 16 Lacs

Kochi

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers

Posted 2 months ago

Apply

4 - 9 years

12 - 16 Lacs

Kochi

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers.

Posted 2 months ago

Apply

4 - 9 years

12 - 16 Lacs

Kochi

Work from Office

Naukri logo

As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities includeStrategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers

Posted 2 months ago

Apply

5 - 10 years

15 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Urgent Hiring: AWS Data Engineer, Senior Data Engineers & Lead Data Engineers Apply Now: Send your resume to heena.ruchwani@gspann.com Location: Bangalore (5+ Years Experience) Company: GSPANN Technologies, Inc. GSPANN Technologies is seeking talented professionals with 5+ years of experience to join our team in Bangalore. We are looking for immediate joiners who are passionate about data engineering and eager to take on exciting challenges. Key Skills & Experience: 5+ years of hands-on experience with AWS Data Services (Glue, Redshift, S3, Lambda, EMR, Athena, etc.) Strong expertise in Big Data Technologies (Spark, Hadoop, Kafka) Proficiency in SQL, Python, and Scala Hands-on experience with ETL pipelines, data modeling, and cloud-based data solutions Location: Bangalore Apply Now: Send your resume to heena.ruchwani@gspann.com Immediate Joiners Preferred! If you're ready to contribute to dynamic, data-driven projects and advance your career with GSPANN Technologies, apply today!

Posted 2 months ago

Apply

5 - 10 years

15 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Urgent Hiring: AWS Data Engineer, Senior Data Engineers & Lead Data Engineers Apply Now: Send your resume to heena.ruchwani@gspann.com Location: Bangalore (5+ Years Experience) Company: GSPANN Technologies, Inc. GSPANN Technologies is seeking talented professionals with 4+ years of experience to join our team in Bangalore. We are looking for immediate joiners who are passionate about data engineering and eager to take on exciting challenges. Key Skills & Experience: 4+ years of hands-on experience with AWS Data Services (Glue, Redshift, S3, Lambda, EMR, Athena, etc.) Strong expertise in Big Data Technologies (Spark, Hadoop, Kafka) Proficiency in SQL, Python, and Scala Hands-on experience with ETL pipelines, data modeling, and cloud-based data solutions Location: Bangalore Apply Now: Send your resume to heena.ruchwani@gspann.com Immediate Joiners Preferred! If you're ready to contribute to dynamic, data-driven projects and advance your career with GSPANN Technologies, apply today!

Posted 2 months ago

Apply

8 - 12 years

20 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Data Engineer Location: Bangalore Experience: 8+ Years Job Description: We are looking for a highly skilled Data Engineer to join our team and contribute to our Data Ingestion and Lakehouse initiatives. The ideal candidate will have extensive experience with data streaming technologies and cloud infrastructure management. 8+ years experiences on relevant field below (Internship, prototype, and personal projects do not count) Coding is required . (Ideally Python or Java) Own end to end lifecycle (From development to deployment to production environment) Experience in building or deploying solution in the cloud. Either Cloud Native (Serverless) : S3, Lambda, AWS Batch, ECS Or Cloud Agnostic: Kubernetes, Helm Chart, ArgoCD, Prometeus, Grafana. CICD experience: Github action or Jenkin. Infrastructure as code : e.g., Terraform And experience in at least one of this focus area: Big Data: Building Big data pipeline or Platform to process petabytes of data: (PySpark, Hudi, Data Lineage, AWS Glue, AWS EMR, Kafka, Schema Registry) Or GraphDB : Ingesting and consuming data in Graph Database such as Neo4J, AWS Neptune, JanusGraph or DGraph How to Apply: Interested candidates can send their resumes to ritu.singh@calsoftinc.com We are looking for candidates who can either join in 7 days or, at the maximum within 15 days.

Posted 2 months ago

Apply

2 - 6 years

12 - 16 Lacs

Kochi

Work from Office

Naukri logo

Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Developed the Pysprk code for AWS Glue jobs and for EMR. Worked on scalable distributed data system using Hadoop ecosystem in AWS EMR, MapR distribution.. Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Hadoop streaming Jobs using python for integrating python API supported applications.. Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.. Re - write some Hive queries to Spark SQL to reduce the overall batch time Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 2 months ago

Apply

4 - 9 years

12 - 16 Lacs

Kochi

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 6 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers

Posted 2 months ago

Apply

5 - 10 years

0 - 3 Lacs

Kochi

Hybrid

Naukri logo

Dear Candidates, Greetings from Cognizant!! We are excited to let you know about the MEGA Walk-in drive in Kochi location on 22nd Mar'25. Venue: Infopark Phase 2, Kakkanad, Kochi, Kerala 682030 Timing: 9 AM to 12:30 PM POC: Ponnarasu M & Saranya E Below given the skillset with mandatory experience, Exp Range: 4 to 12 years Skillset: Azure Databricks | IICS | AWS Glue | AWS Data engineer | Databricks admin | IICS admin | IDMC admin | Abinitio | AIML | DSA | GenAI | ML OPS | Adobe analytics Exp Range: 5 to 12 years Skillset: Pyspark | Snowflake | Azure Data engineer | Spark + Hive | Stibo MDM | SAP MDG | Reltio MDM | Informatica MDM | Spark + Scala Regards, Talent Acquisition Cognizant.

Posted 3 months ago

Apply

6 - 11 years

8 - 14 Lacs

Kolkata

Work from Office

Naukri logo

About The Role : Must have knowledge in Azure Datalake, Azure function, Azure Databricks, Azure data factory, PostgreSQL Working knowledge in Azure devops, Git flow would be an added advantage. (OR) SET 2: Must have working knowledge in AWS Kinesis, AWS EMR, AWS Glue, AWS RDS, AWS Athena, AWS RedShift. Should have demonstrable knowledge and expertise in working with timeseries data. Working knowledge in delivering data engineering / data science projects in Industry 4.0 is an added advantage. Should have knowledge on Palantir. Strong problem-solving skills with an emphasis on sustainable and reusable development. Experience using statistical computer languages to manipulate data and draw insights from large data sets :Python/PySpark, Pandas, Numpy seaborn / matplotlib, Knowledge in Streamlit.io is a plus Familiarity with Scala, GoLang, Java would be added advantage. Experience with big data tools:Hadoop, Spark, Kafka, etc. Experience with relational databases such as Microsoft SQL Server, MySQL, PostGreSQL, Oracle and NoSQL databases such as Hadoop, Cassandra, Mongo dB Experience with data pipeline and workflow management tools:Azkaban, Luigi, Airflow, etc Experience building and optimizing big data data pipelines, architectures and data sets. Strong analytic skills related to working with unstructured datasets. Primary Skills Provide innovative solutions to the data engineering problems that are faced in the project and solve them with technically superior code & skills. Where possible, should document the process of choosing technology or usage of integration patterns and help in creating a knowledge management artefact that can be used for other similar areas. Create & apply best practices in delivering the project with clean code. Should work innovatively and have a sense of proactiveness in fulfilling the project needs. Additional Information: Reporting to :Director- Intelligent Insights and Data Strategy Travel :Must be willing to be deployed at client locations anywhere in the world for long and short term as well as should be flexible to travel on shorter duration within India and abroad

Posted 3 months ago

Apply

4 - 8 years

8 - 12 Lacs

Pune

Work from Office

Naukri logo

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Full Stack Developer who is a skilled professional with exposure to cloud platforms, DevOps, and data visualization, preferably with networking domain (Cisco) product exposure. You will collaborate with internal teams to design, develop, deploy, and maintain software applications at scale. To succeed as a Full Stack Developer, you will be required to ensure the timely completion and approvals of project deliverables. You will also be expected to recommend new technologies and techniques for application development. What You?ll Do Design, develop, deploy, and maintain software applications at scale using Java / J2EE, JavaScript frameworks (Angular or React) and associated technologies Deploy software using CI / CD tools such as Jenkins Understand the technologies implemented, and interface with the project manager on status and technical issues Solve and articulate simple and complex problems with application design, development, and user experiences Collaborate with other developers and designers, as well as assist with technical matters when required Expertise You?ll Bring Experience: Proven ability in Java / j2ee, Spring, Python, Docker, Kubernetes, and Microservices Knowledge in the distributed technologies below will give you an added advantage Apache Spark MapReduce Principles Kafka (MSK) Apache Hadoop (AWS EMR) Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above.

Posted 3 months ago

Apply

4 - 8 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Full Stack Developer who is a skilled professional with exposure to cloud platforms, DevOps, and data visualization, preferably with networking domain (Cisco) product exposure. You will collaborate with internal teams to design, develop, deploy, and maintain software applications at scale. To succeed as a Full Stack Developer, you will be required to ensure the timely completion and approvals of project deliverables. You will also be expected to recommend new technologies and techniques for application development. What You?ll Do Design, develop, deploy, and maintain software applications at scale using Java / J2EE, JavaScript frameworks (Angular or React) and associated technologies Deploy software using CI / CD tools such as Jenkins Understand the technologies implemented, and interface with the project manager on status and technical issues Solve and articulate simple and complex problems with application design, development, and user experiences Collaborate with other developers and designers, as well as assist with technical matters when required Expertise You?ll Bring Experience: Proven ability in Java / j2ee, Spring, Python, Docker, Kubernetes, and Microservices Knowledge in the distributed technologies below will give you an added advantage Apache Spark MapReduce Principles Kafka (MSK) Apache Hadoop (AWS EMR) Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above

Posted 3 months ago

Apply

4 - 8 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Full Stack Developer who is a skilled professional with exposure to cloud platforms, DevOps, and data visualization, preferably with networking domain (Cisco) product exposure. You will collaborate with internal teams to design, develop, deploy, and maintain software applications at scale. To succeed as a Full Stack Developer, you will be required to ensure the timely completion and approvals of project deliverables. You will also be expected to recommend new technologies and techniques for application development. What You?ll Do Design, develop, deploy, and maintain software applications at scale using Java / J2EE, JavaScript frameworks (Angular or React) and associated technologies Deploy software using CI / CD tools such as Jenkins Understand the technologies implemented, and interface with the project manager on status and technical issues Solve and articulate simple and complex problems with application design, development, and user experiences Collaborate with other developers and designers, as well as assist with technical matters when required Expertise You?ll Bring Experience: Proven ability in Java / j2ee, Spring, Python, Docker, Kubernetes, and Microservices Knowledge in the distributed technologies below will give you an added advantage Apache Spark MapReduce Principles Kafka (MSK) Apache Hadoop (AWS EMR) Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above

Posted 3 months ago

Apply

2 - 5 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Should have developed/Worked for atleast 1 Gen AI project. Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 3 months ago

Apply

4 - 9 years

6 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers

Posted 3 months ago

Apply

5 - 10 years

20 - 35 Lacs

Bengaluru, Hyderabad, Gurgaon

Hybrid

Naukri logo

EPAM has presence across 40+ countries globally with 55,000 + professionals & numerous delivery centers, Key locations are North America, Eastern Europe, Central Europe, Western Europe, APAC, Mid East & Development Centers in India (Hyderabad, Pune & Bangalore). Location: Gurgaon/Pune/Hyderabad/Bengaluru/Chennai Work Mode: Hybrid (2-3 days office in a week) Job Description: 5-14 Years of in Big Data & Data related technology experience Expert level understanding of distributed computing principles Expert level knowledge and experience in Apache Spark Hands on programming with Python/Scala Proficiency with Hadoop v2, Map Reduce, HDFS, Sqoop Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming Good understanding of Big Data querying tools, such as Hive, and Impala Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files Good understanding of SQL queries, joins, stored procedures, relational schemas Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of ETL techniques and frameworks Performance tuning of Spark Jobs Experience with native Cloud data services AWS/Azure/GCP Ability to lead a team efficiently Experience with designing and implementing Big data solutions Practitioner of AGILE methodology WE OFFER Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications Opportunity to share your ideas on international platforms Sponsored Tech Talks & Hackathons Possibility to relocate to any EPAM office for short and long-term projects Focused individual development Benefit package: • Health benefits, Medical Benefits• Retirement benefits• Paid time off• Flexible benefits Forums to explore beyond work passion (CSR, photography, painting, sports, etc

Posted 3 months ago

Apply

15 - 19 years

40 - 50 Lacs

Hyderabad

Work from Office

Naukri logo

Role: To The ideal professional for this Cloud Architect role will: Have a passion for design, technology, analysis, collaboration, agility, and planning, along with a drive for continuous improvement and innovation. Exhibit expertise in managing high-volume data projects that leverage Cloud Platforms, Data Warehouse reporting and BI Tools, and the development of relational databases. Research, identify, and internally market enabling data management technologies based on business and end-user requirements. Seek ways to apply new technology to business processes with a focus on modernizing the approach to data management. Consult with technical subject matter experts and develop alternative technical solutions. Advise on options, risks, costs versus benefits, and impact on other business processes and system priorities. Demonstrate strong technical leadership skills and the ability to mentor others in related technologies. Qualifications Bachelor's degree in a computer-related field or equivalent professional experience is required. Preferred masters degree in computer science, information systems or related discipline, or equivalent and extensive related project experience. 10+ years of hands-on software development experience building data platforms with tools and technologies such as Hadoop, Cloudera, Spark, Kafka, Relational SQL, NoSQL databases, and data pipeline/workflow management tools. 6+ years of experience working with various cloud platforms (at least two from among AWS, Azure & GCP). Experience in multi-cloud data platform migration and hands-on experience working with AWS, AZURE GCP Experience in Data & Analytics projects is a must. Data modeling experience relational and dimensional with consumption requirements (reporting, dashboarding, and analytics). Thorough understanding and application of AWS services related to Cloud data platform and Datalake implementation S3 Datalake, AWS EMR, AWS Glue, Amazon Redshift, AWS Lambda, and Step functions with file formats such as Parquet, Avro, and Iceberg. Must know the key tenets of architecting and designing solutions on AWS and Azure Clouds. Expertise and implementation experience in data-specific areas, such as AWS Data Lake, Data Lakehouse Architecture, and Azure Synapse and SQL Datawarehouse. Apply technical knowledge to architect and design solutions that meet business and IT needs, create Data & Analytics roadmaps, drive POCs and MVPs, and ensure the long-term technical viability of new deployments, infusing key Data & Analytics technologies where applicable. Be the Voice of the Customer to share insights and best practices, connect with the Engineering team to remove key blockers, and drive migration solutions and implementations. Familiarity with tools like DBT, Airflow, and data test automation. MUST have experience with Python/ PySpark/Scala in Big Data environments. Strong skills in SQL queries in Big Data tools such as Hive, Impala, Presto. Experience working with and extracting value from large, disconnected, and/or unstructured datasets. Demonstrated ability to build processes that support data transformation, data structures, metadata, dependency, and workload management. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), and working familiarity with a variety of databases. Experience building and optimizing big data data pipelines, architectures, and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies