Home
Jobs

591 Yaml Jobs - Page 18

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

50 - 60 Lacs

Ahmedabad, Gujarat, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

50 - 60 Lacs

Jaipur, Rajasthan, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

50 - 60 Lacs

Greater Lucknow Area

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

50 - 60 Lacs

Thane, Maharashtra, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

50 - 60 Lacs

Nashik, Maharashtra, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

50 - 60 Lacs

Kanpur, Uttar Pradesh, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

50 - 60 Lacs

Nagpur, Maharashtra, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

50 - 60 Lacs

Kochi, Kerala, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

50 - 60 Lacs

Greater Bhopal Area

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

50 - 60 Lacs

Indore, Madhya Pradesh, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

50 - 60 Lacs

Chandigarh, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

50 - 60 Lacs

Mysore, Karnataka, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

50 - 60 Lacs

Thiruvananthapuram, Kerala, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

50 - 60 Lacs

Patna, Bihar, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

50 - 60 Lacs

Vijayawada, Andhra Pradesh, India

Remote

Linkedin logo

Experience : 3.00 + years Salary : INR 5000000-6000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: Rill Data) (*Note: This is a requirement for one of Uplers' client - Rill Data) What do you need for this opportunity? Must have skills required: DBT, Iceberg, Kestra, Parquet, SQLGlot, ClickHouse, DuckDB, AWS, Python, SQL Rill Data is Looking for: Rill is the world’s fastest BI tool, designed from the ground up for real-time databases like DuckDB and ClickHouse. Our platform combines last-mile ETL, an in-memory database, and interactive dashboards into a full-stack solution that’s easy to deploy and manage. With a BI-as-code approach, Rill empowers developers to define and collaborate on metrics using SQL and YAML. Trusted by leading companies in e-commerce, digital marketing, and financial services, Rill provides the speed and scalability needed for operational analytics and partner-facing reporting. Job Summary Overview Rill is looking for a Staff Data Engineer to join our Field Engineering team. In this role, you will work closely with enterprise customers to design and optimize high-performance data pipelines powered by DuckDB and ClickHouse. You will also collaborate with our platform engineering team to evolve our incremental ingestion architectures and support proof-of-concept sales engagements. The ideal candidate has strong SQL fluency, experience with orchestration frameworks (e.g., Kestra, dbt, SQLGlot), familiarity with data lake table formats (e.g., Iceberg, Parquet), and an understanding of cloud databases (e.g., Snowflake, BigQuery). Most importantly, you should have a passion for solving real-world data engineering challenges at scale. Key Responsibilities Collaborate with enterprise customers to optimize data models for performance and cost efficiency. Work with the platform engineering team to enhance and refine our incremental ingestion architectures. Partner with account executives and solution architects to rapidly prototype solutions for proof-of-concept sales engagements. Qualifications (required) Fluency in SQL and competency in Python. Bachelor’s degree in a STEM discipline or equivalent industry experience. 3+ years of experience in a data engineering or related role. Familiarity with major cloud environments (AWS, Google Cloud, Azure) Benefits Competitive salary Health insurance Flexible vacation policy How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

MLOps Engineer L3 As an MLOps Engineer, your day-to-day responsibilities will involve designing and managing machine learning operations pipelines using MLflow, Databricks ML Ops, and CI/CD pipelines. You will collaborate with data scientists to integrate machine learning models into production environments and automate model deployment and monitoring using Dataiku and other tools. Ensuring the scalability and reliability of machine learning workflows is crucial, and you will utilize Azure OpenAI and Argo CD for advanced automation and orchestration. Maintaining and optimizing cloud infrastructure for machine learning applications will be part of your daily tasks, along with monitoring the performance of machine learning models and workflows. Troubleshooting issues related to machine learning pipelines and infrastructure will be a regular part of your role, and you will maintain comprehensive documentation of processes, configurations, and best practices. Must-Have Skills: MLOps Tools: MLflow, Databricks ML Ops pipelines, CI/CD pipelines, Dataiku. Programming: Python, YAML scripting for pipeline automation. Terraform (highly recommended). Azure OpenAI. Argo CD (beginner). Azure Cloud proficiency: Azure Resource Management, AKS, ACS, Azure Functions, Azure Security. Azure Resources: Azure App services, Data Factory, and Azure Data Bricks (highly recommended). Networking: Network Security Groups, Virtual networks, Route Tables (highly recommended). Monitoring: Azure Monitor, Application Insights. Security: Azure Key Vault, RBAC, and secure CI/CD practices, IAM. Pluses: Azure ADO CI/CD pipelines. GitHub Actions. ARM Templates. API skills. This position pays between 30LPA - 35 LPA Depending on years of experience Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Java Backend Developer Location: Hyderabad (On-site) Experience: Minimum 5 Years Notice Period: Immediate Joiners Only Primary Skills: Core Java, Spring Boot, Microservices, REST APIs, Kubernetes Secondary Skills: Docker, OpenShift, YAML.. Media or Telecom domain exp is mandatory.. Key Responsibilities: -Design, develop, and maintain scalable backend applications using Core Java and Spring Boot. -Build and manage RESTful APIs and microservices architecture. -Deploy and manage services using Kubernetes, with exposure to OpenShift and Docker. -Write efficient, reusable, and testable code in a fast-paced development environment. -Collaborate closely with cross-functional teams including DevOps, Product Owners, and QA teams. -Ensure performance, scalability, and security in all backend implementations. Show more Show less

Posted 3 weeks ago

Apply

5.0 - 9.0 years

10 - 14 Lacs

Mumbai, Chennai, Bengaluru

Work from Office

Naukri logo

Classic pipeline Powershell Yaml Biceps Arm Templateterraform/ Biceps CI/CD Experience with data lake and analytics technologies in Azure (e.g., Azure Data Lake Storage, Azure Data Factory, Azure Databricks)- most important Data background with Azure & Powershell. Location: Chennai, Hyderabad, Kolkata, Pune, Ahmedabad, Remote.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

As Consultant, you are responsible to develop design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new mobile solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise With 3-5yrs of Design and implement automation for Application build and deployment. Best practices to follow during Ansible playbook development. Automate the current deployment process using Ansible like deploy war file (java application) to tomcat servers. Configure and manage inventories for different-2 environments Hands-on experience to develop the common (custom) roles to achieve specific functionality which can be re-use for multiple applications Preferred technical and professional experience Good understanding of different-2 types of variables. Good knowledge of jinja templates and Windows and Microsoft ecosystem (Windows, AD, Azure) Good knowledge of Ansible filters to manipulate the variables Strong knowledge of Unix / Linux Strong knowledge of scripting toolsShell, Powershell, YAML

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Summary As a Full Stack developer in imaging platform team, you’ll be responsible for designing, building, delivering and maintaining next generation platform solutions for GE HealthCare Products. Extensive hand-on development experience in containerization using Kubernetes / Docker on Linux Env, Python, Ansible and CI/CD using DevOps. Responsible for software lifecycle including activities such as feature grooming, requirement analysis, documentation/procedures, technology assessment, implementation, code reviews and formal release activities. GE Healthcare is a leading global medical technology and digital solutions innovator. Our mission is to improve lives in the moments that matter. Unlock your ambition, turn ideas into world-changing realities, and join an organization where every voice makes a difference, and every difference builds a healthier world. Job Description Roles and Responsibilities: In This Role, You Will Collaborate with software developers and imaging teams to implement solutions that are aligned with and extend shared platforms and solutions Apply Agile principles of Software Development Life Cycle and methodologies, Software and Product Security, Scalability, Documentation Practices, refactoring and Testing Techniques. Writes codes that meets standards and delivers desired functionality using the technology selected for the project. Build common microservices for Healthcare Imaging platform and healthcare solution using microservices architecture. Perform Technical feasibility assessment and share outcomes that help conclude technology decisions. Understand performance parameters, assess application performance and address security issues. Ensure unit test automation using BDD-Behaviour Driven Development and consistent code quality by adhering to SONAR quality gates. Activity manage formal verification and release activities. Coordinate with internal team members and other imaging teams for platform integration and defect fixes. Education Qualification Bachelor's Degree in Computer Science or “STEM” Majors (Science, Technology, Engineering and Math) with basic experience. 3+ Years of Software development experience Primary Skills Rest based Microservices using GO/ Python/ Java, Undertow or C++ Containerization & Orchestration with Kubernetes for Microservices development. Hands-on experience on browser based debugging. Extensive exposure to automated UI testing frameworks like Jasmine/Karma, Mocha . BDD based automated unit testing using Cucumber. Good communication and presentation skills. Secondary Skills Developing platform solutions for Healthcare Imaging and Radiology workflows Exposure to DICOM Standards HELM Charts, YAML, Ansible, Python, DevOps Pipeline Integration Development experience in Web UI Technologies - HTML5, JavaScript, CSS3, XML, JSON, HTTP/HTTPS protocols, Anagular 2+ and above ,Node JS, React and Responsive Web Design. Experience in developing UI framework, common components & applications. Inclusion and Diversity GE Healthcare is an Equal Opportunity Employer where inclusion matters. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law. We expect all employees to live and breathe our behaviors: to act with humility and build trust; lead with transparency; deliver with focus, and drive ownership – always with unyielding integrity. Our total rewards are designed to unlock your ambition by giving you the boost and flexibility you need to turn your ideas into world-changing realities. Our salary and benefits are everything you’d expect from an organization with global strength and scale, and you’ll be surrounded by career opportunities in a culture that fosters care, collaboration and support. #EveryRoleIsVital Additional Information Relocation Assistance Provided: No Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Design, implement, and manage Azure Kubernetes Service (AKS) clusters. Monitor and optimize the performance of AKS clusters. Troubleshoot and resolve issues related to AKS and containerized applications. Implement security measures to protect AKS clusters and containerized applications. Collaborate with development teams to support application deployment and maintenance. Maintain documentation for AKS configurations, processes, and procedures. Automate deployment, scaling, and management of containerized applications using AKS. Participate in on-call rotation for after-hours support. Designing and deploying Kubernetes clusters to ensure stability, scalability, and security. Automating deployment and scaling of containerized applications. Monitoring and troubleshooting Kubernetes environments to maintain performance and reliability. Collaborating with development teams to implement CI/CD pipelines and optimize workflows. Troubleshooting network/DNS communication issues using tools like telnet, tracert, curl, and nslookup. Troubleshooting AKS issues such as pod crashes, node restarts, and pod volume issues, FluentD communication issue etc. Ensuring security and compliance of Kubernetes infrastructure. Developing a self-managed ingress controller with security controls Writing YAML code to convert Azure classic to YAML pipelines Participate in on-call rotation for after-hours support. Upgrading Kubernetes Node. Primary Skills Azure Kubernetes Service (AKS) Azure Data Factory Azure API Management CI/CD Pipeline Secondary Skills Ensuring security and compliance of Kubernetes infrastructure. Show more Show less

Posted 3 weeks ago

Apply

1.0 - 3.0 years

16 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

About The Position The Global Capability Center (GCC) IT Foundation Platform (ITFP) Network Product Line (NPL) responsible for supporting the Business Network ensuring cost competitive, reliable, and secure operations of Chevron's Network environment globally while also enabling digital capabilities Products managed include all Business Network Infrastructure Products and Services globally including Software Defined Networking, Intent Based Networking, Internet First, Wireless, Telephony, Extranet, WAN, Data Center, Security Services and Life Cycle Management, The NPL Automation & Monitoring team drives continuous innovations to improve network asset configuration and reliability We are seeking a dynamic team player with razor focus on system reliability for our Senior Site Reliability Engineer position to help us achieve our goal of higher returns and lower carbon, Key Responsibilities Key Responsibilities as a Senior Site Reliability Engineer: Participate in review of current network process, change, and build procedures; translate to network automation projects, Work with Agile team members to design and implement feature in support of established security and acceptance criteria, Develop and document standards and provide training to others, Research network automation industry trends and automation tools, Knowledge of source code management systems, version control tools and developing web services, Build and maintain CI/CD pipelines to ensure code quality and maintainability, Align product features and roadmap to NPL strategic themes, Participate in Agile concepts and activities such as daily stand-up meetings, task tracking boards, design and code reviews, automated testing, continuous integration and deployment, Work with leaders across product line and business units to understand business strategies and shape technology roadmaps to support those strategies, Partner with Business Units to ensure solutions will operate at scale without issue and create visualizations for data collected from networking devices for quick interpretation and notification, Provide clear technical direction, prioritization, and delivery excellence for the Network activities; ensuring team members are delivering against priorities, eliminating roadblocks and technical debt, Identify, analyze, and resolve vulnerabilities, deployment and operational issues, Ensure NPL meets Chevron security, architecture, and best practice guardrails and policies, Required Qualifications Bachelors or masters degree in computer science, Computer Engineering, Information Technologies, or Management Information Systems, Site Reliability Engineering Fundamentals: Leverage SRE principles and best practices of SLO, SLI, SLA, error budgets, eliminating toil via automation, observability and monitoring, emergency response (triage, postmortem, retrospective), demand forecast and capacity planning, deliver results, Cloud Fundamentals Network Fundamentals; in-depth knowledge about networking protocols and TCP/IP stack Automation and programmability; proficient in Python, Ansible, YAML and asynchronous programming System and network monitoring Structured technical problem solving and debugging; perfect understanding of access control lists, address translation, tunneling, and standard routing protocols Critical thinking, self-motivated with excellent communication skills, and the demonstrated ability to work both independently and as part of a team Able to conceive and develop a presentation to a peer group that is logical, well-written, and concise Experience Basic network routing and switch experience Familiarity with network switches, load balancers, and firewalls Network automation/orchestration experience Experience using Restful APIs to integrate various technologies 5 yearsexperience as a network automation engineer with 5+ years of automation programing experience 5-10 years of experience Preferred Qualifications/Certifications Experience working in a Linux environment and has a working knowledge of basic Linux commands/utilities Desire to develop new ideas, while following best practices for design and coding Microsoft Cloud Fundamentals (AZ-900) and Designing and Implementing Microsoft DevOps Solutions (AZ-400) certifications Cisco Certified Network Associate (CCNA), Cisco Certified Network professional (CCNP), and/or DevNet certifications Chevron ENGINE supports global operations, supporting business requirements across the world Accordingly, the work hours for employees will be aligned to support business requirements The standard work week will be Monday to Friday Working hours are 8:00am to 5:00pm or 1 30pm to 10 30pm, Chevron participates in E-Verify in certain locations as required by law,

Posted 3 weeks ago

Apply

2.0 - 30.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

Company Description About Eurofins: Eurofins Scientific is an international life sciences company, providing a unique range of analytical testing services to clients across multiple industries, to make life and the environment safer, healthier and more sustainable. From the food you eat to the medicines you rely on, Eurofins works with the biggest companies in the world to ensure the products they supply are safe, their ingredients are authentic and labelling is accurate. Eurofins is a global leader in food, environmental, pharmaceutical and cosmetic product testing and in agroscience CRO services. It is also one of the global independent market leaders in certain testing and laboratory services for genomics, discovery pharmacology, forensics, CDMO, advanced material sciences and in the support of clinical studies. In over just 30 years, Eurofins has grown from one laboratory in Nantes, France to 58,000 staff across a network of over 1,000 independent companies in 54 countries, operating 900 laboratories. Performing over 450 million tests every year, Eurofins offers a portfolio of over 200,000 analytical methods to evaluate the safety, identity, composition, authenticity, origin, traceability and purity of biological substances and products, as well as providing innovative clinical diagnostic testing services, as one of the leading global emerging players in specialised clinical diagnostics testing. Eurofins is one of the fastest growing listed European companies with a listing on the French stock exchange since 1997. In FY 2021, Eurofins achieved a record revenue of over EUR 6.7 billion. Eurofins IT Solutions India Pvt Ltd (EITSI) is a fully owned subsidiary of Eurofins and functions as a Global Software Delivery Center exclusively catering to Eurofins Global IT business needs. The code shipped out of EITSI impacts the global network of Eurofins labs and services. The primary focus at EITSI is to develop the next generation LIMS (Lab Information Management system), Customer portals, e-commerce solutions, ERP/CRM system, Mobile Apps & other B2B platforms for various Eurofins Laboratories and businesses. Young and dynamic, we have a rich culture and we offer fulfilling careers. Job Description Cloud Engineer Eurofins IT Solutions, Bengaluru, Karnataka, India With 36 facilities worldwide, Eurofins BioPharma Product Testing (BPT) is the largest network of bio/pharmaceutical GMP product testing laboratories providing comprehensive laboratory services for the world's largest pharmaceutical, biopharmaceutical, and medical device companies. BPT is enabled by global engineering teams working on next-generation applications and Laboratory Information Management Systems (LIMS). As Site Reliability Engineer, you will be a key part of our cloud strategy, ensuring our IT systems operate effectively on the Azure cloud. As a technology leader, BPT wants to give you the opportunity not just to accept new challenges and opportunities, but to impress with your ingenuity, focus, attention to details and collaboration with a global team of professionals. This role reports to a SRE Manager. Primary Responsibilities : We are looking for a skilled Site Reliability Engineer (SRE) to join our Cloud Engineering and Operations team. The ideal candidate will be responsible for ensuring high availability, performance, and reliability of our cloud-hosted systems, particularly in Microsoft Azure environments. This role combines software engineering practices with operational excellence to build scalable, automated, and resilient infrastructure. You’ll work closely with developers, security teams, and platform engineers to implement best practices, reduce toil, and proactively manage incidents and risks across production environments. Key Responsibilities Infrastructure as Code (IaC) Automate deployment and configuration of resources using Bicep, PowerShell, and Azure CLI. Build repeatable, version-controlled infrastructure aligned with Azure Well-Architected Framework. Cloud Operations & Monitoring Manage and monitor Azure cloud resources including VMs, App Services, AKS clusters, and storage solutions. Ensure platform health using tools such as Azure Monitor, Log Analytics, and custom alerting frameworks. Optimize system performance, plan capacity, and proactively identify reliability risks. Access & Identity Lifecycle Management Administer access controls and identity provisioning using Azure Active Directory. Implement RBAC policies and maintain secure access patterns across the environment. Incident Response & Troubleshooting Respond to incidents and performance alerts, ensuring rapid resolution with minimal impact. Collaborate with engineering teams to analyze root causes and implement preventive solutions. Security & Compliance Enforce best practices for cloud security including encryption, key management via Azure Key Vault, and access control. Align infrastructure with compliance standards such as GDPR, HIPAA, or internal policies. Cost Optimization & Resource Management Monitor and manage cloud spending using Azure Cost Management tools. Recommend and implement strategies for efficient resource usage and scaling. Disaster Recovery & Availability Planning Design and maintain robust backup and DR plans across Azure regions. Ensure recovery objectives are met and tested regularly. Collaboration & Documentation Work closely with DevOps, Security, and Architecture teams to align goals. Maintain clear documentation on infrastructure design, operations, and procedures. Specific Deliverables Daily operations support for Azure-hosted workloads. Timely provisioning and access management for end users. Effective triaging and closure of cloud incidents. Secure and scalable infrastructure design and automation. Support and drive adoption of SRE practices, including monitoring, incident postmortems, and continuous improvement. Skills required: Strong knowledge of Azure services (Compute, Networking, Storage, Identity). Hands-on experience with IaC tools (Bicep), scripting (PowerShell, Python), and deployment automation. Solid foundation in networking concepts and cloud-native security. Experience with cloud observability and logging tools. Proficiency in managing access control using Azure AD and RBAC. Preferred Qualities Strong analytical and problem-solving mindset. A passion for automation and eliminating manual work. Documentation-oriented—writes things down to scale learning and onboarding. Proactive attitude toward identifying and fixing reliability gaps. Collaborative, self-driven, and adaptable to changing priorities. Stack: Cloud: Microsoft Azure Languages: PowerShell, Python, YAML Tools: Azure DevOps, Azure Monitor, Bicep Practices: Infrastructure as Code, CI/CD, RBAC, Zero Trust Security, Postmortems Qualifications Preferred Qualifications: Bachelors in Engineering, Computer Science or equivalent. At least 2 years of professional experience in Azure cloud in a multiple region environment

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About The Role: Grade Level (for internal use): 11 About The Role: We are looking for a Cloud/DevOps Engineer to join the KY3P team, to manage and automate custodian policies and systems administration in the AWS Cloud environment. The role offers extensive technical challenges in a highly dynamic and collaborative work environment. A passion for quality and a sense of pride in your work are an absolute must for the role. You will build solutions to migrate services, automate resource provisioning and administration of infrastructure in AWS Cloud for KY3P applications. What You'll Work On: Create DevOps pipelines to deliver Infrastructure as Code. Build workflows to create immutable Infrastructure in AWS using Terraform. Develop automation for provisioning compute instances and storage. Provision resources in AWS using Cloud Formation Templates and Orchestrate container deployment. Configure Security Groups, Roles & IAM Policy in AWS. Monitor infrastructure and develop utilization reports. Implementing and maintaining version control systems, configuration management tools, and other DevOps-related technologies. Designing and implementing automation tools and frameworks for continuous integration, delivery, and deployment. Develop and write scripts for pipeline automation using relevant scripting languages like Groovy, YAML. Configure continuous delivery workflows for various environments e.g., development, staging, production. Evaluate new AWS services and solutions. Integrate application build & deployments scripts with GitHub. Create comprehensive documentation and provide technical guidance. Effectively interact with global customers, business users and IT employees What We Look For : B Tech./ M Tech / MCA degree in an IT/ Computer Science or related course is a prerequisite. 8+ years of hands-on professional experience in Infrastructure Engineering and automation Experience in AWS Cloud systems administration. Excellent communication skills and ability to thrive in both team-based and independent environments. What You Need To Get The Job Done Candidates should have a minimum of 8+ years industry experience in cloud and Infrastructure. Expertise in using DevOps tools Terraform, GitHub, Artifactory etc. Cloud engineering certifications (AWS, Terraform) are desirable. Deep understanding of networking and application architecture needs for system migrations. Proficiency in scripting languages: Python, PowerShell, Bash. Evaluate new AWS services and solutions Experience working with customers to diagnose a problem, and work toward resolution. Excellent verbal and written communication skills About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf IFTECH202.2 - Middle Professional Tier II (EEO Job Group) Job ID: 315707 Posted On: 2025-05-27 Location: Noida, Uttar Pradesh, India Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

About The Role: Grade Level (for internal use): 11 About The Role: We are looking for a Cloud/DevOps Engineer to join the KY3P team, to manage and automate custodian policies and systems administration in the AWS Cloud environment. The role offers extensive technical challenges in a highly dynamic and collaborative work environment. A passion for quality and a sense of pride in your work are an absolute must for the role. You will build solutions to migrate services, automate resource provisioning and administration of infrastructure in AWS Cloud for KY3P applications. What You'll Work On: Create DevOps pipelines to deliver Infrastructure as Code. Build workflows to create immutable Infrastructure in AWS using Terraform. Develop automation for provisioning compute instances and storage. Provision resources in AWS using Cloud Formation Templates and Orchestrate container deployment. Configure Security Groups, Roles & IAM Policy in AWS. Monitor infrastructure and develop utilization reports. Implementing and maintaining version control systems, configuration management tools, and other DevOps-related technologies. Designing and implementing automation tools and frameworks for continuous integration, delivery, and deployment. Develop and write scripts for pipeline automation using relevant scripting languages like Groovy, YAML. Configure continuous delivery workflows for various environments e.g., development, staging, production. Evaluate new AWS services and solutions. Integrate application build & deployments scripts with GitHub. Create comprehensive documentation and provide technical guidance. Effectively interact with global customers, business users and IT employees What We Look For : B Tech./ M Tech / MCA degree in an IT/ Computer Science or related course is a prerequisite. 8+ years of hands-on professional experience in Infrastructure Engineering and automation Experience in AWS Cloud systems administration. Excellent communication skills and ability to thrive in both team-based and independent environments. What You Need To Get The Job Done Candidates should have a minimum of 8+ years industry experience in cloud and Infrastructure. Expertise in using DevOps tools Terraform, GitHub, Artifactory etc. Cloud engineering certifications (AWS, Terraform) are desirable. Deep understanding of networking and application architecture needs for system migrations. Proficiency in scripting languages: Python, PowerShell, Bash. Evaluate new AWS services and solutions Experience working with customers to diagnose a problem, and work toward resolution. Excellent verbal and written communication skills About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf IFTECH202.2 - Middle Professional Tier II (EEO Job Group) Job ID: 315707 Posted On: 2025-05-27 Location: Noida, Uttar Pradesh, India Show more Show less

Posted 3 weeks ago

Apply

Exploring YAML Jobs in India

YAML (YAML Ain't Markup Language) has seen a surge in demand in the job market in India. Organizations are increasingly looking for professionals who are proficient in YAML to manage configuration files, create data structures, and more. If you are a job seeker interested in YAML roles in India, this article provides valuable insights to help you navigate the job market effectively.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their vibrant tech scenes and have a high demand for YAML professionals.

Average Salary Range

The average salary range for YAML professionals in India varies based on experience levels. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 10 lakhs per annum.

Career Path

In the YAML skill area, a typical career path may involve starting as a Junior Developer, progressing to a Senior Developer, and eventually becoming a Tech Lead. Continuous learning and gaining hands-on experience with YAML will be crucial for career advancement.

Related Skills

Apart from YAML proficiency, other skills that are often expected or helpful alongside YAML include: - Proficiency in scripting languages like Python or Ruby - Experience with version control systems like Git - Knowledge of containerization technologies like Docker - Understanding of CI/CD pipelines

Interview Questions

Here are 25 interview questions for YAML roles: - What is YAML and what are its advantages? (basic) - Explain the difference between YAML and JSON. (basic) - How can you include one YAML file in another? (medium) - What is a YAML anchor? (medium) - How can you create a multi-line string in YAML? (basic) - Explain the difference between a sequence and a mapping in YAML. (medium) - What is the difference between != and !== in YAML? (advanced) - Provide an example of using YAML in a Kubernetes manifest file. (medium) - How can you comment in YAML? (basic) - What is a YAML alias and how is it used? (medium) - Explain how to define a list in YAML. (basic) - What is a YAML tag? (medium) - How can you handle sensitive data in a YAML file? (medium) - Explain the concept of anchors and references in YAML. (medium) - How can you represent a null value in YAML? (basic) - What is the significance of the --- at the beginning of a YAML file? (basic) - How can you represent a boolean value in YAML? (basic) - Explain the concept of scalars, sequences, and mappings in YAML. (medium) - How can you create a complex data structure in YAML? (medium) - What is the difference between << and & in YAML? (advanced) - Provide an example of using YAML in an Ansible playbook. (medium) - Explain what YAML anchors and aliases are used for. (medium) - How can you control the indentation in a YAML file? (basic) - What is a YAML directive? (advanced) - How can you represent special characters in a YAML file? (medium)

Closing Remark

As you prepare for YAML job roles in India, remember to showcase your proficiency in YAML and related skills during interviews. Stay updated with the latest industry trends and continue to enhance your YAML expertise. With the right preparation and confidence, you can excel in the competitive job market for YAML professionals in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies