Home
Jobs

547 Dataflow Jobs - Page 10

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Strong skills in Python and GCP services including Google Composer, Bigquery, Google Storage) Strong expertise in writing SQL, PL-SQL in Oracle, MYSQL or any other relation database. Good to have skill: Data warehousing & ETL (any tool) Proven experience in using GCP services is preferred. Strong Presentation and communication skills Analytical & Problem-Solving skills Mandatory Skill Sets GCP Data Engineer Preferred Skill Sets GCP Data Engineer Years Of Experience Required 4-8 Education Qualification Btech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Data Engineering, GCP Dataflow Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 12 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 2 weeks ago

Apply

2.0 - 3.0 years

0 Lacs

India

On-site

GlassDoor logo

Meridian Trade Links is urgently hiring experienced Nurses for Arab countries. Requirements: HRD authentication, Dataflow verification, and Prometric Exam cleared Bachelor’s/Diploma in Nursing with 2-3 years of experience Benefits: Competitive salary Apply Now! Send your CV to cv@meridiantradelinks.com OR hr@meridiantradelinks.com Job Type: Full-time Schedule: Day shift Work Location: In person

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Kerala

On-site

GlassDoor logo

Job Role: Data Architect Experience: 10+ years Notice period: Immediate to 15 days Location: Trivandrum / Kochi Introduction We are looking for candidates with 10 +years of experience in data architect role. Responsibilities include: Design and implement scalable, secure, and cost-effective data architectures using GCP. Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage. Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP. Ensure data architecture aligns with business goals, governance, and compliance requirements. Collaborate with stakeholders to define data strategy and roadmap. Design and deploy BigQuery solutions for optimized performance and cost efficiency. Build and maintain ETL/ELT pipelines for large-scale data processing. Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration. Implement best practices for data security, privacy, and compliance in cloud environments. Integrate machine learning workflows with data pipelines and analytics tools. Define data governance frameworks and manage data lineage. Lead data modeling efforts to ensure consistency, accuracy, and performance across systems. Optimize cloud infrastructure for scalability, performance, and reliability. Mentor junior team members and ensure adherence to architectural standards. Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager). Ensure high availability and disaster recovery solutions are built into data systems. Conduct technical reviews, audits, and performance tuning for data solutions. Design solutions for multi-region and multi-cloud data architecture. Stay updated on emerging technologies and trends in data engineering and GCP. Drive innovation in data architecture, recommending new tools and services on GCP. Certifications : Google Cloud Certification is Preferred. Primary Skills : 7+ years of experience in data architecture, with at least 3 years in GCP environments. Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. Strong experience in data warehousing, data lakes, and real-time data pipelines. Proficiency in SQL, Python, or other data processing languages. Experience with cloud security, data governance, and compliance frameworks. Strong problem-solving skills and ability to architect solutions for complex data environments. Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. Leadership experience and ability to mentor technical teams. Excellent communication and collaboration skills.

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Thiruvananthapuram

On-site

GlassDoor logo

What you’ll do: As a Senior Developer at Equifax, you will oversee and steer the delivery of innovative batch and data products, primarily leveraging Java and the Google Cloud Platform (GCP). Your expertise will ensure the efficient and timely deployment of high quality data solutions that support our business objectives and client needs. Project Development: Development, and deployment of batch and data products, ensuring alignment with business goals and technical requirements. Technical Oversight: Provide technical direction and oversee the implementation of solutions using Java and GCP, ensuring best practices in coding, performance, and security. Team Management: Mentor and guide junior developers and engineers, fostering a collaborative and high performance environment. Stakeholder Collaboration: Work closely with cross functional teams including product managers, business analysts, and other stakeholders to gather requirements and translate them into technical solutions. Documentation and Compliance: Maintain comprehensive documentation for all technical processes, ensuring compliance with internal and external standards. Continuous Improvement: Advocate for and implement continuous improvement practices within the team, staying abreast of emerging technologies and methodologies. What experience you need: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: Minimum of 2-5 years of relevant experience in software development, with a focus on batch processing and data solutions. Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including BigQuery, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Soft Skills: Strong problem solving abilities and a proactive approach to project management. Effective communication and interpersonal skills, with the ability to convey technical concepts to nontechnical stakeholders. What could set you apart: Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including BigQuery, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Certification in Google Cloud (e.g., Associate Cloud Engineer). Experience with other cloud platforms (AWS, Azure) is a plus. Understanding of data privacy regulations and compliance frameworks.

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Bhubaneswar, Odisha, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Cuttack, Odisha, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Kolkata, West Bengal, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Build the solution for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure data ingestion and transformation components. Following technology skills are required – Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience with ADF, Dataflow Experience with big data tools like Delta Lake, Azure Databricks Experience with Synapse Designing an Azure Data Solution skills Assemble large, complex data sets that meet functional / non-functional business requirements. Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Ghaziabad, Uttar Pradesh, India

On-site

Linkedin logo

Responsibilities As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development & solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills: ● Design, develop, and support data pipelines and related data products and platforms. ● Design and build data extraction, loading, and transformation pipelines and data products across on- prem and cloud platforms. ● Perform application impact assessments, requirements reviews, and develop work estimates. ● Develop test strategies and site reliability engineering measures for data products and solutions. ● Participate in agile development "scrums" and solution reviews. ● Mentor junior Data Engineers. ● Lead the resolution of critical operations issues, including post-implementation reviews. ● Perform technical data stewardship tasks, including metadata management, security, and privacy by design. ● Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies ● Demonstrate SQL and database proficiency in various data engineering tasks. ● Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. ● Develop Unix scripts to support various data operations. ● Model data to support business intelligence and analytics initiatives. ● Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. ● Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). Qualifications: ● Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. ● 4+ years of data engineering experience. ● 2 years of data solution architecture and design experience. ● GCP Certified Data Engineer (preferred).  Interested candidates can send their resumes to riyanshi@etelligens.in Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Guwahati, Assam, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Jamshedpur, Jharkhand, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Raipur, Chhattisgarh, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Amritsar, Punjab, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Ranchi, Jharkhand, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

🚨 Exciting Opportunity: Data Engineer Contract (1 Year) 🌟 Join a leading UK-based offshore product company as a Data Engineer in Hyderabad! If you thrive on cutting-edge data infrastructure and scalable pipelines using Google Cloud Platform (GCP), this 1 year contract role is perfect for you. **About the Role:** - Position: Data Engineer - Type: Contract (1 Year) - Experience : 4 -13 Years -Notice Period : Immediate - 30 Days - Location: Hyderabad - Interview: Face-to-Face on June 7,June 11,June 14 **Key Skills Required:** - Python -GCP - SQL - Google BigQuery **Responsibilities:** - Build and maintain scalable data pipelines using Dataflow and Cloud Composer. - Ingest and transform data into BigQuery from various sources. - Optimize complex SQL queries for reporting and analytics. - Automate workflows and orchestrate pipelines with Cloud Composer (Airflow). - Collaborate with cross-functional teams to meet data requirements. - Ensure end-to-end data quality, consistency, and availability. - Monitor, troubleshoot, and resolve data pipeline issues. - Contribute to documentation, code reviews, and team knowledge sharing. **Preferred Qualifications:** - GCP Certification (Professional Data Engineer or similar) is a plus. - Experience with DevOps practices for data pipelines. - Exposure to real-time data streaming tools is an advantage. Ready to take your career to new heights with a global product firm? Don't miss out on this opportunity! Apply now !!! 🚀 #DataEngineer #GCP #Hyderabad #JobOpening Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Greater Bhopal Area

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures Of Outcomes TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Code Outputs Expected: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure Define and govern the configuration management plan. Ensure compliance from the team. Test Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Execute and monitor the release process. Design Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface With Customer Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications Obtain relevant domain and technology certifications. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning. Experience in data warehouse design and cost improvements. Apply and optimize data models for efficient storage retrieval and processing of large datasets. Communicate and explain design/development aspects to customers. Estimate time and resource requirements for developing/debugging features/components. Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification. Knowledge Examples Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF. Proficient in SQL for analytics and windowing functions. Understanding of data schemas and models. Familiarity with domain-related data. Knowledge of data warehouse optimization techniques. Understanding of data security concepts. Awareness of patterns frameworks and automation practices. Additional Comments Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Skills scala,Python,Pyspark Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Visakhapatnam, Andhra Pradesh, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Indore, Madhya Pradesh, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Chandigarh, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Dehradun, Uttarakhand, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Mysore, Karnataka, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Patna, Bihar, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Vijayawada, Andhra Pradesh, India

Remote

Linkedin logo

Experience : 8.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Terraform, Cloud Composer, Dataproc, Dataflow, AWS, GCP, Terraform, BigQuery, SRE, GKE, GCP certification Forbes Advisor is Looking for: Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 weeks ago

Apply

Exploring Dataflow Jobs in India

The dataflow job market in India is currently experiencing a surge in demand for skilled professionals. With the increasing reliance on data-driven decision-making in various industries, the need for individuals proficient in managing and analyzing dataflow is on the rise. This article aims to provide job seekers with valuable insights into the dataflow job landscape in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Delhi

These cities are known for their thriving tech ecosystems and are home to numerous companies actively hiring for dataflow roles.

Average Salary Range

The average salary range for dataflow professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries upwards of INR 12-15 lakhs per annum.

Career Path

In the dataflow domain, a typical career path may involve starting as a Junior Data Analyst or Data Engineer, progressing to roles such as Senior Data Scientist or Data Architect, and eventually reaching positions like Tech Lead or Data Science Manager.

Related Skills

In addition to expertise in dataflow tools and technologies, dataflow professionals are often expected to have proficiency in programming languages such as Python or R, knowledge of databases like SQL, and familiarity with data visualization tools like Tableau or Power BI.

Interview Questions

  • What is dataflow and how is it different from data streaming? (basic)
  • Explain the difference between batch processing and real-time processing. (medium)
  • How do you handle missing or null values in a dataset? (basic)
  • Can you explain the concept of data lineage? (medium)
  • What is the importance of data quality in dataflow processes? (basic)
  • How do you optimize dataflow pipelines for performance? (medium)
  • Describe a time when you had to troubleshoot a dataflow issue. (medium)
  • What are some common challenges faced in dataflow projects? (medium)
  • How do you ensure data security and compliance in dataflow processes? (medium)
  • What are the key components of a dataflow architecture? (medium)
  • Explain the concept of data partitioning in dataflow. (advanced)
  • How would you handle a sudden increase in data volume in a dataflow pipeline? (advanced)
  • What role does data governance play in dataflow processes? (medium)
  • Can you discuss the advantages and disadvantages of using cloud-based dataflow solutions? (medium)
  • How do you stay updated with the latest trends and technologies in dataflow? (basic)
  • What is the significance of metadata in dataflow management? (medium)
  • Walk us through a dataflow project you have worked on from start to finish. (medium)
  • How do you ensure data quality and consistency across different data sources in a dataflow pipeline? (medium)
  • What are some best practices for monitoring and troubleshooting dataflow pipelines? (medium)
  • How do you handle data transformations and aggregations in a dataflow process? (basic)
  • What are the key performance indicators you would track in a dataflow project? (medium)
  • How do you collaborate with cross-functional teams in a dataflow project? (basic)
  • Can you explain the concept of data replication in dataflow management? (advanced)
  • How do you approach data modeling in a dataflow project? (medium)
  • Describe a challenging dataflow problem you encountered and how you resolved it. (advanced)

Closing Remark

As you navigate the dataflow job market in India, remember to showcase your skills and experiences confidently during interviews. Stay updated with the latest trends in dataflow and continuously upskill to stand out in a competitive job market. Best of luck in your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies