Jobs
Interviews

30 Gcs Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

0 Lacs

indore, madhya pradesh

On-site

You are a highly skilled and experienced ETL Developer with expertise in data ingestion and extraction, sought to join our team. With 8-12 years of experience, you specialize in building and managing scalable ETL pipelines, integrating diverse data sources, and optimizing data workflows specifically for Snowflake. Your role will involve collaborating with cross-functional teams to extract, transform, and load large-scale datasets in a cloud-based data ecosystem, ensuring data quality, consistency, and performance. Your responsibilities will include designing and implementing processes to extract data from various sources such as on-premise databases, cloud storage (S3, GCS), APIs, and third-party applications. You will ensure seamless data ingestion into Snowflake, utilizing tools like SnowSQL, COPY INTO commands, Snowpipe, and third-party ETL tools (Matillion, Talend, Fivetran). Developing robust solutions for handling data ingestion challenges such as connectivity issues, schema mismatches, and data format inconsistencies will be a key aspect of your role. Within Snowflake, you will perform complex data transformations using SQL-based ELT methodologies, implement incremental loading strategies, and track data changes using Change Data Capture (CDC) techniques. You will optimize transformation processes for performance and scalability, leveraging Snowflake's native capabilities such as clustering, materialized views, and UDFs. Designing and maintaining ETL pipelines capable of efficiently processing terabytes of data will be part of your responsibilities. You will optimize ETL jobs for performance, parallelism, and data compression, ensuring error logging, retry mechanisms, and real-time monitoring for robust pipeline operation. Your role will also involve implementing mechanisms for data validation, integrity checks, duplicate handling, and consistency verification. Collaborating with stakeholders to ensure adherence to data governance standards and compliance requirements will be essential. You will work closely with data engineers, analysts, and business stakeholders to define requirements and deliver high-quality solutions. Documenting data workflows, technical designs, and operational procedures will also be part of your responsibilities. Your expertise should include 8-12 years of experience in ETL development and data engineering, with significant experience in Snowflake. You should be proficient in tools and technologies such as Snowflake (SnowSQL, COPY INTO, Snowpipe, external tables), ETL Tools (Matillion, Talend, Fivetran), cloud storage (S3, GCS, Azure Blob Storage), databases (Oracle, SQL Server, PostgreSQL, MySQL), and APIs (REST, SOAP for data extraction). Strong SQL skills, performance optimization techniques, data transformation expertise, and soft skills like strong analytical thinking, problem-solving abilities, and excellent communication skills are essential for this role. Location: Bhilai, Indore,

Posted 3 days ago

Apply

9.0 - 11.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About McDonalds: One of the worlds largest employers with locations in more than 100 countries, McDonalds Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald&aposs global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: We are seeking an experienced Data Architect to design, implement, and optimize scalable data solutions on Amazon Web Services (AWS) and / or Google Cloud Platform (GCP). The ideal candidate will lead the development of enterprise-grade data architectures that support analytics, machine learning, and business intelligence initiatives while ensuring security, performance, and cost optimization. Who we are looking for: Primary Responsibilities: Key Responsibilities Architecture & Design: Design and implement comprehensive data architectures using AWS or GCP services Develop data models, schemas, and integration patterns for structured and unstructured data Create solution blueprints, technical documentation, architectural diagrams, and best practice guidelines Implement data governance frameworks and ensure compliance with security standards Design disaster recovery and business continuity strategies for data systems Technical Leadership: Lead cross-functional teams in implementing data solutions and migrations Provide technical guidance on cloud data services selection and optimization Collaborate with stakeholders to translate business requirements into technical solutions Drive adoption of cloud-native data technologies and modern data practices Platform Implementation: Implement data pipelines using cloud-native services (AWS Glue, Google Dataflow, etc.) Configure and optimize data lakes and data warehouses (S3 / Redshift, GCS / BigQuery) Set up real-time streaming data processing solutions (Kafka, Airflow, Pub / Sub) Implement automated data quality monitoring and validation processes Establish CI/CD pipelines for data infrastructure deployment Performance & Optimization: Monitor and optimize data pipeline performance and cost efficiency Implement data partitioning, indexing, and compression strategies Conduct capacity planning and scaling recommendations Troubleshoot complex data processing issues and performance bottlenecks Establish monitoring, alerting, and logging for data systems Skill: Bachelors degree in computer science, Data Engineering, or related field 9+ years of experience in data architecture and engineering 5+ years of hands-on experience with AWS or GCP data services Experience with large-scale data processing and analytics platforms AWS Redshift, S3, Glue, EMR, Kinesis, Lambda AWS Data Pipeline, Step Functions, CloudFormation Big Query, Cloud Storage, Dataflow, Dataproc, Pub/Sub GCP Cloud Functions, Cloud Composer, Deployment Manager IAM, VPC, and security configurations SQL and NoSQL databases Big data technologies (Spark, Hadoop, Kafka) Programming languages (Python, Java, SQL) Data modeling and ETL/ELT processes Infrastructure as Code (Terraform, CloudFormation) Container technologies (Docker, Kubernetes) Data warehousing concepts and dimensional modeling Experience with modern data architecture patterns Real-time and batch data processing architectures Data governance, lineage, and quality frameworks Business intelligence and visualization tools Machine learning pipeline integration Strong communication and presentation abilities Leadership and team collaboration skills Problem-solving and analytical thinking Customer-focused mindset with business acumen Preferred Qualifications: Masters degree in relevant field Cloud certifications (AWS Solutions Architect, GCP Professional Data Engineer) Experience with multiple cloud platforms Knowledge of data privacy regulations (GDPR, CCPA) Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonalds is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonalds provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonalds Capability Center India Private Limited (McDonalds in India) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonalds in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonalds in India does not discriminate based on race, religion, colour, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

About GlobalLogic GlobalLogic, a leader in digital product engineering with over 30,000 employees, helps brands worldwide in designing and developing innovative products, platforms, and digital experiences. By integrating experience design, complex engineering, and data expertise, GlobalLogic assists clients in envisioning possibilities and accelerating their transition into the digital businesses of tomorrow. Operating design studios and engineering centers globally, GlobalLogic extends its deep expertise to customers in various industries such as communications, financial services, automotive, healthcare, technology, media, manufacturing, and semiconductor. GlobalLogic is a Hitachi Group Company. Requirements Leadership & Strategy As a part of GlobalLogic, you will lead and mentor a team of cloud engineers, providing technical guidance and support for career development. You will define cloud architecture standards and best practices across the organization and collaborate with senior leadership to develop a cloud strategy and roadmap aligned with business objectives. Your responsibilities will include driving technical decision-making for complex cloud infrastructure projects and establishing and maintaining cloud governance frameworks and operational procedures. Leadership Experience With a minimum of 3 years in technical leadership roles managing engineering teams, you should have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential for this role. Certifications (Preferred) Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. Technical Excellence You should have over 10 years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services. As a technical expert, you will architect and oversee the development of sophisticated cloud solutions using Python and advanced GCP services. Your role will involve leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services. Additionally, you will design complex integrations with multiple data sources and systems, implement security best practices, and troubleshoot and resolve technical issues while establishing preventive measures. Job Responsibilities Technical Skills Your expertise should include expert-level proficiency in Python and experience in additional languages such as Java, Go, or Scala. Deep knowledge of GCP services like Dataflow, Compute Engine, BigQuery, Cloud Functions, and others is required. Advanced knowledge of Docker, Kubernetes, and container orchestration patterns, along with experience in cloud security, infrastructure as code, and CI/CD practices, will be crucial for this role. Cross-functional Collaboration Collaborating with C-level executives, senior architects, and product leadership to translate business requirements into technical solutions, leading cross-functional project teams, presenting technical recommendations to executive leadership, and establishing relationships with GCP technical account managers are key aspects of this role. What We Offer At GlobalLogic, we prioritize a culture of caring, continuous learning and development, interesting and meaningful work, balance and flexibility, and a high-trust organization. Join us to experience an inclusive culture, opportunities for growth and advancement, impactful projects, work-life balance, and a safe, reliable, and ethical global company. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences since 2000. Collaborating with forward-thinking companies globally, GlobalLogic continues to transform businesses and redefine industries through intelligent products, platforms, and services.,

Posted 1 week ago

Apply

6.0 - 9.0 years

18 - 25 Lacs

Bangalore Rural, Bengaluru

Work from Office

ETL Tester,ETL/Data Migration Testing,AWS to GCP data migration, PostgreSQL, AlloyDB, Presto, BigQuery, S3, and GCS,Python for test automation,data warehousing and cloud-native tools,PostgreSQL to AlloyDB,Presto to BigQuery,S3 to Google Cloud Storage

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

You will be part of a dynamic team at Equifax, where we are seeking creative, high-energy, and driven software engineers with hands-on development skills to contribute to various significant projects. As a software engineer at Equifax, you will have the opportunity to work with cutting-edge technology alongside a talented group of engineers. This role is perfect for you if you are a forward-thinking, committed, and enthusiastic individual who is passionate about technology. Your responsibilities will include designing, developing, and operating high-scale applications across the entire engineering stack. You will be involved in all aspects of software development, from design and testing to deployment, maintenance, and continuous improvement. By utilizing modern software development practices such as serverless computing, microservices architecture, CI/CD, and infrastructure-as-code, you will contribute to the integration of our systems with existing internal systems and tools. Additionally, you will participate in technology roadmap discussions and architecture planning to translate business requirements and vision into actionable solutions. Working within a closely-knit, globally distributed engineering team, you will be responsible for triaging product or system issues and resolving them efficiently to ensure the smooth operation and quality of our services. Managing project priorities, deadlines, and deliverables will be a key part of your role, along with researching, creating, and enhancing software applications to advance Equifax Solutions. To excel in this position, you should have a Bachelor's degree or equivalent experience, along with at least 7 years of software engineering experience. Proficiency in mainstream Java, SpringBoot, TypeScript/JavaScript, as well as hands-on experience with Cloud technologies such as GCP, AWS, or Azure, is essential. You should also have a solid background in designing and developing cloud-native solutions and microservices using Java, SpringBoot, GCP SDKs, and GKE/Kubernetes. Experience in deploying and releasing software using Jenkins CI/CD pipelines, infrastructure-as-code concepts, Helm Charts, and Terraform constructs is highly valued. Moreover, being a self-starter who can adapt to changing priorities with minimal supervision could set you apart in this role. Additional advantageous skills include designing big data processing solutions, UI development, backend technologies like JAVA/J2EE and SpringBoot, source code control management systems, build tools, working in Agile environments, relational databases, and automated testing. If you are ready to take on this exciting opportunity and contribute to Equifax's innovative projects, apply now and be part of our team of forward-thinking software engineers.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

If you are seeking an opportunity in the Service Sale Field for Measurement System & Solution products, Emerson has an exciting role for you! As a Service Sales Engineer, your responsibilities will include generating spares opportunities, framing Annual Maintenance contracts, and managing post-sales activities such as Shutdown jobs and Offshore contracts for field products. You will be accountable for the business growth of MSOL LCS in South India. Your key responsibilities will involve assisting customers in selecting spares and services based on their installed base and budget requirements, proactively following up on quotations, responding to customer inquiries promptly, and ensuring purchase orders align with the proposed solutions. Additionally, you will maintain accurate records in CRM, conduct site walk activities for lead generation, and deliver presentations to showcase Emerson's service strengths. To excel in this role, you must possess good technical knowledge in Level, Flow, Wireless, and Corrosion technologies. Previous experience with products like Radar level, Coriolis flow meters, and Flow computers is essential. Strong presentation and communication skills are also required. Ideally, you should hold a Bachelor's degree in Electronics or Instrumentation Engineering and have 4-6 years of sales experience in related fields. Experience with Analytical Systems for the Power and Oil & Gas Industry, as well as familiarity with Emerson field instruments, will be advantageous. At Emerson, you will have the opportunity to contribute meaningfully through your work. Our compensation and benefits packages are competitive, and we provide comprehensive medical and insurance coverage. We are dedicated to fostering a diverse and inclusive workplace and offer Work Authorization Sponsorship for foreign nationals. We prioritize the development and well-being of our employees, promoting a hybrid work setup for eligible roles to support Work-Life Balance. Safety is a top priority, and we are committed to providing a safe working environment globally. Join us at Emerson and be part of an organization that values its people and their growth, creating a workplace where everyone can thrive and succeed.,

Posted 1 week ago

Apply

6.0 - 8.0 years

18 - 30 Lacs

Hyderabad

Hybrid

Key Skills: Data engineering, Apache Airflow, GCP, BigQuery, GCS, SQL, ETL/ELT, Docker, Kubernetes, data governance, Agile, CI/CD, DevOps, pipeline orchestration, technical leadership. Roles & Responsibilities: Evaluate and provide scalable technical solutions to address complex and interdependent data processes. Ensure data quality and accuracy by implementing data quality checks, data contracts, and governance processes. Collaborate with software development teams and business analysts to understand data requirements and deliver fit-for-purpose data solutions. Lead the team in delivering end-to-end data engineering solutions. Design, develop, and maintain complex applications to support data processing workflows. Develop and manage data pipelines and workflows using Apache Airflow on GCP. Integrate data from various sources into Google BigQuery and Google Cloud Storage (GCS). Write and optimize advanced SQL queries for ETL/ELT processes. Maintain data consistency and troubleshoot issues in data workflows. Create and maintain detailed technical documentation for pipelines and workflows. Mentor junior data engineers and provide technical leadership and support. Lead project planning, execution, and successful delivery of data engineering initiatives. Stay updated with emerging trends and technologies in data engineering and cloud computing. Experience Requirement: 6-8 yeras of experience in leading the design, development, and deployment of complex data pipelines. Strong working knowledge of Apache Airflow on GCP for orchestration. Hands-on experience integrating data into Google BigQuery and GCS from various sources. Proficient in writing and optimizing complex SQL queries for large-scale data processing. Practical knowledge of containerization technologies like Docker and Kubernetes. Experience in implementing data governance and adhering to data security best practices. Familiarity with Agile methodology and working in cross-functional teams. Experience with CI/CD pipelines and DevOps practices for data engineering workflows. Education: B.Tech M.Tech (Dual), B.Tech, M. Tech.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

8 - 24 Lacs

Bengaluru, Karnataka, India

On-site

Hi, Exp: 4-8 Years NP: Immediate to 15 Days Location: Pune/Bangalore GCP Core Service : IAM, VPC, GCE ( Google Compute Engine) , GCS ( Google Cloud Storage) , CloudSQL, MySQL, CI/CD Tool (Code Build/GitHub Action/), Other Tool : GitHub, Terraform, Shell Script, Ansible.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

15 - 27 Lacs

Bengaluru

Hybrid

Dear Aspirants, GCP Developer Experience - 5 years to 8 years Salary Range Rs. 15.00 LPA to 27.00 LPA Notice Period : Immediate to 30 days Location Bangalore ( Hybrid ) Skill Area GCP Platform Kubernetes Linux Automation (Bash/Python) IaC based Infrastructure (Terraform & Ansible) Experience with GCS / block storage / object storage including replication, lifecycle, and transfer strategies Role & responsibilities GCP Developer Preferred candidate profile Google cloud platform certificate is added advantage , but any of the below certification can be considered Google Cloud Skill Boost Associate Cloud Engineer Certification Google Cloud Platform Professional Cloud Architect Google certified Google Professional Cloud Architect Google certified Google Professional Data Engineer Google certified Please provide us the following information. along with your updated resume: Total Experience: Relevant Experience in GCP: Certification (any): GCP cloud platform(Yes/no): If yes, Provide the your code or share the Certication copy: Notice Period: Is Buyout option available (yes/no): if yes, Do mention the Buyout notice period amount: Current Location: Preferred Location: Work from Office / Hybrid: Current Salary : Expected Salary : Any active offer (yes/no) : If yes, do mention your current offered salary details / offered company name : Interview Date / Time:

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Cloud Engineering Team Leader at GlobalLogic, you will be responsible for providing technical guidance and career development support to a team of cloud engineers. You will define cloud architecture standards and best practices across the organization, collaborating with senior leadership to develop a cloud strategy aligned with business objectives. Your role will involve driving technical decision-making for complex cloud infrastructure projects, establishing and maintaining cloud governance frameworks, and operational procedures. With a background in technical leadership roles managing engineering teams, you will have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential. Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. You will leverage your 10+ years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services to architect sophisticated cloud solutions using Python and advanced GCP services. Leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services will be part of your responsibilities. Ensuring optimal performance and scalability of complex integrations with multiple data sources and systems, implementing security best practices and compliance frameworks, and troubleshooting and resolving technical issues will be key aspects of your role. Your technical skills will include expert-level proficiency in Python with experience in additional languages, deep expertise with GCP services such as Dataflow, Compute Engine, BigQuery, Cloud Functions, and others, advanced knowledge of Docker, Kubernetes, and container orchestration patterns, extensive experience in cloud security, proficiency in Infrastructure as Code tools like Terraform, Cloud Deployment Manager, and CI/CD experience with advanced deployment pipelines and GitOps practices. As part of the GlobalLogic team, you will benefit from a culture of caring, continuous learning and development opportunities, interesting and meaningful work, balance and flexibility in work arrangements, and being part of a high-trust organization. You will have the chance to work on impactful projects, engage with collaborative teammates and supportive leaders, and contribute to shaping cutting-edge solutions in the digital engineering domain.,

Posted 2 weeks ago

Apply

4.0 - 6.0 years

8 - 14 Lacs

Chennai

Work from Office

3+ years of experience in Python software development • 3+ years experience in Cloud technologies & services, preferably GCP • 3+ years of experience of practicing statistical methods and their accurate application e.g. ANOVA, principal component analysis, correspondence analysis, k-means clustering, factor analysis, multi-variate analysis, Neural Networks, causal inference, Gaussian regression, etc. • 3+ years experience with Python, SQL, BQ. • Experience in SonarQube, CICD, Tekton, terraform, GCS, GCP Looker, Vertex AI, Airflow, TensorFlow, etc., • Experience in Train, Build and Deploy ML, DL Models • Ability to understand technical, functional, non-functional, security aspects of business requirements and delivering them end-to-end. • Ability to adapt quickly with opensource products & tools to integrate with ML Platforms • Building and deploying Models (Scikit learn, DataRobots, TensorFlow PyTorch, etc.) • Developing and deploying On-Prem & Cloud environments • Kubernetes, Tekton, OpenShift, Terraform, Vertex AI Preferred candidate profile ML, Python, SQL, BQ, Tekton, terraform, GCS, GCP Looker, Vertex AI, Airflow, TensorFlow Chennai Location Onlly 4 Years

Posted 2 weeks ago

Apply

8.0 - 12.0 years

18 - 22 Lacs

Navi Mumbai

Work from Office

Job Title: Senior Engineer (HW_GCS_2) Department: R&D - Mech Location: Navi Mumbai, India Job Type: Full-time | On-Site Seniority Level: Mid-Senior Years of Experience:8 - 12 Years Minimum Qualification :Bachelor's Degree Job Description: As a Mechatronics/Electronics Engineer , you will be crucial in developing and implementing electro-mechanical systems for avionics and other electronic systems in our aerospace projects. You will work closely with cross-functional teams to ensure the successful integration of electronic components with mechanical systems into our Unmanned Aerial Systems (UAS) . This role offers a unique opportunity to work on challenging projects at the forefront of aerospace technology. Key Responsibilities: You will be responsible for spearheading complex electronics system design, analysis, and integration. A deep technical expert, this role requires a profound understanding of the core principles of electronics engineering with proficiency in mechanical engineering, emphasising designing state-of-the-art solutions for challenging applications. Design & Development: Responsible for the design and development of high-precision electro-mechanical systems. Defining selection criteria for key electronics & mechanical components, testing and validating them for use in different sub-systems. Coordinate with multidisciplinary teams to seamlessly integrate embedded systems with mechanical systems, ensuring alignment in design parameters and tolerance considerations. Develop and implement comprehensive testing and verification strategies to ensure the robustness and integrity of embedded software & mechanical systems throughout the development lifecycle. Identify and mitigate risks associated with embedded software development & mechanical systems proactively addressing issues to ensure project success. Experience in designing electro-mechanical/robotic systems through the use of very strong technical fundamentals. Proficiency in designing electronic circuits and embedded system circuits using microcontrollers (e.g., STM32) with strong fundamentals in Electronics. Proficiency in reviewing and modifying circuits, wiring, and PCB layouts. Knowledge in Embedded C/C++ & familiarity with embedded software development. Developing Best Practices: Work with world-class Safety standards to implement best practices within a team. To work with first principles to achieve engineering solutions' highest robustness and value addition. Establish practices to deliver a design that is highest performance, reliable, scalable to manufacture, easy to maintain and re-usable. Skills & Qualification: Bachelors or Master's degree in Electronics, Mechatronics, Robotics, Aeronautical (Avionics). Strong electro-mechanical design instincts and a thorough understanding of dynamics & control. Hands-on expertise in housing electronic & circuit assembly. Working/basic Knowledge of any parametric modelling CAD software. Basic in GD&T and tolerance stack-up. Has a deep appreciation for technology evolution, and modern engineering practices, and deployed in world-class product development processes. Worked in an Indian or global company that delivers high-quality systems integrating mechanical & electronics hardware and/or studied at a reputed academic institution while demonstrating initiative and rigour to learn and create innovative engineering solutions.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

18 - 25 Lacs

Navi Mumbai

Work from Office

Design and development of high-precision electro-mechanical systems Develop and implement comprehensive testing and verification strategies Identify and mitigate risks associated with embedded software development & mechanical systems Required Candidate profile Bachelor’s or Master's degree in Electronics, Mechatronics, robotics, Aeronautical (Avionics). 8-12 years experience Strong electro-mechanical design instincts Basic in GD&T and tolerance stack-up.

Posted 3 weeks ago

Apply

3.0 - 4.0 years

6 - 7 Lacs

Chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Consultant Location: Chennai Work Type: Hybrid Position Description: Software development using React/Angular full stack work with Tech Anchors, Product Managers and the Team internally and across other Teams Ability to understand technical, functional, non-functional, security aspects of business requirements and delivering them end-to-end Software development using TDD approach Experience using GCP products & services Ability to adapt quickly with opensource products & tools to integrate with ML Platforms Skills Required: 3+ years of experience in React/Angular full stack software development 3+ years experience in Cloud technologies & services, preferably GCP Experience in SonarQube, CICD, Tekton, terraform, GCS, GCP etc., Ability to understand technical, functional, non-functional, security aspects of business requirements and delivering them end-to-end. Kubernetes, Tekton, OpenShift, Terraform, Vertex AI Skills Preferred: Good Communication, Presentation and Collaboration Skills Experience Required: 2 to 5 yrs Experience Preferred: API development and GCP deployment Education Required: BE, BTech, MCA, M.Sc, ME TekWissen Group is an equal opportunity employer supporting workforce diversity.

Posted 4 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Major skillset: GCP , Pyspark , SQL , Python , Cloud Architecture , ETL , Automation 4+ years of Experience in Data Engineering, Management with strong focus on Spark for building production ready Data Pipelines. Experienced with analyzing large data sets from multiple data sources and build automated testing and validations. Knowledge of Hadoop eco-system and components like HDFS , Spark , Hive , Sqoop Strong Python experience Hands on SQL , HQL to write optimized queries Strong hands-on experience with GCP Big Query , Data Proc , Airflow DAG , Dataflow , GCS , Pub/sub , Secret Manager , Cloud Functions , Beams . Ability to work in fast passed collaborative environment work with various stakeholders to define strategic optimization initiatives. Deep understanding of distributed computing, memory turning and spark optimization. Familiar with CI/CD workflows, Git . Experience in designing modular , automated , and secure ETL frameworks .

Posted 1 month ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Bengaluru, Karnataka, India

On-site

We are seeking a skilled GCP Data Engineer to join our team in India. The ideal candidate will have a strong background in designing and implementing data processing systems using Google Cloud Platform, with a focus on building scalable and efficient data pipelines. Responsibilities Designing, building, and maintaining scalable data processing systems on Google Cloud Platform (GCP). Developing data pipelines to ensure efficient data flow and transformation. Collaborating with data scientists and analysts to understand data requirements and deliver appropriate solutions. Implementing data security and compliance measures in accordance with best practices. Monitoring and optimizing data storage and processing performance. Troubleshooting and resolving data-related issues in a timely manner. Skills and Qualifications 3-7 years of experience in data engineering or related field. Proficiency in GCP services such as BigQuery, Dataflow, Cloud Storage, and Pub/Sub. Strong experience with SQL and NoSQL databases. Familiarity with data modeling and ETL processes. Knowledge of programming languages such as Python or Java. Understanding of data warehousing concepts and best practices. Experience with CI/CD tools and practices. Ability to work collaboratively in a team environment and communicate effectively with stakeholders.

Posted 1 month ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Gurgaon, Haryana, India

On-site

We are seeking a skilled GCP Data Engineer to join our team in India. The ideal candidate will have a strong background in designing and implementing data processing systems using Google Cloud Platform, with a focus on building scalable and efficient data pipelines. Responsibilities Designing, building, and maintaining scalable data processing systems on Google Cloud Platform (GCP). Developing data pipelines to ensure efficient data flow and transformation. Collaborating with data scientists and analysts to understand data requirements and deliver appropriate solutions. Implementing data security and compliance measures in accordance with best practices. Monitoring and optimizing data storage and processing performance. Troubleshooting and resolving data-related issues in a timely manner. Skills and Qualifications 3-7 years of experience in data engineering or related field. Proficiency in GCP services such as BigQuery, Dataflow, Cloud Storage, and Pub/Sub. Strong experience with SQL and NoSQL databases. Familiarity with data modeling and ETL processes. Knowledge of programming languages such as Python or Java. Understanding of data warehousing concepts and best practices. Experience with CI/CD tools and practices. Ability to work collaboratively in a team environment and communicate effectively with stakeholders.

Posted 1 month ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Mumbai, Maharashtra, India

On-site

We are seeking a skilled GCP Data Engineer to join our team in India. The ideal candidate will have a strong background in designing and implementing data processing systems using Google Cloud Platform, with a focus on building scalable and efficient data pipelines. Responsibilities Designing, building, and maintaining scalable data processing systems on Google Cloud Platform (GCP). Developing data pipelines to ensure efficient data flow and transformation. Collaborating with data scientists and analysts to understand data requirements and deliver appropriate solutions. Implementing data security and compliance measures in accordance with best practices. Monitoring and optimizing data storage and processing performance. Troubleshooting and resolving data-related issues in a timely manner. Skills and Qualifications 3-7 years of experience in data engineering or related field. Proficiency in GCP services such as BigQuery, Dataflow, Cloud Storage, and Pub/Sub. Strong experience with SQL and NoSQL databases. Familiarity with data modeling and ETL processes. Knowledge of programming languages such as Python or Java. Understanding of data warehousing concepts and best practices. Experience with CI/CD tools and practices. Ability to work collaboratively in a team environment and communicate effectively with stakeholders.

Posted 1 month ago

Apply

6.0 - 11.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Shift: (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Machine Learning, ML, ml architectures and lifecycle, Airflow, Kubeflow, MLFlow, Spark, Kubernetes, Docker, Python, SQL, machine learning platforms, BigQuery, GCS, Dataproc, AI Platform, Search Ranking, Deep Learning, Deep Learning Frameworks, PyTorch, TensorFlow About the job Candidates for this position are preferred to be based in Bangalore, India and will be expected to comply with their team's hybrid work schedule requirements. Who We Are Wayfairs Advertising business is rapidly expanding, adding hundreds of millions of dollars in profits to Wayfair. We are building Sponsored Products, Display & Video Ad offerings that cater to a variety of Advertiser goals while showing highly relevant and engaging Ads to millions of customers. We are evolving our Ads Platform to empower advertisers across all sophistication levels to grow their business on Wayfair at a strong, positive ROI and are leveraging state of the art Machine Learning techniques. What youll do Provide technical leadership in the development of an automated and intelligent advertising system by advancing the state-of-the-art in machine learning techniques to support recommendations for Ads campaigns and other optimizations. Design, build, deploy and refine extensible, reusable, large-scale, and real-world platforms that optimize our ads experience. Work cross-functionally with commercial stakeholders to understand business problems or opportunities and develop appropriately scoped machine learning solutions Collaborate closely with various engineering, infrastructure, and machine learning platform teams to ensure adoption of best-practices in how we build and deploy scalable machine learning services Identify new opportunities and insights from the data (where can the models be improved? What is the projected ROI of a proposed modification?) Research new developments in advertising, sort and recommendations research and open-source packages, and incorporate them into our internal packages and systems. Be obsessed with the customer and maintain a customer-centric lens in how we frame, approach, and ultimately solve every problem we work on. We Are a Match Because You Have: Bachelor's or Masters degree in Computer Science, Mathematics, Statistics, or related field. 6-9 years of industry experience in advanced machine learning and statistical modeling, including hands-on designing and building production models at scale. Strong theoretical understanding of statistical models such as regression, clustering and machine learning algorithms such as decision trees, neural networks, etc. Familiarity with machine learning model development frameworks, machine learning orchestration and pipelines with experience in either Airflow, Kubeflow or MLFlow as well as Spark, Kubernetes, Docker, Python, and SQL. Proficiency in Python or one other high-level programming language Solid hands-on expertise deploying machine learning solutions into production Strong written and verbal communication skills, ability to synthesize conclusions for non-experts, and overall bias towards simplicity Nice to have Familiarity with Machine Learning platforms offered by Google Cloud and how to implement them on a large scale (e.g. BigQuery, GCS, Dataproc, AI Notebooks). Experience in computational advertising, bidding algorithms, or search ranking Experience with deep learning frameworks like PyTorch, Tensorflow, etc.

Posted 1 month ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 5 - 15 Yrs Location: Pan India Job Description: Minimum 2 years hands on experience in GCP Development ( Data Engineering ) Position : Developer / Tech Lead / Architect Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :

Posted 1 month ago

Apply

7.0 - 12.0 years

7 - 12 Lacs

Bengaluru

Work from Office

About the Role We are looking for a seasoned Engineering Manager well-versed with emerging technologies to join our team. As an Engineering Manager, you will ensure consistency and quality by shaping the right strategies. You will keep an eye on all engineering projects and ensure all duties are fulfilled . You will analyse other employees tasks and carry on collaborations effectively. You will also transform newbies into experts and build reports on the progress of all projects. What you will do Design tasks for other engineers as per Meeshos guidelines Perform regular performance evaluation and share and seek feedback Keep a closer look on various projects and monitor the progress Carry on smooth collaborations with the sales team and design teams to innovate on new products Manage engineers and take ownership of the project while ensuring product scalability Conduct regular meetings to plan and develop reports on the progress of projects What you will need Bachelor's/Masters in computer science At least 7+ years professional experience At least 2 years of experience in managing software development teams Able to drive sprints and OKRs Deep understanding of transactional and NoSQL DBs Deep understanding of Messaging systems Kafka Good experience on cloud infrastructure - AWS/GCS Good to have: Data pipelines, ES Exceptional team managing skills; experience in building large scale distributed Systems Experience in Scalable Systems Expertise in Java/Python and multithreading

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Mumbai

Work from Office

Job Summary: UPS Enterprise Data Analytics team is looking for a talented and motivated Data Scientist to use statistical modelling, state of the art AI tools and techniques to solve complex and large-scale business problems for UPS operations. This role would also support debugging and enhancing existing AI applications in close collaboration with the Machine Learning Operations team. This position will work with multiple stakeholders across different levels of the organization to understand the business problem, develop and help implement robust and scalable solutions. You will be in a high visibility position with the opportunity to interact with the senior leadership to bring forth innovation within the operational space for UPS. Success in this role requires excellent communication to be able to present your cutting-edge solutions to both technical and business leaderships. Responsibilities: Become a subject matter expert on UPS business processes and data to help define and solve business needs using data, advanced statistical methods and AI Be actively involved in understanding and converting business use cases to technical requirements for modelling. Query, analyze and extract insights from large-scale structured and unstructured data from different data sources utilizing different platforms, methods and tools like BigQuery, Google Cloud Storage, etc. Understand and apply appropriate methods for cleaning and transforming data, engineering relevant features to be used for modelling. Actively drive modelling of business problem into ML/AI models, work closely with the stakeholders for model evaluation and acceptance. Work closely with the MLOps team to productionize new models, support enhancements and resolving any issues within existing production AI applications. Prepare extensive technical documentation, dashboards and presentations for technical and business stakeholders including leadership teams. Qualifications Expertise in Python, SQL. Experienced in using data science-based packages like scikit-learn, numpy, pandas, tensorflow, keras, statsmodels, etc. Strong understanding of statistical concepts and methods (like hypothesis testing, descriptive stats, etc.), machine learning techniques for regression, classification, clustering problems, including neural networks and deep learning. Proficient in using GCP tools like Vertex AI, BigQuery, GCS, etc. for model development and other activities in the ML lifecycle. Strong ownership and collaborative qualities in the relevant domain. Takes initiative to identify and drive opportunities for improvement and process streamline. Solid oral and written communication skills, especially around analytical concepts and methods. Ability to communicate data through a story framework to convey data-driven results to technical and non-technical audience. Masters Degree in a quantitative field of mathematics, computer science, physics, economics, engineering, statistics (operations research, quantitative social science, etc.), international equivalent, or equivalent job experience. Bonus Qualifications NLP, Gen AI, LLM knowledge/experience Knowledge of Operations Research methodologies and experience with packages like CPLEX, PULP, etc. Knowledge and experience in MLOps principles and tools in GCP. Experience working in an Agile environment, understanding of Lean Agile principles.

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 10 Lacs

Chennai, Tamil Nadu, India

On-site

Google Cloud Platform GCS, DataProc, Big Query, Data Flow Programming Languages Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform, GCS, DataProc, Big Query, Data Flow, Composer, Data Processing, Java

Posted 1 month ago

Apply

5.0 - 7.0 years

5 - 7 Lacs

Chennai, Tamil Nadu, India

On-site

Position Summary As a Data Engineer III at Walmart, you will design, build, and maintain scalable data systems and architectures to enable advanced business intelligence and analytics. Your role will be pivotal in ensuring data quality, integrity, and security to support Walmart's data-driven decision-making and operational efficiency. You will collaborate closely with cross-functional teams to implement robust data solutions that drive business growth. About the Team The Walmart Last Mile Data team operates within the Data and Customer Analytics organization, focusing on optimizing Walmart's last mile delivery operations. This dynamic group uses cutting-edge data engineering and analytics technologies to improve delivery routing, reduce costs, and enhance customer satisfaction. What You'll Do Design, develop, and deploy data pipelines and integration solutions using technologies such as Spark, Scala, Python, Airflow, and Google Cloud Platform (GCP). Build scalable, efficient data processing systems while optimizing workflows to ensure high data quality and availability. Monitor, troubleshoot, and tune data pipelines to maximize reliability and performance. Partner with executive leadership, product, data, and design teams to address data infrastructure needs and technical challenges. Provide data scientists and analysts with well-structured data sets and tools to facilitate analytics and reporting. Develop analytics tools that contribute to Walmart's position as an industry leader. Stay current with data engineering trends and technologies, incorporating best practices to enhance data infrastructure. What You'll Bring Minimum 5 years of proven experience as a Data Engineer. Strong programming skills in Scala and experience with Spark for large-scale data processing and analytics. Expertise with Google Cloud Platform services such as BigQuery, Google Cloud Storage (GCS), and Dataproc. Experience building near real-time data ingestion pipelines using Kafka and Spark Structured Streaming. Solid knowledge of data modeling, data warehousing concepts, and ETL processes. Proficiency in SQL and NoSQL databases. Experience with version control systems, preferably Git. Familiarity with distributed computing frameworks and working with large-scale data sets. Understanding of CI/CD pipelines and tools such as Jenkins or GitLab CI. Experience with workflow schedulers like Airflow. Strong analytical and problem-solving abilities. Familiarity with BI and visualization tools such as Tableau or Looker. Experience or interest in Generative AI is a plus but not required. About Walmart Global Tech At Walmart Global Tech, your work impacts millions worldwide by simplifying complex challenges through technology. Join a diverse team of innovators shaping the future of retail, where people and technology come together to drive lasting impact. We support career growth with continuous learning and flexible hybrid work models. Minimum Qualifications Bachelor's degree in Computer Science or related field with at least 2 years of software engineering experience. OR 4 years of relevant software engineering or data engineering experience without a degree. OR Master's degree in Computer Science.

Posted 1 month ago

Apply

6.0 - 8.0 years

18 - 20 Lacs

Chennai

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 2 months ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies