Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
GCP Data Engineer On-Premises to Cloud SQL Migration Experience- 5-8 Yrs Location- Noida Notice period - Immediate /serving Work Mode: WFO/Remote/Hybrid (depends on client’s ask) Budget: open budget DATA ENGINEER IV 5-8 years of experience GCP Data Engineer - On-Premises to Cloud SQL Migration Job Description: • As a Data Engineer with a focus on migrating on-premises databases to Google Cloud SQL, you will play a critical role in solving complex problems and creating value for our business by ensuring reliable, scalable, and efficient data migration processes. You will be responsible for architecting, designing and implementing custom pipelines on the GCP stack to facilitate seamless migration. Required Skills: •5+ years of industry experience in data engineering, business intelligence, or a related field with experience in manipulating, processing, and extracting value from datasets. •Expertise in architecting, designing, building, and deploying internal applications to support technology life cycle management, service delivery management, data, and business intelligence. •Experience in developing modular code for versatile pipelines or complex ingestion frameworks aimed at loading data into Cloud SQL and managing data migration from multiple on-premises sources. • Strong collaboration with analysts and business process owners to translate business requirements into technical solutions. •Proficiency in coding with scripting languages (Shell scripting, Python, SQL). •Deep understanding and hands-on experience with Google Cloud Platform (GCP) technologies, especially in data migration and warehousing, including Database Migration Service (DMS), Cloud SQL, Big Query, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage (GCS), IAM, Compute Engine, Cloud Data Fusion, and optionally Dataproc. •Adherence to best development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, and writing clean, modular, self-sustaining code. •Familiarity with CI/CD processes using GitHub, Cloud Build, and Google Cloud SDK. Qualifications: •Bachelor's degree in Computer Science or a related technical field, or equivalent practical experience. •GCP Certified Data Engineer (preferred). •Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to research scientists, engineering teams, and business audiences .Bachelor's degree in Computer Science or a related technical field, or equivalent practical experience Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Required Experience : 4+ years in to IT with atleast 3 +years into Data Engineer role Responsibilities: Proficient in GCP BigQuery -Dataset creation – Schemas, tables, materialized views -Proficiency in data processing from Pub/Sub to BigQuery to Analytics Hub. -Designing data schemas to align with BigQuery-native structures. -Optimization or testing for production-level loads. -Publish transaction data into BigQuery. Required Skills : GCP, BigQuery, Dataflow, SQL Show more Show less
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Description Join GlobalLogic, to be a valid part of the team working on a huge software project for the world-class company providing M2M / IoT 4G/5G modules e.g. to the automotive, healthcare and logistics industries. Through our engagement, we contribute to our customer in developing the end-user modules’ firmware, implementing new features, maintaining compatibility with the newest telecommunication and industry standards, as well as performing analysis and estimations of the customer requirements. Requirements BA / BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub / Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, Python 6-10 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Job responsibilities Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL / ELT and reporting / analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS / Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and / or agile methodologies Google Data Engineer certified What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services. Show more Show less
Posted 1 month ago
6.0 years
0 Lacs
India
Remote
Role: GCP Data Engineer Experience: 6+ years Type: Contract Duration: 6 months Location: Remote Time zone: IST Shift Job Description: We are looking for a skilled GCP Data Engineer with strong expertise in SQL and Python coding . The ideal candidate will have hands-on experience with Google Cloud Platform (GCP) services, especially BigQuery , and will be responsible for designing, building, and optimizing data pipelines and analytics solutions. Key Skills: Strong proficiency in SQL and Python Experience with GCP services, especially BigQuery Data pipeline development and ETL processes Good understanding of data warehousing and data modeling Nice to Have: Experience with Cloud Functions, Dataflow, or Pub/Sub Exposure to CI/CD and DevOps practices on GCP Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Data Platform Lead / Sr Manager, Platform Engineer As the Data Platform Lead/ Sr Manager, Platform Engineer , you will be responsible for leading the design, scalability, and reliability of the enterprise data analytics platform. You will drive secure, efficient, and resilient data transfer and platform enablement services across a global organization. This role is pivotal in ensuring that analytics-ready data is accessible, governed, and delivered at scale to support decision-making, reporting, and advanced data products—particularly in high-volume, fast-paced environments like retail or QSR. Who we’re looking for: Primary Responsibilities: Platform Strategy & Operations Architect and manage scalable batch processing and data transfer pipelines to serve enterprise-wide analytics use cases. Continuously monitor platform health and optimize for performance, cost efficiency, and uptime. Implement platform observability, diagnostics, and incident response mechanisms to maintain service excellence. Security, Compliance & Governance Ensure secure handling of data across ingestion, transfer, and processing stages, adhering to enterprise and regulatory standards. Establish protocols for secure, compliant, and auditable data movement and transformation. Enablement & Support Provide Level 2/3 technical support for platform services, minimizing disruption and accelerating issue resolution. Drive user enablement by leading documentation efforts, publishing platform standards, and hosting training sessions. Collaborate with key business stakeholders to improve platform adoption and usability. Collaboration & Delivery Partner with product, engineering, and analytics teams to support data initiatives across domains and markets. Ensure the platform supports reliable analytics workflows through automation, integration, and governed data access. Oversee platform releases, upgrades, and maintenance activities to ensure minimal downtime and seamless user experience. Skills: 8+ years of experience in platform engineering, data infrastructure, or analytics technology environments. Deep expertise in: Batch processing and data orchestration tools (e.g., Airflow, Dataflow, Composer) Secure data transfer protocols (e.g., SFTP, API-based, event streaming) Advanced SQL for diagnostics and analytics enablement Python for automation, scripting, and platform utilities Experience with cloud-native platforms (preferably GCP or AWS), including infrastructure-as-code (leveraging Terraform and Ansible) and DevOps tooling. Proven leadership of cross-functional technical teams delivering high-scale, high-availability platform solutions. Excellent collaboration, communication, and stakeholder engagement skills. Bachelor's or Master’s degree in Computer Science, Information Systems, or a related technical field. GCP/AWS certification is preferred Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, color, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: We’re seeking a hands-on Platform Engineer to support our enterprise data integration and enablement platform. As a Platform Engineer III, you’ll be responsible for designing, maintaining, and optimizing secure and scalable data movement services—such as batch processing, file transfers, and data orchestration. This role is essential to ensuring reliable data flow across systems to power analytics, reporting, and platform services in a cloud-native environment. Who we’re looking for: Primary Responsibilities: Hands-On Data Integration Engineering Build and maintain data transfer pipelines, file ingestion processes, and batch workflows for internal and external data sources. Configure and manage platform components that enable secure, auditable, and resilient data movement. Automate routine data processing tasks to improve reliability and reduce manual intervention. Platform Operations & Monitoring Monitor platform services for performance, availability, and failures; respond quickly to disruptions. Tune system parameters and job schedules to improve throughput and processing efficiency. Implement logging, metrics, and alerting to ensure end-to-end observability of data workflows. Security, Compliance & Support Apply secure protocols and encryption standards to data transfer processes (e.g., SFTP, HTTPS, GCS/AWS). Support compliance with internal controls and external regulations (e.g., GDPR, SOC2, PCI). Collaborate with security and infrastructure teams to manage access controls, service patches, and incident response. Troubleshooting & Documentation Investigate and resolve issues related to data processing failures, delays, or quality anomalies. Document system workflows, configurations, and troubleshooting runbooks for team use. Provide support for platform users and participate in on-call rotations as needed. Skill: 5+ years of hands-on experience in data integration , platform engineering , or infrastructure operations . Proficiency in: Designing and supporting batch and file-based data transfers Python scripting and SQL for diagnostics, data movement, and automation Terraform scripting and deploying of infrastructure cloud services Working with GCP (preferred) or AWS data analytics services, such as: GCP: Cloud Storage, BigQuery, Cloud Composer, Pub / Sub, Dataflow AWS: S3, Glue, Redshift, Athena, Lambda, EventBridge, Step Functions Cloud-native storage and compute optimization for data movement and processing Infrastructure-as-code and CI/CD practices (e.g., Terraform, Ansible, Cloud Build, GitHub Actions) Strong analytical and debugging skills for troubleshooting issues in distributed, high-volume environments. Bachelor's degree in computer science, Information Systems, or a related technical field. Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, color, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Platform Engineering & Enablement Services Lead / Sr Manager, Platform Engineer As the Platform Engineering & Enablement Services Lead/ Sr. Manager you will be responsible for driving the reliability, performance, and enablement of platform services that support secure data operations and analytics workflows across the enterprise. You will lead a team that designs, builds, and supports the infrastructure and tools necessary for seamless data movement, integration, and user enablement—ensuring agility, compliance, and operational excellence at scale. Who we’re looking for: Primary Responsibilities: Platform Operations & Engineering Design, implement, and maintain high-performing batch processing and file transfer systems across the enterprise. Monitor and tune platform performance to ensure availability, reliability, and cost-effectiveness. Partners with IT and infrastructure teams to plan and coordinate system updates, patching, and platform upgrades. Enablement & User Support Develop platform enablement services including onboarding, documentation, knowledge bases, and office hours. Facilitate training sessions and create self-service models to improve platform usability and adoption. Provide Tier 2/3 support for platform-related issues and coordinate escalations where needed. Security & Governance Ensure platform services meet internal and external security, privacy, and compliance standards (e.g., SOC2, PCI, GDPR). Manage secure data transfer protocols and implement robust access control mechanisms. Cross-Functional Collaboration Act as a liaison between platform engineering, data analytics, architecture, and security teams to ensure alignment. Drive platform automation, CI/CD enablement, and developer productivity initiatives across teams. Documentation & Standards Maintain comprehensive documentation of platform configurations, best practices, and incident runbooks. Standardize engineering processes to support consistency, observability, and ease of scaling across environments. Skill: 8+ years of experience in platform engineering, infrastructure operations, or enterprise systems enablement. Strong technical expertise in: Batch processing frameworks and orchestration tools (e.g., Airflow, Composer, Dataflow) Proficiency in infrastructure scripting for cloud services e.g. Terraform Secure data movement protocols (e.g., SFTP, APIs, messaging queues) Infrastructure engineering, networking, and system integration Proficiency in SQL and Python for troubleshooting, scripting, and platform automation Demonstrated experience in designing and supporting platform services in cloud-native environments (GCP or AWS preferred). Proven leadership in managing technical teams and platform support functions. Excellent communication, cross-team collaboration, and stakeholder management skills. Bachelor's or Master's degree in Computer Science, Information Systems, or related field. Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, color, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Data Platform Support / Platform Engineer III As a Data Platform Support Engineer, you will design, implement, and maintain scalable and secure platform services that enable efficient data transfers, API integrations, and platform operations across the enterprise. You will play a critical role in ensuring the performance, reliability, and security of data pipelines and platform services that power decision-making, reporting, and advanced data products—particularly in high-volume, fast-paced global environments. Who we’re looking for: Primary Responsibilities: Platform Operations & Support: Design, implement, and maintain secure and efficient platform solutions to support APIs, data transfers, and analytics workflows. Monitor platform health, optimize for performance and uptime, and implement diagnostic and observability practices. Provide Level 2/3 technical support for platform services to minimize disruption and accelerate issue resolution. Support platform upgrades, releases, and maintenance activities to ensure seamless user experiences. Security, Compliance & Governance: Ensure secure handling of data across transfer and integration processes, adhering to enterprise security and compliance standards. Implement and manage data transfer protocols (e.g., SFTP, API-based integrations) in a secure and auditable manner. Enablement & Documentation: Contribute to platform documentation, standards, and usage guidelines to drive user enablement and platform adoption. Support training sessions and user onboarding for platform services and tools. Collaboration & Delivery: Partner with engineering, product, and analytics teams to support platform initiatives and enhance system integration. Collaborate on continuous improvement initiatives to support analytics-ready data delivery across domains and markets. Skills: 5+ years of experience in platform engineering, data infrastructure, or API/data transfer operations. Deep expertise in: Data orchestration and batch processing tools (e.g., Airflow, Dataflow, Composer preferred). Secure data transfer protocols and integration patterns (e.g., SFTP, APIs, event-driven transfers). SQL for platform diagnostics and performance analytics. Python for automation, scripting, and platform utilities. Hands-on experience with cloud-native environments (preferably GCP or AWS), infrastructure-as-code (e.g., Terraform, Ansible), and DevOps practices. Strong problem-solving and analytical skills with a focus on operational excellence. Demonstrated ability to collaborate across cross-functional technical teams. Strong communication and stakeholder engagement skills. Bachelor’s degree in Computer Science, Information Systems, or a related technical field. GCP/AWS certification is preferred Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, color, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Your contributions to organisation's growth: Maintain & develop data platforms based on Microsoft Fabric for Business Intelligence & Databricks for real-time data analytics. Design, implement and maintain standardized production-grade data pipelines using modern data transformation processes and workflows for SAP, MS Dynamics, on-premise or cloud. Develop an enterprise-scale cloud-based Data Lake for business intelligence solutions. Translate business and customer needs into data collection, preparation and processing requirements. Optimize the performance of algorithms developed by Data Scientists. General administration and monitoring of the data platforms. Competencies: Working with structured & unstructured data. Experienced in various database technologies (RDBMS, OLAP, Timeseries, etc.). Solid programming skills (Python, SQL, Scala is a plus). Experience in Microsoft Fabric (incl. Warehouse, Lakehouse, Data Factory, DataFlow Gen2, Semantic Model) and/or Databricks (Spark). Proficient in PowerBI. Experienced working with APIs. Proficient in security best practices. Data centered Azure know-how is a plus (Storage, Networking, Security, Billing). Expertise you have to bring in along with; Bachelor or Master degree in business informatics, computer science, or equal. A background in software engineering (e.g., agile programming, project organization) and experience with human centered design would be desirable. Extensive experience in handling large data sets. Experience working at least 5 years as a data engineer, preferably in an industrial company. Analytical problem-solving skills and the ability to assimilate complex information. Programming experience in modern data-oriented languages (SQL, Python). Experience with Apache Spark and DevOps. Proven ability to synthesize complex data advanced technical skills related to data modelling, data mining, database design and performance tuning. English language proficiency. Special requirements: High quality mindset paired with strong customer orientation, critical thinking, and attention to detail. Understanding of data processing at scale Influence without authority. Willingness to acquire additional system/technical knowledge as needed. Problem solver. Experience to work in an international organization and in multi-cultural teams. Proactive, creative and innovative. Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About the Job: Developers at Vendasta work in teams, working with Product Managers and Designers in the creation of new features and products. Our Research and Development department works hard to help developers learn, grow, and experiment while at work. With a group of over 100 developers, we have fostered an environment that provides developers with the opportunity to continuously learn from each other. The ideal candidate will demonstrate that they are bright and can tackle tough problems while being able to communicate their solution to others. They are creative and can mix technology with the customer’s problems to find the right solution. Lastly, they are driven and will motivate themselves and others to get things done. As an experienced Software Developer, we expect that you will grow into a thought leader at Vendasta, driving better results across our development organization. Roles & Responsibilities: Develop software in teams of 3-5 developers, with the ability to take on tasks for the team and independently work on them to completion. Follow best practices to write clean, maintainable, scalable, and tested software. Contribute to the best engineering practices, including the use of design patterns, CI/CD, maintainable and scalable code, code review, and automated tests. Provide inputs for a technical roadmap for the Product Area. Ensure that the NFRs and technical debt get their due focus. Work collaboratively with Product Managers to design solutions (including technical roadmap) that help our Partners connect digital solutions to small and medium-sized businesses. Analyzing and improving current system integrations and migration strategies. Interact and collaborate with our high-quality technical team across India and Canada Qualifications: 8+ years experience in a related field with at least 5+ years as full stack developer in an architect or senior development role Experience or strong understanding of high scalability, data-intensive, distributed Internet applications Software development experience including building distributed, microservice-style and cloud-based application architectures Proficiency in modern software language, and willingness to quickly learn our technology stack Preference will be given to candidates with a strong Go (programming language) experience, and who can demonstrate the ability to build and adapt web applications using Angular. Experience in designing, Building and Implementing cloud-native architectures (GCP preferred). Experience working with the Scrum framework Technologies We Use: Cloud Native Computing using Google Cloud Platform BigQuery, Cloud Dataflow, Cloud Pub/Sub, Google Data Studio, Cloud IAM, Cloud Storage, Cloud SQL, Cloud Spanner, Cloud Datastore, Google Maps Platform, Stackdriver, etc.… We have been invited to join the Early Access Program on quite a few GCP technologies. GoLang, Typescript, Python, JavaScript, HTML, Angular, GRPC, Kubernetes Elasticsearch, MySQL, PostgreSQL About Vendasta: So what do we do? We create an entire platform full of digital products & solutions that help small to medium-sized businesses (SMBs) have a stronger presence online through digital advertising, online listings, reputation management, website creation, social media marketing … and much more! Our platform is used exclusively by channel partners, who sell products and services to SMBs, allowing them to leverage us to scale and grow their business. We are trusted by 65,000+ channel partners, serving over 6 million SMBs worldwide! Perks: Stock options (as per policy) Benefits - Health insurance Paid time off Public transport reimbursement Flex days Training & Career Development - Professional development plans, leadership workshops, mentorship programs, and more! Free Snacks, hot beverages, and catered lunches on Fridays Culture - comprised of our core values: Drive, Innovation, Respect, and Agility Provident Fund Show more Show less
Posted 1 month ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our organization is currently seeking a skilled Senior Data Engineer to become part of our dynamic team. In this role, as a Senior Data Engineer, you will focus on projects involving data integration and ETL for cloud-based platforms. Your responsibilities will include crafting and executing sophisticated data solutions to ensure the data remains precise, dependable, and readily available. Responsibilities Design and execute sophisticated data solutions on cloud-based platforms Create ETL processes utilizing SQL, Python, and additional relevant technologies Maintain data precision, reliability, and accessibility for all stakeholders Work collaboratively with interdisciplinary teams to meet data integration needs and specifications Produce and update documentation such as technical specifications, data flow diagrams, and data mappings Enhance and oversee data integration processes to boost performance and efficiency while ensuring data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Background in Snowflake for data warehousing Familiarity with cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less
Posted 1 month ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are seeking a skilled Senior Data Engineer to become an integral part of our team. In this role, you will focus on projects involving data integration and ETL for cloud-based platforms. Your tasks will include creating and executing sophisticated data solutions, ensuring the integrity, reliability, and accessibility of data. Responsibilities Create and execute sophisticated data solutions for cloud-based platforms Construct ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Work with cross-functional teams to comprehend data integration needs and specifications Produce and uphold documentation, such as technical specifications, data flow diagrams, and data mappings Enhance data integration processes for performance and efficiency, upholding data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong understanding of SQL for data querying and manipulation Background in Snowflake for data warehousing Familiarity with cloud platforms such as AWS, GCP, or Azure for data storage and processing Exceptional problem-solving abilities and meticulous attention to detail Strong verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less
Posted 1 month ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a skilled Senior Data Engineer to become an integral part of our team. In this role, you will focus on projects involving data integration and ETL for cloud-based platforms. Your tasks will include creating and executing sophisticated data solutions, ensuring the integrity, reliability, and accessibility of data. Responsibilities Create and execute sophisticated data solutions for cloud-based platforms Construct ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Work with cross-functional teams to comprehend data integration needs and specifications Produce and uphold documentation, such as technical specifications, data flow diagrams, and data mappings Enhance data integration processes for performance and efficiency, upholding data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong understanding of SQL for data querying and manipulation Background in Snowflake for data warehousing Familiarity with cloud platforms such as AWS, GCP, or Azure for data storage and processing Exceptional problem-solving abilities and meticulous attention to detail Strong verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
India
Remote
🔍 We're Hiring! – ML Ops Engineer (Remote, India) 📍 Location: Remote (Within India) 💼 Employment Type: Full-Time / Contractor 📅 Start Date: Immediate 🕒 Working Hours: 1:30 PM IST – 10:30 PM IST (Aligned with US CST) 🚀 Join Madlabs Global LLC as we lead the charge in deploying cutting-edge ML and Generative AI solutions at scale! We’re looking for a highly skilled ML Ops Engineer to lead the development, deployment, and lifecycle management of AI/ML models in cloud-native (preferably GCP) environments . 💼 Key Responsibilities Build scalable ML pipelines: ingestion, preprocessing, training, and serving. Collaborate with Data Scientists to turn prototypes into production-ready systems. Deploy and optimize LLM-based applications (instruction-tuned, fine-tuned models). Own continuous learning pipelines: retraining, model drift detection, performance tuning. Automate workflows using CI/CD , MLFlow , orchestration tools. Leverage GCP services like Vertex AI, BigQuery, Dataflow, Pub/Sub, Cloud Functions. Use Docker & Kubernetes to containerize and orchestrate model deployments. Monitor model performance with Prometheus, TensorBoard, Grafana, etc. Ensure security, fairness, and compliance across ML systems. 🧠 Required Experience 8+ years in ML Engineering, MLOps, or AI Infrastructure roles. Strong coding skills in Python with frameworks like TensorFlow, PyTorch, Scikit-learn. Deep expertise in GCP-native ML stacks . Hands-on experience in Generative AI model deployment and model optimization . Proficiency in Docker, Kubernetes, Jenkins, GitLab CI/CD . Solid understanding of model monitoring, versioning, rollback, and governance. 🕘 Work Hours Fully remote (India-based) Must provide overlap with CST time zone – working hours: 1:30 PM IST to 10:30 PM IST 💬 Interested or want to learn more? 📞 Contact: +91 98868 11767 📧 Email: naveed@madlabsinfotech.com Apply now or DM us to explore this opportunity to work with a team pushing the boundaries of AI innovation! #Hiring #MLOps #MachineLearning #GenerativeAI #LLM #VertexAI #RemoteJobsIndia #DataEngineering #AIJobs #GCP #DevOpsForAI #MLDeployment #LinkedInJobs Show more Show less
Posted 1 month ago
12.0 - 20.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
TCS Hiring for Hybrid Cloud Data Solutions Architect(GCP,Azure,AWS)_Kolkata / Hyderabad Experience: 12 to 20 Years Only Job Location: Kolkata / Hyderabad Only TCS Hiring for Hybrid Cloud Data Solutions Architect(GCP,Azure,AWS)_Kolkata / Hyderabad Required Technical Skill Set: Experience: Minimum of 8 years in data architecture or 10 years in engineering roles related to hybrid / multi cloud solutions, with a proven track record of successful client engagements. Technical Skills: Proficiency in two or more cloud platforms (AWS, Azure, GCP) related data services - Redshift, BigQuery, Synapse, Databricks, ADF, Glue, AWS EMR, Azure Insights, GCP Data Proc, GCP DataFlow etc., Experience in architecting applications leveraging containerization (Docker, Kubernetes), Cloud native (IaaS, PaaS, SaaS), Hybrid / Multi-Cloud, Hardware OEMs, Network, Security, Microservices, FinOps, iPaaS and APIs, Infrastructure as Code (IaC) tools (Terraform, CloudFormation), and CI/CD pipelines. Strong knowledge of enterprise architecture principles. Communication Skills: Excellent communication abilities to engage effectively with both technical and non-technical stakeholders to articulate technical concepts. Desirable Skill Set: Knowledge of any specific industry verticals (e.g., BFSI, Healthcare, Manufacturing, Telecom / Media). Technical certifications related to cloud computing (e.g., AWS Certified Solutions Architect, Microsoft Certified: Azure Solutions Architect Expert). Relevant cloud certifications (e.g., AWS Certified Solutions Architect) are preferred; must obtain certification within 90 days of employment. Understanding of DevOps concepts. Ability to lead cross-functional teams effectively. Key Responsibilities: Strategy & Design: Develop a comprehensive data strategy on Multi / Hybrid Cloud scenarios aligned with business goals. Design scalable, secure, and cost-effective Data solutions. Evaluate and select cloud platforms (AWS, Azure, GCP, OCI, IBM, Nutanix, Neo Cloud, etc.) and third-party tools. Develop blueprint, roadmap and drive implementation of data architecture, framework related epics / user stories Data modeling based on the business use cases Solution Design: Design data ingestion layer and data movement from ingestion layer to operational / analytical layers Design of data consumption layer (visualization, Analytics, AI/ML, outbound data) Design data governance track – framework design for data quality, data security, metadata etc., Architect tailored cloud solutions that leverage best practices and meet specific client requirements, utilizing native data services such as AWS, Azure, Google Cloud services Ability to understand data pipelines and modern ways of automating data pipeline using cloud based and on-premise technologies Good knowledge of any RBDMS/NoSQL database with strong SQL writing skills Good understanding of ML and AI concepts and Propose solutions to automate the process Technical Presentations: Conduct workshops and presentations to demonstrate solution feasibility and value, fostering trust and engagement with stakeholders. Proof of Concept (POC): Lead the design and implementation of POCs to validate proposed solutions, products against features & cost. Implementation & Management: Guide technical solution development in engagements related to Legacy modernization, migration of applications and infrastructure to hybrid cloud, Engineered cloud, etc. Guide, mentor the data development squads and review the deliverables, as required. Kind Regards, Priyankha M Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
India
Remote
Job Title - GCP Administrator Location - Remote (Hybrid for Chennai& Mumbai) Experience - 5 to 8 years We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: ● Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access ● Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. ● Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP ● Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks ● Work with development teams to design the GCP-specific cloud architecture ● Provisioning and de-provisioning GCP accounts and resources for internal projects. ● Managing, and operating multiple GCP subscriptions ● Keep technical documentation up to date ● Proactively being up to date on GCP announcements, services and developments. Requirements: ● Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP ● Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. ● Must have hands-on experience on GCP services such as Identity and Access Management (IAM), Big Query, Google Kubernetes Engine (GKE), etc. ● Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs ● Must have a working knowledge of docker containers and Kubernetes. ● Must have strong communication skills and the ability to work both independently and in a collaborative environment. ● Fast learner, Achiever, sets high personal goals ● Must be able to work on multiple projects and consistently meet project deadlines ● Must be willing to work on shift-basis based on project requirements. Good to Have: ● Experience in Terraform Automation over GCP Infrastructure provisioning ● Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services ● Experience in building and supporting any form of data pipeline. ● Multi-Cloud experience with AWS. ● New-Relic monitoring. Perks: ● Day off on the 3rd Friday of every month (one long weekend each month) ● Monthly Wellness Reimbursement Program to promote health well-being ● Paid paternity and maternity leaves Notice period: Immediate to 30 days Email to: poniswarya.m@aptita.com Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
Greetings from TCS!! TCS is Hiring for Data Analyst Interview Mode: Virtual Required Experience: 7-18 years Work location: PAN India Strong knowledge of: Data processing software and strategies Big Data, information retrieval, data mining SQL 4+ years of experience with cloud platforms and customer facing projects. Strong ability to successfully interface (verbal and written) with clients in a concise manner while managing expectations at both executive and technical levels. General Data Platform & Data Lakes Relational & Non-Relation Databases Streaming and Batch Pipelines SQL Engines. Possible options: MySQL SQL Server PostgreSQL NoSQL Engines. Possible options: MongoDB Cassandra HBase Dynamo Redis Google Cloud Data Services Cloud SQL BigQuery Dataflow Dataproc Bigtable Composer Cloud Functions Python Hadoop Ecosystem / Apache Softwares Spark Beam Hive Airflow Sqoop Oozie Code Repositories / CICD tools. Possible options: Github Cloud Source Repositories GitLab Azure DevOps If interested kindly send your updated CV and below mentioned details through DM/E-mail: srishti.g2@tcs.com Name: E-mail ID: Contact Number: Highest qualification: Preferred Location: Highest qualification university: Current organization: Total, years of experience: Relevant years of experience: Any gap: Mention-No: of months/years (career/ education): If any then reason for gap: Is it rebegin: Previous organization name: Current CTC: Expected CTC: Notice Period: Have you worked with TCS before (Permanent / Contract ) : Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Title#1: GCP DATA ENGINEER Job role : GCP DATA ENGINEER Required Exp : 4+ years Employment : Permanent - RandomTrees - https://www.randomtrees.com Mode of Work : Hyd-Hybrid/Remote Notice period : Max 15-30 days/Serving Notice to 30 days max. Job Summary: We are seeking experience with 3+ years’ experience as a software engineer – or equivalent – designing large data-heavy distributed systems and/or high-traffic web-apps. Primary / Expected skill set : GCP DE, Big query, Dataflow, PySpark, GCS, Airflow/composer. Key Requirements: · Hands-on experience designing & managing large data models, writing performant SQL queries, and working with large datasets and related technologies. · Experience working with cloud platforms such as GCP, Big Query. · Strong analytical, problem solving and interpersonal skills, have a hunger to learn, and the ability to operate in a self-guided manner in a fast-paced rapidly changing environment · Must have: Experience in pipeline orchestration (e.g. Airflow) · Must have: Good hands-on experience on Dataflow (Python or Java) and Pyspark · Hands-on experience on migration is an added advantage. Project#2 -- UMG Title#2: GCP DATA ENGINEER AIRFLOW Job role : GCP DATA ENGINEER Required Exp : 4+ years Employment : Permanent - RandomTrees - https://www.randomtrees.com Mode of Work : Hyd-Hybrid/Remote Notice period : Max 15-30 days/Serving Notice to 30 days max. Job Description (Airflow): We are seeking experience with 3+ years’ experience as a software engineer – or equivalent – designing large data-heavy distributed systems and/or high-traffic web-apps. Primary Skills: GCP, Python CODING MUST, SQL Coding skills, Big Query, Airflow and Airflow Dag's. Requirements: · Hands-on experience designing & managing large data models, writing performant SQL queries, and working with large datasets and related technologies. · Experience working with cloud platforms such as GCP, Big Query. · Strong analytical, problem solving and interpersonal skills, have a hunger to learn, and the ability to operate in a self-guided manner in a fast-paced rapidly changing environment · Must have: Experience in pipeline orchestration (e.g. Airflow). Show more Show less
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
When you join Verizon You want more out of a career. A place to share your ideas freely even if theyre daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the V Team Life. What Youll Be Doing The Wireless Solution Train supports critical network functions and services for 4G/5G wireless applications. We are looking for a dynamic and collaborative individual who will contribute to the growth and evolution of Next Gen OSS for Network systems. Planning, designing, developing, coding and testing software systems or applications for software enhancements and new products; revise and refine as required. Implementing changes and new features in a manner which promotes efficient, reusable and performant code. Participating in product feature implementation, both independently and in cooperation with the team. Maintaining and improve existing code with a pride of ownership. Leading medium to large scale projects with minimal direction. Design, develop, and maintain data pipelines using GCP services such as BigQuery, Dataflow, Cloud Storage, Pub/Sub, and Dataproc. Implement and manage data ingestion processes from various sources (e.g., databases, APIs, streaming platforms). What were looking for... You'll need to have: Bachelor's degree or four or more years of work experience. Four or more years of relevant work experience Experience in Python, Pyspark/Flink Experience on Product Agile model (POD) and have product mindset. GCP experience on BQ, Spanner, Looker Experience in GEN AI solutions & tools Even better if you have one or more of the following: Masters degree or related field. Any relevant certification. Excellent communication and collaboration skills. Develop and maintain data quality checks and monitoring systems to ensure data accuracy and integrity. Optimize data pipelines for performance, scalability, and cost-effectiveness. Collaborate with data scientists and analysts to understand data requirements and provide data solutions. Build and maintain Looker dashboards and reports for data visualization and analysis. Stay up-to-date with the latest technologies and best practices in cloud data engineering. (preferred GCP). Where youll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Locations: Chennai, India; Hyderabad, India. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools . Agile environments (e.g. Scrum, XP) Relational databases Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centres (Delivery Centres), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centres offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In Your Role, You Will Be Responsible For Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Preferred Education Master's Degree Required Technical And Professional Expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred Technical And Professional Experience Experience with AEM Core Technologies : OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
_VOIS Intro About _VOIS: _VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, _VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. _VOIS Centre Intro About _VOIS India: In 2009, _VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, _VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Role Related Content (Role specific) Key Responsibilities Design and develop high-quality, innovative applications using Pega PRPC and other Pega frameworks. Perform code reviews to ensure application quality and performance. Pinpoint areas within the application that are experiencing performance issues and optimize them to enhance overall efficiency. Identify threats during interface development and recommend measures based on Vodafone's cybersecurity guidelines. Serve as a security champion and document best practices from a security perspective. Mentor junior system architects and developers on Pega development best practices. Qualifications Bachelor’s degree in computer science, Information Technology, or a related field. Pega CSSA certification with at least 7 years of experience in building and implementing model-driven, enterprise-level business solutions. Strong technical skills and a deep understanding of Pega PRPC and system architectures. Experience with Agile, Scrum, and waterfall project delivery frameworks. Exceptional interpersonal skills and the ability to communicate, partner, and collaborate effectively. Proven track record in software development, system architecture, and security champion roles. Preferred Skills: Experience with CDH real time APIs and decisioning artifacts. Experience with Pega CDH Framework with a focus on dataflow and process flows. Knowledge of industry-standard project delivery frameworks, including SAFe Agile. _VOIS Equal Opportunity Employer Commitment India _VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion , Top 10 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch! Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within PWC Job Description: We are seeking skilled and dynamic Cloud Data Engineers specializing in AWS, Azure, Databricks, and GCP. The ideal candidate will have a strong background in data engineering, with a focus on data ingestion, transformation, and warehousing. They should also possess excellent knowledge of PySpark or Spark, and a proven ability to optimize performance in Spark job executions. Responsibilities Design, build, and maintain scalable data pipelines for a variety of cloud platforms including AWS, Azure, Databricks, and GCP. Implement data ingestion and transformation processes to facilitate efficient data warehousing. Utilize cloud services to enhance data processing capabilities: - AWS: Glue, Athena, Lambda, Redshift, Step Functions, DynamoDB, SNS. Azure: Data Factory, Synapse Analytics, Functions, Cosmos DB, Event Grid, Logic Apps, Service Bus. GCP: Dataflow, BigQuery, DataProc, Cloud Functions, Bigtable, Pub/Sub, Data Fusion. Optimize Spark job performance to ensure high efficiency and reliability. Stay proactive in learning and implementing new technologies to improve data processing frameworks. Collaborate with cross-functional teams to deliver robust data solutions. Work on Spark Streaming for real-time data processing as necessary. Qualifications 4-7 years of experience in data engineering with a strong focus on cloud environments. - Proficiency in PySpark or Spark is mandatory. Proven experience with data ingestion, transformation, and data warehousing. In-depth knowledge and hands-on experience with cloud services(AWS/Azure/GCP): Demonstrated ability in performance optimization of Spark jobs. Strong problem-solving skills and the ability to work independently as well as in a team. Cloud Certification (AWS, Azure, or GCP) is a plus. - Familiarity with Spark Streaming is a bonus. Mandatory Skill Sets Python, Pyspark, SQL with (AWS or Azure or GCP) Preferred Skill Sets Python, Pyspark, SQL with (AWS or Azure or GCP) Years Of Experience Required 4-7 years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Technology, Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Full Stack Development Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation {+ 19 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Strong skills in Python and GCP services including Google Composer, Bigquery, Google Storage) Strong expertise in writing SQL, PL-SQL in Oracle, MYSQL or any other relation database. Good to have skill: Data warehousing & ETL (any tool) Proven experience in using GCP services is preferred. Strong Presentation and communication skills Analytical & Problem-Solving skills Mandatory Skill Sets GCP Data Engineer Preferred Skill Sets GCP Data Engineer Years Of Experience Required 4-8 Education Qualification Btech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering, Bachelor of Technology Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills GCP Dataflow, Good Clinical Practice (GCP) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 12 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within PWC Responsibilities Experience :4-7 years in Data Engineering Job Description We are seeking skilled and dynamic Cloud Data Engineers specializing in AWS, Azure, Databricks, and GCP. The ideal candidate will have a strong background in data engineering, with a focus on data ingestion, transformation, and warehousing. They should also possess excellent knowledge of PySpark or Spark, and a proven ability to optimize performance in Spark job executions. Key Responsibilities Design, build, and maintain scalable data pipelines for a variety of cloud platforms including AWS, Azure, Databricks, and GCP. Implement data ingestion and transformation processes to facilitate efficient data warehousing. Utilize cloud services to enhance data processing capabilities: AWS: Glue, Athena, Lambda, Redshift, Step Functions, DynamoDB, SNS. Azure: Data Factory, Synapse Analytics, Functions, Cosmos DB, Event Grid, Logic Apps, Service Bus. GCP: Dataflow, BigQuery, DataProc, Cloud Functions, Bigtable, Pub/Sub, Data Fusion. Optimize Spark job performance to ensure high efficiency and reliability. Stay proactive in learning and implementing new technologies to improve data processing frameworks. Collaborate with cross-functional teams to deliver robust data solutions. Work on Spark Streaming for real-time data processing as necessary. Qualifications 4-7 years of experience in data engineering with a strong focus on cloud environments. Proficiency in PySpark or Spark is mandatory. Proven experience with data ingestion, transformation, and data warehousing. In-depth knowledge and hands-on experience with cloud services(AWS/Azure/GCP): Demonstrated ability in performance optimization of Spark jobs. Strong problem-solving skills and the ability to work independently as well as in a team. Cloud Certification (AWS, Azure, or GCP) is a plus. Familiarity with Spark Streaming is a bonus. Mandatory Skill Sets Python, Pyspark, SQL with (AWS or Azure or GCP) Preferred Skill Sets Python, Pyspark, SQL with (AWS or Azure or GCP) Years Of Experience Required 4-7 years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Technology, Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation {+ 19 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France