Home
Jobs
Companies
Resume

170 Dataproc Jobs - Page 3

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

5 - 7 Lacs

Chennai, Tamil Nadu

Work from Office

Naukri logo

Duration: 12Months Work Type: Onsite Position Description: We seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform. Skills Required: Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management Proficient in Machine Learning model architecture, data pipeline interaction and metrics interpretation. This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Integration between GCP Data Catalog and Informatica EDC. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Skills Preferred: Strong drive for results and ability to multi-task and work independently Self-starter with proven innovation skills Ability to communicate and work with cross-functional teams and all levels of management Demonstrated commitment to quality and project timing Demonstrated ability to document complex systems Experience in creating and executing detailed test plans Experience Required: 3 to 5 Yrs Education Required: BE or Equivalent

Posted 3 weeks ago

Apply

4.0 - 7.0 years

10 - 14 Lacs

Noida

Work from Office

Naukri logo

Location: Noida (In-office/Hybrid; client site if required) Type: Full-Time | Immediate Joiners Preferred Must-Have Skills: GCP (BigQuery, Dataflow, Dataproc, Cloud Storage) PySpark / Spark Distributed computing expertise Apache Iceberg (preferred), Hudi, or Delta Lake Role Overview: Be part of a high-impact Data Engineering team focused on building scalable, cloud-native data pipelines. You'll support and enhance EMR platforms using DevOps principles, helping deliver real-time health alerts and diagnostics for platform performance. Key Responsibilities: Provide data engineering support to EMR platforms Design and implement cloud-native, automated data solutions Collaborate with internal teams to deliver scalable systems Continuously improve infrastructure reliability and observability Technical Environment: Databases: Oracle, MySQL, MSSQL, MongoDB Distributed Engines: Spark/PySpark, Presto, Flink/Beam Cloud Infra: GCP (preferred), AWS (nice-to-have), Terraform Big Data Formats: Iceberg, Hudi, Delta Tools: SQL, Data Modeling, Palantir Foundry, Jenkins, Confluence Bonus: Stats/math tools (NumPy, PyMC3), Linux scripting Ideal for engineers with cloud-native, real-time data platform experience especially those who have worked with EMR and modern lakehouse stacks.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Pune

Work from Office

Naukri logo

Job Summary We are looking for a seasoned Data Modeler / Data Analyst to design and implement scalable, reusable logical and physical data models on Google Cloud Platformprimarily BigQuery. You will partner closely with data engineers, analytics teams, and business stakeholders to translate complex business requirements into performant data models that power reporting, self-service analytics, and advanced data science workloads. Key Responsibilities Gather and analyze business requirements to translate them into conceptual, logical, and physical data models on GCP (BigQuery, Cloud SQL, Cloud Spanner, etc.). Design star/snowflake schemas, data vaults, and other modeling patterns that balance performance, flexibility, and cost. Implement partitioning, clustering, and materialized views in BigQuery to optimize query performance and cost efficiency. Establish and maintain data modelling standards, naming conventions, and metadata documentation to ensure consistency across analytic and reporting layers. Collaborate with data engineers to define ETL/ELT pipelines and ensure data models align with ingestion and transformation strategies (Dataflow, Cloud Composer, Dataproc, dbt). Validate data quality and lineage; work with BI developers and analysts to troubleshoot performance issues or data anomalies. Conduct impact assessments for schema changes and guide version-control processes for data models. Mentor junior analysts/engineers on data modeling best practices and participate in code/design reviews. Contribute to capacity planning and cost-optimization recommendations for BigQuery datasets and reservations. Must-Have Skills 6-8 years of hands-on experience in data modeling, data warehousing, or database design, including at least 2 years on GCP BigQuery. Proficiency in dimensional modeling, 3NF, and modern patterns such as data vault. Expert SQL skills with demonstrable ability to optimize complex analytical queries on BigQuery (partitioning, clustering, sharding strategies). Strong understanding of ETL/ELT concepts and experience working with tools such as Dataflow, Cloud Composer, or dbt. Familiarity with BI/reporting tools (Looker, Tableau, Power BI, or similar) and how model design impacts dashboard performance. Experience with data governance practices—data cataloging, lineage, and metadata management (e.g., Data Catalog). Excellent communication skills to translate technical concepts into business-friendly language and collaborate across functions. Good to Have Experience of working on Azure Cloud (Fabric, Synapse, Delta Lake) Education Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, Statistics, or a related field. Equivalent experience will be considered.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

8 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Required Skills: SQL, GCP (BigQuery, Composer, Data Flow), Big Data (Scala, Kafka) You'll need to have: Experience in Big Data technologies - GCP/Composer/Bigquery /DataFlow Understanding the business requirements and converting them to technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. Experience with Data Warehouse concepts and Data Management life cycle.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 15 to 30 LPA Exp: 3 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 3 weeks ago

Apply

4.0 - 8.0 years

10 - 19 Lacs

Chennai

Hybrid

Naukri logo

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 3 to 5 years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 4+ years of professional experience in: o Data engineering, data product development and software product launches. - 3+ years of cloud data/software engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree Candidate should be willing to take GCP assessment (1-hour online test) LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Narmadha

Posted 3 weeks ago

Apply

8 - 13 years

14 - 24 Lacs

Chennai

Hybrid

Naukri logo

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Hope you are doing well! This is Abirami from Getronics Talent Acquisition team. We have multiple opportunities for Senior GCP Data Engineers for our automotive client in Chennai Sholinganallur location. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to abirami.rsk@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 8+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 4+ years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs. • Implement security measures and data governance policies to ensure the integrity and confidentiality of data. • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 8+ years of professional experience in: o Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Abirami Getronics Recruitment team

Posted 4 weeks ago

Apply

3 - 6 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. About The Role - Grade Specific The primary focus is to help organizations design, develop, and optimize their data infrastructure and systems. They help organizations enhance data processes, and leverage data effectively to drive business outcomes. Skills (competencies) Industry Standard Data Modeling (FSLDM) Ab Initio Industry Standard Data Modeling (IBM FSDM)) Agile (Software Development Framework) Influencing Apache Hadoop Informatica IICS AWS Airflow Inmon methodology AWS Athena JavaScript AWS Code Pipeline Jenkins AWS EFS Kimball AWS EMR Linux - Redhat AWS Redshift Negotiation AWS S3 Netezza Azure ADLS Gen2 NewSQL Azure Data Factory Oracle Exadata Azure Data Lake Storage Performance Tuning Azure Databricks Perl Azure Event Hub Platform Update Management Azure Stream Analytics Project Management Azure Sunapse PySpark Bitbucket Python Change Management R Client Centricity RDD Optimization Collaboration SantOs Continuous Integration and Continuous Delivery (CI/CD) SaS Data Architecture Patterns Scala Spark Data Format Analysis Shell Script Data Governance Snowflake Data Modeling SPARK Data Validation SPARK Code Optimization Data Vault Modeling SQL Database Schema Design Stakeholder Management Decision-Making Sun Solaris DevOps Synapse Dimensional Modeling Talend GCP Big Table Teradata GCP BigQuery Time Management GCP Cloud Storage Ubuntu GCP DataFlow Vendor Management GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2

Posted 1 month ago

Apply

5 - 10 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

EPAM has presence across 40+ countries globally with 55,000 + professionals & numerous delivery centers, Key locations are North America, Eastern Europe, Central Europe, Western Europe, APAC, Mid East & Development Centers in India (Hyderabad, Pune & Bangalore). Location: Gurgaon/Pune/Hyderabad/Bengaluru/Chennai Work Mode: Hybrid (2-3 days office in a week) Job Description: 5-14 Years of in Big Data & Data related technology experience Expert level understanding of distributed computing principles Expert level knowledge and experience in Apache Spark Hands on programming with Python Proficiency with Hadoop v2, Map Reduce, HDFS, Sqoop Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming Good understanding of Big Data querying tools, such as Hive, and Impala Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files Good understanding of SQL queries, joins, stored procedures, relational schemas Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of ETL techniques and frameworks Performance tuning of Spark Jobs Experience with native Cloud data services AWS/Azure/GCP Ability to lead a team efficiently Experience with designing and implementing Big data solutions Practitioner of AGILE methodology WE OFFER Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications Opportunity to share your ideas on international platforms Sponsored Tech Talks & Hackathons Possibility to relocate to any EPAM office for short and long-term projects Focused individual development Benefit package: • Health benefits, Medical Benefits• Retirement benefits• Paid time off• Flexible benefits Forums to explore beyond work passion (CSR, photography, painting, sports, etc

Posted 1 month ago

Apply

4 - 9 years

15 - 19 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job Title: Technical-Specialist GCP Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Spark and GCP technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the team. As a developer you will bring extensive design and development skills to enforce the group of developers within the team. The candidate will extensively make use and apply Continuous Integration tools and practices in the context of Deutsche Banks digitalization journey. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy. Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design and discuss your own solution for addressing user stories and tasks. Develop and unit-test, Integrate, deploy, maintain, and improve software. Perform peer code review. Actively participate into the sprint activities and ceremonies e.g., daily stand-up/scrum meeting, Sprint planning, retrospectives, etc. Apply continuous integration best practices in general (SCM, build automation, unit testing, dependency management) Collaborate with other team members to achieve the Sprint objectives. Report progress/update Agile team management tools (JIRA/Confluence) Manage individual task priorities and deliverables. Responsible for quality of solutions candidate / applicant provides. Contribute to planning and continuous improvement activities & support PO, ITAO, Developers and Scrum Master. Your skills and experience Engineer with Good development experience in Google Cloud platform for at least 4 years. Hands own experience in Bigquery, Dataproc, Composer, Terraform, GKE, Cloud SQL and Cloud functions. Experience in set-up, maintenance, and ongoing development of continuous build/ integration infrastructure as a part of Devops. Create and maintain fully automated CI build processes and write build and deployment scripts. Has experience with development platformsOpenShift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g., GIT, TeamCity, Maven, SONAR Good Knowledge about the core SDLC processes and tools such as HP ALM, Jira, Service Now. Knowledge on working with APIs and microservices , integrating external and internal web services including SOAP, XML, REST, JSON . Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded and willing to learn business and technology. Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge to expertise the team members. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted 1 month ago

Apply

4 - 8 years

25 - 30 Lacs

Pune

Hybrid

Naukri logo

So, what’s t he r ole all about? As a Data Engineer, you will be responsible for designing, building, and maintaining large-scale data systems, as well as working with cross-functional teams to ensure efficient data processing and integration. You will leverage your knowledge of Apache Spark to create robust ETL processes, optimize data workflows, and manage high volumes of structured and unstructured data. How will you make an impact? Design, implement, and maintain data pipelines using Apache Spark for processing large datasets. Work with data engineering teams to optimize data workflows for performance and scalability. Integrate data from various sources, ensuring clean, reliable, and high-quality data for analysis. Develop and maintain data models, databases, and data lakes. Build and manage scalable ETL solutions to support business intelligence and data science initiatives. Monitor and troubleshoot data processing jobs, ensuring they run efficiently and effectively. Collaborate with data scientists, analysts, and other stakeholders to understand business needs and deliver data solutions. Implement data security best practices to protect sensitive information. Maintain a high level of data quality and ensure timely delivery of data to end-users. Continuously evaluate new technologies and frameworks to improve data engineering processes. Have you got what it takes? 4-7 years of experience as a Data Engineer, with a strong focus on Apache Spark and big data technologies. Expertise in Spark SQL , DataFrames , and RDDs for data processing and analysis. Proficient in programming languages such as Python , Scala , or Java for data engineering tasks. Hands-on experience with cloud platforms like AWS , specifically with data processing and storage services (e.g., S3 , BigQuery , Redshift , Databricks ). Experience with ETL frameworks and tools such as Apache Kafka , Airflow , or NiFi . Strong knowledge of data warehousing concepts and technologies (e.g., Redshift , Snowflake , BigQuery ). Familiarity with containerization technologies like Docker and Kubernetes . Knowledge of SQL and relational databases, with the ability to design and query databases effectively. Solid understanding of distributed computing, data modeling, and data architecture principles. Strong problem-solving skills and the ability to work with large and complex datasets. Excellent communication and collaboration skills to work effectively with cross-functional teams. You will have an advantage if you also have: Knowledge of SQL and relational databases, with the ability to design and query databases effectively. Solid understanding of distributed computing, data modeling, and data architecture principles. Strong problem-solving skills and the ability to work with large and complex datasets. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7235 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 1 month ago

Apply

8 - 12 years

25 - 40 Lacs

Hyderabad

Remote

Naukri logo

Senior GCP Cloud Administrator Experience: 8 - 12 Years Exp Salary : Competitive Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : GCP, Identity and Access Management (IAM), BigQuery, SRE, GKE, GCP certification Good to have skills : Terraform, Cloud Composer, Dataproc, Dataflow, AWS Forbes Advisor (One of Uplers' Clients) is Looking for: Senior GCP Cloud Administrator who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Senior GCP Cloud Administrator Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements: Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good to Have: Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Forbes Advisor is a global platform dedicated to helping consumers make the best financial choices for their individual lives. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 month ago

Apply

5 - 7 years

27 - 30 Lacs

Pune

Work from Office

Naukri logo

Must-Have Skills: 5+ years of experience as a Big Data Engineer 3+ years of experience with Apache Spark, Hive, HDFS, and Beam (optional) Strong proficiency in SQL and either Scala or Python Experience with ETL processes and working with structured and unstructured data 2+ years of experience with Cloud Platforms (GCP, AWS, or Azure) Hands-on experience with software build management tools like Maven or Gradle Experience in automation, performance tuning, and optimizing data pipelines Familiarity with CI/CD, serverless computing, and infrastructure-as-code practices Good-to-Have Skills: Experience with Google Cloud Services (BigQuery, Dataproc, Dataflow, Composer, DataStream) Strong knowledge of data pipeline development and optimization Familiarity with source control tools (SVN/Git, GitHub) Experience working in Agile environments (Scrum, XP, etc.) Knowledge of relational databases (SQL Server, Oracle, MySQL) Experience with Atlassian tools (JIRA, Confluence, GitHub) Key Responsibilities: Extract, transform, and load (ETL) data from multiple sources using Big Data technologies Develop, enhance, and support data ingestion jobs using GCP services like Apache Spark, Dataproc, Dataflow, BigQuery, and Airflow Work closely with senior engineers and cross-functional teams to improve data accessibility Automate manual processes, optimize data pipelines, and enhance infrastructure for scalability Modify data extraction pipelines to follow standardized, reusable approaches Optimize query performance and data access techniques in collaboration with senior engineers Follow modern software development practices, including microservices, CI/CD, and infrastructure-as-code Participate in Agile development teams, ensuring best practices for software engineering and data management Preferred Qualifications: Bachelor's degree in Computer Science, Systems Engineering, or a related field Self-starter with strong problem-solving skills and adaptability to shifting priorities Cloud certifications (GCP, AWS, or Azure) are a plus Skills GCP Services, ,Dataproc,DataFlow.

Posted 1 month ago

Apply

4 - 7 years

10 - 19 Lacs

Indore, Gurugram, Bengaluru

Work from Office

Naukri logo

We need GCP engineers for capacity building; - The candidate should have extensive production experience (1-2 Years ) in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application development is a must Roles and Responsibilities 4-7 years of IT experience range is preferred. Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos. Ability to drive the deployment of the customers’ workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for GCP cloud implementations. Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams. Technical ability to become certified in required GCP technical certifications.

Posted 1 month ago

Apply

5 - 7 years

0 - 0 Lacs

Kolkata

Work from Office

Naukri logo

Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation: Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration: Define and govern the configuration management plan. Ensure compliance within the team. Testing: Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance: Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management: Manage the delivery of modules effectively. Defect Management: Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation: Create and provide input for effort and size estimation for projects. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management: Execute and monitor the release process to ensure smooth transitions. Design Contribution: Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface: Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management: Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications: Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples: Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments: Required Skills & Qualifications: - A degree (preferably an advanced degree) in Computer Science, Engineering or a related field - Senior developer having 8+ years of hands on development experience in Azure using ASB and ADF: Extensive experience in designing, developing, and maintaining data solutions/pipelines in the Azure ecosystem, including Azure Service Bus, & ADF. - Familiarity with MongoDB and Python is added advantage. Required Skills Azure Data Factory,Azure Service Bus,Azure,Mongodb

Posted 1 month ago

Apply

12 - 15 years

0 - 0 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

Role Proficiency: Resolve complex trouble tickets spanning across different technologies and fine tune infrastructure for optimum performance and/or provide technical people and financial management (Hierarchical or Lateral) Outcomes: 1) Mentor new team members in understanding customer infrastructure and processes2) Review and approve RCA prepared by team and drive corrective and preventive actions for permanent resolution3) Review problem tickets for timely closure. Measure incident reduction achieve by problem record for showcasing during Governance review4) Provide technical leadership for change implementation 5) Review CSI initiatives and aggressively drive for time bound closure to achieve optimization and efficiency goals6) Drive delivery finance goal to achieve project forecast numbers 7) Work on proposals and PIP for identified greenfield opportunity to increase revenue Measures of Outcomes: 1) SLA Adherence2) Time bound resolution of elevated tickets - OLA3) Manage ticket backlog timelines - OLA4) Adhere to defined process - Number of NCs in internal/external Audits5) Number of KB articles created6) Number of incidents and change tickets handled 7) Number of elevated tickets resolved8) Number of successful change tickets9) % Completion of all mandatory training requirements10) Overall financial goals of project11) % of incident reduction by problem management12) Number of repeated escalations for the same technical issue Outputs Expected: Resolution/Response: Daily review of resolution and response SLA for early intervention of SLA management Troubleshooting: Troubleshooting based on available information from previous tickets or consulting with seniors. Participate in online knowledge forums reference. Convert the new steps to KB article Perform logical/analytical troubleshooting. Work on problem tickets to identify permanent solutions. Assist and lead technical teams to rope in technology experts for complex issues Escalation/Elevation: Escalate within organization/customer peer in case of resolution delay. Define OLA between delivery layers (L1 L2 L3 etc) and enforce adherence. SPOC for any customer and leadership escalations Tickets Backlog/Resolution: Follow up on tickets based on agreed timelines manage ticket backlogs/last activity as per defined process. Resolve incidents and SRs within agreed timelines. Execute change tickets for infrastructure. Runbook/KB: Review KB compliance and suggest changes. Initiate and drive periodic SOP review with customer stake holders Collaboration: Collaborate with different towers of delivery for ticket resolution (within SLA) Resolve L1 tickets with help from respective tower. Collaborate with other team members for timely resolution of tickets. Actively participate in team/organization-wide initiatives. Co-ordinate with UST ISMS teams for resolving connectivity related issues Stakeholder Management: Lead the customer calls and vendor calls. Organize meeting with different stake holders. Take ownership for function's internal communications and related change management. Strategic: Define the strategy on data management policy management and data retention management. Support definition of the IT strategy for the function's relevant scope and be accountable for ensuring the strategy is tracked benchmarked and updated for the area owned. Process Adherence: Thorough understanding of organization and customer defined process. Suggest process improvements and CSI ideas. Adhere to organization' s policies and business conduct Process/efficiency Improvement: Proactively identify opportunities to increase service levels and mitigate any issues in service delivery within the function or across functions. Take accountability for overall productivity efforts within the function including coordination of function specific tasks and close collaboration with Finance. Process Implementation: Coordinate and monitor IT process implementation within the function Compliance: Support information governance activities and audit preparations within the function. Act as a function SPOC for IT audits in local sites (incl. preparation interface to local organization mitigation of findings etc.) and work closely with ISRM (Information Security Risk Management). Coordinate overall objective setting preparation and facilitate process in order to achieve consistent objective setting in function Job Description. Coordination Support for CSI across all services in CIS and beyond. Training: On time completion of all mandatory training requirements of organization and customer. Provide On floor training and one-on-one mentorship for new joiners. Complete certification of respective career paths. Explore cross training possibilities for improved efficiency and career growth. Performance Management: Update FAST Goals in NorthStar; track report and seek continues feedback from peers and manager. Set goals for team members and mentees and provide feedback. Assist new team members understanding the customer environment day-to-day operations and people management for example roster transport and leaves. Prepare weekly/Monthly/Quarterly governance review slides. Drive finance goals of the account. Skill Examples: 1) Good communication skills (Written verbal and email etiquette) to interact with different teams and customers. 2) Modify / Create runbooks based on suggested changes from juniors or newly identified steps3) Ability to work on an elevated server tickets to resolution4) Networking:a. Trouble shooting skills in static and Dynamic routing protocolsb. Should be capable of running netflow analyzers in different product lines5) Server:a. Skills in installing and configuring active directory DNS DHCP DFS IIS patch managementb. Excellent troubleshooting skills in various technologies like AD replication DNS issues etc.c. Skills in managing high availability solutions like failover clustering Vmware clustering etc.6) Storage and Back up:a. Ability to give recommendations to customers. Perform Storage and backup enhancements. Perform change management.b. Skilled in in core fabric technology storage design and implementation. Hands on experience in backup and storage Command Line Interfacesc. Perform Hardware upgrades firmware upgrades vulnerability remediation storage and backup commissioning and de-commissioning replication setup and management.d. Skilled in server Network and virtualization technologies. Integration of virtualization storage and backup technologiese. Review the technical diagrams architecture diagrams and modify the SOP and documentations based on business requirements.f. Ability to perform the ITSM functions for storage and backup team; review the quality of ITSM process followed by the team.7) Cloud:a. Skilled in any one of the cloud technologies - AWS Azure GCP.8) Tools:a. Skilled in administration and configuration of monitoring tools like CA UIM SCOM Solarwinds Nagios ServiceNow etcb. Skilled in SQL scriptingc. Skilled in building Custom Reports on Availability and performance of IT infrastructure building based on the customer requirements9) Monitoring:a. Skills in monitoring of infrastructure and application components10) Database:a. Data modeling and database design; Database schema creation and managementb. Identification of data integrity violations so that only accurate and appropriate data is entered and maintained.c. Backup and recoveryd. Web-specific tech expertise for e-Biz Cloud etc. Examples of this type of technology include XML CGI Java Ruby firewalls SSL and so on.e. Migrating database instances to new hardware and new versions of software from on premise to cloud based databases and vice versa.11) Quality Analysis: a. Ability to drive service excellence and continuous improvement within the framework defined by IT Operations Knowledge Examples: 1) Good understanding of customer infrastructure and related CIs. 2) ITIL Foundation certification3) Thorough hardware knowledge 4) Basic understanding of capacity planning5) Basic understanding of storage and backup6) Networking:a. Hands-on experience in Routers witches and Firewallsb. Should have minimum knowledge and hands-on with BGPc. Good understanding in Load balancers and WAN optimizersd. Advance back and restore knowledge in backup tools7) Server:a. Basic to intermediate powershell / BASH/Python scripting knowledge and demonstrated experience in script based tasksb. Knowledge of AD group policy management group policy tools and troubleshooting GPO sc. Basic AD object creation DNS concepts DHCP DFSd. Knowledge with tools like SCCM SCOM administration8) Storage and Backup:a. Subject Matter Expert in any of the and Backup technologies9) Tools:a. Proficient in understanding and troubleshooting of Windows and Linux family of operating systems10) Monitoring:a. Strong knowledge in ITIL process and functions11) Database:a. Knowledge in general database managementb. Knowledge in OS System and networking skills Additional Comments: UST(r) is looking for an SRE (Site Reliability Engineer) to work with one of the leading financial services organization in the US. The ideal candidate must possess strong background on DevOps in a cloud (GCP) environment. The candidate must possess excellent leadership skills, written and verbal communication skills with the ability to collaborate effectively with domain experts and technical experts in the team. Responsibilities: Participate in release cycles of our offerings, deploying code to nonprod and production environments, integrating with continuous integration (CI) and continuous delivery (CD) tools, monitoring, and change management Develop and implement appropriate operational procedures to meet timely and high-quality deliverables Establish, monitor, and report on operation/production issues, business improvement opportunities, and operational achievements You will develop standard methodologies and processes to maximize operational output/efficiency to support business goals Respond to and prevent application incidents/outages Requirements: 5+ years of experience in DevOps roles Bachelor's degree or equivalent combination of education and related professional experience Certification in GCP Cloud (Professional Architect preferred) Experience in CI/CD pipeline tools like Github, Jenkins, Maven Experience in Terraform Scripting. Experience deploying and managing infrastructure on public clouds (GCP) - Creating Docker Images and containers. Experience working on Kubernetes, Dataproc/Hadoop or other large-scale cluster management systems Experience with PagerDuty, ServiceNow or other IT Service Management/ing platforms Knowledge of Spark, Asymmetric/Symmetric Encryption, Tokenization Experience in Linux environments (CentOS). Demonstrated ability to script around repeatable tasks Demonstrated ability to quickly and accurately troubleshoot system issues. You possess excellent written and verbal communication skills with the ability to communicate with team members at various levels, including business leaders. A real passion for and the ability to learn new technologies Required Skills Aws,Gcp,Terraform,Jenkins

Posted 1 month ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI to improve performance and efficiency, including but not limited to deep learning, neural networks, chatbots, natural language processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Dataproc, Google Pub/Sub Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education br/>Key Responsibilities :A:Implement and maintain data engineering solutions using BigQuery, Dataflow, Vertex AI, Dataproc, and Pub/SubB:Collaborate with data scientists to deploy machine learning modelsC:Ensure the scalability and efficiency of data processing pipelines br/> Technical Experience :A:Expertise in BigQuery, Dataflow, Vertex AI, Dataproc, and Pub/SubB:Hands-on experience with data engineering in a cloud environment br/> Professional Attributes :A:Strong problem-solving skills in optimizing data workflowsB:Effective collaboration with data science and engineering teams Qualifications 15 years full time education

Posted 1 month ago

Apply

7 - 12 years

13 - 17 Lacs

Gurugram

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills

Posted 1 month ago

Apply

1 - 3 years

3 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Record To Report - Invoice Processing Designation: Record to Report Ops Associate Qualifications: BCom/MCom Years of Experience: 1 to 3 years Language - Ability: English(Domestic) - Expert What would you do? You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions.Posting journal entries, preparing balance sheet reconciliations, reviewing entries and reconciliations, preparing cash forecasting statement, supporting month end closing, preparing reports and supports in audits.Refers to the systematic handling and management of incoming invoices within a business or organization. It involves tasks such as verifying the accuracy of the invoice, matching it with purchase orders and delivery receipts, and initiating the payment process. Automated systems and software are often employed to streamline and expedite the invoice processing workflow, improving efficiency and reducing the likelihood of errors. What are we looking for? Google Cloud SQL Adaptable and flexible Ability to perform under pressure Problem-solving skills Agility for quick learning Commitment to quality Roles and Responsibilities: In this role you are required to solve routine problems, largely through precedent and referral to general guidelines Your expected interactions are within your own team and direct supervisor You will be provided detailed to moderate level of instruction on daily work tasks and detailed instruction on new assignments The decisions that you make would impact your own work You will be an individual contributor as a part of a team, with a predetermined, focused scope of work Please note that this role may require you to work in rotational shifts Qualification BCom,MCom

Posted 1 month ago

Apply

7 - 10 years

16 - 21 Lacs

Mumbai

Work from Office

Naukri logo

Position Overview: The Google Cloud Data Engineering Lead role is ideal for an experienced Google Cloud Data Engineer who will drive the design, development, and optimization of data solutions on the Google Cloud Platform (GCP). The role requires the candidate to lead a team of data engineers and collaborate with data scientists, analysts, and business stakeholders to enable scalable, secure, and high-performance data pipelines and analytics platforms. Key Responsibilities: Lead and manage a team of data engineers delivering end-to-end data pipelines and platforms on GCP. Design and implement robust, scalable, and secure data architectures using services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Develop and maintain batch and real-time ETL/ELT workflows using tools such as Apache Beam, Dataflow, or Composer (Airflow). Collaborate with data scientists, analysts, and application teams to gather requirements and ensure data availability and quality. Define and enforce data engineering best practices including version control, testing, code reviews, and documentation. Drive automation and infrastructure-as-code approaches using Terraform or Deployment Manager for provisioning GCP resources. Implement and monitor data quality, lineage, and governance frameworks across the data platform. Optimize query performance and storage strategies, particularly within BigQuery and other GCP analytics tools. Mentor team members and contribute to the growth of technical capabilities across the organization. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with GCP data services. Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert-level understanding of BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub/Sub. Strong SQL and Python skills for data processing and orchestration. Experience with workflow orchestration tools (Airflow/Composer). Hands-on experience with CI/CD, Git, and infrastructure-as-code tools (e.g., Terraform). Familiarity with data security, governance, and compliance practices in cloud environments. Certifications : GCP Professional Data Engineer certification.

Posted 1 month ago

Apply

7 - 11 years

20 - 30 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Role: 6+ years hands on experience in software development. Strong in algorithms and data structures, knowledge of Object-oriented design, Design Patterns, and multi-threaded programming Strong troubleshooting, debugging, and analytical skills. Google cloud development experience Mandatory Skills Google cloud BQ, Data Proc SQL skills (BQ, Hive and Spark) Spark jobs debugging and tuning Python, Java Pyspark Good architecture skills to understand end to end flows/data pipelines Secondary skills Client stakeholder management Certification in relevant skills Responsibilities The candidate must have complete understanding with hands-on experience in design, coding, and testing aspects Design, implement, and support multi-tier software applications, document and test systems, modify as necessary Work independently and with other engineers, and Ops teams, design and develop strategic and tactical processes, and create solutions that meet business requirements. Demonstrate thought processes in solving business and technical problems. Design and code should be easy to maintain, available, performing well, and can be reused across a sub-system or feature. Code may persist for the lifetime of a software version. Own to deliver very high-quality code that is thoroughly tested and is supported by unit tests written in Golang for a flawless execution Can be relied on to deliver features and sub-systems on time and to requirements Works well within a team and contributes effectively to the success of those they interact with regularly. candidate profile

Posted 1 month ago

Apply

4 - 7 years

6 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Senior Software Engineer Google Cloud Platform (GCP) Location: Bangalore, India Why Join Fossil Group? At Fossil Group, we are part of an international team that dares to dream, disrupt, and deliver innovative watches, jewelry, and leather goods to the world. We're committed to long-term value creation, driven by technology and our core values: Authenticity, Grit, Curiosity, Humor, and Impact. If you are a forward-thinker who thrives in a diverse, global setting, we want to hear from you. Make an Impact (Job Summary + Responsibilities) We are looking for a Senior Software Engineer – GCP to join our growing Cloud & Data Engineering team at Fossil Group . This role involves building scalable cloud-native data pipelines using Google Cloud Platform services, with a focus on Dataflow, Dataproc, BigQuery , and strong development skills in Java, Python, and SQL . You will collaborate with global architects and business teams to design and deploy innovative solutions, supporting data analytics, automation, and transformation needs. What you will do in this role: Design, develop, deploy, and maintain data pipelines and services using GCP technologies including Dataflow, Dataproc, BigQuery, Composer , and others. Translate blueprinting documents and business requirements into scalable and maintainable GCP configurations and solutions. Develop and enhance cloud-based batch/streaming jobs using Java or Python. Collaborate with global architects and cross-functional teams to define solutions and execute projects across development and testing environments. Perform unit testing, integration testing, and resolve issues arising during the QA lifecycle. Work closely with internal stakeholders to gather requirements, present technical solutions, and provide end-to-end delivery. Own and manage project timelines, priorities, and documentation. Continuously improve processes and stay current with GCP advancements and big data technologies. Who You Are (Requirements) Bachelor's degree in Computer Science or related field. 4-7 years of experience as a DB/SQL Developer or Java/Python Developer with strong SQL capabilities. Hands-on experience with GCP components such as Dataflow, Dataproc, BigQuery, Cloud Functions, Composer, Data Fusion . Excellent command of SQL with the ability to write complex queries and perform advanced data transformation. Strong programming skills in Java and/or Python , specifically for building cloud-native data pipelines. Experience with relational and NoSQL databases. Ability to understand and translate business requirements into functional specifications. Familiarity with BI dashboards and Google Data Studio is a plus. Strong problem-solving, communication, and collaboration skills. Self-directed, with a growth mindset and eagerness to upskill in emerging GCP technologies. Comfortable leading meetings, gathering requirements, and managing stakeholder communication across regions. What We Offer Comprehensive Benefits: Includes health and well-being services. Paid Parental Leave & Return to Work Program: Support for new parents and caregivers with paid leave and a flexible phase-back schedule. Generous Paid Time Off: Includes Sick Time, Personal Days, and Summer Flex Fridays. Employee Discounts: Save on Fossil merchandise. EEO Statement At Fossil, we believe our differences not only make us stronger as a team, but also help us create better products and a richer community. We are an Equal Employment Opportunity Employer dedicated to a policy of non-discrimination in all employment practices without regard to age, disability, gender identity or expression, marital status, pregnancy, race, religion, sexual orientation, or any other protected characteristic.

Posted 1 month ago

Apply

5 - 9 years

10 - 20 Lacs

Pune, Bengaluru, Delhi / NCR

Hybrid

Naukri logo

Mandatory Skills: GCS Composer, BigQuery, Azure, Azure Databricks, ADLS. Experience: 5-10 years Good to have skills Knowledge on CDP (customer data platform) Airline knowledge Design and develop data-ingestion frameworks, real-time processing solutions, and data processing and transformation frameworks leveraging open source tools and data processing frameworks. Hands-on on technologies such as Kafka, Apache Spark (SQL, Scala, Java), Python, Hadoop Platform, Hive, airflow Experience in GCP Cloud Composer, Big Query, DataProc Offer system support as part of a support rotation with other team members. Operationalize open source data-analytic tools for enterprise use. Ensure data governance policies are followed by implementing or validating data lineage, quality checks, and data classification. Understand and follow the company development lifecycle to develop, deploy and deliver the solutions. Minimum Qualifications: Bachelor's degree in Computer Science, CIS, or related field 5-7 years of IT experience in software engineering or related field Experience on project(s) involving the implementation of software development life cycles (SDLC)Core Skills

Posted 1 month ago

Apply

2 - 5 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 month ago

Apply

2 - 6 years

8 - 12 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job Title Senior engineer - (Data Engineer (ETL, Big Data, Hadoop, Spark, GCP), AVP Location:Pune, India Role Description Senior engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Work as a senior developer for developing analytics algorithm on top of ingested data. Work as though senior developer for various data sourcing in Hadoop also GCP. Own unit testing UAT deployment end user sign off and prod go live. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience More than 10+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark, SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Desired Banking experience regulatory and cross product knowledge. Passionate about test driven development. Prior experience with release management tasks and responsibilities. Data visualization experience is good to have. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies