Home
Jobs
Companies
Resume

962 Bigquery Jobs - Page 5

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

7 - 11 Lacs

Mumbai, Gurugram, Bengaluru

Work from Office

Naukri logo

With its increasing global client footprint, advertisers from all app categories are trusting RevX to deliver every day incremental value in achieving their growth objectives. To enhance the quality, scope, and scale of our delivery capabilities, RevX is looking to expand its campaign management /data analytics function with talented team players who share a passion for business ownership, advertising data, programmatic performance, and holistic collaboration with internal/external stakeholders. Are you passionate about great mobile advertising, curious about making data speak, and taking pride in being customer-centric What you should be excelling on: Spearhead the daily achievement of delivery targets on our programmatic DSP platform for an allocated portfolio of campaigns = PL target achievement Monitoring of performance and scale of campaigns including regular in-depth data analysis, granular optimization, and reporting. Understand, learn and expand your space of action within different advertiser app verticals, programmatic supply, data integration, deep linking, and dynamic creatives. Take initiative and collaborate with multiple functions including Sales, AM, TSE, Design, Supply, Engineering, Product Management, and Finance. Working with Product Management, Tech, and Data Science to provide pragmatic feedback, drive development through qualified input as well as ensure smooth testing and feature roll out. What prepares and aligns you for this job: You love to solve riddles and never give up over technicalities. Working within a small task team, which takes full responsibility of clients and their growth, excites and empowers you to achieve more. Holistic knowledge in app performance advertising and of the RTB environment. Advanced data interpretation and analysis skills to be able to look at large data sets with ease to extract insights quickly and regularly. Deep understanding of MMPs, SSPs and other ecosystem players. Ability to work well independently as well as being highly responsive to stakeholders. Results oriented with great attention to detail, strong analytical skills and creative, independent problem solver. Strong process, organisational and time management skills as well as strong written and verbal communication skills. Comfortable with using MS Excel, Google Sheets, Pivot, Big Query, SQL, Python and other tools to analyse and manipulate data. Fluency in English (spoken and written) required

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 37 Lacs

Hyderabad

Work from Office

Naukri logo

Position: Data Architect GCP Exp: 10+ Yrs Roles and Responsibilities 10+ years of relevant work experience, including previous experience leading Data related projects in the field of Reporting and Analytics. Design, build & maintain scalable data lake and data warehouse in cloud (GCP) Expertise in gathering business requirements, analysing business needs, defining the BI/DW architecture to support and help deliver technical solutions to complex business and technical requirements Creating solution prototype and participating in technology selection. Perform POC and technical presentations Architect, develop and test scalable data warehouses and data pipelines architecture in Cloud Technologies ( GCP ) Experience in SQL and No SQL DBMS like MS SQL Server, MySQL, PostgreSQL, DynamoDB, Cassandra, MongoDB. Design and develop scalable ETL processes, including error handling. Expert in Query and program languages MS SQL Server, T-SQL, PostgreSQL, MY SQL, Python, R. Preparing data structures for advanced analytics and self-service reporting using MS SQL, SSIS, SSRS Write scripts for stored procedures, database snapshots backups and data archiving. Experience with any of these cloud-based technologies: o PowerBI/Tableau, Azure Data Factory, Azure Synapse, Azure Data Lake o AWS RedShift, Glue, Athena, AWS Quicksight , Google Cloud Platform Good to have: Agile development environment pairing DevOps with CI/CD pipelines AI/ML background Interested aspirants can share your updated CV to dikshith.nalapatla@motivitylabs.com for quick response.

Posted 1 week ago

Apply

9.0 - 10.0 years

12 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities: * Design, develop & maintain data pipelines using Airflow/Data Flow/Data Lake * Optimize performance & scalability of ETL processes with SQL & Python

Posted 1 week ago

Apply

8.0 - 12.0 years

20 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

The ideal candidate will have extensive experience with Google Cloud Platform's data services, building scalable data pipelines, and implementing modern data architecture solutions. Key Responsibilities Design and implement data lake solutions using GCP Storage and Data Transfer Service Develop and maintain ETL/ELT pipelines for data processing and transformation Orchestrate complex data workflows using Cloud Composer (managed Apache Airflow) Build and optimize BigQuery data models and implement data governance practices Configure and maintain Dataplex for unified data management across our organization Implement monitoring solutions using Cloud Monitoring to ensure data pipeline reliability Create and maintain data visualization solutions using Looker for business stakeholders Collaborate with data scientists and analysts to deliver high-quality data products Required Skills & Experience 8+ years of hands-on experience with GCP data services including: Cloud Storage and Storage Transfer Service for data lake implementation BigQuery for data warehousing and analytics Cloud Composer for workflow orchestration Dataplex for data management and governance Cloud Monitoring for observability and alerting Strong experience with ETL/ELT processes and data pipeline development Proficiency in SQL and at least one programming language (Python preferred) Experience with Looker or similar BI/visualization tools Knowledge of data modeling and dimensional design principles Experience implementing data quality monitoring and validation Preferred Qualifications Google Cloud Professional Data Engineer certification Experience with streaming data processing using Dataflow or Pub/Sub Knowledge of data mesh or data fabric architectures Experience with dbt or similar transformation tools Familiarity with CI/CD practices for data pipelinesRole & responsibilities Preferred candidate profile

Posted 1 week ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

Chennai

Work from Office

Naukri logo

Role & responsibilities 5+ years of hands-on experience in data engineering or production support Strong technical expertise in IBM DataStage for building and troubleshooting ETL jobs Proven experience with Google Cloud Platform (GCP), especially BigQuery, Cloud Storage, and other native tools Advanced SQL skills and a solid understanding of data warehouse concepts Practical knowledge of incident, change, and problem management frameworks (ITIL preferred) Familiarity with job orchestration tools like Control-M, Autosys, or Cloud Composer Scripting experience (Shell, Python) for automation and diagnostics

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Manager, Business Analytics Bengaluru, India The Opportunity: Are you a data-driven storyteller with a passion for transforming raw information into actionable insights that drive tangible business outcomes? Do you thrive on collaborating directly with business stakeholders to understand their needs and then architecting elegant data solutions? If so, we have an exciting opportunity for a highly skilled and motivated Manager-level Business Analytics to join our growing team in Bangalore. In this role, you will be instrumental in empowering our business users with the data and visualizations they need to make informed decisions, analyze and drive improvements to our Finance operational performance, business decisions, and strategy. You will drive the analytics lifecycle, from initial consultation to the delivery of impactful dashboards and data sets. Please note - this role operates to PST, so you will work from 5pm - 2am IST. What You'll Do: Strategic Alignment: Align analytics initiatives with key business objectives and contribute to the development of data-driven strategies that lead to measurable improvements. Become a Trusted Advisor: Partner closely with business users across various departments to understand their strategic objectives, identify their analytical requirements, and translate those needs into clear and actionable data and reporting solutions. Consultative Analysis: Engage with stakeholders to explore their business questions, guide them on appropriate analytical approaches, and help them define key metrics and performance indicators (KPIs). Data Architecture & Design: Partner with our Data and Insights team to design and develop robust and efficient data models and datasets optimized for visualization and analysis, ensuring data accuracy and integrity. Expert Tableau Development: Leverage your deep expertise in Tableau to create intuitive, interactive, and visually compelling dashboards and reports that effectively communicate key insights and trends. Data Wrangling & Transformation: Utilize Fivetran and/or Python scripting to extract, transform, and load data from various sources into our data warehouse or analytics platforms. End-to-End Ownership: Take full ownership of the analytics projects you lead, from initial scoping and data acquisition to dashboard deployment, user training, and ongoing maintenance. Drive Data Literacy: Educate and empower business users to effectively utilize dashboards and data insights to drive business outcomes. Stay Ahead of the Curve: Continuously explore new data visualization techniques, analytical methodologies, and data technologies to enhance our analytics capabilities. Collaborate and Communicate: Effectively communicate complex analytical findings and recommendations to stakeholders. Lead cross-functional collaborations to achieve project goals. Data Governance & Quality: Ensure data accuracy, consistency, and integrity in all developed datasets and dashboards, contributing to data governance efforts. Performance Monitoring & Iteration: Monitor the performance and user adoption of developed dashboards, gather feedback, and implement necessary revisions for continuous improvement. Documentation & Training: Develop comprehensive documentation for created dashboards and datasets. Provide training and support to business users to ensure effective utilization of analytics tools. What You'll Bring: 5-7+ years of experience in a Business Analytics, Data Analytics, or similar role with increasing responsibility. Proven experience working directly with business stakeholders to understand their needs and deliver data-driven solutions. Expert-level proficiency in Tableau , including advanced calculations, parameters, actions, and performance optimization. Strong hands-on experience in building and optimizing data sets for Tableau. Solid experience with data integration tools, preferably Fivetran , and the ability to design and implement data pipelines. Proficiency in Python for data manipulation, cleaning, and transformation (e.g., using libraries like Pandas). Strong understanding of data warehousing principles and experience with platforms such as Snowflake or Redshift. Advanced SQL skills for data extraction, transformation, and querying. Excellent problem-solving and analytical skills with a strong attention to detail and the ability to translate business questions into analytical frameworks. Exceptional written and verbal communication skills, with the ability to present complex data insights effectively to both technical and non-technical audiences. Proven ability to manage multiple analytics projects simultaneously, prioritize tasks, and meet deadlines effectively. Excellent collaboration and interpersonal skills with the ability to build strong working relationships with business stakeholders. Bachelor's degree in a quantitative field such as Engineering, Finance/Accounting/ Business, Economics, Statistics, Mathematics, Computer Science, or a related discipline. (Master's degree a plus). Bonus Points For: Experience in SaaS software industry is highly preferred. Experience with other data visualization tools (e.g., Power BI, Looker). Familiarity with cloud-based data platforms (e.g., BigQuery). Familiarity with basic statistical concepts and methodologies is a plus.

Posted 1 week ago

Apply

6.0 - 11.0 years

18 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

SUMMARY Data Modeling Professional Location Hyderabad/Pune Experience: The ideal candidate should possess at least 6 years of relevant experience in data modeling with proficiency in SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools) along with GCP. Key Responsibilities: Develop and configure data pipelines across various platforms and technologies. Write complex SQL queries for data analysis on databases such as SQL Server, Oracle, and HIVE. Create solutions to support AI/ML models and generative AI. Work independently on specialized assignments within project deliverables. Provide solutions and tools to enhance engineering efficiencies. Design processes, systems, and operational models for end-to-end execution of data pipelines. Preferred Skills: Experience with GCP, particularly Airflow, Dataproc, and Big Query, is advantageous. Requirements Requirements: Strong problem-solving and analytical abilities. Excellent communication and presentation skills. Ability to deliver high-quality materials against tight deadlines. Effective under pressure with rapidly changing priorities. Note: The ability to communicate efficiently at a global level is paramount. --- Minimum 6 years of experience in data modeling with SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools). Proficiency in writing complex SQL queries for data analysis. Experience with GCP, particularly Airflow, Dataproc, and Big Query, is an advantage. Strong problem-solving and analytical abilities. Excellent communication and presentation skills. Ability to work effectively under pressure with rapidly changing priorities.

Posted 1 week ago

Apply

2.0 - 7.0 years

1 - 6 Lacs

Hyderabad, Qatar

Work from Office

Naukri logo

SUMMARY Job Summary: Exciting job opportunity as a Registered Nurse in Qatar (Homecare) Key Responsibilities: Develop and assess nursing care plans Monitor vital signs and assess holistic patient needs Collaborate with physicians, staff nurses, and healthcare team members Administer oral and subcutaneous medications while ensuring safety Document nursing care, medications, and procedures using the company's Nurses Buddy application Conduct client assessment and reassessment using approved tools Attend refresher training courses, seminars, and training Timeline for Migration: Application to Selection: Not more than 5 days Data flow & Prometric: 1 month Visa processing: 1-2 months Start working in Qatar within 3 months! Requirements: Educational Qualification: Bachelor's Degree in Nursing or GNM Experience: Minimum 2 years working experience as a Nurse post registration Citizenship: Indian Ag e limit: below 45 years Certification: registration Certification from Nursing Council Language: Basic English proficiency required Technical Skills: Bed side nursing, patient care, patient assessment and monitoring Benefits: High Salary & Perks: Earn 5000 QAR / month (1,18,000 INR/month) Tax Benefit: No tax deduction on salary Career Growth: Advanced Nursing career in Qatar with competitive salaries, cutting-edge facilities, and opportunities for specialization Relocation support: Visa process and flight sponsored. Free accommodation and transportation provided. International Work Experience: Boost your resume with International healthcare expertise. Comprehensive Health Insurance: Medical coverage for under Qatar’s healthcare system. S afe and stable environment: Qatar is known for its low crime rate, political stability, and high quality of life. The strict laws in the country, makes it one of safest place to live. Faster Visa Processing With efficient government procedures, work visas for nurses are processed quickly, reducing waiting times. Simplified Licensing Process Compared to other countries, Qatar offers a streamlined process for obtaining a nursing license through QCHP (Qatar Council for Healthcare Practitioners) . Direct Hiring Opportunities Many hospitals and healthcare facilities offer direct recruitment , minimizing third-party delays and complications. Limited slots available! Apply now to secure your place in the next batch of Nurses migrating to Qatar!

Posted 1 week ago

Apply

4.0 - 9.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Understanding of design, configuring infrastructure based on provided design, managing GCP infrastructure using Terraform. Automate the provisioning, configuration, and management of GCP resources, including Compute Engine, Cloud Storage, Cloud SQL, Spanner, Kubernetes Engine (GKE), and serverless offerings like Cloud Functions and Cloud Run. Manage and configure GCP service accounts, IAM roles, and permissions to ensure secure access to resources. Implement and manage load balancers (HTTP(S), TCP/UDP) for high availability and scalability. Develop and maintain CI/CD pipelines using Cloud build, GitHub Actions or similar tools. Monitor and optimize the performance and availability of our GCP infrastructure. Primary Skills Terraform CI/CD Pipeline IAC Docker, Kubernetes Secondary Skills AWS Azure Github

Posted 1 week ago

Apply

4.0 - 9.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

Your Role As a senior software engineer with Capgemini, you should have 4 + years of experience in Snowflake Data Engineer with strong project track record In this role you will play a key role in Strong customer orientation, decision making, problem solving, communication and presentation skills Very good judgement skills and ability to shape compelling solutions and solve unstructured problems with assumptions Very good collaboration skills and ability to interact with multi-cultural and multi-functional teams spread across geographies Strong executive presence andspirit Superb leadership and team building skills with ability to build consensus and achieve goals through collaboration rather than direct line authority Your Profile 4+ years of experience in data warehousing, and cloud data solutions. Minimum 2+ years of hands-on experience with End-to-end Snowflake implementation. Experience in developing data architecture and roadmap strategies with knowledge to establish data governance and quality frameworks within Snowflake Expertise or strong knowledge in Snowflake best practices, performance tuning, and query optimisation. Experience with cloud platforms like AWS or Azure and familiarity with Snowflakes integration with these environments. Strong knowledge in at least one cloud (AWS or Azure) is mandatory Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management

Posted 1 week ago

Apply

4.0 - 9.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Your Role As a senior software engineer with Capgemini, you should have 4 + years of experience in Azure Data Engineer with strong project track record In this role you will play a key role in Strong customer orientation, decision making, problem solving, communication and presentation skills Very good judgement skills and ability to shape compelling solutions and solve unstructured problems with assumptions Very good collaboration skills and ability to interact with multi-cultural and multi-functional teams spread across geographies Strong executive presence andspirit Superb leadership and team building skills with ability to build consensus and achieve goals through collaboration rather than direct line authority Your Profile Experience with Azure Data Bricks, Data Factory Experience with Azure Data components such as Azure SQL Database, Azure SQL Warehouse, SYNAPSE Analytics Experience in Python/Pyspark/Scala/Hive Programming. Experience with Azure Databricks/ADB Experience with building CI/CD pipelines in Data environments Primary Skills ADF (Azure Data Factory) OR ADB (Azure Data Bricks) Secondary Skills Excellent verbal and written communication and interpersonal skills Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management

Posted 1 week ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Mumbai

Work from Office

Naukri logo

Your Role As a senior software engineer with Capgemini, you should have 4 + years of experience in GCP Data Engineer with strong project track record In this role you will play a key role in Strong customer orientation, decision making, problem solving, communication and presentation skills Very good judgement skills and ability to shape compelling solutions and solve unstructured problems with assumptions Very good collaboration skills and ability to interact with multi-cultural and multi-functional teams spread across geographies Strong executive presence andspirit Superb leadership and team building skills with ability to build consensus and achieve goals through collaboration rather than direct line authority Your Profile Minimum 4 years' experience in GCP Data Engineering. Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. Should have worked on handling big data. Strong communication skills. experience in Agile methodologies ETL, ELT skills, Data movement skills, Data processing skills. Certification on Professional Google Cloud Data engineer will be an added advantage. Proven analytical skills and Problem-solving attitude Ability to effectively function in a cross-teams environment. Primary Skills GCP, data engineering.Java/ Python/ Spark on GCP, Programming experience in any one language - either Python or Java or PySpark. GCS (Cloud Storage), Composer (Airflow) and BigQuery experience. Experience building data pipelines using above skills Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Mangaluru

Remote

Naukri logo

Job Title: Data Engineer GCP | 5+ Years Experience Location: [Remote] Experience: Minimum 5 Years Employment Type: [Full-time] Job Description: We are looking for a skilled Data Engineer with hands-on expertise in Google Cloud Platform (GCP) , BigQuery , SQL , Python , and Apache Airflow . The ideal candidate will have at least 5 years of experience in building scalable data pipelines, working with large datasets, and collaborating with cross-functional teams. Strong communication and presentation skills are essential for success in this role. Key Responsibilities: Design, develop, and maintain robust data pipelines and ETL workflows using Apache Airflow Work with BigQuery and SQL to perform data transformations and advanced querying Develop and maintain Python scripts for data ingestion, transformation, and automation Leverage key GCP services (such as Cloud Storage, Dataflow, Pub/Sub, Cloud Functions) in data engineering workflows Ensure data quality, accuracy, and performance optimization across data pipelines Collaborate with analytics and business teams to understand data requirements and provide reliable solutions Present ideas, insights, and technical designs clearly to both technical and non-technical stakeholders Required Skills: SkillProficiency GCP ServicesHands-on experience with core GCP data servicesBigQuery / SQLIntermediate to advanced levelPythonSolid hands-on development experienceApache AirflowProven ability to build & maintain workflowsCommunicationStrong verbal and written skillsPresentationClear and concise presentation capabilities Preferred Qualifications:

Posted 1 week ago

Apply

3.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT THE ROLE We are seeking a highly skilled, hands-on Senior QA & Test Automation Specialist ( T e st Automation Engineer ) with strong experience in data validation , ETL testing , test automation , and QA process ownership . This role combines deep technical execution with a solid foundation in QA best practices including test planning, defect tracking, and test lifecycle management . You will be responsible for designing and executing manual and automated test strategies for complex real-time and batch data pipelines , contributing to the design of automation frameworks , and ensuring high-quality data delivery across our AWS and Databricks-based analytics platforms . The role is highly technical and hands-on , with a strong focus on automation, metadata validation , and ensuring data governance practices are seamlessly integrated into development pipelines. Roles & Responsibilities: Collaborate with the QA Manager to design and implement end-to-end test strategies for data validation, semantic layer testing, and GraphQL API validation. Perform manual validation of data pipelines, including source-to-target data mapping, transformation logic, and business rule verification. Develop and maintain automated data validation scripts using Python and PySpark for both real-time and batch pipelines. Contribute to the design and enhancement of reusable automation frameworks, with components for schema validation, data reconciliation, and anomaly detection. Validate semantic layers (e.g., Looker, dbt models) and GraphQL APIs, ensuring data consistency, compliance with contracts, and alignment with business expectations. Write and manage test plans, test cases, and test data for structured, semi-structured, and unstructured data. Track, manage, and report defects using tools like JIRA, ensuring thorough root cause analysis and timely resolution. Collaborate with Data Engineers, Product Managers, and DevOps teams to integrate tests into CI/CD pipelines and enable shift-left testing practices. Ensure comprehensive test coverage for all aspects of the data lifecycle, including ingestion, transformation, delivery, and consumption. Participate in QA ceremonies (standups, planning, retrospectives) and continuously contribute to improving the QA process and culture. Experience building or maintainingtest data generators Contributions to internal quality dashboards or data observability systems Awareness of metadata-driven testing approaches and lineage-based validations Experience working with agile Testing methodologies such as Scaled Agile. Familiarity with automated testing frameworks like Selenium, JUnit, TestNG, or PyTest. Must-Have Skills: 6-9 years of experience in QA roles, with at least 3+ yearsof strong exposure to data pipeline testing and ETL validation. Strong in SQL, Python, and optionally PySpark comfortable with writing complex queries and validation scripts. Practical experience with manual validation of data pipelines and source-to-target testing. Experience in validating GraphQL APIs, semantic layers (Looker, dbt, etc.), and schema/data contract compliance. Familiarity with data integration tools and platforms such as Databricks, AWS Glue, Redshift, Athena, or BigQuery. Strong understanding of test planning, defect tracking, bug lifecycle management, and QA documentation. Experience working in Agile/Scrum environments with standard QA processes. Knowledge of test case and defect management tools (e.g., JIRA, TestRail, Zephyr). Strong understanding of QA methodologies, test planning, test case design, and defect lifecycle management. Deep hands-on expertise in SQL, Python, and PySpark for testing and automating validation. Proven experience in manual and automated testing of batch and real-time data pipelines. Familiarity with data processing and analytics stacks: Databricks, Spark, AWS (Glue, S3, Athena, Redshift). Experience with bug tracking and test management tools like JIRA, TestRail, or Zephyr. Ability to troubleshoot data issues independently and collaborate with engineering for root cause analysis. Experience integrating automated tests into CI/CD pipelines (e.g., Jenkins, GitHub Actions). Experience validating data from various file formats such as JSON, CSV, Parquet, and Avro Strong ability to validate and automate data quality checks: schema validation, null checks, duplicates, thresholds, and transformation validation Hands-on experience with API testing using Postman, pytest, or custom automation scripts Good-to-Have Skills: Experience with data governance tools such as Apache Atlas, Collibra, or Alation Familiarity with monitoring/observability tools such as Datadog, Prometheus, or CloudWatch Education and Professional Certifications Bachelors/Masters degree in computer science and engineering preferred. Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills.

Posted 1 week ago

Apply

5.0 - 10.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

PositionSenior AI/ML Engineer - GCP LocationHyderabad/Bangalore/Pune Work ModeHybrid About the Company: Relanto is a global advisory, consulting, and technology services partner, empowering customers to accelerate innovation by harnessing the power of Data and AI, Automation, NextGen Planning, and Cloud Solutions. Overview We're seeking an experienced individual (5-10 years) specializing in AI/ML to build and deploy cloud-based machine learning solutions. You'll work with Google Cloud Platform services to create scalable AI systems and APIs that integrate with various ML models. What You'll Do: Design and implement end-to-end ML pipelines in GCP Build and optimize AI models for production deployment using Vertex AI Develop RESTful APIs for ML model serving Implement vector search capabilities using BigQuery ML Create automated testing and deployment pipelines for ML models Set up model monitoring and performance tracking Optimize model inference and serving capabilities Must Have Skills: Strong Python programming with ML frameworks (PyTorch, TensorFlow) Experience with large language models and prompt engineering Proficiency in GCP AI services (Vertex AI, Cloud ML Engine) Vector search implementation (BigQuery ML, Matching Engine) RESTful API development with FastAPI/Flask Container orchestration with Docker and Google Kubernetes Engine (GKE) CI/CD pipeline experience for ML workflows Why Join Us: Competitive salary and benefits. Opportunities for professional growth and development. Collaborative and inclusive work culture. Flexible working arrangements for work-life balance. Exposure to innovative data analytics and visualization projects.

Posted 1 week ago

Apply

5.0 - 9.0 years

20 - 25 Lacs

Pune

Work from Office

Naukri logo

Primary Responsibilities Provide engineering leadership, mentorship, technical direction to small team of other engineers (~6 members). Partner with your Engineering Manager to ensure engineering tasks understood, broken down and implemented to the highest of quality standards. Collaborate with members of the team to solve challenging engineering tasks on time and with high quality. Engage in code reviews and training of team members. Support continuous deployment pipeline code. Situationally troubleshoot production issues alongside the support team. Continually research and recommend product improvements. Create and integrate features for our enterprise software solution using the latest Python technologies. Assist and adhere to enforcement of project deadlines and schedules. Evaluate, recommend, and proposed solutions to existing systems. Actively communicate with team members to clarify requirements and overcome obstacles to meet the team goals. Leverage open-source and other technologies and languages outside of the Python platform. Develop cutting-edge solutions to maximize the performance, scalability, and distributed processing capabilities of the system. Provide troubleshooting and root cause analysis for production issues that are escalated to the engineering team. Work with development teams in an agile context as it relates to software development, including Kanban, automated unit testing, test fixtures, and pair programming. Requirement of 4-8or more years experience as a Python developer on enterprise projects using Python, Flask, FastAPI, Django, PyTest, Celery and other Python frameworks. Software development experience includingobject-oriented programming, concurrency programming, modern design patterns, RESTful service implementation, micro-service architecture, test-driven development, and acceptance testing. Familiarity with tools used to automate the deployment of an enterprise software solution to the cloud, Terraform, GitHub Actions, Concourse, Ansible, etc. Proficiency with Git as a version control system Experience with Docker and Kubernetes Experience with relational SQL and NoSQL databases, including MongoDB and MSSQL. Experience with object-oriented languagesPython, Java, Scala, C#, etc. Experience with testing tools such as PyTest, Wiremock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as BigQuery, GKE, GCS, DataFlow, Kubeflow, and/or VertexAI Excellent problem solving and communication skills. Experience with Java and Spring a big plus. Disability Accommodation: UKGCareers@ukg.com.

Posted 1 week ago

Apply

7.0 - 10.0 years

12 - 16 Lacs

Bengaluru

Hybrid

Naukri logo

Primary skill: GCP (Dataproc, Bigquery) Secondary: Python, Spark

Posted 1 week ago

Apply

5.0 - 6.0 years

6 - 11 Lacs

Chennai

Work from Office

Naukri logo

Position: Data Analytics Engineer Exp: 5 -6 years NP: Immediate - 30 days Qualification: B.tech Location: Chennai Hybrid Primary: Google Cloud Platform ,Python skills and Big Data pipeline Secondary: Big Query SQ, coding, testing, implementing, debugging workflows and apps Kindly share your updated resume to aishwarya_s@onwardgroup.com Kindly fill the below details Total Exp: Relevant Exp: Notice Period: CTC: ECTC: If servicing NP, Last working Day, offered location & CTC: Available for Video modes interview on Weekdays (Y/N) : PAN Number: Name as Per PAN Card: Date of Birth: Alternative Contact No: Reason for Job Change:

Posted 1 week ago

Apply

4.0 - 9.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII: At Target, we have a timeless purpose and a proven strategy. And that hasn t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target s global team and has more than 4,000 team members supporting the company s global strategy and operations. Team Overview: Every time a guest enters a Target store or browses Target.com nor the app, they experience the impact of Target s investments in technology and innovation. We re the technologists behind one of the most loved retail brands, delivering joy to millions of our guests, team members, and communities. Join our global in-house technology team of more than 5,000 of engineers, data scientists, architects and product managers striving to make Target the most convenient, safe and joyful place to shop. We use agile practices and leverage open-source software to adapt and build best-in-class technology for our team members and guests and we do so with a focus on diversity and inclusion, experimentation and continuous learning. At Target, we are gearing up for exponential growth and continuously expanding our guest experience. To support this expansion, Data Engineering is building robust warehouses and enhancing existing datasets to meet business needs across the enterprise. We are looking for talented individuals who are passionate about innovative technology, data warehousing and are eager to contribute to data engineering. . Position Overview Assess client needs and convert business requirements into business intelligence (BI) solutions roadmap relating to complex issues involving long-term or multi-work streams. Analyze technical issues and questions identifying data needs and delivery mechanisms Implement data structures using best practices in data modeling, ETL/ELT processes, Spark, Scala, SQL, database, and OLAP technologies Manage overall development cycle, driving best practices and ensuring development of high quality code for common assets and framework components Develop test-driven solutions and provide technical guidance and heavily contribute to a team of high caliber Data Engineers by developing test-driven solutions and BI Applications that can be deployed quickly and in an automated fashion. Manage and execute against agile plans and set deadlines based on client, business, and technical requirements Drive resolution of technology roadblocks including code, infrastructure, build, deployment, and operations Ensure all code adheres to development & security standards About you 4 year degree or equivalent experience 5+ years of software development experience preferably in data engineering/Hadoop development (Hive, Spark etc.) Hands on Experience in Object Oriented or functional programming such as Scala / Java / Python Knowledge or experience with a variety of database technologies (Postgres, Cassandra, SQL Server) Knowledge with design of data integration using API and streaming technologies (Kafka) as well as ETL and other data Integration patterns Experience with cloud platforms like Google Cloud, AWS, or Azure. Hands on Experience on BigQuery will be an added advantage Good understanding of distributed storage(HDFS, Google Cloud Storage, Amazon S3) and processing(Spark, Google Dataproc, Amazon EMR or Databricks) Experience with CI/CD toolchain (Drone, Jenkins, Vela, Kubernetes) a plus Familiarity with data warehousing concepts and technologies. Maintains technical knowledge within areas of expertise Constant learner and team player who enjoys solving tech challenges with global team. Hands on experience in building complex data pipelines and flow optimizations Be able to understand the data, draw insights and make recommendations and be able to identify any data quality issues upfront Experience with test-driven development and software test automation Follow best coding practices & engineering guidelines as prescribed Strong written and verbal communication skills with the ability to present complex technical information in a clear and concise manner to variety of audiences Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging

Posted 1 week ago

Apply

3.0 - 5.0 years

32 - 37 Lacs

Pune

Work from Office

Naukri logo

: J ob Title AI Engineer Corporate TitleAssistant Vice President LocationPune, India Role Description Overview We are seeking a talented and experienced AI Engineer to join our team. The ideal candidate will be hands-on and drive design, development, and implementation of AI based solutions for CB Tech. This role involves working with large datasets, conducting experiments, and staying updated with the latest advancements in AI and Machine Learning. This person is expected to innovate and lead the CB Tech efforts in modernizing the engineering landscape by identifying AI use cases and provide local support. If you are actively coding, have a passion for AI and want to be part of developing innovative products then apply today. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, develop and implement AI and Gen-AI based Agentic software systems on the cloud. Collaborate with other development teams and SMEs to integrate shared services into products. Learn Deutsche Banks AI Governance framework and operate within safe AI principles. Leverage architecture decision trees to pick strategic AI patterns to solve business problems. Integrate Gen-AI APIs with cloud-native presentation (GKE, Cloud Run) and persistent layers (PostgreSQL, BQ). Run systems at scale while continuing to innovate and evolve. Work with data engineers and scientists to ensure effective data collection and preparation for training AI models. Continuously monitor the performance of AI solutions and implement improvements. Lead training sessions and create comprehensive documentation to empower end users. Function as an active member of an agile team. Your skills and experience Skills Youll Need AI ExpertiseProficiency in frameworks (Langchain, Streamlit or similar), libraries (Scikit-learn or similar) and cloud platforms (Vertex AI or OpenAI). Prompt Engineering & RAGSkills in crafting effective prompts and enhancing AI outputs with external data integration. NLP KnowledgeStrong understanding of natural language processing and conversational AI technologies. Deployment & OperationsExperience in model deployment, monitoring, optimization (MLOps), and problem-solving. Proficiency with cloud-native orchestration systems (Docker/Kubernetes). Proficiency in Python or Java, and SQL. Knowledge of RESTful design. Experience working with different types of enterprise and real-world data sets structured, semi-structured and unstructured data. Experience putting ML/AI into production, and ability to talk through best practices and pitfalls. Relationship and consensus building skills. Skills That Will Help You Excel Stakeholder CommunicationAbility to explain AI concepts to non-technical audiences and collaborate cross-functionally. Adaptability & InnovationFlexibility in learning new tools and developing innovative solutions. Experience with cloud-native databases/warehouses (PG and BigQuery). Experience in data visualization and observability with a focus on real time serving and monitoring of time series data with alerts. Thought Leadership & AdvocacyDevelop awareness of industry developments and best practices Provide thought leadership in emerging technologies as they relate to AI topics. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 1 week ago

Apply

3.0 - 5.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

: Job TitleGCP Data Engineer, AS LocationPune, India Corporate TitleAssociate Role Description An Engineer is responsible for designing and developing entire engineering solutions to accomplish business goals. Key responsibilities of this role include ensuring that solutions are well architected, with maintainability and ease of testing built in from the outset, and that they can be integrated successfully into the end-to-end business process flow. They will have gained significant experience through multiple implementations and have begun to develop both depth and breadth in several engineering competencies. They have extensive knowledge of design and architectural patterns.They will provide engineering thought leadership within their teams and will play a role in mentoring and coaching of less experienced engineers. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, develop and maintain data pipelines using Python and SQL programming language on GCP. Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills. Work with Cloud Composer to manage and process batch data jobs efficiently. Develop and optimize complex SQL queries for data analysis, extraction, and transformation. Develop and deploy google cloud services using Terraform. Implement CI CD pipeline using GitHub Action Consume and Hosting REST API using Python. Monitor and troubleshoot data pipelines, resolving any issues in a timely manner. Ensure team collaboration using Jira, Confluence, and other tools. Ability to quickly learn new any existing technologies Strong problem-solving skills. Write advanced SQL and Python scripts. Certification on Professional Google Cloud Data engineer will be an added advantage. Your skills and experience 6+ years of IT experience, as a hands-on technologist. Proficient in Python for data engineering. Proficient in SQL. Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run and well to have GKE Hands on experience in REST API hosting and consumptions. Proficient in Terraform/ Hashicorp. Experienced in GitHub and Git Actions Experienced in CI-CD Experience in automating ETL testing using python and SQL. Good to have API knowledge Good to have Bit Bucket How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 1 week ago

Apply

3.0 - 5.0 years

32 - 40 Lacs

Pune

Work from Office

Naukri logo

: Job TitleSenior Engineer, VP LocationPune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the endto-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities The candidate is expected to Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your skills and experience Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience in Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integrationpatterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such asCI/CD pipelines using Jenkins, Git Actions etc Experience on leading teams and mentoring developers Key Skill: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about Target in India At Target, we have a timeless purpose and a proven strategy. And that hasn t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target s global team and has more than 4,000 team members supporting the company s global strategy and operations. About the Role As a Senior RBX Data Specialist at Target in India, involves the end-to-end management of data, encompassing building and maintaining pipelines through ETL/ELT and data modeling, ensuring data accuracy and system performance, and resolving data flow issues. It also requires analyzing data to generate insights, creating visualizations for stakeholders, automating processes for efficiency, and effective collaboration across both business and technical teams. You will also answer ad-hoc questions from your business users by conducting quick analysis on relevant data, identify trends and correlations, and form hypotheses to explain the observations. Some of this will lead to bigger projects of increased complexity, where you will have to work as a part of a bigger team, but also independently execute specific tasks. Finally, you are expected to always adhere to project schedule and technical rigor as well as requirements for documentation, code versioning, etc Key Responsibilities Data Pipeline and MaintenanceMonitor data pipelines and warehousing systems to ensure optimal health and performance. Ensure data integrity and accuracy throughout the data lifecycle. Incident Management and ResolutionDrive the resolution of data incidents and document their causes and fixes, collaborating with teams to prevent recurrence. Automation and Process ImprovementIdentify and implement automation opportunities and Data Ops best practices to enhance the efficiency, reliability, and scalability of data processes. Collaboration and CommunicationWork closely with data teams and stakeholders, to understand data pipeline architecture and dependencies, ensuring timely and accurate data delivery while effectively communicating data issues and participating in relevant discussions. Data Quality and GovernanceImplement and enforce data quality standards, monitor metrics for improvement, and support data governance by ensuring policy compliance. Documentation and ReportingCreate and maintain clear and concise documentation of data pipelines, processes, and troubleshooting steps. Develop and generate reports on data operations performance and key metrics. Core responsibilities are described within this job description. Job duties may change at any time due to business needs. About You B.Tech / B.E. or equivalent (completed) degree 5+ years of relevant work experience Experience in Marketing/Customer/Loyalty/Retail analytics is preferable Exposure to A/B testing Familiarity with big data technologies, data languages and visualization tools Exposure to languages such as Python and R for data analysis and modelling Proficiency in SQL for data extraction, manipulation, and analysis, with experience in big data query frameworks such as Hive, Presto, SQL, or BigQuery Solid foundation knowledge in mathematics, statistics, and predictive modelling techniques, including Linear Regression, Logistic Regression, time-series models, and classification techniques. Ability to simplify complex technical and analytical methodologies for easier comprehension for broad audiences. Ability to identify process and tool improvements and implement change Excellent written and verbal English communication skills for Global working Motivation to initiate, build and maintain global partnerships Ability to function in group and/or individual settings. Willing and able to work from our office location (Bangalore HQ) as required by business needs and brand initiatives Useful Links- Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging

Posted 1 week ago

Apply

3.0 - 5.0 years

30 - 35 Lacs

Pune

Work from Office

Naukri logo

: Job TitleDevOps Engineer, AVP LocationPune, India Role Description We are seeking a highly skilled and experienced DevOps Engineer to join our team, with a focus on Google Cloud as we migrate and build the financial crime risk platforms on the cloud. The successful candidate will be responsible for designing, implementing, and maintaining our teams infrastructure and workflows on Google Cloud Platforms. This is a unique opportunity to work at the intersection of software development, infrastructure management and to contribute to the growth and success of our team. DevOps Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, implement, and maintain our teams infrastructure and workflows on Google Cloud Platform, including GCP services such as Google Kubernetes Engine (GKE), Cloud Storage, Vertex AI, Anthos, Monitoring etc. Design, implement, and maintain our containerization and orchestration strategy using Docker and Kubernetes. Collaborate with development teams to ensure seamless integration of containerized applications into our production environment. Collaborate with software developers to integrate machine learning models and algorithms into our products, using PyTorch, TensorFlow or other machine learning frameworks. Develop and maintain CI/CD pipelines for our products, using tools such as GitHub and GitHub actions. Create and maintain Infrastructure as Code templates using Terraform. Ensure the reliability, scalability, and security of our infrastructure and products, using monitoring and logging tools such as Anthos Service Mesh (ASM), Google Cloud's operations (GCO) etc. Work closely with other teams, such as software development, data science, and product management, to identify and prioritize infrastructure and machine learning requirements. Stay up to date with the latest developments in Google Cloud Platform and machine learning and apply this knowledge to improve our products and processes. Your skills and experience Bachelors degree in computer science, Engineering, or a related field. At least 3 years of experience in a DevOps or SRE role, with a focus on Google Cloud Platform. Strong experience with infrastructure as code tools such as Terraform or Cloud Formation. Experience with containerization technologies such as Docker and container orchestration tools such as Kubernetes. Knowledge of machine learning frameworks such as TensorFlow or PyTorch. Experience with CI/CD pipelines and automated testing. Strong understanding of security and compliance best practices, including GCP security and compliance features. Excellent communication and collaboration skills, with the ability to work closely with cross-functional teams Preferred Qualifications Masters degree in computer science, Engineering, or a related field. Knowledge of cloud-native application development, including serverless computing and event-driven architecture. Experience with cloud cost optimization and resource management. Familiarity with agile software development methodologies and version control systems such as Git How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 week ago

Apply

8.0 - 12.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1581_JOB Date Opened 25/11/2022 Industry Technology Job Type Work Experience 8-12 years Job Title Senior Specialist- Data Engineer City Pune Province Maharashtra Country India Postal Code 411001 Number of Positions 4 Location:Pune/ Mumbai/ Bangalore/ Chennai Roles & Responsibilities: Total 8-10 years of working experience Experience/Needs 8-10 Years of experience with big data tools like Spark, Kafka, Hadoop etc. Design and deliver consumer-centric high performant systems. You would be dealing with huge volumes of data sets arriving through batch and streaming platforms. You will be responsible to build and deliver data pipelines that process, transform, integrate and enrich data to meet various demands from business Mentor team on infrastructural, networking, data migration, monitoring and troubleshooting aspects Focus on automation using Infrastructure as a Code (IaaC), Jenkins, devOps etc. Design, build, test and deploy streaming pipelines for data processing in real time and at scale Experience with stream-processing systems like Storm, Spark-Streaming, Flink etc.. Experience with object-oriented/object function scripting languagesScala, Java, etc. Develop software systems using test driven development employing CI/CD practices Partner with other engineers and team members to develop software that meets business needs Follow Agile methodology for software development and technical documentation Good to have banking/finance domain knowledge Strong written and oral communication, presentation and interpersonal skills. Exceptional analytical, conceptual, and problem-solving abilities Able to prioritize and execute tasks in a high-pressure environment Experience working in a team-oriented, collaborative environment 8-10 years of hand on coding experience Proficient in Java, with a good knowledge of its ecosystems Experience with writing Spark code using scala language Experience with BigData tools like Sqoop, Hive, Pig, Hue Solid understanding of object-oriented programming and HDFS concepts Familiar with various design and architectural patterns Experience with big data toolsHadoop, Spark, Kafka, fink, Hive, Sqoop etc. Experience with relational SQL and NoSQL databases like MySQL, PostgreSQL, Mongo dB and Cassandra Experience with data pipeline tools like Airflow, etc. Experience with AWS cloud servicesEC2, S3, EMR, RDS, Redshift, BigQuery Experience with stream-processing systemsStorm, Spark-Streaming, Flink etc. Experience with object-oriented/object function scripting languagesPython, Java, Scala, etc. Expertise in design / developing platform components like caching, messaging, event processing, automation, transformation and tooling frameworks check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies