Jobs
Interviews

1356 Bigquery Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Cloud Data Modeler with 6 to 9 years of experience in GCP environments, you will play a crucial role in designing schema architecture, creating performance-efficient data models, and guiding teams on cloud-based data integration best practices. Your expertise will be focused on GCP data platforms such as BigQuery, CloudSQL, and AlloyDB. Your responsibilities will include architecting and implementing scalable data models for cloud data warehouses and databases, optimizing OLTP/OLAP systems for reporting and analytics, supporting cloud data lake and warehouse architecture, and reviewing and optimizing existing schemas for cost and performance on GCP. You will also be responsible for defining documentation standards, ensuring model version tracking, and collaborating with DevOps and DataOps teams for deployment consistency. Key Requirements: - Deep knowledge of GCP data platforms including BigQuery, CloudSQL, and AlloyDB - Expertise in data modeling, normalization, and dimensional modeling - Understanding of distributed query engines, table partitioning, and clustering - Familiarity with DBSchema or similar tools Preferred Skills: - Prior experience in BFSI or asset management industries - Working experience with Data Catalogs, lineage, and governance tools Soft Skills: - Collaborative and consultative mindset - Strong communication and requirements gathering skills - Organized and methodical approach to data architecture challenges By joining our team, you will have the opportunity to contribute to modern data architecture in a cloud-first enterprise, influence critical decisions around GCP-based data infrastructure, and be part of a future-ready data strategy implementation team.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a skilled Data Modeler with expertise in using DBSchema within GCP environments. In this role, you will be responsible for creating and optimizing data models for both OLTP and OLAP systems, ensuring they are well-designed for performance and maintainability. Your key responsibilities will include developing conceptual, logical, and physical models using DBSchema, aligning schema design with application requirements, and optimizing models in BigQuery, CloudSQL, and AlloyDB. Additionally, you will be involved in supporting schema documentation, reverse engineering, and visualization tasks. Your must-have skills for this role include proficiency in using the DBSchema modeling tool, strong experience with GCP databases such as BigQuery, CloudSQL, and AlloyDB, as well as knowledge of OLTP and OLAP system structures and performance tuning. It is essential to have expertise in SQL and schema evolution/versioning best practices. Preferred skills include experience integrating DBSchema with CI/CD pipelines and knowledge of real-time ingestion pipelines and federated schema design. As a Data Modeler, you should possess soft skills such as being detail-oriented, organized, and communicative. You should also feel comfortable presenting schema designs to cross-functional teams. By joining this role, you will have the opportunity to work with industry-leading tools in modern GCP environments, enhance modeling workflows, and contribute to enterprise data architecture with visibility and impact.,

Posted 2 weeks ago

Apply

10.0 - 15.0 years

0 Lacs

delhi

On-site

You are looking for a Senior Data Architect to join the team at Wingify in Delhi. As a Senior Data Architect, you will be responsible for leading and mentoring a team of data engineers, optimizing scalable data infrastructure, driving data governance frameworks, collaborating with cross-functional teams, and ensuring data security, compliance, and quality. Your role will involve optimizing data processing workflows, fostering a culture of innovation and technical excellence, and aligning technical strategy with business objectives. To be successful in this role, you should have at least 10 years of experience in software/data engineering, with a minimum of 3 years in a leadership position. You should possess expertise in backend development using programming languages like Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS. Proficiency in SQL, Python, and Scala for data processing and analytics is essential, along with a strong understanding of cloud platforms such as AWS, GCP, or Azure and their data services. Additionally, you should have experience with big data technologies like Spark, Hadoop, Kafka, and distributed computing frameworks, as well as hands-on experience with data warehousing solutions like Snowflake, Redshift, or BigQuery. Deep knowledge of data governance, security, and compliance, along with familiarity with NoSQL databases and automation/DevOps tools, is required. Strong leadership, communication, and stakeholder management skills are crucial for this role. Preferred qualifications include experience in machine learning infrastructure or MLOps, exposure to real-time data processing and analytics, and interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture. Prior experience in a SaaS or high-growth tech company would be advantageous. Please note that candidates must have a minimum of 10 years of experience to be eligible for this role. Graduation from Tier - 1 colleges, such as IIT, is preferred. Candidates from B2B Product Companies with High data-traffic are encouraged to apply, while those who do not meet these criteria are kindly requested not to apply.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As an Enablement Engineer (Analytics) at Adobe, you will play a crucial role in the Growth and Martech Engineering team by implementing Analytics and Experience Cloud solutions for Adobe.com and other digital platforms. Your responsibilities will include designing and executing scalable web data collection and tagging strategies using AEP's Data collection tools, JavaScript, and other supporting tools. Your key responsibilities will involve collaborating with marketing, product, and analytics teams to gather tracking requirements and translate them into actionable Tagging Specification documents (TSDs). You will be responsible for designing, implementing, and managing scalable tagging strategies using Adobe Launch/Tags and AEP Web SDK to ensure accurate data capture. Additionally, you will work closely with engineering teams to align data layer implementations with XDM schema and business logic. In this role, you will configure AEP components, including schemas, datasets, and ingestion pipelines, while also partnering with QA to validate and fix tag deployments using tools like AEP Debugger, Charles Proxy, and browser DevTools. Ensuring data governance and privacy compliance (e.g., GDPR, CCPA) will be a crucial aspect of your responsibilities. You will also enable integrations with Martech platforms such as Adobe Target, Audience Manager, and third-party tools like Google, Meta, and Contentsquare. Furthermore, you will provide documentation, training, and ongoing support to multi-functional teams on standard methodologies for digital data instrumentation. To excel in this role, you should be proficient in Adobe Experience Platform (AEP), including Web SDK (Alloy.js), Real-Time CDP, and XDM schema design. Hands-on experience with Adobe Data Collection (Launch or Tags) or equivalent Tag Management Systems (TMS) is required. Solid understanding of Adobe Analytics and Customer Journey Analytics implementation and reporting components is essential. Additionally, skills in Front-End technologies like JavaScript, HTML, DOM manipulation, and familiarity with SPA frameworks such as React or Angular are necessary. Deep knowledge of data layer architecture, governance, and debugging tools like AEP Debugger, network sniffers, and browser developer consoles will be beneficial. Strong communication skills are crucial for effective collaboration with both technical and non-technical partners. The ideal candidate should have at least 6+ years of experience in Digital Analytics implementation or Martech engineering, along with a Bachelor's degree or equivalent experience in Computer Science or a related field, with a minimum of 8 years of practical experience. Experience with Consent Management Platforms (CMPs), server-side tagging, or integrations with platforms like Snowflake or BigQuery will be advantageous. Adobe certifications in AEP or Adobe Analytics are preferred. Adobe is committed to ensuring accessibility for all users. If you require accommodation to navigate the website or complete the application process due to a disability or special need, please contact accommodations@adobe.com or call (408) 536-3015.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Senior Specialist in Global Data Science at Colgate-Palmolive, you will play a crucial role in the GLOBAL DATA SCIENCE & ADVANCED ANALYTICS vertical. This department focuses on working on business cases with significant financial impacts for the company, providing solutions to business questions, recommended actions, and scalability options across markets. Your position as a Data Scientist will involve leading projects within the Analytics Continuum, where you will be responsible for conceptualizing and developing machine learning, predictive modeling, simulations, and optimization solutions aimed at achieving clear financial objectives for Colgate-Palmolive. Your responsibilities will include building predictive modeling solutions, applying ML and AI algorithms to analytics, developing end-to-end business solutions from data extraction to building business presentations, conducting model validations, and continuous improvement of algorithms. You will also deploy models using Airflow and Docker on Google Cloud Platforms, own Pricing and Promotion, Marketing Mix projects, and present insights to business teams in an easily interpretable manner. To qualify for this role, you should have a BE/BTECH in Computer Science or Information Technology, an MBA or PGDM in Business Analytics or Data Science, additional certifications in Data Science, or an MSC/MSTAT in Economics or Statistics. You should have at least 5 years of experience in building statistical models, hands-on experience with coding languages such as Python and SQL, and knowledge of visualization frameworks like PyDash, Flask, and PlotLy. Understanding of Cloud Frameworks like Google Cloud and Snowflake is essential. Preferred qualifications include experience in managing statistical models for Revenue Growth Management or Marketing Mix models, familiarity with third-party data sources, knowledge of machine learning techniques, and experience with Google Cloud products. At Colgate-Palmolive, we are committed to fostering an inclusive environment where individuals with diverse backgrounds and perspectives can thrive. Our goal is to develop talent that best serves our consumers globally and ensure that everyone feels a sense of belonging within our organization. We are an Equal Opportunity Employer dedicated to empowering all individuals to contribute meaningfully to our business.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a versatile Data Model Developer with 6 to 9 years of experience, proficient in designing robust data models across cloud (GCP) and traditional RDBMS environments. Your role involves collaborating with cross-functional teams to develop schemas that cater to both operational systems and analytical use cases. Your key responsibilities include designing and implementing scalable data models for cloud (GCP) and traditional RDBMS, supporting hybrid data architectures integrating real-time and batch workflows, collaborating with engineering teams for seamless schema implementation, documenting conceptual, logical, and physical models, assisting in ETL and data pipeline alignment with schema definitions, and monitoring and refining performance through partitioning and indexing strategies. You must have experience with GCP data services like BigQuery, CloudSQL, AlloyDB, proficiency in relational databases such as PostgreSQL, MySQL, or Oracle, solid grounding in OLTP/OLAP modeling principles, familiarity with schema design tools like DBSchema, ER/Studio, and SQL expertise for query performance optimization. Preferred skills include experience working in hybrid cloud/on-prem data architectures, functional knowledge in BFSI or asset management domains, and knowledge of metadata management and schema versioning. Soft skills required for this role include adaptability to cloud and legacy tech stacks, clear communication with engineers and analysts, and strong documentation and collaboration skills. Joining this role will allow you to contribute to dual-mode data architecture (cloud + on-prem), solve real-world data design challenges in regulated industries, and have the opportunity to influence platform migration and modernization.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Job Description: Our technology services client is looking for multiple GCP Data Developers specializing in BigQuery and Pyspark to become a part of their team on a Full-Time basis. This role requires a strong background in GCP, BigQuery, Pyspark, and SQL, along with 5-8 years of relevant experience. The positions are based in Pune and Hyderabad with an immediate to 30 days notice period. As a GCP Data Developer, your responsibilities will include creating and upholding database standards and policies, ensuring database availability and performance, defining and implementing event triggers for potential performance or integrity issues, performing database housekeeping tasks, and monitoring usage transaction volumes, response times, and concurrency levels. If you find this opportunity exciting, please share your updated resume with us at pavan.k@s3staff.com.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are looking for a Data Modelling Consultant with 6 to 9 years of experience to work in Chennai office. As a Data Modelling Consultant, your role will involve providing end-to-end modeling support for OLTP and OLAP systems hosted on Google Cloud. Your responsibilities will include designing and validating conceptual, logical, and physical models for cloud databases, translating requirements into efficient schema designs, and supporting data model reviews, tuning, and implementation. You will also guide teams on best practices for schema evolution, indexing, and governance to enable usage of models in real-time applications and analytics platforms. To succeed in this role, you must have strong experience in modeling across OLTP and OLAP systems, hands-on experience with GCP tools like BigQuery, CloudSQL, and AlloyDB, and the ability to understand business rules and translate them into scalable structures. Additionally, familiarity with partitioning, sharding, materialized views, and query optimization is essential. Preferred skills for this role include experience with BFSI or financial domain data schemas, familiarity with modeling methodologies and standards such as 3NF and star schema. Soft skills like excellent stakeholder communication, collaboration, strategic thinking, and attention to scalability are also important. Joining this role will allow you to deliver advisory value across critical data initiatives, influence the modeling direction for a data-driven organization, and be at the forefront of GCP-based enterprise data transformation.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Cloud Data Architect specializing in BigQuery and CloudSQL at our Chennai office, you will play a crucial role in leading the design and implementation of scalable, secure, and high-performing data architectures using Google Cloud technologies. Your expertise will be essential in shaping architectural direction and ensuring that data solutions meet enterprise-grade standards. Your responsibilities will include designing data architectures that align with performance, cost-efficiency, and scalability needs, implementing data models, security controls, and access policies across GCP platforms, leading cloud database selection, schema design, and tuning for analytical and transactional workloads, collaborating with DevOps and DataOps teams to deploy and manage data environments, ensuring best practices for data governance, cataloging, and versioning, and enabling real-time and batch integrations using GCP-native tools. To excel in this role, you must possess deep knowledge of BigQuery, CloudSQL, and the GCP data ecosystem, along with strong experience in schema design, partitioning, clustering, and materialized views. Hands-on experience in implementing data encryption, IAM policies, and VPC configurations is crucial, as well as an understanding of hybrid and multi-cloud data architecture strategies and data lifecycle management. Proficiency in GCP cost optimization is also required. Preferred skills for this role include experience with AlloyDB, Firebase, or Spanner, familiarity with LookML, dbt, or DAG-based orchestration tools, and exposure to the BFSI domain or financial services architecture. In addition to technical skills, soft skills such as visionary thinking with practical implementation skills, strong communication, and cross-functional leadership are highly valued. Previous experience guiding data strategy in enterprise settings will be advantageous. Joining our team will give you the opportunity to own data architecture initiatives in a cloud-native ecosystem, drive innovation through scalable and secure GCP designs, and collaborate with forward-thinking data and engineering teams. Skills required for this role include IAM policies, Spanner, cloud, schema design, data architecture, GCP data ecosystem, dbt, GCP cost optimization, data, AlloyDB, data encryption, data lifecycle management, BigQuery, LookML, VPC configurations, partitioning, clustering, materialized views, DAG-based orchestration tools, Firebase, and CloudSQL.,

Posted 2 weeks ago

Apply

8.0 - 13.0 years

15 - 25 Lacs

Pune, Chennai, Bengaluru

Work from Office

Exp - 8 to 12 years Experience with Google Cloud, security and compliance, as well as Terraform experience. The team supports all 4 major public cloud environments (Google/Azure/IBM/AWS) but for this position Google is preferred. Minimum Qualifications: Knowledge of public Cloud Architecture and solutions (Google Cloud specifically, Azure, IBM and/or AWS as well) Experience in Python and PowerShell programing/scripting Experience with Terraform Excellent time and project management skills Influencing skills Multitasking Process Analysis Mandatory Skills Google cloud infrastructure, security and compliance, GKE, Kubernetes, Terraform, Desired Skills Azure, IBM, AWS Cloud, Grafana, Prometheus, Logi, Python

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Python (Programming Language), Apache Spark, Google BigQueryMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, contributing to the overall success of the projects you are involved in. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Collaborate with cross-functional teams to gather requirements and provide technical insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Python (Programming Language), Apache Spark, Google BigQuery.- Strong understanding of data processing and transformation techniques.- Experience in developing scalable applications using distributed computing frameworks.- Familiarity with cloud platforms and services related to application deployment. Additional Information:- The candidate should have minimum 3 years of experience in PySpark.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, contribute to key decisions, and manage the development process to deliver high-quality applications that enhance operational efficiency and user experience. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data warehousing concepts and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing platforms and services.- Ability to analyze and optimize query performance. Additional Information:- The candidate should have minimum 5 years of experience in Google BigQuery.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Google BigQuery Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress, address any challenges that arise, and facilitate communication among team members to foster a productive work environment. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation for application processes and workflows. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data warehousing concepts and architecture.- Experience with SQL and data manipulation techniques.- Familiarity with cloud computing platforms and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Google BigQuery.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Project Role : DevOps Engineer Project Role Description : Responsible for building and setting up new development tools and infrastructure utilizing knowledge in continuous integration, delivery, and deployment (CI/CD), Cloud technologies, Container Orchestration and Security. Build and test end-to-end CI/CD pipelines, ensuring that systems are safe against security threats. Must have skills : Google Cloud Compute Services Good to have skills : Google BigQuery, Google Kubernetes EngineMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a DevOps Engineer, you will be responsible for building and setting up new development tools and infrastructure. A typical day involves utilizing your expertise in continuous integration, delivery, and deployment, while also focusing on cloud technologies, container orchestration, and security measures. You will engage in building and testing end-to-end CI/CD pipelines, ensuring that systems are secure and efficient, and collaborating with various teams to enhance the development process. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to identify and resolve issues in the development process.- Implement best practices for CI/CD pipelines to enhance efficiency and security. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Compute Services.- Good To Have Skills: Experience with Google BigQuery, Google Kubernetes Engine.- Strong understanding of container orchestration and management.- Experience with security protocols and best practices in cloud environments.- Familiarity with monitoring and logging tools to ensure system reliability. Additional Information:- The candidate should have minimum 3 years of experience in Google Cloud Compute Services.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

6.0 - 11.0 years

18 - 33 Lacs

Bengaluru

Remote

Job Title: Senior Data Engineer Experience: 5+ Years Location: [Remote ] About the Role We are seeking a highly skilled and experienced Senior Data Engineer to join our team. The ideal candidate will have strong hands-on experience with Google Cloud Platform (GCP) , Python , BigQuery , and Apache Airflow . You will be responsible for building and maintaining scalable, efficient, and reliable data pipelines and workflows to support key data initiatives across the organization. Key Responsibilities Design, build, and maintain robust data pipelines and ETL workflows using Apache Airflow and GCP services . Work extensively with BigQuery for data modeling, query optimization, and large-scale data analysis. Develop scalable Python scripts for data extraction, transformation, and automation. Ensure high levels of data integrity, quality, and availability across systems. Collaborate closely with Data Analysts, Data Scientists, and Engineering teams to gather requirements and deliver data-driven solutions. Participate in architecture design, code reviews, and performance tuning. Maintain clear documentation of systems, processes, and data flows. Required Skills and Experience 5+ years of experience in Data Engineering. Proven hands-on experience with Google Cloud Platform (GCP) BigQuery, Cloud Storage, Cloud Composer, etc. Strong command of SQL and working knowledge of BigQuery . Advanced proficiency in Python for scripting and automation. Experience in designing and managing workflows using Apache Airflow . Good understanding of data warehousing concepts , ETL best practices , and data modeling . Strong communication and presentation skills, with the ability to explain technical concepts to non-technical stakeholders. Important Notes Do not apply if your notice period is 90 days . Offered CTC will depend on current and expected compensation .

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 15 Lacs

Chennai

Work from Office

Role & responsibilities: Assist in executing cloud migration tasks including VM migrations, database transfers, and application re-platforming. Perform GCP resource provisioning using Deployment Manager or Terraform. Collaborate with senior engineers on lift-and-shift or re-architecture engagements. Troubleshoot basic networking, IAM, and storage issues in GCP. Preferred candidate profile: GCP Associate Cloud Engineer Certification (mandatory) Hands-on experience in at least one GCP migration project (even as a support resource) Strong understanding of GCP core services: Compute Engine, Cloud Storage, VPC, IAM. Familiarity with CI/CD tools and scripting (Bash, Python) Nice-to-Have Skills: Exposure to Kubernetes (GKE) or Docker Familiarity with hybrid/multi-cloud tools Interested candidate share your resume to valarmathi.venkatesan@securekloud.com

Posted 2 weeks ago

Apply

1.0 - 7.0 years

3 - 9 Lacs

Pune

Work from Office

Required Skills and Qualifications- Bachelor degree in Computer Science, Information Technology, or a related field. Hands on experience in data pipeline testing, preferably in a cloud environment. Strong experience with Google Cloud Platform services, especially BigQuery Proficient in working with Kafka, Hive, Parquet files, and Snowflake. Expertise in Data Quality Testing and metrics calculations for both batch and streaming data. Excellent programming skills in Python and experience with test automation. Strong analytical and problem-solving abilities. Excellent communication and teamwork skills.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Bengaluru

Work from Office

4+ years of experience in software development using Java/J2EE technologies.Exposure to Microservices and RESTFul API development with Java, Spring Framework.4+ years of experience in database technologies with exposure to NoSQL technologies.4 year of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC).Working experience with frontend technology like ReactJS or any other JavaScript frameworks.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Bengaluru

Work from Office

4+ years of Testing Experience and at least 2 years in ETL Testing and automationExperience of automating ETL flowsExperience of development of automation framework for ETLGood coding skills in Python and PytestExpert at Test Data Analysis & Test designGood at Database Analytics(ETL or BigQuery).Having snowflake knowledge is a plusGood communication skills with customers and other stakeholdersCapable of working independently or with little supervision

Posted 2 weeks ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Data Engineer Skills required : Bigdata Workflows (ETL/ELT), Python hands-on, SQL hands-on, Any Cloud (GCP & BigQuery preferred), Airflow (good knowledge on Airflow features, operators, scheduling etc) Skills that would add advantage : DBT, Kafka Experience level : 4 5 years NOTE Candidate will be having the coding test (Python and SQL) in the interview process. This would be done through coders-pad. Panel would set it at run-time.

Posted 2 weeks ago

Apply

4.0 - 7.0 years

7 - 14 Lacs

Gurugram

Work from Office

Must have : Bigdata ,GCP Roles & Responsibilities Must have : Bigdata ,GCP Tags Bigdata, GCP Years Of Experience 4 to 7 Years The candidate should have extensive production experience (1-2 Years ) in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 4-5 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to Production application is a must and Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics) Role & responsibilities Preferred candidate profile

Posted 2 weeks ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad, Bengaluru

Work from Office

6 -10 years of hands-on experience in Java development , with a focus on building robust data processing components Proficient in working with Google Cloud Pub/Sub or equivalent streaming platforms like Kafka Skilled in JSON schema design , data serialization , and handling structured data formats Experienced in designing BigQuery views optimized for performance, scalability, and ease of consumption Responsible for enhancing and maintaining Java-based adapters to publish transactional data from the Optimus system to Google Pub/Sub Implement and manage JSON schemas for smooth and accurate ingestion of data into BigQuery Collaborate with cross-functional teams to ensure data models are structured to support high-performance queries and business usability Strong communication and teamwork skills, with the ability to align technical solutions with stakeholder requirements Contribute to continuous improvements in data architecture and integration practices

Posted 2 weeks ago

Apply

4.0 - 9.0 years

15 - 30 Lacs

Noida, Pune, Bengaluru

Hybrid

Location- All EXL Locations ( Hybrid Mode) Salary - 15 to 35 LPA Experience - 5 to 10 years Competencies-Analytical Thinking and Problem Solving, Communication, Teamwork, Flexibility Skills- Agile, Tableau, BigQuery, SQL, T-SQL, data modelling, Snowflake, redshift, CDP Platform, Power BI, Data Visualization Notice period- 15 days or less Job Description : - Design, develop, and maintain dashboards and reports using Sigma Computing. Collaborate with business stakeholders to understand data requirements and deliver actionable insights. Write and optimize SQL queries that run directly on cloud data warehouses. Enable self-service analytics for business users via Sigma's spreadsheet interface and templates. Apply row-level security and user-level filters to ensure proper data access controls. Partner with data engineering teams to validate data accuracy and ensure model alignment. Troubleshoot performance or data issues in reports and dashboards. Train and support users on Sigma best practices, tools, and data literacy. Required Skills & Qualifications: 5+ years of experience in Business Intelligence, Analytics, or Data Visualization roles. Hands-on experience with Sigma Computing is highly preferred. Strong SQL skills and experience working with cloud data platforms (Snowflake, BigQuery, Redshift, etc.). Experience with data modeling concepts and modern data stacks. Ability to translate business requirements into technical solutions. Familiarity with data governance, security, and role-based access controls. Excellent communication and stakeholder management skills. Experience with Looker, Tableau, Power BI, or similar tools (for comparative insight). Familiarity with dbt, Fivetran, or other ELT/ETL tools. Exposure to Agile or Scrum methodologies. Follow to keep yourself updated about future job openings linkedin.com/in/sonali-nayakawad-088b19199

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

At PwC, the focus in data and analytics is on leveraging data to drive insights and make informed business decisions. Utilizing advanced analytics techniques to help clients optimize their operations and achieve strategic goals is key. In data analysis at PwC, the emphasis is on utilizing advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. Skills in data manipulation, visualization, and statistical modeling play a crucial role in supporting clients in solving complex business problems. Candidates with 4+ years of hands-on experience are sought for the position of Senior Associate in supply chain analytics. Successful candidates should possess proven expertise in supply chain analytics across domains such as demand forecasting, inventory optimization, logistics, segmentation, and network design. Additionally, hands-on experience working on optimization methods like linear programming, mixed integer programming, and scheduling optimization is required. Proficiency in forecasting techniques and machine learning techniques, along with a strong command of statistical modeling, testing, and inference, is essential. Familiarity with GCP tools like BigQuery, Vertex AI, Dataflow, and Looker is also necessary. Required skills include building data pipelines and models for forecasting, optimization, and scenario planning, strong SQL and Python programming skills, experience deploying models in a GCP environment, and knowledge of orchestration tools like Cloud Composer (Airflow). Nice-to-have skills consist of familiarity with MLOps, containerization (Docker, Kubernetes), and orchestration tools, as well as strong communication and stakeholder engagement skills at the executive level. The roles and responsibilities of the Senior Associate involve assisting analytics projects within the supply chain domain, driving design, development, and delivery of data science solutions. They are expected to interact with and advise consultants/clients as subject matter experts, conduct analysis using advanced analytics tools, and implement quality control measures for deliverable integrity. Validating analysis outcomes, making presentations, and contributing to knowledge and firm building activities are also part of the role. The ideal candidate should hold a degree in BE / B.Tech / MCA / M.Sc / M.E / M.Tech / Masters Degree / MBA from a reputed institute.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Modeller specializing in GCP and Cloud Databases, you will play a crucial role in designing and optimizing data models for both OLTP and OLAP systems. Your expertise in cloud-based databases, data architecture, and modeling will be essential in collaborating with engineering and analytics teams to ensure efficient operational systems and real-time reporting pipelines. You will be responsible for designing conceptual, logical, and physical data models tailored for OLTP and OLAP systems. Your focus will be on developing and refining models that support performance-optimized cloud data pipelines, implementing models in BigQuery, CloudSQL, and AlloyDB, as well as designing schemas with indexing, partitioning, and data sharding strategies. Translating business requirements into scalable data architecture and schemas will be a key aspect of your role, along with optimizing for near real-time ingestion, transformation, and query performance. You will utilize tools like DBSchema for collaborative modeling and documentation while creating and maintaining metadata and documentation around models. In terms of required skills, hands-on experience with GCP databases (BigQuery, CloudSQL, AlloyDB), a strong understanding of OLTP and OLAP systems, and proficiency in database performance tuning are essential. Additionally, familiarity with modeling tools such as DBSchema or ERWin, as well as a proficiency in SQL, schema definition, and normalization/denormalization techniques, will be beneficial. Preferred skills include functional knowledge of the Mutual Fund or BFSI domain, experience integrating with cloud-native ETL and data orchestration pipelines, and familiarity with schema version control and CI/CD in a data context. In addition to technical skills, soft skills such as strong analytical and communication abilities, attention to detail, and a collaborative approach across engineering, product, and analytics teams are highly valued. Joining this role will provide you with the opportunity to work on enterprise-scale cloud data architectures, drive performance-oriented data modeling for advanced analytics, and collaborate with high-performing cloud-native data teams.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies