Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 - 8.0 years
10 - 18 Lacs
Guwahati
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 2 weeks ago
3.0 - 8.0 years
10 - 18 Lacs
Kochi
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 2 weeks ago
3.0 - 8.0 years
10 - 18 Lacs
Kanpur
Work from Office
Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred
Posted 2 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Chennai
Work from Office
What youll be doing: The Wireless Solution Train supports critical network functions and services for 4G/5G wireless applications. We are looking for a dynamic and collaborative individual who will contribute to the growth and evolution of Next Gen OSS for Network systems. Planning, designing, developing, coding and testing software systems or applications for software enhancements and new products; revise and refine as required. Implementing changes and new features in a manner which promotes efficient, reusable and performant code. Participating in product feature implementation, both independently and in cooperation with the team. Maintaining and improve existing code with a pride of ownership. Leading medium to large scale projects with minimal direction. Design, develop, and maintain data pipelines using GCP services such as BigQuery, Dataflow, Cloud Storage, Pub/Sub, and Dataproc. Implement and manage data ingestion processes from various sources (e.g., databases, APIs, streaming platforms). What were looking for... You'll need to have: Bachelor's degree or four or more years of work experience. Four or more years of relevant work experience Experience in Python, Pyspark/Flink Experience on Product Agile model (POD) and have product mindset. GCP experience on BQ, Spanner, Looker Experience in GEN AI solutions & tools Even better if you have one or more of the following: Masters degree or rleted field. Any relevant certification. Excellent communication and collaboration skills. Develop and maintain data quality checks and monitoring systems to ensure data accuracy and integrity. Optimize data pipelines for performance, scalability, and cost-effectiveness. Collaborate with data scientists and analysts to understand data requirements and provide data solutions. Build and maintain Looker dashboards and reports for data visualization and analysis. Stay up-to-date with the latest technologies and best practices in cloud data engineering. (preferred GCP)
Posted 2 weeks ago
4.0 - 9.0 years
15 - 25 Lacs
Gurugram
Remote
Dear Candidate, Greetings from A2Z HR Consultants Base Job Location: Gurgaon Shift timings: Dubai Time Zone (General Shifts) Number of working days: 5 Mode of Work: Remote Salary range: Upto 25 LPA fixed Role: Digital Analytics Job Description: We are seeking an experienced and detail-oriented App Analytics Implementation Specialist to join our team. The ideal candidate will be responsible for ensuring the accuracy and effectiveness of analytics implementations for mobile applications. This includes performing quality assurance (QA) on Firebase Analytics app events, Adjust SDK integration, and GA4 reporting. You will work closely with cross-functional teams to ensure data integrity, validate event tracking, and support reporting initiatives. Responsibilities: Firebase Analytics App Events QA: Test and verify the integration of Firebase Analytics within mobile applications. Ensure proper tracking of app events, including user interactions, sessions, and in-app behavior. Collect Firebase Analytics logs from Android Studio logcat and Xcode from build branches shared by developers. Validate each event and its associated parameters to ensure data accuracy and completeness. Perform troubleshooting and ensure accurate event data collection for reporting. Work with developers and product teams to resolve discrepancies and ensure correct event firing across all devices. Adjust Implementation QA: Conduct thorough QA for Adjust SDK implementation across mobile platforms (iOS, Android). Verify and test in-app events, user attribution, and campaign tracking in Adjust. Perform troubleshooting of Adjust integrations and resolve any tracking issues. Collaborate with marketing and analytics teams to ensure proper attribution data collection for advertising campaigns. GA4 Reporting QA: Test Google Analytics 4 (GA4) setup and ensure accurate tracking of web/app data. Perform validation of GA4 custom events, eCommerce tracking, and user properties. Clear understanding of GA4 user properties, items parameters, and event parameters. Ensure that the tracking aligns with reporting requirements and provides accurate, actionable data. Collaborate with stakeholders to define reporting requirements and ensure GA4 dashboards are accurate. Troubleshoot and resolve data discrepancies in GA4 and ensure proper reporting of app metrics. Data Layer Creation: Create and maintain a data layer for app analytics based on new designs and requirements from the UX team. Ensure tracking of each call-to-action (CTA) and eCommerce event in detail, aligning the data layer with the product and user experience goals. Collaborate with UX, product, and development teams to ensure that tracking reflects the latest app design changes and meets analytics needs. Data Validation: Ensure analytics QA with high accuracy to maintain parity across all platforms Android, iOS, and Web . Perform data validation across GA4, Adjust, and internal databases to ensure consistency and accuracy across all analytics tools. Ensure proper alignment of event data and user interactions between platforms to guarantee reliable cross-platform analytics. MarTech Tools and Marketing Analytics: Knowledge of MarTech tools integrations and marketing analytics to support campaign tracking and reporting. Collaborate with marketing teams to ensure accurate data flow from external platforms (e.g., advertising networks, Braze CRM, Meta, Tiktok, Criteo, TradeDesk) into the analytics tools. Collaboration & Communication: Work closely with the development, marketing, and analytics teams to ensure alignment on tracking needs. Participate in sprint planning and help define test cases for tracking events and KPIs. Provide insights and recommendations to improve tracking efficiency and accuracy. Document QA test cases, issues, and resolutions effectively. Required Skills & Qualifications: Proven experience with Firebase Analytics and mobile app event tracking. Experience collecting Firebase Analytics logs from Android Studio logcat and Xcode from build branches and validating events and their parameters. Strong knowledge of Adjust SDK implementation and troubleshooting. Hands-on experience with Google Analytics 4 (GA4) and generating custom reports. Clear understanding of GA4 user properties, items parameters, and event parameters . Experience creating and maintaining a data layer for app analytics based on UX team designs and new app features. Strong expertise in analytics QA across multiple platforms (Android, iOS, Web) to ensure parity and consistency of data. Data validation experience across GA4, Adjust, and Big Query to ensure cross-platform consistency and accuracy. Knowledge of MarTech tools integrations and marketing analytics to support cross-channel attribution. Familiarity with mobile analytics platforms and performance metrics. Detail-oriented with strong QA skills, including creating test cases, performing tests, and documenting results. Ability to interpret data, identify trends, and communicate findings clearly. Strong problem-solving skills and the ability to troubleshoot analytics-related issues. Solid understanding of data flow between different platforms (Firebase, Adjust, GA4, etc.). Excellent communication and collaboration skills. Preferred Skills: Experience with Google Tag Manager (GTM) for mobile and web. Familiarity with BigQuery for advanced reporting and querying. Educational Requirements: Bachelors degree in Computer Science, Information Technology, Marketing, or a related field, or equivalent work experience. Interested candidates can share their CV at 9711831492 or gaurav.a2zhrconsultants@gmail.com Regards Gaurav Kumar A2Z HR Consultants
Posted 2 weeks ago
2.0 - 5.0 years
5 - 9 Lacs
Navi Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data modeling and database design principles.- Experience with application development frameworks and methodologies.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Google BigQuery.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
13.0 - 18.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Skill required: Tech for Operations - Artificial Intelligence (AI) Designation: AI/ML Computational Science Manager Qualifications: Any Graduation Years of Experience: 13 to 18 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationUnderstanding of foundational principles and knowledge of Artificial Intelligence AI including concepts, techniques, and tools in order to use AI effectively. What are we looking for Problem-solving skillsAbility to perform under pressureResults orientationStrong analytical skillsWritten and verbal communication Roles and Responsibilities: In this role you are required to identify and assess complex problems for area of responsibility The person would create solutions in situations in which analysis requires an in-depth evaluation of variable factors Requires adherence to strategic direction set by senior management when establishing near-term goals Interaction of the individual is with senior management at a client and/or within Accenture, involving matters that may require acceptance of an alternate approach Some latitude in decision-making in involved you will act independently to determine methods and procedures on new assignments Decisions individual at this role makes have a major day to day impact on area of responsibility The person manages large - medium sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation
Posted 2 weeks ago
5.0 - 10.0 years
20 - 35 Lacs
Bengaluru
Hybrid
Location : Bengaluru (Hybrid) / Remote Job Type : Full-time Experience Required : 5+ Year Notice Period : immediate -30 days Role Overview : As a Collibra Expert, you will be responsible for implementing, maintaining, and optimizing the Collibra Data Governance Platform to ensure data quality, governance, and lineage across the organization. You will partner with cross-functional teams to develop data management strategies and integrate Collibra solutions with Google Cloud Platform (GCP) to create a robust, scalable, and efficient data governance framework for the retail domain. Key Responsibilities : - Data Governance Management : Design, implement, and manage the Collibra Data Governance Platform for data cataloging, data quality, and data lineage within the retail domain. - Collibra Expertise : Utilize Collibra for metadata management, data quality monitoring, policy enforcement, and data stewardship across various business units. - Data Cataloging : Lead the implementation and continuous improvement of data cataloging processes to enable a centralized, user-friendly view of the organization's data assets. - Data Quality Management : Collaborate with business and technical teams to ensure that data is high-quality, accessible, and actionable. Define data quality rules and KPIs to monitor data accuracy, completeness, consistency, and timeliness. - Data Lineage Implementation : Build and maintain comprehensive data lineage models to visualize the flow of data from source to consumption, ensuring compliance with data governance standards. - GCP Integration : Architect and implement seamless integrations between Collibra and the Google Cloud Platform (GCP) tools such as BigQuery, Dataflow, and Cloud Storage, ensuring data governance policies are enforced in the cloud environment. - Collaboration & Stakeholder Management : Collaborate with Data Engineers, Analysts, Business Intelligence teams, and leadership to define and implement data governance best practices and standards. - Training & Support : Provide ongoing training and support to business users and technical teams on data governance practices, Collibra platform usage, and GCP-based solutions. - Compliance & Security : Ensure data governance initiatives comply with internal policies, industry standards, and regulations (e.g., GDPR, CCPA). Key Requirements : - Proven Expertise in Collibra : Hands-on experience implementing and managing Collibra Data Governance Platform (cataloging, lineage, data quality). - Google Cloud Platform (GCP) Proficiency : Strong experience with GCP tools (BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc.) and integrating them with Collibra for seamless data governance. - Data Quality and Lineage Expertise : In-depth knowledge of data quality frameworks, metadata management, and data lineage implementation. - Retail Industry Experience : Prior experience in data governance within the retail or eCommerce domain is a plus. - Technical Skills : Strong understanding of cloud data architecture and best practices for managing data at scale in the cloud (preferably in GCP). - Problem-Solving and Analytical Skills : Ability to analyze complex data governance issues and find practical solutions to ensure high-quality data management across the organization. - Excellent Communication Skills : Ability to communicate effectively with both technical and non-technical stakeholders to advocate for data governance best practices. - Certifications : Relevant certifications in Collibra, Google Cloud, or Data Governance are highly desirable. Education & Experience : - Bachelor's degree (B. Tech/BE) mandatory, masters optional - 5+ years of experience in Data Governance, with at least 3 years of specialized experience in Collibra and GCP. - Experience working with data teams in a retail environment is a plus.
Posted 2 weeks ago
7.0 - 12.0 years
0 - 3 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Required Skills: Python, ETL, SQL, GCP, Bigquery, Pub/Sub, Airflow. Good to Have: DBT, Data mesh Job Title: Senior GCP Engineer Data Mesh & Data Product Specialist We are hiring a Senior GCP Developer to join our high-performance data engineering team. This is a mission-critical role where you will design, build, and maintain scalable ETL pipelines and frameworks in a Data Mesh architecture. You will work with modern tools like Python, dbt, BigQuery (GCP), and SQL to deliver high-quality data products that power decision-making across the organization. We are looking for a highly skilled professional who thrives in demanding environments, takes ownership of their work, and delivers results with precision and reliability. Key Responsibilities * Design, Build, and Maintain ETL Pipelines: Develop robust, scalable, and efficient ETL workflows to ingest, transform, and load data into distributed data products within the Data Mesh architecture. * Data Transformation with dbt: Use dbt to build modular, reusable transformation workflows that align with the principles of Data Products. * Cloud Expertise: Leverage Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Pub/Sub, and Dataflow to implement highly scalable data solutions. * Data Quality & Governance: Enforce strict data quality standards by implementing validation checks, anomaly detection mechanisms, and monitoring frameworks. * Performance Optimization: Continuously optimize ETL pipelines for speed, scalability, and cost efficiency. * Collaboration & Ownership: Work closely with data product owners, BI developers, and stakeholders to understand requirements and deliver on expectations. Take full ownership of your deliverables. * Documentation & Standards: Maintain detailed documentation of ETL workflows, enforce coding standards, and adhere to best practices in data engineering. * Troubleshooting & Issue Resolution: Proactively identify bottlenecks or issues in pipelines and resolve them quickly with minimal disruption. Required Skills & Experience * 10+ or 7+ years of hands-on experience in designing and implementing ETL workflows in large-scale environments (Lead & Dev) * Advanced proficiency in Python for scripting, automation, and data processing. * Expert-level knowledge of SQL for querying large datasets with performance optimization techniques. * Deep experience working with modern transformation tools like dbt in production environments. * Strong expertise in cloud platforms like Google Cloud Platform (GCP) with hands-on experience using BigQuery. * Familiarity with Data Mesh principles and distributed data architectures is mandatory. * Proven ability to handle complex projects under tight deadlines while maintaining high-quality standards. * Exceptional problem-solving skills with a strong focus on delivering results. What We Expect This is a demanding role that requires: 1. A proactive mindset – you take initiative without waiting for instructions. 2. A commitment to excellence – no shortcuts or compromises on quality. 3. Accountability – you own your work end-to-end and deliver on time. 4. Attention to detail – precision matters; mistakes are not acceptable.
Posted 2 weeks ago
10.0 - 14.0 years
10 - 16 Lacs
Pune
Work from Office
Role Overview:- The Senior Tech Lead - GCP Data Engineering leads the design, development, and optimization of advanced data solutions. The jobholder has extensive experience with GCP services, data architecture, and team leadership, with a proven ability to deliver scalable and secure data systems. Responsibilities:- Lead the design and implementation of GCP-based data architectures and pipelines. Architect and optimize data solutions using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in GCP data environments. Stay updated on the latest GCP technologies and industry trends. Key Technical Skills & Responsibilities Overall 10+ Yrs of experience with GCP and Data Warehousing concepts; Coding; reviewing; testing and debugging Experience as architect on GCP implementation/or migration data projects. Must have understanding of Data Lakes and Data Lake Architectures, best practices in data storage, loading, retrieving data from data lakes. Experience in develop and maintain pipelines in GCP platform, understand best practices of bringing on-prem data to the cloud. File loading, compression, parallelization of loads, optimization etc. Working knowledge and/or experience with Google Data Studio, looker and other visualization tools Working knowledge in Hadoop and Python/Java would be an added advantage Experience in designing and planning BI solutions, Debugging, monitoring and troubleshooting BI solutions, Creating and deploying reports and Writing relational and multidimensional database queries. Any experience in NOSQL environment is a plus. Must be good with Python and PySpark for data pipeline building. Must have experience of working with streaming data sources and Kafka. GCP Services - Cloud Storage, BigQuery , Big Table, Cloud Spanner, Cloud SQL, DataStore/Firestore, DataFlow, DataProc, DataFusion, DataPrep, Pub/Sub, Data Studio, Looker, Data Catalog, Cloud Composer, Cloud Scheduler, Cloud Function Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field. Extensive experience with GCP data services and tools. GCP certification (e.g., Professional Data Engineer, Professional Cloud Architect). Experience with machine learning and AI integration in GCP environments. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Proven leadership experience in managing technical teams. Excellent problem-solving and communication skills.
Posted 2 weeks ago
5.0 - 8.0 years
17 - 20 Lacs
Kolkata
Work from Office
Key Responsibilities Architect and implement scalable data solutions using GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer, etc.) and Snowflake. Lead the end-to-end data architecture including ingestion, transformation, storage, governance and consumption layers. Collaborate with business stakeholders, data scientists and engineering teams to define and deliver enterprise data strategy. Design robust data pipelines (batch and real-time) ensuring high data quality, security and availability. Define and enforce data governance, data cataloging and metadata management best practices. Evaluate and select appropriate tools and technologies to optimize data architecture and cost efficiency. Mentor junior architects and data engineers, guiding them on design best practices and technology standards. Collaborate with DevOps teams to ensure smooth CI/CD pipelines and infrastructure automation for data Skills & Qualifications : 3+ years of experience in data architecture, data engineering, or enterprise data platform roles. 3+ years of hands-on experience in Google Cloud Platform (especially BigQuery, Dataflow, Cloud Composer, Data Catalog). 3+ years of experience designing and implementing Snowflake-based data solutions. Deep understanding of modern data architecture principles (Data Lakehouse, ELT/ETL, Data Mesh, etc.). Proficient in Python, SQL and orchestration tools like Airflow / Cloud Composer. Experience in data modeling (3NF, Star, Snowflake schemas) and designing data marts and warehouses. Strong understanding of data privacy, compliance (GDPR, HIPAA) and security principles in cloud environments. Familiarity with tools like dbt, Apache Beam, Looker, Tableau, or Power BI is a plus. Excellent communication and stakeholder management skills. GCP or Snowflake certification preferred (e.g., GCP Professional Data Engineer, SnowPro Qualifications : Experience working with hybrid or multi-cloud data strategies. Exposure to ML/AI pipelines and support for data science workflows. Prior experience in leading architecture reviews, PoCs and technology roadmaps
Posted 2 weeks ago
1.0 - 5.0 years
3 - 7 Lacs
Chandigarh
Work from Office
Key Responsibilities Assist in building and maintaining data pipelines on GCP using services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc. Support data ingestion, transformation, and storage processes for structured and unstructured datasets. Participate in performance tuning and optimization of existing data workflows. Collaborate with data analysts, engineers, and stakeholders to ensure reliable data delivery. Document code, processes, and architecture for reproducibility and future reference. Debug issues in data pipelines and contribute to their resolution.
Posted 2 weeks ago
10.0 - 15.0 years
11 - 15 Lacs
Jhagadia
Work from Office
Develop, implement, and maintain the organization's MIS to ensure accurate and real-time reporting of key business metrics. Oversee the preparation and distribution of daily, weekly, and monthly reports to various departments and senior management. Ensure data accuracy, integrity, and consistency across all reporting platforms. Design and maintain dashboards for business performance monitoring. Analyze data trends and provide insights to management for informed decision-making. Establish and maintain cost accounting systems and procedures for accurate tracking of material, labor, and overhead costs. Review and update cost standards, analyzing variances and taking corrective actions when necessary. Collaborate with other departments to monitor and control project costs, ensuring alignment with budget and financial goals. Perform cost analysis and prepare cost reports to monitor financial performance and support pricing decisions. Conduct regular audits to ensure compliance with costing policies and industry standards. Provide regular cost analysis reports, highlighting variances between actual and budgeted figures, and recommend corrective actions. Support financial forecasting and budgeting processes by providing relevant data and insights. Assist in month-end and year-end closing processes by ensuring accurate costing and reporting entries. Review profitability analysis reports and identify areas for cost optimization.
Posted 2 weeks ago
1.0 - 5.0 years
3 - 7 Lacs
Gurugram
Work from Office
Key Responsibilities Assist in building and maintaining data pipelines on GCP using services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc. Support data ingestion, transformation, and storage processes for structured and unstructured datasets. Participate in performance tuning and optimization of existing data workflows. Collaborate with data analysts, engineers, and stakeholders to ensure reliable data delivery. Document code, processes, and architecture for reproducibility and future reference. Debug issues in data pipelines and contribute to their resolution.
Posted 2 weeks ago
4.0 - 6.0 years
7 - 9 Lacs
Chennai
Work from Office
What youll be doing Were seeking a skilled Data Engineering Analyst to join our high-performing team and propel our telecom business forward. Youll contribute to building cutting-edge data products and assets for our wireless and wireline operations, spanning areas like consumer analytics, network performance, and service assurance. In this role, you will develop deep expertise in various telecom domains. As part of the Data Architecture Strategy team, youll collaborate closely with IT and business stakeholders to design and implement user-friendly, robust data product solutions. This includes incorporating data classification and governance principles. Your responsibilities encompass Collaborate with stakeholders to understand data requirements and translate them into efficient data models Design, develop, and implement data architecture solutions on GCP and Teradata to support our Telecom business. Design data ingestion for both real-time and batch processing, ensuring efficient and scalable data acquisition for creating an effective data warehouse. Maintain meticulous documentation, including data design specifications, functional test cases, data lineage, and other relevant artifacts for all data product solution assets. Implement data architecture standards, as set by the data architecture team. Proactively identify opportunities for automation and performance optimization within your scope of work Collaborate effectively within a product-oriented organization, providing data expertise and solutions across multiple business units. Cultivate strong cross-functional relationships and establish yourself as a subject matter expert in data and analytics within the organization. What were looking for... Youre curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. Youll need to have Bachelors degree with four or more years of work experience. Four or more years of relevant work experience. Expertise in building complex SQLs to do data analysis to understand and design data solutions Experience with ETL, Data Warehouse concepts and Data Management life cycle Experience in creating technical documentation such as Source to Target mapping, Source contract, SLA's etc Experience in any DBMS, preferably GCP/BigQuery Experience in creating Data models using Erwin tool Experience in shell scripting and python Understanding of git version control and basic git command Understanding of Data Quality concepts Even better if you have one or more of the following Certification in GCP-Data Engineer. Understanding of NO SQL databases like Cassandra, Mongo etc Accuracy and attention to detail. Good problem solving, analytical, and research capabilities. Good verbal and written communication. Experience presenting to leaders and influencing stakeholders.
Posted 2 weeks ago
5.0 - 7.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Job Title: Senior Data Engineer / Technical Lead Location: Bangalore Employment Type: Full-time Role Summary We are seeking a highly skilled and motivated Senior Data Engineer/Technical Lead to take ownership of the end-to-end delivery of a key project involving data lake transitions, data warehouse maintenance, and enhancement initiatives. The ideal candidate will bring strong technical leadership, excellent communication skills, and hands-on expertise with modern data engineering tools and platforms. Experience in Databricks and JIRA is highly desirable. Knowledge of supply chain and finance domains is a plus, or a willingness to quickly ramp up in these areas is expected. Key Responsibilities Delivery Management Lead and manage data lake transition initiatives under the Gold framework. Oversee delivery of enhancements and defect fixes related to the enterprise data warehouse. Technical Leadership Design and develop efficient, scalable data pipelines using Python, PySpark , and SQL . Ensure adherence to coding standards, performance benchmarks, and data quality goals. Conduct performance tuning and infrastructure optimization for data solutions. Provide code reviews, mentorship, and technical guidance to the engineering team. Collaboration & Stakeholder Engagement Collaborate with business stakeholders (particularly the Laboratory Products team) to gather, interpret, and refine requirements. Communicate technical solutions and project progress clearly to both technical and non-technical audiences. Tooling and Technology Use Leverage tools such as Databricks, Informatica, AWS Glue, Google DataProc , and Airflow for ETL and data integration. Use JIRA to manage project workflows, track defects, and report progress. Documentation and Best Practices Create and review documentation including architecture, design, testing, and deployment artifacts. Define and promote reusable templates, checklists, and best practices for data engineering tasks. Domain Adaptation Apply or gain knowledge in supply chain and finance domains to enhance project outcomes and align with business needs. Skills and Qualifications Technical Proficiency Strong hands-on experience in Python, PySpark , and SQL . Expertise with ETL tools such as Informatica, AWS Glue, Databricks , and Google Cloud DataProc . Deep understanding of data warehousing solutions (e.g., Snowflake, BigQuery, Delta Lake, Lakehouse architectures ). Familiarity with performance tuning, cost optimization, and data modeling best practices. Platform & Tools Proficient in working with cloud platforms like AWS, Azure, or Google Cloud . Experience in version control and configuration management practices. Working knowledge of JIRA and Agile methodologies. Certifications (Preferred but not required) Certifications in cloud technologies, ETL platforms, or relevant domain (e.g., AWS Data Engineer, Databricks Data Engineer, Supply Chain certification). Expected Outcomes Timely and high-quality delivery of data engineering solutions. Reduction in production defects and improved pipeline performance. Increased team efficiency through reuse of components and automation. Positive stakeholder feedback and high team engagement. Consistent adherence to SLAs, security policies, and compliance guidelines. Performance Metrics Adherence to project timelines and engineering standards Reduction in post-release defects and production issues Improvement in data pipeline efficiency and resource utilization Resolution time for pipeline failures and data issues Completion of required certifications and training Preferred Background Background or exposure to supply chain or finance domains Willingness to work during morning US East hours Ability to work independently and drive initiatives with minimal oversight Required Skills Databricks,Data Warehousing,ETL,SQL
Posted 3 weeks ago
5.0 - 7.0 years
7 - 9 Lacs
Pune
Work from Office
New Opportunity :FullStack Engineer. Location :Pune (Onsite). Company :Apptware Solutions Hiring. Experience :4+ years. We're looking for a skilled Full Stack Engineer to join our team. If you have experience in building scalable applications and working with modern technologies, this role is for you. Role & Responsibilities. Develop product features to help customers easily transform data. Design, implement, deploy, and support client-side and server-side architectures, including web applications, CLI, and SDKs. Minimum Requirements. 4+ years of experience as a Full Stack Developer or similar role. Hands-on experience in a distributed engineering role with direct operational responsibility (on-call experience preferred). Proficiency in at least one back-end language (Node.js, TypeScript, Python, or Go). Front-end development experience with Angular or React, HTML, CSS. Strong understanding of web applications, backend APIs, CI/CD pipelines, and testing frameworks. Familiarity with NoSQL databases (e.g. DynamoDB) and AWS services (Lambda, API Gateway, Cognito, etc.). Bachelor's degree in Computer Science, Engineering, Math, or equivalent experience. Strong written and verbal communication skills. Preferred Skills. Experience with AWS Glue, Spark, or Athena. Strong understanding of SQL and data engineering best practices. Exposure to Analytical EDWs (Snowflake, Databricks, Big Query, Cloudera, Teradata). Experience in B2B applications, SaaS offerings, or startups is a plus. (ref:hirist.tech). Show more Show less
Posted 3 weeks ago
5.0 - 8.0 years
8 - 14 Lacs
Bengaluru
Remote
Job Overview : We are looking for an experienced GCP Data Engineer with deep expertise in BigQuery, DataFlow, DataProc, Pub/Sub, and GCS to build, manage, and optimize large-scale data pipelines. The ideal candidate should have a strong background in cloud data storage, real-time data streaming, and orchestration. Key Responsibilities : Data Storage & Management : - Manage Google Cloud Storage (GCS) buckets, set up permissions, and optimize storage solutions for handling large datasets. - Ensure data security, access control, and lifecycle management. Data Processing & Analytics : - Design and optimize BigQuery for data warehousing, querying large datasets, and performance tuning. - Implement ETL/ELT pipelines for structured and unstructured data. - Work with DataProc (Apache Spark, Hadoop) for batch processing of large datasets. Real-Time Data Streaming : - Use Pub/Sub for building real-time, event-driven streaming pipelines. - Implement Dataflow (Apache Beam) for real-time and batch data processing. Workflow Orchestration & Automation : - Use Cloud Composer (Apache Airflow) for scheduling and automating data workflows. - Build monitoring solutions to ensure data pipeline health and performance. Cloud Infrastructure & DevOps : - Implement Terraform for provisioning and managing cloud infrastructure. - Work with Google Kubernetes Engine (GKE) for container orchestration and managing distributed applications. Advanced SQL & Data Engineering : - Write efficient SQL queries for data transformation, aggregation, and analysis. - Optimize query performance and cost efficiency in BigQuery. Required Skills & Qualifications : - 4-8 years of experience in GCP Data Engineering - Strong expertise in BigQuery, DataFlow, DataProc, Pub/Sub, and GCS - Experience in SQL, Python, or Java for data processing and transformation - Proficiency in Airflow (Cloud Composer) for scheduling workflows - Hands-on experience with Terraform for cloud infrastructure automation - Familiarity with NoSQL databases like Bigtable for high-scale data handling - Knowledge of GKE for containerized applications and distributed processing Preferred Qualifications : - Experience with CI/CD pipelines for data deployment - Familiarity with Cloud Functions or Cloud Run for serverless execution - Understanding of data governance, security, and compliance Why Join Us ? - Work on cutting-edge GCP data projects in a cloud-first environment - Competitive salary and career growth opportunities - Collaborative and innovative work culture - Exposure to big data, real-time streaming, and advanced analytics.
Posted 3 weeks ago
4.0 - 7.0 years
5 - 8 Lacs
Gurugram
Work from Office
Key Responsibilities: Gather and analyze data from a variety of sources, including SQL databases, BigQuery, Excel, Power BI, and Python. Good SQL coding skills are a must. Work with stakeholders to understand their needs and translate them into data-driven solutions. Communicate effectively with stakeholders, both verbally and in writing. Leading and managing a team of business analysts. Must be a self-starter, able to manage multiple tasks and projects simultaneously, own deliverables end to end, prioritize workload effectively and thrive in dynamic environment Must be a problem solver with outstanding skills in discovering techniques and proven Abilities to translate the underlying business needs into actionable insights Works well under pressure, can work within stringent timelines and collaborate with teams to achieve results Desired Profile : Should have relevant experience of 4-6 years in the field of analytics Technical Capabilities (hands on) - SQL, Advance Excel, Power BI, BigQuery, R/Python (Good to have) Possess strong analytical skills and sharepoint of views with the organization Penchant for business, curiosity about numbers and persistent to work with data to generate insights Provides customized knowledge for client work, prepares accurate, well developed client deliverables Experience in App Analytics would be preferred Experience with E-commerce, Retail business would be preferred
Posted 3 weeks ago
1.0 - 5.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Job TitleData Engineer Experience5"“8 Years LocationDelhi, Pune, Bangalore (Hyderabad & Chennai also acceptable) Time ZoneAligned with UK Time Zone Notice PeriodImmediate Joiners Only Role Overview: We are seeking experienced Data Engineers to design, develop, and optimize large-scale data processing systems You will play a key role in building scalable, efficient, and reliable data pipelines in a cloud-native environment, leveraging your expertise in GCP, BigQuery, Dataflow, Dataproc, and more Key Responsibilities: Design, build, and manage scalable and reliable data pipelines for real-time and batch processing. Implement robust data processing solutions using GCP services and open-source technologies. Create efficient data models and write high-performance analytics queries. Optimize pipelines for performance, scalability, and cost-efficiency. Collaborate with data scientists, analysts, and engineering teams to ensure smooth data integration and transformation. Maintain high data quality, enforce validation rules, and set up monitoring and alerting. Participate in code reviews, deployment activities, and provide production support. Technical Skills Required: Cloud PlatformsGCP (Google Cloud Platform)- mandatory Key GCP ServicesDataproc, BigQuery, Dataflow Programming LanguagesPython, Java, PySpark Data Engineering ConceptsData Ingestion, Change Data Capture (CDC), ETL/ELT pipeline design Strong understanding of distributed computing, data structures, and performance tuning Required Qualifications & Attributes: 5"“8 years of hands-on experience in data engineering roles Proficiency in building and optimizing distributed data pipelines Solid grasp of data governance and security best practices in cloud environments Strong analytical and problem-solving skills Effective verbal and written communication skills Proven ability to work independently and in cross-functional teams Show more Show less
Posted 3 weeks ago
1.0 - 5.0 years
6 - 10 Lacs
Pune
Work from Office
We're HiringPython Developer! We are looking for an experienced Python Developer to join dynamic team in Pune, India The ideal candidate will possess a strong background in software development and be proficient in writing efficient, reusable code You will play a key role in designing and implementing scalable applications while collaborating with cross-functional teams. “ LocationPune, India Work ModeHybrid ’ RolePython Developer Experience5+ years What Were Looking For Proven experience designing, building, and operating data-oriented solutions in a high volume, transactional, global, industry Experience with advertising technology (AdTech) highly desired. Proven experience developing simple / scalable / reliable architectures, building, and operating concurrent, distributed systems, and solving difficult and novel problems Proven experience in developing data structures and algorithms, including experience working with ML/AI solutions. Proven experience and a passion for developing and operating data-oriented and/or full stack solutions using Python, Javascript/Typescript, Airflow/Composer, Node, Kafka, Snowflake, BigQuery, and a mix of data platforms such as Spark, Hadoop, AWS Athena, Postgres and Redis Excellent SQL development, query optimization and data pipeline development skills required Strong experience using public cloud platforms including AWS and GCP is required; experience with docker and Kubernetes strongly preferred. Proven experience in developing data structures and algorithms Experience supporting ML/AI highly desirable. Proven experience in modern software development and testing practices, with a willingness to share, partner and support and coach other engineers, product people, and operations Experience in employing TDD, BDD or ATDD highly desirable. Proven experience contributing to the development of principles, practices, and tooling supporting agile, testing/QA, DevSecOps, automation, SRE Experience in Trunk Based Development, XP, & implementing CI/CD highly desirable. Experience in SaaS product engineering and operations highly desirable. A focus on continuous learning and improving, both technically and professionally, in your industry, for you and your teams. Demonstrated resilience, with experience working in ambiguous situations. What You'll Do Develop software as a member of one of our engineering teams, participating in all stages of development, delivery and operations, together with your tech lead, colleagues, Product, Data Science, and Design leaders. Develop solutions that are simple, scalable, reliable, secure, maintainable, and make a measurable impact. Develop and deliver new features, maintain our product, and drive growth to hit team KPIs. Employ modern pragmatic engineering principles, practices, and tooling, including TDD/BDD/ATDD, XP, QA Engineering, Trunk Based Development, Continuous Delivery, automation, DevSecOps, and Site Reliability Engineering. Contribute to driving ongoing improvements to our engineering principles, practices, and tooling Provide support & mentorship to junior engineers, prioritising continuous learning and development. Develop and maintain a contemporary understanding of AdTech developments, industry standards, partner and competitor platform developments, and commercial models, from an engineering perspective Combined these insights with technical expertise to contribute to our strategy and plans, influence product design, shape our roadmap, and help plan delivery. Ready to take your career to the next levelš" Apply now and join us on this exciting journey! Show more Show less
Posted 3 weeks ago
10.0 - 18.0 years
25 - 30 Lacs
Noida
Work from Office
Responsibilities:- Collaborate with the sales team to understand customer challenges and business objectives and propose solutions, POC etc..- Develop and deliver impactful technical presentations and demos showcasing the capabilities of GCP Data and AI , GenAI Solutions- Conduct technical proof-of-concepts (POCs) to validate the feasibility and value proposition of GCP solutions.- Collaborate with technical specialists and solution architects from COE Team to design and configure tailored cloud solutions.- Manage and qualify sales opportunities, working closely with the sales team to progress deals through the sales funnel.- Stay up to date on the latest GCP offerings, trends, and best practices.Experience :- Design and implement a comprehensive strategy for migrating and modernizing existing relational on-premise databases to scalable and cost-effective solution on Google Cloud Platform ( GCP).- Design and Architect the solutions for DWH Modernization and experience with building data pipelines in GCP - Strong Experience in BI reporting tools ( Looker, PowerBI and Tableau) - In-depth knowledge of Google Cloud Platform (GCP) services, particularly Cloud SQL, Postgres, Alloy DB, BigQuery, Looker Vertex AI and Gemini (GenAI)- Strong knowledge and experience in providing the solution to process massive datasets in real time and batch process using cloud native/open source Orchestration techniques - Build and maintain data pipelines using Cloud Dataflow to orchestrate real-time and batch data processing for streaming and historical data.- Strong knowledge and experience in best practices for data governance, security, and compliance - Excellent Communication and Presentation Skills with ability to tailor technical information as per customer needs- Strong analytical and problem-solving skills.- Ability to work independently and as part of a team.
Posted 3 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Gurugram
Work from Office
Key Responsibilities: - Gather and Analyze data from a variety of sources, including SQL databases, BigQuery, Excel, Power BI, and Python. Good SQL coding skills are a must. - Work with stakeholders to understand their needs and translate them into data-driven solutions. - Communicate effectively with stakeholders, both verbally and in writing. - Leading and managing a team of business analysts. - Must be a self-starter, able to manage multiple tasks and projects simultaneously, own deliverables end to end, prioritize workload effectively and thrive in dynamic environment - Must be a problem solver with outstanding skills in discovering techniques and proven - Abilities to translate the underlying business needs into actionable insights - Works well under pressure, can work within stringent timelines and collaborate with teams to achieve results Desired Profile: - Should have relevant experience of 4-7 years in the field of analytics - Technical Capabilities (hands on) - SQL, Advance Excel, Power BI, BigQuery, R/Python (Good to have) - Possess strong analytical skills and sharepoint of views with the organization - Penchant for business, curiosity about numbers and persistent to work with data to generate insights - Provides customized knowledge for client work, prepares accurate, well developed client deliverables - Experience in App Analytics would be preferred - Experience with E-commerce, Retail business would be preferred
Posted 3 weeks ago
1.0 - 4.0 years
10 - 14 Lacs
Pune
Work from Office
Overview Design, develop, and maintain data pipelines and ETL/ELT processes using PySpark/Databricks. Optimize performance for large datasets through techniques such as partitioning, indexing, and Spark optimization. Collaborate with cross-functional teams to resolve technical issues and gather requirements. Responsibilities Ensure data quality and integrity through data validation and cleansing processes. Analyze existing SQL queries, functions, and stored procedures for performance improvements. Develop database routines like procedures, functions, and views. Participate in data migration projects and understand technologies like Delta Lake/warehouse. Debug and solve complex problems in data pipelines and processes. Qualifications Bachelor’s degree in computer science, Engineering, or a related field. Strong understanding of distributed data processing platforms like Databricks and BigQuery. Proficiency in Python, PySpark, and SQL programming languages. Experience with performance optimization for large datasets. Strong debugging and problem-solving skills. Fundamental knowledge of cloud services, preferably Azure or GCP. Excellent communication and teamwork skills. Nice to Have: Experience in data migration projects. Understanding of technologies like Delta Lake/warehouse. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 3 weeks ago
4.0 - 8.0 years
16 - 25 Lacs
Bengaluru
Hybrid
Required Skills: Successful candidates will have demonstrated the following skills and characteristics: Must Have: Proven expertise in supply chain analytics across domains such as demand forecasting, inventory optimization, logistics, segmentation, and network design Well versed and hands-on experience of working on optimization methods like linear programming, mixed integer programming, scheduling optimization. Having understanding of working on third party optimization solvers like Gurobi will be an added advantage Proficiency in forecasting techniques (e.g., Holt-Winters, ARIMA, ARIMAX, SARIMA, SARIMAX, FBProphet, NBeats) and machine learning techniques (supervised and unsupervised) Strong command of statistical modeling, testing, and inference Proficient in using GCP tools: BigQuery, Vertex AI, Dataflow, Looker Building data pipelines and models for forecasting, optimization, and scenario planning Strong SQL and Python programming skills; experience deploying models in GCP environment Knowledge of orchestration tools like Cloud Composer (Airflow) Nice to have: Familiarity with MLOps, containerization (Docker, Kubernetes), and orchestration tools (e.g., Cloud composer) Strong communication and stakeholder engagement skills at the executive level Roles and Responsibilities: Assist analytics projects within the supply chain domain, driving design, development, and delivery of data science solutions Develop and execute on project & analysis plans under the guidance of Project Manager Interact with and advise consultants/clients in US as a subject matter expert to formalize data sources to be used, datasets to be acquired, data & use case clarifications that are needed to get a strong hold on data and the business problem to be solved Drive and Conduct analysis using advanced analytics tools and coach the junior team members Implement necessary quality control measures in place to ensure the deliverable integrity like data quality, model robustness, and explainability for deployments. Validate analysis outcomes, recommendations with all stakeholders including the client team Build storylines and make presentations to the client team and/or PwC project leadership team Contribute to the knowledge and firm building activities Role & responsibilities Preferred candidate profile
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.
The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.
In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.
Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.
As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2