Home
Jobs

962 Bigquery Jobs - Page 35

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Apache Spark, Java, Apache Kafka Good to have skills : Google Dataproc Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Your typical day will involve designing and developing data solutions, collaborating with teams to ensure data quality, and implementing ETL processes for data migration and deployment. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Design and develop data solutions for data generation, collection, and processing. Create data pipelines to ensure efficient data flow. Implement ETL processes to migrate and deploy data across systems. Ensure data quality and integrity throughout the data solutions. Collaborate with cross-functional teams to gather requirements and understand data needs. Optimize data solutions for performance and scalability. Troubleshoot and resolve data-related issues. Stay up-to-date with the latest trends and technologies in data engineering. Professional & Technical Skills: Must To Have Skills:Proficiency in Apache Spark, Java, and Apache Kafka. Good To Have Skills:Experience with Apache Airflow, Google Dataproc Strong understanding of data engineering principles and best practices. Experience with data modeling and database design. Hands-on experience with data integration and ETL tools. Knowledge of cloud platforms and services, such as AWS or Google Cloud Platform. Experience performing analysis with large datasets in a cloud-based environment, preferably with an understanding of Google's Cloud Platform (GCP) Experience with big data technologies, such as Hadoop and Hive. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Strong experience with multiple database models ( SQL, NoSQL, OLTP and OLAP) Strong experience with Data Streaming Architecture ( Kafka, Spark, Airflow) Strong knowledge of cloud data platforms and technologies such as GCS, BigQuery, Cloud Composer, Dataproc and other cloud-native offerings Knowledge of Infrastructure as Code (IaC) and associated tools (Terraform, ansible etc) Experience pulling data from a variety of data source types including Mainframe EBCDIC), Fixed Length and delimited files, databases (SQL, NoSQL, Time-series) Comfortable communicating with various stakeholders (technical and non-technical) GCP Data Engineer Certification is a nice to have Additional Information: The candidate should have a minimum of 5 years of experience in Apache Spark. This position is based in Mumbai. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Google BigQuery Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with Integration Architects and Data Architects to design and implement data platform components. Ensure seamless integration between various systems and data models. Develop and maintain data platform blueprints. Implement data governance policies and procedures. Conduct performance tuning and optimization of data platform components. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform. Good To Have Skills:Experience with Google BigQuery. Strong understanding of data platform architecture and design principles. Hands-on experience in implementing data pipelines and ETL processes. Proficient in SQL and other query languages. Knowledge of cloud platforms such as AWS or Azure. Additional Information: The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : No Function Specialty Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable solutions using Google BigQuery. Your typical day will involve collaborating with the team, analyzing business requirements, designing and implementing application features, and ensuring the applications meet quality standards and performance goals. Roles & Responsibilities:1. Design, create, code, and support a variety of data pipelines and models on GCP cloud technology 2. Strong hand-on exposure to GCP services like BigQuery, Composer etc.3. Partner with business/data analysts, architects, and other key project stakeholders to deliver data requirements.4. Developing data integration and ETL (Extract, Transform, Load) processes.5. Support existing Data warehouses & related pipelines.6. Ensuring data quality, security, and compliance.7. Optimizing data processing and storage efficiency, troubleshoot issues in Data space.8. Seeks to learn new skills/tools utilized in Data space (ex:dbt, MonteCarlo etc.)9. Excellent communication skills- verbal and written, Excellent analytical skills with Agile mindset.10. Demonstrates strong affinity towards paying attention to details and delivery accuracy.11. Self-motivated team player and should have ability to overcome challenges and achieve desired results.12. Work effectively in Global distributed environment. Professional & Technical Skills: Must To Have Skills:Proficiency in Google BigQuery. Strong understanding of data modeling and database design principles. Experience with SQL and query optimization techniques. Hands-on experience with ETL processes and data integration. Knowledge of cloud computing platforms such as Google Cloud Platform. Good To Have Skills:Experience with data visualization tools such as Tableau or Power BI. Skill Proficiency Expectation:Expert:Data Storage, BigQuery,SQL,Composer,Data Warehousing ConceptsIntermidate Level:PythonBasic Level/Preferred:DB,Kafka, Pub/Sub Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery. This position is based at our Hyderabad office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Nagpur

Work from Office

Naukri logo

Project Role : Advanced Application Engineer Project Role Description : Utilize modular architectures, next-generation integration techniques and a cloud-first, mobile-first mindset to provide vision to Application Development Teams. Work with an Agile mindset to create value across projects of multiple scopes and scale. Must have skills : SAP FI CO Finance Good to have skills : SAP CO Product Cost Controlling Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education About The Role ::Sr. SAP S4H FICO Consultant Job Duties & ResponsibilitiesIn-depth SAP Solutions and process knowledge including industry best practicesLeads fit/gap and other types of working sessions to understand needs driven by business process requirements.Translate requirements into solutions, using SAP Best Practices or Navisite Solutions as a baseline.Leader of their respective workstream on assigned projects.Work in conjunction with Navisite Service Delivery Lead to establish the overall plan for their respective work for the customerSAP configuration experience primarily in the FI/CO modules.Configure SAP CO systems to meet client business requirements, including connection points with SD, PP, MM and other modules and implementation of SAP best practices. At least two full lifecycle implementations as an SAP CO functional consultant and minimum 5 support projects. S4 HANA Experience is a mustApply strong knowledge of the business processes for designing, developing, and testing SAP functions associated with financial operations, which includes expertise in cost center accounting (CCA), Internal Order Accounting (IOA), product cost controlling (CO-PC), profitability analysis (CO-PA), and profit center accounting (PCA). Focus on business process re-engineering efforts and technology enablement Serves as the subject matter expert on product systems, processes, network architecture and interface capabilities Should have in-depth understanding and execution skills in FI and CO sub modules SAP FI:FI General Ledger accounting, Accounts Receivables, Account Payables, Asset accounting Experience in developing specifications for Interfaces and Custom ReportsCreates functional specifications for development objects.Conducts unit testing on overall solution including technical objects.Supports integration testing and user acceptance testing with customer.Explores new SAP applications as a subject matter expert and may be first adopter for emerging SAP technologies.Supports Navisite Application Managed Services (AMS) by working and resolving tickets as assigned.Sustains adequate product knowledge through formal training, webinars, SAP publications, collaboration among colleagues and self-study.Enforce the core competencies and professional standards of Navisite in all client engagements.Supports internal projects as assigned.Collaborates with colleagues to grow product knowledge.Assists in the continual improvement of Navisite methods and tools.Adheres to Navisite professional standardsWilling to travel as per business needs Key Competencies:Customer FocusResults DrivenBusiness AcumenTrusted AdvisorTask ManagementProblem Solving SkillsCommunication SkillsPriority SettingPresentation SkillsMentorship and CollaborationAbility to work regularly scheduled shifts After-hours coverage for critical issues as needed Qualifications 15 years full time education

Posted 3 months ago

Apply

12 - 17 years

14 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Google Cloud Data Services Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the entire application development process and ensuring its successful implementation. This role requires strong leadership skills and the ability to collaborate with cross-functional teams to deliver high-quality solutions. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Lead the effort to design, build, and configure applications. Act as the primary point of contact for the application development process. Oversee the entire application development process. Ensure successful implementation of applications. Professional & Technical Skills: Must To Have Skills:Proficiency in Google Cloud Data Services. Strong understanding of cloud computing concepts and architecture. Experience with designing and implementing scalable and reliable applications on Google Cloud Platform. Hands-on experience with Google Cloud services such as BigQuery, Cloud Storage, and Pub/Sub. Solid understanding of data integration and ETL processes. Good To Have Skills:Experience with other cloud platforms such as AWS or Azure. Additional Information: The candidate should have a minimum of 12 years of experience in Google Cloud Data Services. This position is based in Mumbai. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Gurgaon

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google Kubernetes Engine Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : Grad Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google Kubernetes Engine. Your typical day will involve working with the development team, analyzing requirements, and developing scalable and reliable applications. Roles & Responsibilities: Design, develop, and maintain scalable and reliable applications using Google Kubernetes Engine. Collaborate with cross-functional teams to analyze requirements and develop solutions that meet business process and application requirements. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Ensure the quality of the application by conducting unit testing, integration testing, and performance testing. Professional & Technical Skills:GCP Associate Cloud Engineer OR GCP Professional Cloud Architect MUST HAVE SKILLS:Hands-on experience in setting Cloud foundations/Landing zone, Cloud infrastructure design, network architecture design & implementation, Cloud security, logging & monitoring, Hands-on experience in at least two or more disciplines :GKE migration, GCVE migration, CloudBuild & Cloud Run, VM migration to GCE, CI/CD pipeline, Hands-on experience in writing Infrastructure-as-code using terraform scripting. Excellent verbal & written communication in English. Must To Have Skills:Experience in Google Kubernetes Engine. Good To Have Skills:Experience in Docker, Kubernetes, and other containerization technologies. Strong understanding of software development life cycle (SDLC) and agile methodologies. Experience in developing RESTful APIs and microservices architecture. Experience in developing and maintaining technical documentation. Experience in conducting unit testing, integration testing, and performance testing. Additional Information: The candidate should have a minimum of 5 years of experience in Google Kubernetes Engine. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering scalable and reliable applications. This position is based at our Gurugram office. Qualifications Grad

Posted 3 months ago

Apply

7 - 12 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Google Cloud Data Services Good to have skills : Platform Engineering Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication within the team and stakeholders. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the application design and development process Ensure timely delivery of high-quality applications Provide technical guidance and mentorship to team members Professional & Technical Skills: Must To Have Skills:Proficiency in Google Cloud Data Services, Platform Engineering Strong understanding of cloud architecture and Google Cloud Platform services Experience in designing and implementing scalable and reliable applications on Google Cloud Proficient in cloud-native application development and deployment practices Hands-on experience with Google Cloud Dataflow and BigQuery Additional Information: The candidate should have a minimum of 7.5 years of experience in Google Cloud Data Services This position is based at our Pune office A 15 years full-time education is required Qualifications 15 years full time education

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Kolkata

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : Fulltime 15 years qualification Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google BigQuery. Your typical day will involve working with the development team, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Google BigQuery. Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements. Develop and maintain technical documentation related to application development. Ensure that all development activities are completed on time, within budget, and to the required quality standards. Professional & Technical Skills: Proficiency in Google BigQuery. Experience in designing, building, and configuring applications to meet business process and application requirements. Strong understanding of software engineering principles and best practices. Experience with Agile development methodologies. Experience with version control systems such as Git. Experience with cloud computing platforms such as Google Cloud Platform. Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions. This position is based at our Mumbai office. Qualifications Fulltime 15 years qualification

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years or more of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google BigQuery. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing scalable solutions to meet the needs of our clients. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Google BigQuery. Collaborate with cross-functional teams to analyze business requirements and develop scalable solutions to meet the needs of our clients. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Ensure the quality of deliverables by conducting thorough testing and debugging of applications. Professional & Technical Skills: Must To Have Skills:Proficiency in Google BigQuery. Good To Have Skills:Experience with other cloud-based data warehousing solutions such as Amazon Redshift or Snowflake. Strong understanding of SQL and database design principles. Experience with ETL tools and processes. Experience with programming languages such as Python or Java. Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualifications 15 years or more of full time education

Posted 3 months ago

Apply

3 - 8 years

0 - 2 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

Position: Software Engineer / Senior Software Engineer Location : Anywhere in India Work Mode : Hybrid/Remote But if they are in any of the Brillio locations, Bangalore, Pune, Chennai, Hyderabad or Gurgaon then they might have to come to the office. 1-5+ years of experience in software design and development 1 years of experience in the data engineering field is preferred 1+ years of Hands-on experience in GCP cloud data implementation suite such as Big Query, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, Cloud Storage Experience in dialog flow and java programming is must Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms. Hands on Strong Experience in the below technology 1. GBQ Query 2. Python 3. Apache Airflow 4. SQL (BigQuery preferred) Extensive hands-on experience working with data using SQL and Python Cloud Functions. Comparable skills in AWS and other cloud Big Data Engineering space is considered. Experience with agile development methodologies Excellent verbal and written communications skills with the ability to clearly present ideas, concepts, and solutions Bachelor's Degree in Computer Science, Information Technology, or closely related discipline

Posted 3 months ago

Apply

6 - 10 years

5 - 15 Lacs

Pune, Bengaluru, Gurgaon

Hybrid

Naukri logo

Position: GCP Senior Data Engineer (Big Query) Location: Pan India Required Experience : 5 to 9 yrs Role & responsibilities Ability to assist in translating business requirements into technical specifications. Hands on experience in migrating the on-premises ETL data pipelines and Enterprise Datawarehouse applications to Google Cloud Platform. Experience in Data Migration from various databases like Greenplum, Teradata to BigQuery. Hands-on experience with various data analytic services like Google Cloud Storage, BigQuery, Pub/Sub, Dataflow, Datafusion, Cloud Composer. Should have strong experience writing complex SQL queries, PL/SQL procedures. Should have experience with writing scripts (Unix, Python). Should have implemented BQ scripts and procedures, performance tuning of queries. Should have good knowledge in RDBMS, Datawarehouse concepts. Should be able to implement BigQuery best practices. Data modeling experience in BigQuery is nice to have Should have strong analytical and troubleshooting skills. Ability to mentor and guide junior developers in the team. GCP Professional data engineer certification is Preferred. 2 years' Experience working in GCP Projects. 7+ Years of professional experience in IT industry within data space is preferred.

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job Title: DevOps/ML Engineer Corporate Title:Assistant Vice President Location:Pune Role Description: We are seeking a highly skilled and experienced DevOps Engineer to join our team, with a focus on Google Cloud and Machine Learning. The successful candidate will be responsible for designing, implementing, and maintaining our teams infrastructure and workflows on Google Cloud Platform, while also integrating machine learning models and algorithms into our products. This is a unique opportunity to work at the intersection of software development, infrastructure management, and machine learning, and to contribute to the growth and success of our team. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, implement, and maintain our teams infrastructure and workflows on Google Cloud Platform, including GCP services such as Google Kubernetes Engine (GKE), Cloud Storage, Vertex AI, Anthos, Monitoring etc. Design, implement, and maintain our containerization and orchestration strategy using Docker and Kubernetes. Collaborate with development teams to ensure seamless integration of containerized applications into our production environment. Collaborate with software developers to integrate machine learning models and algorithms into our products, using PyTorch, TensorFlow or other machine learning frameworks. Develop and maintain CI/CD pipelines for our products, using tools such as GitHub and GitHub actions. Create and maintain Infrastructure as Code templates using Terraform. Ensure the reliability, scalability, and security of our infrastructure and products, using monitoring and logging tools such as Anthos Service Mesh (ASM), Google Cloud's operations (GCO) etc. Work closely with other teams, such as software development, data science, and product management, to identify and prioritize infrastructure and machine learning requirements. Stay up to date with the latest developments in Google Cloud Platform and machine learning and apply this knowledge to improve our products and processes. Your skills and experience: Bachelors degree in computer science, Engineering, or a related field. At least 3 years of experience in a DevOps or SRE role, with a focus on Google Cloud Platform. Strong experience with infrastructure as code tools such as Terraform or Cloud Formation. Experience with containerization technologies such as Docker and container orchestration tools such as Kubernetes. Knowledge of machine learning frameworks such as TensorFlow or PyTorch. Experience with CI/CD pipelines and automated testing. Strong understanding of security and compliance best practices, including GCP security and compliance features. Excellent communication and collaboration skills, with the ability to work closely with cross-functional teams. Preferred Qualifications: Masters degree in computer science, Engineering, or a related field. Knowledge of cloud-native application development, including serverless computing and event-driven architecture. Experience with cloud cost optimization and resource management. Familiarity with agile software development methodologies and version control systems such as Git. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 months ago

Apply

4 - 8 years

12 - 22 Lacs

Ahmedabad

Work from Office

Naukri logo

Design, deliver & maintain the appropriate data solution to provide the correct data for analytical development to address key issues within the organization Gather detailed data requirements with a cross-functional team to deliver quality results. Required Candidate profile Strong experience with cloud services within Azure, AWS, or GCP platforms (preferably Azure) Strong experience with analytical tool (preferably SQL, dbt, Snowflake, BigQuery, Tableau)

Posted 3 months ago

Apply

10 - 14 years

12 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

Skill required: Marketing Operations - Customer Service Technology Designation: Business Advisory Associate Manager Qualifications: Any Graduation Years of Experience: 10 to 14 years What would you do? As a Team lead, should have experience in defining data modeling strategy for projects, functional requirements gathering and technical design and project management (Waterfall and Agile). Also should have designing, implementing and optimizing data transformation processes in GCP Big Query with the ability to consolidate, validate and cleanse data from multiple sources. Expertise in life-cycle implementation, upgrades, production support & enhancement from business analysis, testing, cut-over & migration, go-live assistance till post implementation support. Also should have experience in Billing Account and knowledge of Supply Chain doainRole requires Digital Marketing Ads & Promotion creation/designPlatform and software contact centers use to provide customer support. The quality and capabilities of the technology have a significant impact on both customer and agent experience, as well as contact center performance. What are we looking for? Microsoft SQL Server MySQL Google Cloud SQL Min 5+ years experience within Data Analytics SQL experience, ability to build queries Customer enterprise experience is preferred Help to automate and manually review and group all Billing Account (BA) data Help and create the structure around cloud account Arrange customer data to maximize the benefits Ability to build queries in SQL and prioritize Experience within customer enterprise ?SQL/PL SQL?Google Big Query?Handle Migration Activities?Data analysisSQL/PLSQL MasteryBig QueryAdvancedMigrationAdvancedData AnalysisAdvanced Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems Typically creates new solutions, leveraging and, where needed, adapting existing methods and procedures The person requires understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor or team leads Generally interacts with peers and/or management levels at a client and/or within Accenture The person should require minimal guidance when determining methods and procedures on new assignments Decisions often impact the team in which they reside and occasionally impact other teams Individual would manage medium-small sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualifications Any Graduation

Posted 3 months ago

Apply

1 - 6 years

3 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Job Area: Engineering Group, Engineering Group > Software Engineering General Summary: We are looking for a talented, motivated leader with experience in building Scalable Cloud Services, Infrastructure, and processes. As part of the IoT (Internet of Things) team you will be working on the next generation of IoT products. As a Business Intelligence Engineer (BIE) In This Role The ideal candidate will solve unique and complex problems at a rapid pace, utilizing the latest technologies to create solutions that are highly scalable. You will have deep expertise in gathering requirements and insights, mining large and diverse data sets, data visualization, writing complex SQL queries, building rapid prototype using Python/ R and generating insights that enable senior leaders to make critical business decisions.Key job responsibilities You will utilize your deep expertise in business analysis, metrics, reporting, and analytic tools/languages like SQL, Excel, and others, to translate data into meaningful insights through collaboration with scientists, software engineers, data engineers and business analysts. You will have end-to-end ownership of operational, financial, and technical aspects of the insights you are building for the business, and will play an integral role in strategic decision-making. Conduct deep dive analyses of business problems and formulate conclusions and recommendations to be presented to senior leadership Produce recommendations and insights that will help shape effective metric development and reporting for key stakeholders Simplify and automate reporting, audits and other data-driven activities Partner with Engineering teams to enhance data infrastructure, data availability, and broad access to customer insights To develop and drive best practices in reporting and analysis:data integrity, test design, analysis, validation, and documentation Learn new technology and techniques to meaningfully support product and process innovation BASIC QUALIFICATIONS At least 1+ years of experience using SQL to query data from databases/data warehouses/cloud data sources/etc. (e.g., Redshift, MySQL, PostgreSQL, MS SQL Server, BigQuery, etc.). Experience with data visualization using Tableau, Power BI, Quicksight, or similar tools. Bachelors degree in Statistics, Economics, Math, Finance, Engineering, Computer Science, Information Systems, or a related quantitative field. Ability to operate successfully and independently in a fast-paced environment. Comfort with ambiguity and eagerness to learn new skills. Knowledge of Cloud Services AWS, GCP and/or Azure is a must PREFERRED QUALIFICATIONS Experience with AWS solutions such as EC2, Athena, Glue, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Experience with creating and building predictive/optimization tools that benefit the business and improve customer experience Experience articulating business questions and using quantitative techniques to drive insights for business. Experience in dealing with technical and non-technical senior level managers. Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field. Applicants : Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies : Our Careers Site is only for individuals seeking a job at Qualcomm. Staffing and recruiting agencies and individuals being represented by an agency are not authorized to use this site or to submit profiles, applications or resumes, and any such submissions will be considered unsolicited. Qualcomm does not accept unsolicited resumes or applications from agencies. Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers.

Posted 3 months ago

Apply

7 - 10 years

9 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

As a Data Engineer, you will design, build, and maintain scalable data pipelines and systems. Your expertise in DBT, GCP BigQuery, and GIT, combined with your coding skills and experience working in an Agile environment. Key Responsibilities : Data Pipeline Development:Design, develop, and maintain ETL/ELT pipelines to ensure seamless data flow from various sources to our data warehouse using DBT and GCP BigQuery. Data Modelling:Implement and manage data models in BigQuery to support business analytics and reporting needs. Version Control:Utilize GIT for version control to manage changes in data pipelines, schemas, and related code. Collaboration:Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet their needs. Performance Optimization:Monitor and optimize data pipeline performance, ensuring high availability and reliability. Agile Practices:Participate in Agile ceremonies such as sprint planning, daily stand-ups, and retrospectives to ensure continuous improvement and timely delivery of projects. Experience :Proven experience as a Data Engineer with 7-10 years of experience or in a similar role, with a strong background in data engineering. Technical Skills: Proficiency in DBT for data transformation and modelling. Extensive experience with Google Cloud Platform (GCP), particularly BigQuery. Strong knowledge of GIT for version control. Solid coding skills in languages such as SQL, Python, or Java. Agile Environment:Experience working in an Agile development environment, with a good understanding of Agile methodologies and practices. Problem-Solving:Strong analytical and problem-solving skills. Qualifications : Certifications Preferable:Relevant certifications in GCP or DBT. Experience:Experience with additional GCP services (e.g., Dataflow, Pub/Sub, Composer) or other data tools. Education:Bachelors degree in Computer Science, Engineering, Data Science, or a related field.

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

1. Oracle Transport Management Cloud Techno/functional consultant with 8 years of expert domain knowledge covering the Planning, execution, Settlement . S/he must have been a part of at least 1-3 end-to-end OTM implementations 2. In-depth understanding of OTM Cloud business processes and their data flow. 3. The candidate should have been in client-facing roles and interacted with customers in requirement-gathering workshops, design, configuration, testing and go-live. 4. Should have strong written and verbal communication skills, personal drive, flexibility, team player, problem-solving, continuous improvement and client management. 5. Assist in identifying, assessing, and resolving complex functional issues/problems. Interact with client frequently around specific work efforts/deliverables 6.Configuring and Managing Technical activities 7. Configuration of Agents, SQL, VPD Profile and Executing as per requirments 8.Identifying areas of improvement and recommending process modifications to enhance operational efficiencies of 9.Configuration of Agents, Milestones, Rates, Locations, Itineraries 10. Managed the integration for OTM with ERPs like SAP/Oracle

Posted 3 months ago

Apply

5 - 9 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

An experienced consulting professional who understands solutions, industry best practices, multiple business processes or technology designs within a product/technology family. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Detailed Requirements: 1. The candidate is expected to have 6 ? 8 years of expert domain knowledge in HCM covering the hire to retire cycle. S/he must have been a part of at least 3 end-to-end HCM implementations. 2. The candidate must have expertise on Benefits modules. 3. In-depth understanding of HCM Cloud business process and their data flow 4. The candidate should have previous experience on PeopleSoft/EBS. 5. The candidate should have been in client facing roles and interacted with customers in requirement gathering workshops, design, configuration, testing and go-live 6. Should be able to provide consulting on best-practices guidance on business processes and implementation 7. Design, and build/configuration of complex requirements in Oracle Cloud 8. Understand the business requirements and interpret them to appropriate configurations on Oracle Cloud 9. Conduct design workshops and create workbooks and documentation to support the system design. 10. Experience of migrating data from varied legacy systems 11. Work with technical streams and provide guidance on integrations, conversions and reports. 12. Assist in the identification, assessment and resolution of complex functional issues/problems. 13. Should have strong written and verbal communication skills, personal drive, flexibility, team player, problem solving, influencing and negotiating skills and organizational awareness and sensitivity, engagement delivery, continuous improvement and knowledge sharing and client management. 14. Good leadership capability with strong planning and follow up skills, mentorship, work allocation, monitoring and status updates to Project Manager

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

GCP:Data engineer Hands on and deep experience working with Google Data Products (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.). Hands on experience in SQL and Unix scripting Experience in Python and Kafka. ELT Tool Experience and Hands on DBT Google Cloud Professional Data Engineers are responsible for developing Extract, Transform, and Load (ETL) processes to move data from various sources into the Google Cloud Platform. Detailed JD : Must Have Around 8 to 11 years of experience with a strong knowledge in migrating on premise ETLs to Google Cloud Platform (GCP) ? 2-3 years of Strong Bigquery+GCP Experience. Very Strong SQL writing skills Hand on Experience in Python Programming Hands on experience in Design, Development, Implementation of Data Warehousing in ETL process. Experience in IT data analytics projects, hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as BIG query, Google Cloud Storage, Composer, Dataflow, Cloud Functions. GCP certified Associate Cloud Engineer. Practical understanding of the Data modelling (Dimensional & Relational), Performance Tuning and debugging. Extensive experience in the Data Warehousing using Data Extraction, Data Transformation, Data Loading and business intelligence technologies using ELT design Working experience in CI /CD using Gitlab and Jenkins. Good to Have DBT tool experience Practical experience in Big Data application development involving various data processing techniques for Data Ingestion, Data Modelling In-Stream data processing and Batch Analytics using various distributions of Hadoop and its ecosystem tools like HDFS, HIVE, PIG, Sqoop, Spark. Document all the work implemented using Confluence and track all requests and changes using Jira. Involved in both technical and managerial activities and experience in GCP Responsibilities ? Create and maintain optimal data pipeline architecture. ? Assemble large, complex data sets that meet functional / non-functional business requirements. ? Identify, design, and implement internal process improvements:automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. ? Build and maintain the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and GCP data warehousing technologies. ? Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. ? Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. ? Keep our data separated and secure across national boundaries through data centers and GCP regions. ? Work with data and analytics experts to strive for greater functionality in the data systems. Qualifications ? Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. ? Experience building and optimizing data warehousing data pipelines (ELT and ETL), architectures and data sets. ? Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. ? Strong analytic skills related to working with unstructured datasets. ? Build processes supporting data transformation, data structures, metadata, dependency and workload management. ? A successful history of manipulating, processing and extracting value from large disconnected datasets. ? Working knowledge of message queuing, stream processing, and highly scalable big data data stores. ? Strong project management and organizational skills. ? Experience supporting and working with cross-functional teams in a dynamic environment. Technical Skillset ? Experience with data warehouse tools:GCP BigQuery, GCP BigData, Dataflow, Teradata, etc. ? Experience with relational SQL and NoSQL databases, including PostgreSQL and MongoDB. ? Experience with data pipeline and workflow management tools:Data Build Tool (DBT), Airflow, Google Cloud Composer, Google Cloud PubSub, etc. ? Experience with GCP cloud services:Primarily BigQuery, Kubernetes, Cloud Function, Cloud Composer, PubSub etc. ? Experience with object-oriented/object function scripting languages:Python, Java, Terraform etc. ? Experience with CICD pipeline and workflow management tools:GitHub Enterprise, Cloud Build, Codefresh etc. ? Experience with Data Analytics and Visualization Tools:Tableau BI Tool (OnPrem and SaaS), Data Analytics Workbench (DAW), Visual Data Studio etc. ? GCP Data Engineer certification is mandatory

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google Cloud Data Services Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary:As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for utilizing your expertise in Google Cloud Data Services to develop efficient and effective solutions. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, designing application architecture, and implementing robust and scalable applications. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Collaborate with cross-functional teams to analyze business requirements.- Design and develop application architecture using Google Cloud Data Services.- Implement robust and scalable applications.- Perform code reviews and ensure adherence to coding standards.- Troubleshoot and debug application issues.- Stay updated with the latest industry trends and technologies.- Provide technical guidance and support to junior team members. Professional & Technical Skills:- Must To Have Skills:Proficiency in Google Cloud Data Services.- Strong understanding of cloud computing concepts and architecture.- Experience with data storage and processing using Google Cloud Platform.- Hands-on experience with Google Cloud services such as BigQuery, Cloud Storage, and Dataflow.- Knowledge of data integration and ETL processes.- Familiarity with programming languages such as Python or Java. Additional Information:- The candidate should have a minimum of 3 years of experience in Google Cloud Data Services.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualifications 15 years full time education

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Kolkata

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years or more of full time education Summary:As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google BigQuery. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing scalable solutions to meet the needs of our clients. Roles & Responsibilities:- Design, build, and configure applications to meet business process and application requirements using Google BigQuery.- Collaborate with cross-functional teams to analyze business requirements and develop scalable solutions to meet the needs of our clients.- Develop and maintain technical documentation, including design documents, test plans, and user manuals.- Ensure the quality of deliverables by conducting thorough testing and debugging of applications. Professional & Technical Skills:- Must To Have Skills:Proficiency in Google BigQuery.- Good To Have Skills:Experience with other cloud-based data warehousing solutions such as Amazon Redshift or Snowflake.- Strong understanding of SQL and database design principles.- Experience with ETL tools and processes.- Experience with programming languages such as Python or Java. Additional Information:- The candidate should have a minimum of 5 years of experience in Google BigQuery.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Bengaluru office. Qualifications 15 years or more of full time education

Posted 3 months ago

Apply

7 - 12 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

Job description Must have 7+ years of experience. Good hands-on Jenkins, Git, terraform ( infra provisioning), GKE, DevSecOps and Cloud Concepts practically. Provision the infra pipeline using terraform, vulnerability removal. Provision of data flow , big query using terraform. Candidate should be from infra background.

Posted 3 months ago

Apply

6 - 10 years

8 - 12 Lacs

Gurgaon

Work from Office

Naukri logo

About The Role : Role Purpose Data Analyst, Data Modeling, Data Pipeline, ETL Process, Tableau, SQL, Snowflake. Do Strong expertise in data modeling, data warehousing, and ETL processes. - Proficient in SQL and experience with data warehousing tools (e.g., Snowflake, Redshift, BigQuery) and ETL tools (e.g., Talend, Informatica, SSIS). - Demonstrated ability to lead and manage complex projects involving cross-functional teams. - Excellent analytical, problem-solving, and organizational skills. - Strong communication and leadership abilities, with a track record of mentoring and developing team members. - Experience with data visualization tools (e.g., Tableau, Power BI) is a plus. - Preference to candidates with experience in ETL using Python, Airflow or DBT Build capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Partner with team leaders to brainstorm and identify training themes and learning issues to better serve the client Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback2Self- ManagementProductivity, efficiency, absenteeism, Training Hours, No of technical training completed

Posted 3 months ago

Apply

4 - 8 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

JR REQ--- GCP Data Engineer --4to8year---Bangalore, Hyder----Chahat Parveen ---TCS C2H ---900000

Posted 3 months ago

Apply

5 - 7 years

8 - 10 Lacs

Chennai, Pune, Delhi

Work from Office

Naukri logo

The ideal candidate should possess technical expertise in the following areas, along with soft skills such as communication, collaboration, time management, and organizational abilities. Key Skills Experience: Soft Skills: Communication, Collaboration, Time Management, Organizational Skills, Positive Attitude. Experience: Proficiency in Data Engineering, SQL, and Cloud Technologies. Must-Have Technical Skills: Talend SQL, SQL Server, T-SQL SQL Agent Snowflake / BigQuery GCP (Google Cloud Platform) SSIS Dataproc Composer / Airflow Python Nice-to-Have Technical Skills: Dataplex Dataflow Big Lake, Lakehouse, BigTable GCP Pub/Sub BQ API, BQ Connection API Other Details: .

Posted 3 months ago

Apply

Exploring BigQuery Jobs in India

BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

Average Salary Range

The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.

Career Path

In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.

Related Skills

Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.

Interview Questions

  • What is BigQuery and how does it differ from traditional databases? (basic)
  • How can you optimize query performance in BigQuery? (medium)
  • Explain the concepts of partitions and clustering in BigQuery. (medium)
  • What are some best practices for designing schemas in BigQuery? (medium)
  • How does BigQuery handle data encryption at rest and in transit? (advanced)
  • Can you explain how BigQuery pricing works? (basic)
  • What are the limitations of BigQuery in terms of data size and query complexity? (medium)
  • How can you schedule and automate tasks in BigQuery? (medium)
  • Describe your experience with BigQuery ML and its applications. (advanced)
  • How does BigQuery handle nested and repeated fields in a schema? (basic)
  • Explain the concept of slots in BigQuery and how they impact query processing. (medium)
  • What are some common use cases for BigQuery in real-world scenarios? (basic)
  • How does BigQuery handle data ingestion from various sources? (medium)
  • Describe your experience with BigQuery scripting and stored procedures. (medium)
  • What are the benefits of using BigQuery over traditional on-premises data warehouses? (basic)
  • How do you troubleshoot and optimize slow-running queries in BigQuery? (medium)
  • Can you explain the concept of streaming inserts in BigQuery? (medium)
  • How does BigQuery handle data security and access control? (advanced)
  • Describe your experience with BigQuery Data Transfer Service. (medium)
  • What are the differences between BigQuery and other cloud-based data warehousing solutions? (basic)
  • How do you handle data versioning and backups in BigQuery? (medium)
  • Explain how you would design a data pipeline using BigQuery and other GCP services. (advanced)
  • What are some common challenges you have faced while working with BigQuery and how did you overcome them? (medium)
  • How do you monitor and optimize costs in BigQuery? (medium)
  • Can you walk us through a recent project where you used BigQuery to derive valuable insights from data? (advanced)

Closing Remark

As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies