Jobs
Interviews

1359 Bigquery Jobs - Page 20

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

5 - 14 Lacs

Hyderabad

Work from Office

Position: Data Analyst | Interview: Walk-in | Type: Full-time | Location: Hyderabad | Exp: 3–8 yrs | Work: 5 Days WFO Data Analysis & Insights Reporting & Visualization Data Extraction & ETL Collaboration & Management Contact:6309124068 Manoj Required Candidate profile Looking for Data Analysts with 3–8 yrs exp in SQL, BI tools (Tableau/Power BI), Python/AppScript. Should have experience in ETL, dashboarding, A/B testing, Contact:6309124068 Manoj

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Remote

Now Hiring MicroStrategy Developer Author : Adastra APAM Talent Acquisition Team Contact: HRIN@adastragrp.com Website : Adastra | Data Analytics and IT Consultancy Office : 101A, 1st Floor, Delta One, Giga Space IT Park, Viman Nagar, Pune, Maharashtra, 411014, India Company Name : ADASTRAINDIA DATA SERVICES PVT Company ID : U62013PN2024FTC232527 First Published : 09-Jul-25 We are looking for a MicroStrategy Developer on a freelance or part-time basis, with strong hands-on experience in MicroStrategy development, particularly involving newer versions and cloud-based platforms like Google BigQuery. This role focuses on the technical aspects of MicroStrategy BI delivery from schema design and dashboard development to platform integration and is well-suited for professionals who prefer flexible, project-based work rather than full-time, business-facing roles. 1 Job Description As a MicroStrategy Developer, you will be responsible for: Design and develop MicroStrategy reports, dashboards, and Dossiers. Build and maintain schema objects and metadata layers. Migrate and optimize MicroStrategy environments across traditional and cloud platforms (e.g., BigQuery). Collaborate with business units to refine and iterate dashboard solutions. Handle technical communication and documentation related to BI implementations. Support data integration and performance tuning efforts within MicroStrategy. 2 Profile Requirements For this position as MicroStrategy Developer, we are looking for someone with: Strong expertise in MicroStrategy development, including schema and dashboard design. Experience working with MicroStrategy integrations on cloud platforms, particularly Google BigQuery. Proficiency in SQL and data modeling. Familiarity with MicroStrategy Dossiers, cubes, metrics, and visualizations. Comfortable managing technical tasks independently with minimal oversight. Developers with strong technical skills looking to contribute without direct business-facing responsibilities. Freelancers or part-time professionals seeking flexible project-based work. 3 Adastra APAM Culture Manifesto Servant Leadership Managers are servants to employees. Managers are elected to make sure that employees have all the processes, resources, and information they need to provide services to clients in an efficient manner. Any manager up to the CEO is visible and reachable for a chat regardless their title. Decisions are taken with a consent in an agile manner and executed efficiently in no overdue time. We accept that wrong decisions happen and we appreciate the learning before we adjust the process for a continuous improvement. Employees serve clients. Employees listen attentively to client needs and collaborate internally as a team to cater to them. Managers and employees work together to get things done and are accountable to each other. Corporate KPIs are transparently reviewed on monthly company events with all employees. Performance Driven Compensation We recognize and accept that some of us are more ambitious, more gifted, or more hard-working. We also recognize that some of us look for a stable income and lesser hassle at a different stage of their careers. There is a place for everyone, we embrace and need this diversity. Grades in our company are not based on number of years of experience, they are value driven based on everyones ability to deliver independently their work to clients and/or lead others. There is no annual indexation of salaries, you may be upgraded several times within the year, or none, based on your own pace of progress, ambitions, relevant skillset and recognition by clients. Work-Life Integration We challenge the notion of work-life balance, we embrace the notion of work-life integration instead. This philosophy looks into our lives a single whole where we serve ourselves, our families and our clients in an integrated manner. We encourage 100% flexible working hours where you arrange your day. This means you are free when you have little work, but this also means extra effort if you are behind schedule. Working for clients that may be in different time zones means we give you the flexibility to design how your day will look like in accordance to personal and project preferences and needs. We appreciate time and we minimize time spent on Adastra meetings. We are also a remote-first company. While we have our collaboration offices and social events, we encourage people to work 100% remote from home whenever possible. This means saving time and money on commute, staying home with elderly and little ones, not missing the special moments in life. This also means you can work from any of our other offices in Europe, North America or Australia, or move to a place with lower cost of living without impacting your income. We trust you by default until you fail our trust. Global Diversity Adastra is an international organization. We hire globally and our biggest partners and clients are in Europe, North America and Australia. We work on teams with individuals from different culture, ethnicity, sexual preference, political views or religion. We have zero tolerance to anyone who doesnt pay respect to others or is abusive in any way. We speak different languages to one another, but we speak English when we are together or with clients. Our company is a safe space where communication is encouraged but boundaries regarding sensitive topics are respected. We accept and converge together to serve our teams and clients and ultimately have good time at work. Lifelong Learning On annual average we invest 25% of our working hours to personal development and upskilling outside project work, regardless of seniority or role. We feature hundreds of courses on our Training Repo, and we continue to actively purchase or tailor hands-on content. We certify people on our expense. We like to say we are technology agnostic; we learn the principles of data management and we apply it on different use cases and different technology stacks. We believe that the juniors today are the seniors tomorrow, we treat everyone with respect and mentor them into the roles they deserve. We encourage seniors to give back to the IT community through leadership and mentorship. On your last day with us we may give you an open-dated job offer so that you feel welcome to return home as others did before you. More About Adastra: Visit http://adastragrp.com and/or contact us: HRIN@adastragrp.com Now Hiring Sourcing Associate 1

Posted 3 weeks ago

Apply

7.0 - 12.0 years

25 - 27 Lacs

Hyderabad

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply

2.0 - 5.0 years

5 - 7 Lacs

Noida

Work from Office

Key Responsibilities: Develop and maintain scalable full-stack applications using Java, Spring Boot, and Angular for building rich UI screens and custom/reusable components. Design and implement cloud-based solutions leveraging Google Cloud Platform (GCP) services such as BigQuery, Google Cloud Storage, Cloud Run, and PubSub. Manage and optimize CI/CD pipelines using Tekton to ensure smooth and efficient development workflows. Deploy and manage Google Cloud services using Terraform, ensuring infrastructure as code principles. Mentor and guide junior software engineers, fostering professional development and promoting systemic change across the development team. Collaborate with cross-functional teams to design, build, and maintain efficient, reusable, and reliable code. Drive best practices and improvements in software engineering processes, including coding standards, testing, and deployment strategies. Required Skills: Java/Spring Boot (5+ years): In-depth experience in developing backend services and APIs using Java and Spring Boot. Angular (3+ years): Proven ability to build rich, dynamic user interfaces and custom/reusable components using Angular. Google Cloud Platform (2+ years): Hands-on experience with GCP services like BigQuery, Google Cloud Storage, Cloud Run, and PubSub. CI/CD Pipelines (2+ years): Experience with tools like Tekton for automating build and deployment processes. Terraform (1-2 years): Experience in deploying and managing GCP services using Terraform. J2EE (5+ years): Strong experience in Java Enterprise Edition for building large-scale applications. Experience mentoring and delivering organizational change within a software development team.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Pune, Bengaluru, Delhi / NCR

Hybrid

Mandatory Skills: Apache beam,Big-Query,Dataflow,DataProc,Composer,Airflow,Pyspark,Python,SQL.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

0 - 1 Lacs

Hyderabad

Hybrid

Job Title: Data Engineer (L3) Python & GCP Experience Level: 4 to 6 years of relevant IT experience Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. Implement and enforce data quality checks, validation rules, and monitoring. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL. Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: 4(+)years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture, data integration, and transformation techniques. Experience in working with version control systems like GitHub and knowledge of CI/CD practices. Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools. Additional Details: Excellent problem-solving and analytical skills. Strong communication skills and ability to collaborate in a team environment. Note: Only shortlisted candidates will receive interview invites after profile screening.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

10 - 20 Lacs

Pune, Chennai, Bengaluru

Work from Office

Key Responsibilities Design, build, and maintain robust, scalable, and efficient ETL/ELT pipelines. Implement data ingestion processes using FiveTran and integrate various structured and unstructured data sources into GCP-based environments. Develop data models and transformation workflows using DBT and manage version-controlled pipelines. Build and manage data storage solutions using Snowflake, optimizing for cost, performance, and scalability. Orchestrate workflows and pipeline dependencies using Apache Airflow. Design and support Data Lake architecture for raw and curated data zones. Collaborate with Data Analysts, Scientists, and Product teams to ensure availability and quality of data. Monitor data pipeline performance, ensure data integrity, and handle error recovery mechanisms. Follow best practices in CI/CD, testing, data governance, and security standards. Required Skills 5 - 7 years of professional experience in data engineering roles. Hands-on experience with GCP services: Big Query, Cloud Storage, Pub/Sub, Dataflow, Composer, etc. Proficient in writing modular SQL transformations and data modeling using DBT. Deep understanding of Snowflake warehousing: performance tuning, cost optimization, security. Experience with Airflow for pipeline orchestration and DAG management. Familiarity with designing and implementing Data Lake solutions. Proficient in Python and/or SQL. Send profiles to payal.kumari@nam-it.com Regards, Payal Kumari Senior Executive Staffing NAM Info Pvt Ltd, 29/2B-01, 1st Floor, K.R. Road, Banashankari 2nd Stage, Bangalore - 560070. Email payal.kumari@nam-it.com Website - www.nam-it.com USA | CANADA | INDIA

Posted 3 weeks ago

Apply

6.0 - 11.0 years

12 - 22 Lacs

Chennai

Hybrid

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers for Chennai Location . Hope you are doing well! This is Jogeshwari from Getronics Talent Acquisition team. We have multiple opportunities for GCP Data Engineers. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to jogeshwari.k@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 6+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 6+ years of professional experience: Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Jogeshwari Senior Specialist

Posted 3 weeks ago

Apply

1.0 - 2.0 years

3 - 6 Lacs

Dhule

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 3 weeks ago

Apply

4.0 - 6.0 years

6 - 16 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

What youll be doing We are looking for data engineers who can work with world class team members to help drive telecom business to its full potential. We are building data products / assets for telecom wireless and wireline business which includes consumer analytics, telecom network performance and service assurance analytics etc. We are working on cutting edge technologies like digital twin to build these analytical platforms and provide data support for varied AI ML implementations. As a data engineer you will be collaborating with business product owners, coaches, industry renowned data scientists and system architects to develop strategic data solutions from sources which includes batch, file and data streams As a Data Engineer with ETL/ELT expertise for our growing data platform & analytics teams, you will understand and enable the required data sets from different sources both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Company. Understanding the business requirements and converting them to the technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. Understanding devops process and contributing for devops pipelines What were looking for... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. You’ll need to have: Bachelor’s degree or four or more years of work experience. Experience with Data Warehouse concepts and Data Management life cycle. Experience in GCP cloud platform - (BigQuery/Cloud Composer/Data Proc(or Hadoop+Spark))/Cloud Function). Experience in any programming language preferably Python. Proficiency in graph data modeling, including experience with graph data models and graph query language. Exposure in working on GenAI use cases. Experience in troubleshooting the data issues. Experience in writing complex SQL and performance tuning. Experience in DevOps Experience in GraphDB , Core Java Experience in real time streaming and lambda architecture. Role & responsibilities Preferred candidate profile

Posted 3 weeks ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Bengaluru

Work from Office

Location: Bangalore/Hyderabad/Pune Experience level: 8+ Years About the Role We are looking for a technical and hands-on Lead Data Engineer to help drive the modernization of our data transformation workflows. We currently rely on legacy SQL scripts orchestrated via Airflow, and we are transitioning to a modular, scalable, CI/CD-driven DBT-based data platform. The ideal candidate has deep experience with DBT , modern data stack design , and has previously led similar migrations improving code quality, lineage visibility, performance, and engineering best practices. Key Responsibilities Lead the migration of legacy SQL-based ETL logic to DBT-based transformations Design and implement a scalable, modular DBT architecture (models, macros, packages) Audit and refactor legacy SQL for clarity, efficiency, and modularity Improve CI/CD pipelines for DBT: automated testing, deployment, and code quality enforcement Collaborate with data analysts, platform engineers, and business stakeholders to understand current gaps and define future data pipelines Own Airflow orchestration redesign where needed (e.g., DBT Cloud/API hooks or airflow-dbt integration) Define and enforce coding standards, review processes, and documentation practices Coach junior data engineers on DBT and SQL best practices Provide lineage and impact analysis improvements using DBTs built-in tools and metadata Must-Have Qualifications 8+ years of experience in data engineering Proven success in migrating legacy SQL to DBT , with visible results Deep understanding of DBT best practices , including model layering, Jinja templating, testing, and packages Proficient in SQL performance tuning , modular SQL design, and query optimization Experience with Airflow (Composer, MWAA), including DAG refactoring and task orchestration Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery etc.) Familiarity with data testing and CI/CD for analytics workflows Strong communication and leadership skills; comfortable working cross-functionally Nice-to-Have Experience with DBT Cloud or DBT Core integrations with Airflow Familiarity with data governance and lineage tools (e.g., dbt docs, Alation) Exposure to Python (for custom Airflow operators/macros or utilities) Previous experience mentoring teams through modern data stack transitions

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Key Responsibilities: Design, develop, and maintain backend services and APIs using Python (Flask/Django/FastAPI). Develop scalable and secure microservices for data processing, analytics, and APIs. Manage and optimize data storage with SQL (PostgreSQL/MySQL) and NoSQL databases (MongoDB/Firestore/Bigtable). Design and implement CI/CD pipelines and automate cloud deployments on GCP (App Engine, Cloud Run, Cloud Functions, GKE). Collaborate with front-end developers, product owners, and other stakeholders to integrate backend services with business logic and UI. Optimize application performance and troubleshoot issues across backend systems. Implement best practices in code quality, testing (unit/integration), security, and scalability. Qualifications: Bachelors or masters degree in computer science, Data Science, or a related field. Must have 3+ years of relevant IT experience Strong hands-on programming experience in Python. Experience with one or more Python frameworks: Flask, Django, or FastAPI. Deep understanding of RESTful API design and development. Proficient in working with relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Firestore, BigQuery). Solid understanding and experience with GCP services Familiarity with Git, CI/CD tools (e.g., Cloud Build, Jenkins, GitHub Actions). Strong debugging, problem-solving, and performance tuning skills.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Position: Senior AI/ML Engineer GCP Location: Hyderabad/Bangalore/Pune Work Mode: Hybrid About the Company: Relanto is a global advisory, consulting, and technology services partner, empowering customers to accelerate innovation by harnessing the power of Data and AI, Automation, NextGen Planning, and Cloud Solutions. Overview: We're seeking an experienced individual (5-10 years) specializing in AI/ML to build and deploy cloud-based machine learning solutions. You'll work with Google Cloud Platform services to create scalable AI systems and APIs that integrate with various ML models. What You'll Do: Design and implement end-to-end ML pipelines in GCP Build and optimize AI models for production deployment using Vertex AI Develop RESTful APIs for ML model serving Implement vector search capabilities using BigQuery ML Create automated testing and deployment pipelines for ML models Set up model monitoring and performance tracking Optimize model inference and serving capabilities Must Have Skills: Strong Python programming with ML frameworks (PyTorch, TensorFlow) Experience with large language models and prompt engineering Proficiency in GCP AI services (Vertex AI, Cloud ML Engine) Vector search implementation (BigQuery ML, Matching Engine) RESTful API development with FastAPI/Flask Container orchestration with Docker and Google Kubernetes Engine (GKE) CI/CD pipeline experience for ML workflows Why Join Us: Competitive salary and benefits. Opportunities for professional growth and development. Collaborative and inclusive work culture. Flexible working arrangements for work-life balance. Exposure to innovative data analytics and visualization projects.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

1 Lacs

Chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description: Potential candidates should have hands-on experience in applying machine learning, data mining, and text mining techniques to build analytics prototypes that work on massive datasets. Candidates should have experience in manipulating both structured and unstructured data in various formats, sizes, and storage-mechanisms. Candidates should have excellent problem solving skills with an inquisitive mind to challenge existing practices. Candidates should have exposure to multiple programming languages and analytical tools and be flexible to using the requisite tools/languages for the problem at-hand. Key Roles and Responsibilities of Position: Apply machine learning, data mining and text mining techniques to create scalable solutions for business problems Train, tune, validate, and monitor predictive models i Analyze and extract relevant information from large amounts of the Client historical business data, both in structured and unstructured formats Establish scalable, efficient, automated processes for large scale data analyses Package and present the findings and communicate with large cross-functional teams Qualifications: MBA/Masters in a quantitative discipline like Mathematics/Statistics/Operations Research/Computer Science/Economics/Engineering or B-Tech in any related engineering discipline Excellent problem-solving skills Strong communication and data presentation skills Develop complex SQL queries and stored procedures for data transformation, analysis, and reporting within BigQuery and other potential data stores. Strong programming skills in Python Familiarity with distributed computing language & cloud technologies Proficiency with Version Control systems, specifically Git. Experience with or knowledge of data visualization tools like Power BI. Strong communication and collaboration skills. Knowledge of other cloud providers (AWS, Azure) Skills Required: Big Query, Experience Required: Apply machine learning, data mining and text mining techniques to create scalable solutions for business problems Train, tune, validate, and monitor predictive models Analyze and extract relevant information from large amounts of the client' s historical business data, both in structured and unstructured formats Establish scalable, efficient, automated processes for large scale data analyses Package and present the findings and communicate with large cross-functional teams Education Required: Bachelor's Degree Education Preferred: Bachelor's Degree, Certification Program TekWissen Group is an equal opportunity employer supporting workforce diversity.

Posted 3 weeks ago

Apply

1.0 - 2.0 years

3 - 5 Lacs

Ahmedabad

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Noida, Pune, Bengaluru

Work from Office

Description: The Data & Analytics Team is seeking a Data Engineer with a hybrid skillset in data integration and application development. This role is crucial for designing, engineering, governing, and improving our entire Data Platform, which serves customers, partners, and employees through self-service access. You'll demonstrate expertise in data & metadata management, data integration, data warehousing, data quality, machine learning, and core engineering principles Requirements: • 5+ years of experience with system/data integration, development, or implementation of enterprise and/or cloud software. • Strong experience with Web APIs (RESTful and SOAP). • Strong experience setting up data warehousing solutions and associated pipelines, including ETL tools (preferably Informatica Cloud). • Demonstrated proficiency with Python. • Strong experience with data wrangling and query authoring in SQL and NoSQL environments for both structured and unstructured data. • Experience in a cloud-based computing environment, specifically GCP. • Expertise in documenting Business Requirement, Functional & Technical documentation. • Expertise in writing Unit & Functional Test Cases, Test Scripts & Run books. • Expertise in incident management systems like Jira, Service Now etc. • Working knowledge of Agile Software development methodology. • Strong organizational and troubleshooting skills with attention to detail. • Strong analytical ability, judgment, and problem analysis techniques. • Excellent interpersonal skills with the ability to work effectively in a cross-functional team. Job Responsibilities: • Lead system/data integration, development, or implementation efforts for enterprise and/or cloud software. • Design and implement data warehousing solutions and associated pipelines for internal and external data sources, including ETL processes. • Perform extensive data wrangling and author complex queries in both SQL and NoSQL environments for structured and unstructured data. • Develop and integrate applications, leveraging strong proficiency in Python and Web APIs (RESTful and SOAP). • Provide operational support for the data platform and applications, including incident management. • Create comprehensive Business Requirement, Functional, and Technical documentation. • Develop Unit & Functional Test Cases, Test Scripts, and Run Books to ensure solution quality. • Manage incidents effectively using systems like Jira, Service Now, etc. • Prepare change management packages and implementation plans for migrations across different environments. • Actively participate in Enterprise Risk Management Processes. • Work within an Agile Software Development methodology, contributing to team success. • Collaborate effectively within cross-functional teams. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 3 weeks ago

Apply

13.0 - 17.0 years

32 - 35 Lacs

Noida, Gurugram

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 3 weeks ago

Apply

3.0 - 5.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

New Delhi, Chennai, Bengaluru

Work from Office

We are looking for an experienced Data Engineer with a strong background in data engineering, storage, and cloud technologies. The role involves designing, building, and optimizing scalable data pipelines, ETL/ELT workflows, and data models for efficient analytics and reporting. The ideal candidate must have strong SQL expertise, including complex joins, stored procedures, and certificate-auth-based queries. Experience with NoSQL databases such as Firestore, DynamoDB, or MongoDB is required, along with proficiency in data modeling and warehousing solutions like BigQuery (preferred), Redshift, or Snowflake. The candidate should have hands-on experience working with ETL/ELT pipelines using Airflow, dbt, Kafka, or Spark. Proficiency in scripting languages such as PySpark, Python, or Scala is essential. Strong hands-on experience with Google Cloud Platform (GCP) is a must. Additionally, experience with visualization tools such as Google Looker Studio, LookerML, Power BI, or Tableau is preferred. Good-to-have skills include exposure to Master Data Management (MDM) systems and an interest in Web3 data and blockchain analytics.

Posted 3 weeks ago

Apply

0.0 - 1.0 years

2 - 3 Lacs

Bengaluru

Work from Office

Key Responsibilities: Develop and maintain scalable full-stack applications using Java, Spring Boot, and Angular for building rich UI screens and custom/reusable components. Design and implement cloud-based solutions leveraging Google Cloud Platform (GCP) services such as BigQuery, Google Cloud Storage, Cloud Run, and PubSub. Manage and optimize CI/CD pipelines using Tekton to ensure smooth and efficient development workflows. Deploy and manage Google Cloud services using Terraform, ensuring infrastructure as code principles. Mentor and guide junior software engineers, fostering professional development and promoting systemic change across the development team. Collaborate with cross-functional teams to design, build, and maintain efficient, reusable, and reliable code. Drive best practices and improvements in software engineering processes, including coding standards, testing, and deployment strategies. Required Skills: Java/Spring Boot (5+ years): In-depth experience in developing backend services and APIs using Java and Spring Boot. Angular (3+ years): Proven ability to build rich, dynamic user interfaces and custom/reusable components using Angular. Google Cloud Platform (2+ years): Hands-on experience with GCP services like BigQuery, Google Cloud Storage, Cloud Run, and PubSub. CI/CD Pipelines (2+ years): Experience with tools like Tekton for automating build and deployment processes. Terraform (1-2 years): Experience in deploying and managing GCP services using Terraform. J2EE (5+ years): Strong experience in Java Enterprise Edition for building large-scale applications. Experience mentoring and delivering organizational change within a software development team.

Posted 3 weeks ago

Apply

12.0 - 15.0 years

30 - 35 Lacs

Pune

Work from Office

Primary Requirement - Python Automation Architect (12+ Years Experience) We are seeking a highly experienced Python professional with 12+ years of expertise in Python programming and automation projects. The ideal candidate will possess strong architectural skills, deep technical knowledge, and a proven track record of delivering scalable, efficient solutions. Key Responsibilities & Expertise: Python Expertise: Deep understanding of Python internals, design patterns, and industry best practices for building high-performance applications. Architectural Design & System Planning: Demonstrated ability to design and implement scalable, maintainable, and efficient Python-based systems and architectures. Advanced Data Handling with Pandas: Extensive experience using the Pandas library for complex data manipulation, transformation, analysis, and performance optimization. Web/API Development with Flask: Proficient in building robust and secure RESTful APIs and web applications using Flask, with an emphasis on performance and scalability. Database & SQL Proficiency: Strong expertise in SQL and database design principles with a deep understanding of database architecture. Code Quality, Testing & CI/CD: Skilled in writing clean, testable code and implementing CI/CD pipelines for automated testing and deployment. Frontend Exposure: Experience working with the Angular framework for building responsive web interfaces. Nice to Have: Experience with Google BigQuery and cloud-based data warehousing . Familiarity with data integration tools such as Looker Studio. Hands-on experience with distributed systems and microservices architectures . Working knowledge of Google Cloud Platform (GCP) or Amazon Web Services (AWS) . Exposure to data science and machine learning practices. Experience with Infrastructure as Code tools like Terraform. Understanding of event-driven architectures . Skills: Python, Software Architecture, Automation Engineering, Pandas, Flask, SQL, Angular, CI/CD, Cloud Platforms (GCP/AWS), BigQuery, Terraform Required Skills Python,Architect,Automation Engineering,Pandas

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Chennai, Coimbatore, Bengaluru

Hybrid

Experience : 5 to 12 years Key Skill : GCP, Bigquery,SQL, Python Job Requirements Overall, more than 5 Yrs of experience in Data projects Good Knowledge of GCP Bigquery SQL Python dataflow skills Has worked in Implementation projects building data pipelines transformation logics and data models Responsibility: GCP Data Engineer Belongs to Data Management Engineering Education Bachelor of engineering in any discipline equivalent Desired Candidate Profile Technology Engineering Expertise 4 years of experience in implementing data solutions using GCP BigquerySQL programming Proficient in dealing data access layer RDBMS NOSQL Experience in implementing and deploying Big data applications with GCP Big Data Services Good to have SQL skills Able to deal with diverse set of stakeholders Proficient in articulation communication and presentation High integrity Problem solving skills learning attitude Team player Key Responsibilities Implement data solutions using GCP and need to be familiar in programming with SQLpython Ensure clarity on NFR and implement these requirements Work with Client Technical Manager by understanding customers landscape their IT priorities Lead performance engineering and capacity planning exercises for databases Technology Engineering Expertise 4 years of experience in implementing data pipelines for Data Analytics solutions Experience in solutions using Google Cloud Data Flow Apache Beam Java programming Proficient in dealing data access layer RDBMS NOSQL Experience in implementing and deploying Big data applications with GCP Big Data Services Good to have SQL skills Experience with different development methodologies RUP Scrum XP Soft skills Able to deal with diverse set of stakeholders Proficient in articulation communication and presentation High integrity Problem solving skills learning attitude Team player Please fill in the link https://forms.office.com/r/hGeeE6usBK Regards, shivalila

Posted 3 weeks ago

Apply

2.0 - 7.0 years

1 - 6 Lacs

Hyderabad, Qatar

Work from Office

SUMMARY Job Summary: Exciting job opportunity as a Registered Nurse in Qatar (Homecare) Key Responsibilities: Develop and assess nursing care plans Monitor vital signs and assess holistic patient needs Collaborate with physicians, staff nurses, and healthcare team members Administer oral and subcutaneous medications while ensuring safety Document nursing care, medications, and procedures using the company's Nurses Buddy application Conduct client assessment and reassessment using approved tools Attend refresher training courses, seminars, and training Timeline for Migration: Application to Selection: Not more than 5 days Data flow & Prometric: 1 month Visa processing: 1-2 months Start working in Qatar within 3 months! Requirements: Educational Qualification: Bachelor's Degree in Nursing or GNM Experience: Minimum 2 years working experience as a Nurse post registration Certification: registration Certification from Nursing Council Language: Basic English proficiency required Technical Skills: Bed side nursing, patient care, patient assessment and monitoring Benefits: High Salary & Perks: Earn 5000 QAR / month (1,18,000 INR/month) Tax Benefit: No tax deduction on salary Career Growth: Advanced Nursing career in Qatar with competitive salaries, cutting-edge facilities, and opportunities for specialization Relocation support: Visa process and flight sponsored. Free accommodation and transportation provided. International Work Experience: Boost your resume with International healthcare expertise. Comprehensive Health Insurance: Medical coverage for under Qatar’s healthcare system. S afe and stable environment: Qatar is known for its low crime rate, political stability, and high quality of life. The strict laws in the country, makes it one of safest place to live. Faster Visa Processing With efficient government procedures, work visas for nurses are processed quickly, reducing waiting times. Simplified Licensing Process Compared to other countries, Qatar offers a streamlined process for obtaining a nursing license through QCHP (Qatar Council for Healthcare Practitioners) . Direct Hiring Opportunities Many hospitals and healthcare facilities offer direct recruitment , minimizing third-party delays and complications. Limited slots available! Apply now to secure your place in the next batch of Nurses migrating to Qatar!

Posted 3 weeks ago

Apply

3.0 - 5.0 years

6 - 9 Lacs

Chandigarh

Work from Office

Were looking for a hands-on ETL & BI Engineer to design, build, and maintain robust data pipelines on Google Cloud Platform and turn that trusted data into compelling, actionable reports in Power BI. Youll partner with data architects, analysts, and BI developers to ensure timely delivery of clean, well-modeled data into BigQuery—and translate it into high-impact dashboards and metrics. Key Responsibilities 1. Data Ingestion & Landing Architect and manage landing zones in Cloud Storage for raw feeds Handle batch and streaming input in Parquet, Avro, CSV, JSON, ORC 2. ETL Pipeline Development Develop and orchestrate ETL workflows with Cloud Data Fusion (including Wrangler) Perform data cleansing, imputation, type conversions, joins/unions, pivots 3. Data Modeling & Semantic Layer Design star- and snowflake-schema fact and dimension tables in BigQuery Define and document the semantic layer to support Power BI datasets 4. Load & Orchestration Load curated datasets into BigQuery zones (raw staging curated) Implement orchestration via scheduled queries, Cloud Composer/Airflow, or Terraform-driven pipelines 5. Performance, Quality & Monitoring Tune SQL queries and ETL jobs for throughput, cost-efficiency, and reliability Implement automated data-quality checks, logging, and alerting Required Skills & Experience Bachelor’s degree in Computer Science, Engineering, or related field 3+ years building ETL pipelines on GCP (Cloud Data Fusion, Cloud Storage, BigQuery) Solid SQL expertise, including query optimization in BigQuery Strong grasp of dimensional modeling (star/snowflake schemas) Experience managing Cloud Storage buckets and handling diverse file formats Familiarity with orchestration tools (Cloud Composer, Airflow, or BigQuery scheduled queries) Excellent problem-solving skills, attention to detail, and collaborative mindset Preferred (Nice to Have) Experience with other GCP data services (Dataflow, Dataproc) Power BI skills: data modeling, report development, DAX calculations, and performance tuning. Python scripting for custom transformations or orchestration Understanding of CI/CD best practices (Git, Terraform, deployment pipelines) Knowledge of data governance frameworks and metadata management

Posted 3 weeks ago

Apply

2.0 - 4.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Overview Annalect is currently seeking a back-end developer to join our Technology team. In this role, you will help grow our microservices and API layer which sit atop our Big Data infrastructure. We are passionate about building distributed back-end systems in a modular and reusable way. We're looking for people who have a shared passion for data and desire to build cool, maintainable and high-quality applications to use this data. In this role you will participate in shaping our technical architecture, design and development of software products, collaborate with back-end developers from other tracks, research and evaluation of new technical solutions, and helping to elevate the skills of more junior developers. Responsibilities Designing, building, testing and deploying scalable, reusable and maintainable applications that handle large amounts of data Growing our API layer: author, update, and debug API microservices; contribute to API design and architecture Perform code reviews and provide leadership and guidance to junior developers Ability to learn and teach new technologies Qualifications 5+ years of solid coding experience working in Python Demonstrated proficiency with RESTful APIs (data caching, JWT auth, API load testing, RAML), and production use of a python API framework Fluency with Linux/Unix Systems and in bash Excellent grasp of microservices, containers (Docker), container orchestration (Kubernetes), serverless computing (AWS Lambda) and distributed/scalable systems Proven history of mentoring junior developers to improve overall team effectiveness Passion for writing good documentation and creating architecture diagrams Experience processing and analyzing large data sets. Extensive history working with data and databases (SQL, MySQL, PostgreSQL, Amazon Aurora, Redis, Amazon Redshift, Google BigQuery) Rigorous approach to testing (unit testing, functional testing, integration testing) Understanding of critical API security best practices Ability to profile, identify, debug, and fix performance bottlenecks in application and database layers with modern tooling Strong proficiency in conducting PR reviews and helping to maintain a high-quality code base Knowledge of git, with understanding of branching, how to manage conflicts, and pull requests

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies