Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5 - 10 years
15 - 30 Lacs
Pune, Bengaluru, Hyderabad
Work from Office
Preferred candidate profile Experience- 5 to 10 Years Notice Period- 30 Days max Location- Hyderabad, Pune, Bengaluru, Delhi Key Skills - Proficiency in programming languages: Python, Java - Expertise in data processing frameworks: Apache Beam (Data Flow) - Active experience on GCP tools and technologies like B ig Query, Dataflow, Cloud Composer , Cloud Spanner, GCS, DBT etc., - Data Engineering skillset using Python, SQL - Experience in ETL (Extract, Transform, Load) processes
Posted 3 months ago
6 - 11 years
15 - 30 Lacs
Bengaluru
Work from Office
Role: GCP Data Engineer Location: Bangalore Experience: 6-12 years Mode: work from office Job Description: We are seeking a talented GCP Data Engineer to join our team and help us design and implement robust data pipelines and analytics solutions on Google Cloud Platform (GCP). The ideal candidate will have strong expertise in BigQuery , DataFlow , Cloud Composer , and DataProc , along with experience in AI/ML tools such as Google Vertex AI or Dialogflow . Key Responsibilities: Design, develop, and maintain data pipelines and workflows using DataFlow , Cloud Composer , and DataProc . Develop optimized queries and manage large-scale datasets using BigQuery . Collaborate with cross-functional teams to gather requirements and translate business needs into scalable data solutions. Implement best practices for data engineering, including version control, CI/CD pipelines, and data governance. Work on AI/ML use cases, leveraging Google Vertex AI or Dialogflow to create intelligent solutions. Perform data transformations, aggregations, and ETL processes to prepare data for analytics and reporting. Monitor and troubleshoot data workflows to ensure reliability, scalability, and performance. Document technical processes and provide guidance to junior team members. Qualifications: Experience: 3-5+ years of professional experience in GCP data engineering or related fields. Skills: Proficiency in BigQuery , DataFlow , Cloud Composer , and DataProc . Exposure to Google Vertex AI , Dialogflow , or other AI/ML platforms. Strong programming skills in Python , SQL , and familiarity with Terraform for GCP infrastructure. Experience with distributed data processing frameworks like Apache Spark is a plus. Knowledge of data security, governance, and best practices for cloud platforms
Posted 3 months ago
3 - 8 years
8 - 18 Lacs
Chennai, Hyderabad
Work from Office
Title : Association/Senior/Lead -Data Engineer Experience : 3 to 8 Years Location : Chennai/Hyderabad Required Skill : GCP, BigQuery, Python/Hadoop, Teradata/DataProc, Airflow. Regards, Sharmeela Sharmeela.s@saaconsulting.co.in
Posted 3 months ago
5 - 10 years
4 - 9 Lacs
Bengaluru
Remote
Role & responsibilities 5+ years of experience in data engineering/analytics Proficiency in SQL and Python Experience Snowflake and Google BigQuery is highly desirable Skilled in statistical analysis, exploratory data analysis (EDA) Experience in building dashboards in BI tools, such as Power BI, Superset Strong problem-solving skills and the ability to adapt to new challenges. Excellent communication skills and the ability to proactively engage with clients. Preferred candidate profile Perks and benefits
Posted 3 months ago
3 - 6 years
5 - 8 Lacs
Kolkata
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: "Must Have: End to End functional knowledge of the data pipeline/transformation, implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done. Expert in SQL can do data analysis and investigation using Sql queries Implementation Knowledge Advance Sql functions like Regular Expressions, Aggregation, Pivoting, Ranking, Deduplication etc. BigQuery and BigQuery Transformation (using Stored Procedures) Data modelling concepts Star & Snowflake schemas, Fact & Dimension table, Joins, Cardinality etc GCP Services related to data pipelines like Workflow, Cloud Composer, Cloud Schedular, Cloud Storage etc Understanding of CI/CD & related tools Git & Terraform Other GCP Services like Dataflow, Cloud Build, Pub/Sub, Cloud Functions, Cloud Run, Cloud Workstation etc BigQuery Performance tuning, Python based API development exp Spark development exp Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Develop/Convert the database (Hadoop to GCP) of the specific objects (tables, views, procedures, functions, triggers, etc.) from one database to another database platform Implementation of a specific Data Replication mechanism (CDC, file data transfer, bulk data transfer, etc.). Expose data as API Participation in modernization roadmap journey Analyze discovery and analysis outcomes Lead discovery and analysis workshops/playbacks Identification of the applications dependencies, source, and target database incompatibilities. Analyze the non-functional requirements (security, HA, RTO/RPO, storage, compute, network, performance bench, etc.). Prepare the effort estimates, WBS, staffing plan, RACI, RAID etc. . Leads the team to adopt right tools for various migration and modernization method Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 3 months ago
12 - 17 years
35 - 60 Lacs
Chennai, Bengaluru
Hybrid
At ZoomInfo, we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. We value your take charge, take initiative, get stuff done attitude and will help you unlock your growth potential. One great choice can change everything. Thrive with us at ZoomInfo. Zoominfo is a rapidly growing data-driven company, and as such- we understand the importance of a comprehensive and solid data solution to support decision making in our organization. Our vision is to have a consistent, democratized, and accessible single source of truth for all company data analytics and reporting. Our goal is to improve decision-making processes by having the right information available when it is needed. As a Principal Software Engineer in our Data Platform infrastructure team you'll have a key role in building and designing the strategy of our Enterprise Data Engineering group. What You'll do: Design and build a highly scalable data platform to support data pipelines for diversified and complex data flows. Track and identify relevant new technologies in the market and push their implementation into our pipelines through research and POC activities. Deliver scalable, reliable and reusable data solutions. Leading, building and continuously improving our data gathering, modeling, reporting capabilities and self-service data platforms. Working closely with Data Engineers, Data Analysts, Data Scientists, Product Owners, and Domain Experts to identify data needs. Develop processes and tools to monitor, analyze, maintain and improve data operation, performance and usability. What you bring: Relevant Bachelor degree or other equivalent Software Engineering background. 12+ years of experience as an infrastructure / data platform / big data software engineer. Experience with AWS/GCP cloud services such as GCS/S3, Lambda/Cloud Function, EMR/Dataproc, Glue/Dataflow, Athena. IaC design and hands-on experience. Familiarity designing CI/CD pipelines with Jenkins, Github Actions, or similar tools. Experience in designing, building and maintaining enterprise systems in a big data environment on public cloud. Strong SQL abilities and hands-on experience with SQL, performing analysis and performance optimizations. Hands-on experience in Python or equivalent programming language. Experience with administering data warehouse solutions (like Bigquery/ Redshift/ Snowflake). Experience with data modeling, data catalog concepts, data formats, data pipelines/ETL design, implementation and maintenance. Experience with Airflow and DBT - advantage Experience with Kubernetes using GKE or EKS - advantage.. Experience with development practices Agile, TDD - advantage
Posted 3 months ago
4 - 7 years
5 - 8 Lacs
Bengaluru
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: "Must Have: End to End functional knowledge of the data pipeline/transformation, implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done. Expert in SQL can do data analysis and investigation using Sql queries Implementation Knowledge Advance Sql functions like Regular Expressions, Aggregation, Pivoting, Ranking, Deduplication etc. BigQuery and BigQuery Transformation (using Stored Procedures) Data modelling concepts Star & Snowflake schemas, Fact & Dimension table, Joins, Cardinality etc GCP Services related to data pipelines like Workflow, Cloud Composer, Cloud Schedular, Cloud Storage etc Understanding of CI/CD & related tools Git & Terraform Other GCP Services like Dataflow, Cloud Build, Pub/Sub, Cloud Functions, Cloud Run, Cloud Workstation etc BigQuery Performance tuning, Python based API development exp Spark development exp Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Develop/Convert the database (Hadoop to GCP) of the specific objects (tables, views, procedures, functions, triggers, etc.) from one database to another database platform Implementation of a specific Data Replication mechanism (CDC, file data transfer, bulk data transfer, etc.). Expose data as API Participation in modernization roadmap journey Analyze discovery and analysis outcomes Lead discovery and analysis workshops/playbacks Identification of the applications dependencies, source, and target database incompatibilities. Analyze the non-functional requirements (security, HA, RTO/RPO, storage, compute, network, performance bench, etc.). Prepare the effort estimates, WBS, staffing plan, RACI, RAID etc. Leads the team to adopt right tools for various migration and modernization method Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 3 months ago
6 - 11 years
20 - 35 Lacs
Pune, Navi Mumbai, Mumbai (All Areas)
Hybrid
Job Description: We are seeking a skilled GCP Data Engineer to join our data team. The ideal candidate will have hands-on experience with Google Cloud Platform (GCP) services, especially BigQuery , and strong expertise in SQL and Python . You will be responsible for designing, building, and maintaining data pipelines, ensuring efficient data processing and analytics solutions. Key Responsibilities: Design, develop, and optimize data pipelines using BigQuery and Dataflow . Write efficient SQL queries for data transformation, aggregation, and reporting. Develop and maintain ETL/ELT processes using Python and Apache Airflow . Integrate data from various sources, including Cloud Storage, Pub/Sub, and APIs . Implement data quality checks, validation, and performance tuning. Collaborate with Data Analysts, Data Scientists, and Business teams to enable data-driven decision-making. Ensure best practices in data security, governance, and compliance . Monitor and troubleshoot data pipeline performance and failures. Required Skills & Qualifications: 4+ years of experience in data engineering or related roles. Strong proficiency in BigQuery and SQL (writing complex queries, performance tuning, partitioning, clustering). Hands-on experience with Python for data engineering (Pandas, NumPy, PySpark, etc.). Experience with Google Cloud Platform (GCP) services such as Cloud Storage, Pub/Sub, Dataflow, Cloud Functions, and Composer (Airflow) . Familiarity with ETL/ELT frameworks and data modeling techniques . Experience in CI/CD pipelines for data workflows . Knowledge of Data Warehousing concepts and best practices. Strong problem-solving and debugging skills. Education & Certifications: Bachelors or Masters degree in Computer Science, Data Engineering, or related fields . GCP Data Engineer Certification (preferred but not mandatory). If interested, share your CV with below details - Current Location - Preferred Location - Current CTC - Expected CTC - Notice Period -
Posted 3 months ago
5 - 10 years
20 - 35 Lacs
Pune, Gurgaon, Jaipur
Hybrid
Job Opportunity: Data Engineer Location: Xebia (Hybrid 3 days a week at Xebia locations) Shift: 2 PM 11 PM Notice Period: Immediate joiners or up to 30 days Are you an experienced Data Engineer with a strong background in Python, AWS, and data streaming technologies? If you are passionate about building scalable, reliable, and secure data solutions, we want to hear from you! Key Responsibilities & Skills: 5+ years of experience in Python (v3.6 or higher) and Python frameworks like Pytest Expertise in AWS Cloud tools and technologies, including but not limited to AWS CDK, S3, Lambda, DynamoDB, EventBridge, Kinesis, CloudWatch, etc. Experience in data engineering, real-time streaming, and event-driven architectures Strong understanding of SDLC, best practices, and microservices architecture using FastAPI, GraphQL, and Pydantic Adherence to best practices for scalable and secure data pipeline development Strong attention to detail and problem-solving skills Who You Are: Passionate and personable with a strong technical mindset Active in meetups, conferences, and webinars Aligns with Xebias values and culture Well-aware of Xebia’s business, Glassdoor ratings, and industry positioning Fits within the desired experience and salary range How to Apply: If you meet the above criteria and are interested, please share your updated CV along with the following details: Total Experience: Current CTC: Expected CTC: Current Location: Preferred Location: Notice Period / Last Working Day (if serving notice): Kindly share your details only if you have not applied recently or are not currently in the interview process for any open roles at Xebia. Looking forward to your response! Best regards, Vijay S Assistant Manager - TAG vijay.s@xebia.com https://www.linkedin.com/in/vijay-selvarajan/
Posted 3 months ago
7 - 12 years
9 - 14 Lacs
Gurgaon
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted 3 months ago
5 - 10 years
7 - 12 Lacs
Chennai
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Google BigQuery Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the effort to design, build, and configure applications Act as the primary point of contact Manage the team and ensure successful project delivery Professional & Technical Skills: Must To Have Skills:Proficiency in Google BigQuery Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery This position is based at our Chennai office A 15 years full-time education is required Qualifications 15 years full time education
Posted 3 months ago
6 - 8 years
8 - 18 Lacs
Gurgaon
Remote
Exp in on Google Cloud Platform (GCP),Biq Query Exp in Python Exp in SQL. Knowledge and experience in Scrum development Exp in cicd Pipelines
Posted 3 months ago
9 - 14 years
25 - 35 Lacs
Bengaluru
Work from Office
About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : GCP Data architect Required skills and qualifications : Designing and Managing Implementing Google Cloud Platform Cloud Storage Data Flow Cloud SQL Qualification : Any Graduate or Above Relevant Experience : 9 to 14 yrs Location : Hyderabad/Bangalore/Pune CTC Range : 25 to 35 LPA Notice period : Currently serving / 30 days / 60 days Mode of Interview : Virtual Mode of Work : In Office Sana F Staffing analyst - IT recruiter Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA sana.f@blackwhite.in I www.blackwhite.in +91 9902578775
Posted 3 months ago
3 - 6 years
6 - 13 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
Role & Responsibilities Expertise in GCP (BigQuery, Dataproc) , SQL/HQL , and Linux/Unix . Experience in at least one programming language and unit testing principles . Knowledge of Hadoop-to-BigQuery migration , Apache Airflow , Kafka , and Shell Scripting is a plus. Proficiency in Python/PySpark, Scala, or Java and familiarity with GCP Looker is desirable. Ability to develop and optimize data pipelines in cloud-based environments.
Posted 3 months ago
3 - 6 years
6 - 16 Lacs
Chennai, Bengaluru
Work from Office
Role & responsibilities Working knowledge of Google Cloud Platform (GCP) , specifically BigQuery and Dataproc . Hands-on experience with any Query Language ( SQL/HQL ). Experience working in Linux/Unix environments . Proficiency in at least one programming language ( Python, PySpark, Scala, or Java ). Solid understanding of unit testing principles . Preferred candidate profile Exposure to pipeline migration from Hadoop to BigQuery . Familiarity with Apache Airflow for workflow automation. Knowledge of Shell Scripting . Experience with Kafka for real-time data streaming. Understanding of GCP Looker for data visualization.
Posted 3 months ago
3 - 7 years
11 - 15 Lacs
Mumbai
Work from Office
A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific A senior leadership role that entails the oversight of multiple teams or a substantial team of data platform engineers, the management of intricate data infrastructure projects, and the making of strategic decisions that shape technological direction within the realm of data platform engineering.Key responsibilities encompass:Strategic Leadership: Leading multiple data platform engineering teams, steering substantial projects, and setting the strategic course for data platform development and operations.Complex Project Management: Supervising the execution of intricate data infrastructure projects, ensuring alignment with cliental objectives and the delivery of value.Technical and Strategic Decision-Making: Making well-informed decisions concerning data platform architecture, tools, and processes. Balancing technical considerations with broader business goals.Influencing Technical Direction: Utilising their profound technical expertise in data platform engineering to influence the direction of the team and the client, driving enhancements in data platform technologies and processes.Innovation and Contribution to the Discipline: Serving as innovators and influencers within the field of data platform engineering, contributing to the advancement of the discipline through thought leadership and the sharing of knowledge.Leadership and Mentorship: Offering mentorship and guidance to both managers and technical personnel, cultivating a culture of excellence and innovation within the domain of data platform engineering.
Posted 3 months ago
5 - 10 years
5 - 9 Lacs
Pune
Work from Office
GCP data engineer JD: Proficiency in Google Cloud Platform (GCP) and its associated tools, particularly Big Query. Query certification and scan data using SQL and PowerShell, retrieving ownership information from sources like Active Directory and BigQuery. Leverage Google Cloud Platform tools to manage and process large datasets. Ensure data accuracy and consistency through validation and troubleshooting. Required Skills: Proficiency in Google Cloud Platform (GCP), SQL, and PowerShell. Experience building reports and dashboards in Power BI. Familiarity with data sources like Active Directory. Strong problem-solving and communication skills.
Posted 3 months ago
4 - 8 years
7 - 15 Lacs
Telangana
Work from Office
Position Description: Full Stack development using Google Cloud Platform (GCP) Development in Cloud Run/ Cloud Function Cloud PubSub integration BigQuery development/operations Cloud Logging/ Monitoring principles/ operations IAM SQL Tekton CI/CD on Google Cloud, Terraform development Dataflow Programming languages Python expert level Knowledge of Java, spring boot is a plus Software Development Acumen Excellent problem solving skills Experience collaborating in team environment and working independently Ability to describe and teach coding best practices and production code standards Ability to quickly propose GCP based solutions aligned with latest Google documentation, whitepapers, and community publications Ability to trace and resolve technical issues with minimal supervision Demonstrated proficiency in understanding and implementing process logic and workflows Motivated and keen to work in a collaborative environment with a focus on team success over and above individual success Skills Required: GCP Professional cloud architect or DE certification is a must. Python expert level. Skills Preferred: Java spring boot.
Posted 3 months ago
8 - 13 years
25 - 27 Lacs
Bengaluru
Remote
Requirements: Responsibilities: • Plan, built and delivery of the features and team members • Work in an agile environment that follow scrum-based delivery • Very strong data mindset with expertise in SQL and data analysis • Strong in data pipeline building using Python in airflow DAGs with expertise in GCloud platform components like bigquery, buckets, IAM, etc. • Exposure to Cassandra, COSMOS, SQL Server DB, and cloud platforms like azure databricks, etc. • Good written and verbal communication with team members, external teams, and leadership. • Supply chain domain knowledge • Worked in an environment with onshore-offshore model Required Documentation: Aadhar card and linkedin ID. Coalification: • Bachelors degree or Master’s degree in computer science, Information Technology, or related field. • Strong communication and interpersonal skills. • Ability to work independently and as part of a team
Posted 3 months ago
3 - 7 years
11 - 15 Lacs
Mumbai
Work from Office
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way youd like, where youll be supported and inspired bya collaborative community of colleagues around the world, and where youll be able to reimagine whats possible. Join us and help the worlds leading organizationsunlock the value of technology and build a more sustainable, more inclusive world. Job Description A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific A senior leadership role that entails the oversight of multiple teams or a substantial team of data platform engineers, the management of intricate data infrastructure projects, and the making of strategic decisions that shape technological direction within the realm of data platform engineering.Key responsibilities encompass:Strategic Leadership: Leading multiple data platform engineering teams, steering substantial projects, and setting the strategic course for data platform development and operations.Complex Project Management: Supervising the execution of intricate data infrastructure projects, ensuring alignment with cliental objectives and the delivery of value.Technical and Strategic Decision-Making: Making well-informed decisions concerning data platform architecture, tools, and processes. Balancing technical considerations with broader business goals.Influencing Technical Direction: Utilising their profound technical expertise in data platform engineering to influence the direction of the team and the client, driving enhancements in data platform technologies and processes.Innovation and Contribution to the Discipline: Serving as innovators and influencers within the field of data platform engineering, contributing to the advancement of the discipline through thought leadership and the sharing of knowledge.Leadership and Mentorship: Offering mentorship and guidance to both managers and technical personnel, cultivating a culture of excellence and innovation within the domain of data platform engineering.
Posted 3 months ago
3 - 5 years
7 - 17 Lacs
Bengaluru
Work from Office
About the Role The Consumer Model Development Centre, within CDAI organization, is looking for Quantitative analytics specialist to join our team and help us solve challenging and interesting business problems through data exploration, advanced analytics and visualization. In this highly valued role, you will be working on ML/AI models within the Center of Excellence (COEs) with varied competencies, including Developing Models, Monitoring, Deployment, and Analytic Quality Review. This role is highly hands on position in the area of ML/AI and Data science across. Responsibilities Required to work individually or as part of a team on data science projects and work closely with business partners across the organization. Perform various complex activities related to statistical/machine learning. Provide analytical support for developing, evaluating, implementing, monitoring and executing models across business verticals using emerging technologies including but not limited to Python, Spark, and H2O etc. Ability to work with large datasets using SQL and present conclusions to key stakeholders. Establish a consistent and collaborative framework with the business and act as a primary point of contact in delivering the solutions. Build quick prototypes to check feasibility and value to business. Develop and maintain modular code-base for reusability Review and validate models and help improve the performance of the model under the preview of the banking regulations. Work closely with technology teams to deploy the models to production. Prepare detailed documentations for projects for both internal and external that complies regulatory and internal audit requirements Required skills: 2+ years of Quantitative Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Master's degree or higher in statistics, mathematics, physics, engineering, computer science, economics, or quantitative discipline Desired Skills Bachelors or Master's degree in Engineering field like computer science, Information technology, Electrical Engineering etc. M.Sc./M.Phil. in statistics/economics/mathematics/ operations research engineering physics 3-5 years of relevant hands on experience in data science and advanced analytics. Must have hands on exposure in Python and SQL. Working knowledge of libraries like, sckit-learn, pandas, numpy, mllib, matplotlib, keras Proficiency in data mining and statistical analysis. Experience in developing, implementing models. Statistical models linear regression, logistic regression, time series analysis, multivariate statistical analysis Machine learning models Random forest, XGBoost, GBM, SVM Exposure to deep learning framework - ANN,RNN, CNN, LSTM Excellent understanding of model metrics including AUC, ROC, F-statistics etc. with clear understanding of how model performance is tuned Strong programing skills. Exposure in one or more of Big Data skills SQL, Aster, Teradata, Hadoop, SPARK, H20, BigQuery. Exposure to Google Cloud Platform Critical thinking and strong problem solving skills Ability to learn the business aspects quickly Knowledge of banking industry and products in at least one of the LOB such as credit cards, mortgage, deposits, loans or wealth management etc.is desirable Knowledge of functional area such as risk, marketing, operations or supply chain in banking industry is desirable Ability to multi-task and prioritize between projects Ability to work independently and as part of a team Working expertise in Tensorflow, Keras or Pytorch would be added advantage
Posted 3 months ago
5 - 10 years
10 - 20 Lacs
Pune
Hybrid
About GSPANN : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. GSPANN is looking for a Data Architect. As we march ahead on a tremendous growth trajectory, we seek passionate and talented professionals to join our growing family. Job Position (Title)- BEAT Engineer Experience Required- 5 + Years Location- Hyderabad/Gurgaon/Pune Technical Skills set- Data Visualization, Python, NodeJS/ReactJS, SQL Job Description: Bachelor's degree in Computer Science or Engineering, or equivalent experience 5+ years of experience as a Python Developer with a strong portfolio of Product development projects. Expertise in one or more cloud platforms such as AWS, Azure or GCP and its compute, storage, IAM, API & SQL services Proficient knowledge of object-oriented programming combined with software development experience: Python, Pyspark & SQL. Experience with popular Python frameworks such as Django, Flask or Pyramid. Data engineering experience, throughout its lifecycle (Design, Engineering, Testing, Deployment) Expertise in data pipeline scheduling software such as Apache Airflow or Databrick workflow. Good understanding of Data Modelling and Data warehousing concepts Expertise with any of the databases such as Snowflake, Teradata or Oracle Strong expertise in SQL including stored procedures & events. Experience with tools like Jenkins 2.0 Knowledgeable with GitHub (version control systems) and Jira (issue tracking) Good to know Apache Spark, Hadoop, Hive, Databricks Good to have a working knowledge of NodeJS/Python with React.js and its core principles. Experience in writing unit tests to validate the code based on functional / Technical specs. Roles & Responsibilities: Work closely with cross-functional teams including designers, product managers, and other developers to create high-quality software solutions. Participate in code reviews and provide constructive feedback to peers. Analyze requirements, identify areas for improvement, and devise innovative solutions to technical challenges. Troubleshoot and debug applications to resolve issues and improve performance. Develop and manage database schemas, write optimized SQL queries, and ensure efficient data storage and retrieval. Work with database technologies like PostgreSQL, MySQL, or NoSQL databases Stay updated with the latest industry trends, technologies, and best practices in Python development. Continuously improve and optimize codebase for performance and scalability.
Posted 3 months ago
5 - 10 years
10 - 20 Lacs
Gurgaon
Hybrid
About GSPANN : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. GSPANN is looking for a Data Architect. As we march ahead on a tremendous growth trajectory, we seek passionate and talented professionals to join our growing family. Job Position (Title)- BEAT Engineer Experience Required- 5 + Years Location- Hyderabad/Gurgaon/Pune Technical Skills set- Data Visualization, Python, NodeJS/ReactJS, SQL Job Description: Bachelor's degree in Computer Science or Engineering, or equivalent experience 5+ years of experience as a Python Developer with a strong portfolio of Product development projects. Expertise in one or more cloud platforms such as AWS, Azure or GCP and its compute, storage, IAM, API & SQL services Proficient knowledge of object-oriented programming combined with software development experience: Python, Pyspark & SQL. Experience with popular Python frameworks such as Django, Flask or Pyramid. Data engineering experience, throughout its lifecycle (Design, Engineering, Testing, Deployment) Expertise in data pipeline scheduling software such as Apache Airflow or Databrick workflow. Good understanding of Data Modelling and Data warehousing concepts Expertise with any of the databases such as Snowflake, Teradata or Oracle Strong expertise in SQL including stored procedures & events. Experience with tools like Jenkins 2.0 Knowledgeable with GitHub (version control systems) and Jira (issue tracking) Good to know Apache Spark, Hadoop, Hive, Databricks Good to have a working knowledge of NodeJS/Python with React.js and its core principles. Experience in writing unit tests to validate the code based on functional / Technical specs. Roles & Responsibilities: Work closely with cross-functional teams including designers, product managers, and other developers to create high-quality software solutions. Participate in code reviews and provide constructive feedback to peers. Analyze requirements, identify areas for improvement, and devise innovative solutions to technical challenges. Troubleshoot and debug applications to resolve issues and improve performance. Develop and manage database schemas, write optimized SQL queries, and ensure efficient data storage and retrieval. Work with database technologies like PostgreSQL, MySQL, or NoSQL databases Stay updated with the latest industry trends, technologies, and best practices in Python development. Continuously improve and optimize codebase for performance and scalability.
Posted 3 months ago
4 - 9 years
10 - 20 Lacs
Gurgaon
Hybrid
About GSPANN : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. GSPANN is looking for a Data Architect. As we march ahead on a tremendous growth trajectory, we seek passionate and talented professionals to join our growing family. Title : GCP Developers /Leads Skill: GCP, Bigquery, Python, SQL Experience:- 5+ Yrs Work Location: Hyderabad/Gurgaon/Pune Responsibilities Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support. Solve complex business problems by utilizing a disciplined development methodology. Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies. Analyze the source and target system data. Map the transformation that meets the requirements. Interact with the client and onsite coordinators during different phases of a project Required Skills Hands on in writing Python and SQL Code Should be strong in writing ETL logic using any ETL tool/SQL script/Python Script. Any Data warehousing knowledge with Data Model experience Exposure to any Scheduling tool Experience with any cloud will add advantage.
Posted 3 months ago
6 - 11 years
15 - 25 Lacs
Pune, Bengaluru, Noida
Work from Office
Location: Chennai/Bangalore/Noida/Hyderabad/Pune GCP BigQuery, Looker / powerBI dashboards, Python, Java, or SQL, ETL, Multi cloud (e.g.,Google Cloud, AWS, Azure)
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.
The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.
In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.
Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.
As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2