Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a GCP DBT Manager, your primary responsibility will be to collaborate with the team in designing, building, and maintaining data pipelines and transformations using Google Cloud Platform (GCP) and the Data Build Tool (dbt). This role will involve utilizing tools such as BigQuery, Cloud Composer, and Python, requiring a strong foundation in SQL skills and knowledge of data warehousing concepts. Additionally, you will play a crucial role in ensuring data quality, optimizing performance, and working closely with cross-functional teams. Your key responsibilities will include: Data Pipeline Development: - Designing, building, and maintaining ETL/ELT pipelines using dbt and GCP services like BigQuery and Cloud Composer. Data Modeling: - Creating and managing data models and transformations with dbt to ensure efficient and accurate data consumption for analytics and reporting. Data Quality: - Developing and maintaining a data quality framework, including automated testing and cross-dataset validation. Performance Optimization: - Writing and optimizing SQL queries to enhance data processing efficiency within BigQuery. Collaboration: - Collaborating with data engineers, analysts, scientists, and business stakeholders to deliver effective data solutions. Incident Resolution: - Providing support for day-to-day incident and ticket resolution related to data pipelines. Documentation: - Creating and maintaining comprehensive documentation for data pipelines, configurations, and procedures. Cloud Platform Expertise: - Leveraging GCP services like BigQuery, Cloud Composer, Cloud Functions, etc. for efficient data operations. Scripting: - Developing and maintaining SQL/Python scripts for data ingestion, transformation, and automation tasks. Preferred Candidate Profile: Requirements: - 7~12 years of experience in data engineering or a related field. - Strong hands-on experience with Google Cloud Platform (GCP) services, particularly BigQuery. - Proficiency in using dbt for data transformation, testing, and documentation. - Advanced SQL skills for data modeling, performance optimization, and querying large datasets. - Understanding of data warehousing concepts, dimensional modeling, and star schema design. - Experience with ETL/ELT tools and frameworks, such as Apache Beam, Cloud Dataflow, Data Fusion, or Airflow/Composer. In this role, you will be at the forefront of data pipeline development and maintenance, ensuring data quality, performance optimization, and effective collaboration across teams to deliver impactful data solutions using GCP and dbt.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Data Engineer at Synoptek, you will be responsible for designing, developing, and maintaining robust and scalable data pipelines on the Google Cloud Platform (GCP). You will leverage your hands-on experience with GCP services such as BigQuery, Jitterbit, Cloud Dataflow, Cloud Pub/Sub, and Cloud Storage to build efficient data processing solutions. Collaborating with cross-functional teams, you will translate their data needs into technical requirements, ensuring data quality, integrity, and security throughout the data lifecycle. Your role will involve developing and optimizing ETL/ELT processes to extract, transform, and load data from various sources into data warehouses and data lakes. Additionally, you will build and maintain data models and schemas to support business intelligence and analytics, while troubleshooting data quality issues and performance bottlenecks. To excel in this position, you should have a Bachelor's degree in Computer Science, Engineering, or a related field, along with 3 to 4 years of experience as a Data Engineer focusing on GCP. Proficiency in Python, SQL, and BigQuery is essential, as well as hands-on experience with data ingestion, transformation, and loading tools like Jitterbit and Apache Beam. A strong understanding of data warehousing and data lake concepts, coupled with experience in data modeling and schema design, will be beneficial. The ideal candidate will exhibit excellent problem-solving and analytical skills, working both independently and collaboratively with internal and external teams. Familiarity with acquiring and managing data from various sources, as well as the ability to identify trends in complex datasets and propose business solutions, are key attributes for success in this role. At Synoptek, we value employees who embody our core DNA behaviors, including clarity, integrity, innovation, accountability, and a results-focused mindset. We encourage continuous learning, adaptation, and growth in a fast-paced environment, promoting a culture of teamwork, flexibility, respect, and collaboration. If you have a passion for data engineering, a drive for excellence, and a commitment to delivering impactful results, we invite you to join our dynamic team at Synoptek. Work hard, play hard, and let's achieve superior outcomes together.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
punjab
On-site
As a GCP Data Engineer in Australia, you will be responsible for leveraging your experience in Google Cloud Platform (GCP) to handle various aspects of data engineering. Your role will involve working on data migration projects from legacy systems such as SQL and Oracle. You will also be designing and building ETL pipelines for data lake and data warehouse solutions on GCP. In this position, your expertise in GCP data and analytics services will be crucial. You will work with tools like Cloud Dataflow, Cloud Dataprep, Apache Beam/Cloud composer, Cloud BigQuery, Cloud Fusion, Cloud PubSub, Cloud storage, and Cloud Functions. Additionally, you will utilize Cloud Native GCP CLI/gsutil for operations and scripting languages like Python and SQL to enhance data processing efficiencies. Furthermore, your experience with data governance practices, metadata management, data masking, and encryption will be essential. You will utilize GCP tools such as Cloud Data Catalog and GCP KMS tools to ensure data security and compliance. Overall, this role requires a strong foundation in GCP technologies and a proactive approach to data engineering challenges in a dynamic environment.,
Posted 3 weeks ago
4.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
We are seeking a PostgreSQL Database Developer with a minimum of 4 years of experience in database management. The ideal candidate should be passionate about technology, dedicated to continuous learning, and committed to providing exceptional customer experiences through client interactions. Qualifications: - Must have a degree in BE/B.Tech/MCA/MS-IT/CS/B.Sc/BCA or related fields. - Expertise and hands-on experience in PostgreSQL, PLSQL, Oracle, query optimization, performance tuning, and GCP Cloud. Job Description: The responsibilities of the PostgreSQL Database Developer include: - Proficient in PL/SQL and PostgreSQL programming, with the ability to write complex SQL queries and stored procedures. - Experience in migrating database structure and data from Oracle to Postgres SQL, preferably on GCP Alloy DB or Cloud SQL. - Familiarity with Cloud SQL/Alloy DB and tuning them for better performance. - Working knowledge of Big Query, Fire Store, Memory Store, Spanner, and bare metal setup for PostgreSQL. - Expertise in tuning Alloy DB/Cloud SQL database for optimal performance. - Experience with GCP Data migration service, MongoDB, Cloud Dataflow, Disaster Recovery, job scheduling, logging techniques, and OLTP/OLAP. - Desirable: GCP Database Engineer Certification. Roles & Responsibilities: - Develop, test, and maintain data architectures. - Migrate Enterprise Oracle database from On-Prem to GCP cloud, focusing on autovacuum in PostgreSQL. - Tuning autovacuum in PostgreSQL. - Performance tuning of PostgreSQL stored procedures and queries. - Convert Oracle stored procedures and queries to PostgreSQL equivalents. - Create a hybrid data store with Datawarehouse, NoSQL GCP solutions, and PostgreSQL. - Migrate Oracle table data to Alloy DB. - Lead the database team. Mandatory Skills: PostgreSQL, PLSQL, Bigquery, GCP Cloud, tuning, and optimization. To apply, please share your resume at sonali.mangore@impetus.com with details of your current CTC, expected CTC, notice period, and Last Working Day (LWD).,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have a strong understanding of the tech stack including GCP Services such as BigQuery, Cloud Dataflow, Pub/Sub, Dataproc, and Cloud Storage. Experience with Data Processing tools like Apache Beam (batch/stream), Apache Kafka, and Cloud Dataprep is crucial. Proficiency in programming languages like Python, Java/Scala, and SQL is required. Your expertise should extend to Orchestration tools like Apache Airflow (Cloud Composer) and Terraform, and Security aspects including IAM, Cloud Identity, and Cloud Security Command Center. Knowledge of Containerization using Docker and Kubernetes (GKE) is essential. Familiarity with Machine Learning platforms such as Google AI Platform, TensorFlow, and AutoML is expected. Candidates with certifications like Google Cloud Data Engineer and Cloud Architect are preferred. You should have a proven track record of designing scalable AI/ML systems in production, focusing on high-performance and cost-effective solutions. Strong experience with cloud platforms (Google Cloud, AWS, Azure) and cloud-native AI/ML services like Vertex AI and SageMaker is important. Your role will involve implementing MLOps practices, including model deployment, monitoring, retraining, and version control. Leadership skills are key to guide teams, mentor engineers, and collaborate effectively with cross-functional teams to achieve business objectives. A deep understanding of frameworks like TensorFlow, PyTorch, and Scikit-learn for designing, training, and deploying models is necessary. Experience with data engineering principles, scalable pipelines, and distributed systems (e.g., Apache Kafka, Spark, Kubernetes) is also required. Nice to have requirements include strong leadership and mentorship capabilities to guide teams towards best practices and high-quality deliverables. Excellent problem-solving skills focusing on designing efficient, high-performance systems are valued. Effective project management abilities are necessary to handle multiple initiatives and ensure timely delivery. Collaboration and teamwork are emphasized to foster a positive and productive work environment.,
Posted 1 month ago
4.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
We are seeking a PostgreSQL Database Developer with a minimum of 4 years of experience in Database management. We are looking for an individual who is enthusiastic about technology, committed to continuous learning, and approaches every client interaction as an opportunity to deliver exceptional customer service. Qualifications: - BE/B.Tech/MCA/MS-IT/CS/B.Sc/BCA or any related degrees - Proficiency in PostgreSQL, PLSQL, Oracle, Query optimization, Performance tuning, and GCP Cloud Key Responsibilities: - Proficient in PL/SQL and PostgreSQL programming with the ability to write complex SQL Queries and Stored Procedures - Experience in migrating Database structure and data from Oracle to PostgreSQL, preferably on GCP Alloy DB or Cloud SQL - Expertise in working with Cloud SQL/Alloy DB, tuning Alloy DB/PostgreSQL for enhanced performance, and utilizing BigQuery, Fire Store, Memory Store, Spanner, and bare metal setup - Familiarity with GCP Data migration service, MongoDB, Cloud Dataflow, Database Disaster Recovery, Job scheduling, logging techniques, and OLTP/OLAP - Desirable: GCP Database Engineer Certification Additional Responsibilities: - Develop, test, and maintain data architectures - Migrate Enterprise Oracle database from On-Premises to GCP cloud with a focus on autovacuum in PostgreSQL - Performance tuning of PostgreSQL stored procedure code and queries - Converting Oracle stored procedures & queries to PostgreSQL equivalents - Create Hybrid data stores integrating Datawarehouse and NoSQL GCP solutions with PostgreSQL - Lead the database team Mandatory Skills: PostgreSQL, PLSQL, BigQuery, GCP Cloud, Tuning, and Optimization If you meet the requirements and are interested in this position, kindly share your resume with details including CTC, expected CTC, Notice period, and Last Working Day (LWD) at sonali.mangore@impetus.com.,
Posted 1 month ago
0.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Senior Principal Consultant - GCP AI Engineer! In this role, We are seeking an experienced and passionate GCP AI Engineer to join our team and drive innovation through the application of artificial intelligence solutions on the Google Cloud Platform (GCP). The ideal candidate will have a strong understanding of AI principles, cloud computing, and a track record of delivering impactful AI-driven projects. Responsibilities 1. GCP AI Solutions Development: Design and develop AI solutions using GCP AI services, including Vertex AI, BigQuery ML, and other specialized AI tools. 2. Data Preprocessing and Engineering: Preprocess and engineer large datasets using GCP data tools and techniques to prepare them for AI model training. 3. Model Training and Deployment: Train and deploy AI models on GCP, optimizing model performance and efficiency while ensuring scalability and reliability. 4. Algorithm Selection and Tuning: Select appropriate AI algorithms and hyperparameters for specific business problems, and tune models to achieve optimal performance. 5. Cloud Infrastructure Management: Provision and manage GCP cloud infrastructure, including virtual machines, containers, and storage systems, to support AI workloads. 6. Performance Monitoring and Optimization: Continuously monitor and evaluate AI models in production, identifying and addressing performance bottlenecks and opportunities for optimization. 7. Collaboration and Communication: Collaborate with cross-functional teams, including data scientists, software engineers, and product managers, to deliver end-to-end AI solutions. 8. Documentation and Knowledge Sharing: Document AI project methodologies, findings, and best practices, and share knowledge with the broader team to foster a culture of innovation. 9. Stay Up-to-Date : Keep abreast of the latest advancements in AI and GCP AI services, attending conferences, workshops, and training programs to enhance skills and knowledge. 10. Hands on experience developing solutions with GCP%27s ML APIs. Qualifications we seek in you! Minimum Qualification s . Bachelor&rsquos degree or equivalent experience in Computer Science, Artificial Intelligence, or a related field. . Relevant experience in AI development and deployment, with a focus on GCP AI services. . Strong programming skills in Python and proficiency with cloud-based AI frameworks and libraries. . Solid understanding of machine learning algorithms, including supervised and unsupervised learning, and deep learning architectures. . Experience with natural language processing, computer vision, or other specialized AI domains is a plus. . Familiarity with GCP data management tools and services, such as BigQuery , Cloud Storage, and Cloud Dataflow. Preferred Qualifications/ Skills . Excellent problem-solving skills and a strong analytical mindset. . Strong communication skills, both written and verbal, with the ability to explain technical concepts to non-technical stakeholders. . Ability to work independently and as part of a team in a fast-paced, dynamic environment. . Passion for AI and its potential to solve real-world problems using GCP technologies. Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |