Jobs
Interviews

11 Dataform Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

1 - 1 Lacs

Chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description: At the client's Credit Company, we are modernizing our enterprise data warehouse in Google Cloud to enhance data, analytics, and AI/ML capabilities, improve customer experience, ensure regulatory compliance, and boost operational efficiencies. As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to the client's Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for the client's Credit. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform. Skills Required: Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform - Biq Query Experience Required: GCP Data Engineer Certified Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions. 5+ years of complex SQL development experience 2+ experience with programming languages such as Python, Java, or Apache Beam. Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions. Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API, cloudbuild, App Engine, Apache Kafka, Pub/Sub, AI/ML, Kubernetes Experience Preferred: In-depth understanding of GCP's underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing with microservice architecture from container orchestration framework. Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Data engineering or development experience gained in a regulated financial environment. Experience in coaching and mentoring Data Engineers Project management tools like Atlassian JIRA Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Experience with data security, governance, and compliance best practices in the cloud. Experience with AI solutions or platforms that support AI solutions Experience using data science concepts on production datasets to generate insights Experience Range: 5+ years Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.

Posted 4 days ago

Apply

5.0 - 10.0 years

6 - 12 Lacs

Bengaluru, Karnataka, India

On-site

We are actively seeking a highly skilled and experienced GCP Data Engineer to join our client's team through Acme Services . This pivotal role requires very strong hands-on experience as a GCP Data Engineer with exceptional proficiency in SQL and PySpark . The ideal candidate will also possess solid experience with key GCP data services such as BigQuery, Dataform, Dataplex, and similar technologies . We are specifically looking for immediate joiners to currently serving candidates . Key Responsibilities Data Pipeline Development : Design, build, and maintain robust and scalable data pipelines on the Google Cloud Platform (GCP). SQL and PySpark Expertise : Utilize very strong expertise in SQL for data manipulation and querying, and PySpark for large-scale data processing and transformation. GCP Data Services Implementation : Work extensively with GCP data services including BigQuery for data warehousing, Dataform for data pipeline orchestration and governance, and Dataplex for data lake management and data governance. Data Modeling & Optimization : Implement efficient data models and optimize data workflows for performance and cost-effectiveness within the GCP ecosystem. Troubleshooting & Performance Tuning : Identify and resolve complex data-related issues, performing performance tuning on existing data processes. Collaboration : Work closely with data scientists, analysts, and other engineering teams to understand data requirements and deliver integrated solutions. Skills Very strong hands-on experience as a GCP Data Engineer . Exceptional proficiency in SQL for complex data querying and manipulation. Very strong experience with PySpark for distributed data processing. Solid experience with BigQuery, Dataform, Dataplex , and other relevant GCP data services. Knowledge of data warehousing concepts and ETL/ELT processes. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Qualifications Bachelor's degree in Computer Science, Engineering, or a related quantitative field. Proven experience as a Data Engineer with a significant focus on GCP. Immediate joiners to currently serving candidates are highly preferred.

Posted 1 week ago

Apply

5.0 - 10.0 years

6 - 12 Lacs

Kolkata, West Bengal, India

On-site

We are actively seeking a highly skilled and experienced GCP Data Engineer to join our client's team through Acme Services . This pivotal role requires very strong hands-on experience as a GCP Data Engineer with exceptional proficiency in SQL and PySpark . The ideal candidate will also possess solid experience with key GCP data services such as BigQuery, Dataform, Dataplex, and similar technologies . We are specifically looking for immediate joiners to currently serving candidates . Key Responsibilities Data Pipeline Development : Design, build, and maintain robust and scalable data pipelines on the Google Cloud Platform (GCP). SQL and PySpark Expertise : Utilize very strong expertise in SQL for data manipulation and querying, and PySpark for large-scale data processing and transformation. GCP Data Services Implementation : Work extensively with GCP data services including BigQuery for data warehousing, Dataform for data pipeline orchestration and governance, and Dataplex for data lake management and data governance. Data Modeling & Optimization : Implement efficient data models and optimize data workflows for performance and cost-effectiveness within the GCP ecosystem. Troubleshooting & Performance Tuning : Identify and resolve complex data-related issues, performing performance tuning on existing data processes. Collaboration : Work closely with data scientists, analysts, and other engineering teams to understand data requirements and deliver integrated solutions. Skills Very strong hands-on experience as a GCP Data Engineer . Exceptional proficiency in SQL for complex data querying and manipulation. Very strong experience with PySpark for distributed data processing. Solid experience with BigQuery, Dataform, Dataplex , and other relevant GCP data services. Knowledge of data warehousing concepts and ETL/ELT processes. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Qualifications Bachelor's degree in Computer Science, Engineering, or a related quantitative field. Proven experience as a Data Engineer with a significant focus on GCP. Immediate joiners to currently serving candidates are highly preferred.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Software Engineer Practitioner at TekWissen in Chennai, you will be a crucial part of the team responsible for the development and maintenance of the Enterprise Data Platform. Your main focus will be on designing, building, and optimizing scalable data pipelines within the Google Cloud Platform (GCP) environment. Working with GCP Native technologies such as BigQuery, Dataform, Dataflow, and Pub/Sub, you will ensure data governance, security, and optimal performance. This role offers you the opportunity to utilize your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at the client. To be successful in this role, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related field of study. You should have at least 5 years of experience with a strong understanding of database concepts and multiple database technologies to optimize query and data processing performance. Proficiency in SQL, Python, and Java is essential, along with experience in programming engineering transformations in Python or similar languages. Additionally, you should have the ability to work effectively across different organizations, product teams, and business partners, along with knowledge of Agile (Scrum) methodology and experience in writing user stories. Your skills should include expertise in data architecture, data warehousing, and Google Cloud Platform tools such as BigQuery, Data Flow, Dataproc, Data Fusion, and others. Experience with Data Warehouse concepts, ETL processes, and data service ecosystems is crucial for this role. Strong communication skills are necessary for both internal team collaboration and external stakeholder interactions. Your role will involve advocating for user experience through empathetic stakeholder relationships and ensuring effective communication within the team and with stakeholders. As a Software Engineer Practitioner, you should have excellent communication, collaboration, and influence skills to energize the team. Your knowledge of data, software, architecture operations, data engineering, and data management standards will be valuable in this role. Hands-on experience in Python using libraries like NumPy and Pandas is required, along with extensive knowledge of GCP offerings and bundled services related to data operations. You should also have experience in re-developing and optimizing data operations, data science, and analytical workflows and products. TekWissen Group is an equal opportunity employer that supports workforce diversity, and we encourage applicants from diverse backgrounds to apply. Join us in shaping the future of data engineering and making a positive impact on lives, communities, and the planet.,

Posted 2 weeks ago

Apply

12.0 - 16.0 years

0 Lacs

karnataka

On-site

As an Experienced Senior Data Engineer at Adobe, you will utilize Big Data and Google Cloud technologies to develop large-scale, on-cloud data processing pipelines and data warehouses. Your role will involve consulting with customers worldwide on their data engineering needs around Adobe's Customer Data Platform and supporting pre-sales discussions regarding complex and large-scale cloud data engineering solutions. You will design custom solutions on cloud by integrating Adobe's solutions in a scalable and performant manner. Additionally, you will deliver complex, large-scale, enterprise-grade on-cloud data engineering and integration solutions in a hands-on manner. To be successful in this role, you should have a total of 12 to 15 years of experience, with 3 to 4 years of experience leading Data Engineer teams in developing enterprise-grade data processing pipelines on Google Cloud. You must have led at least one project of medium to high complexity involving the migration of ETL pipelines and Data warehouses to the cloud. Your recent 3 to 5 years of experience should be with premium consulting companies. Profound hands-on expertise with Google Cloud Platform services, especially BigQuery, Dataform, Dataplex, etc., is essential. Exceptional communication skills are crucial for effectively engaging with Data Engineers, Technology, and Business leadership. Furthermore, the ability to leverage knowledge of GCP to other cloud environments is highly desirable. It would be advantageous to have experience consulting with customers in India and possess multi-cloud expertise, with knowledge of AWS and GCP. At Adobe, creativity, curiosity, and continuous learning are valued qualities that contribute to your career growth journey. To pursue a new opportunity at Adobe, ensure to update your Resume/CV and Workday profile, including your unique Adobe experiences and volunteer work. Familiarize yourself with the Internal Mobility page on Inside Adobe to understand the process and set up job alerts for roles that interest you. Prepare for interviews by following the provided tips. Upon applying for a role via Workday, the Talent Team will contact you within 2 weeks. If you progress to the official interview process with the hiring team, inform your manager to support your career growth. At Adobe, you will experience an exceptional work environment recognized globally. You will collaborate with colleagues dedicated to mutual growth through the Check-In approach, where ongoing feedback is encouraged. If you seek to make an impact, Adobe is the ideal place for you. Explore employee career experiences on the Adobe Life blog and discover the meaningful benefits offered. For individuals with disabilities or special needs requiring accommodation to navigate the Adobe.com website or complete the application process, contact accommodations@adobe.com or call (408) 536-3015.,

Posted 2 weeks ago

Apply

5.0 - 6.0 years

5 - 6 Lacs

Chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Software Engineer Practitioner Location: Chennai Work Type: Hybrid Position Description: We're seeking a highly skilled and experienced Full Stack Data Engineer to play a pivotal role in the development and maintenance of our Enterprise Data Platform. In this role, you'll be responsible for designing, building, and optimizing scalable data pipelines within our Google Cloud Platform (GCP) environment. You'll work with GCP Native technologies like BigQuery, Dataform,Dataflow, and Pub/Sub, ensuring data governance, security, and optimal performance. This is a fantastic opportunity to leverage your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at client. Basic Qualifications: Bachelors or Masters degree in a Computer Science, Engineering or a related or related field of study 5+ Years - Strong understating of Database concepts and experience with multiple database technologies optimizing query and data processing performance. 5+ Years - Full Stack Data Engineering Competency in a public cloud Google Critical thinking skills to propose data solutions, test, and make them a reality. 5+ Years - Highly Proficient in SQL, Python, Java- Experience programming engineering transformation in Python or a similar language. 5+ Years - Ability to work effectively across organizations, product teams and business partners. 5+ Years - Knowledge Agile (Scrum) Methodology, experience in writing user stories Deep understanding of data service ecosystems including data warehousing, lakes and Marts User experience advocacy through empathetic stakeholder relationship. Effective Communication both internally (with team members) and externally (with stakeholders) Knowledge of Data Warehouse concepts experience with Data Warehouse/ ETL processes Strong process discipline and thorough understating of IT processes (ISP, Data Security). Skills Required: Data Architecture, Data Warehousing, DataForm, Google Cloud Platform - Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API Experience Required: Excellent communication, collaboration and influence skills; ability to energize a team. Knowledge of data, software and architecture operations, data engineering and data management standards, governance and quality Hands on experience in Python using libraries like NumPy, Pandas, etc. Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, DataFlow, Dataform, PubSub Experience with recoding, re-developing and optimizing data operations, data science and analytical workflows and products. Experience Required: 5+ Years Education Required: Bachelor's Degree TekWissen Group is an equal opportunity employer supporting workforce diversity.

Posted 2 weeks ago

Apply

10.0 - 12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

about randstad enterprise As the leading global talent solutions provider, Randstad Enterprise enables companies to create sustainable business value and agility by keeping people at the heart of their organizations. Part of Randstad N.V., we combine unmatched talent data and market insights with smart technologies and deep people expertise. Our integrated talent solutions - delivered by Randstad Advisory, Randstad Sourceright and Randstad RiseSmart - help companies build skilled and agile workforces that move their businesses forward. Randstad Enterprise supports some of the world's most renowned brands to build their talent acquisition and management models that not only meet their business needs today but also in the future. We offer solutions in Europe, Middle East and Africa (EMEA) region, Asia Pacific (APAC) region as well as in North America (NAM) region. This results in a digital way of working and requires a proactive mind-set. Our solutions know no limits, we have proven experience delivering market-leading MSP, RPO, Total Talent, and Services Procurement Solutions including technology, talent marketing, talent intelligence, and workforce consulting services. We create the best talent experience, from attraction to onboarding and onto ongoing career development, we understand the human and digital touchpoints that compel talent to join and stay with a company. We know where the talent of tomorrow is, how they behave, what they are looking for, and how to build their loyalty toward a specific company employer brand. We push the boundaries of our industry to be able to see around the corner for our clients, continually investing in innovation to stay ahead in our market. ... About the Job The Director Data Engineering will lead the development and implementation of a comprehensive data strategy that aligns with the organization's business goals and enables data driven decision making. Roles and Responsibilities Build and manage a team of talented data managers and engineers with the ability to not only keep up with, but also pioneer, in this space Collaborate with and influence leadership to directly impact company strategy and direction Develop new techniques and data pipelines that will enable various insights for internal and external customers Develop deep partnerships with client implementation teams, engineering and product teams to deliver on major cross-functional measurements and testing Communicate effectively to all levels of the organization, including executives Provide success in partnering teams with dramatically varying backgrounds, from the highly technical to the highly creative Design a data engineering roadmap and execute the vision behind it Hire, lead, and mentor a world-class data team Partner with other business areas to co-author and co-drive strategies on our shared roadmap Oversee the movement of large amounts of data into our data lake Establish a customer-centric approach and synthesize customer needs Own end-to-end pipelines and destinations for the transfer and storage of all data Manage 3rd-party resources and critical data integration vendors Promote a culture that drives autonomy, responsibility, perfection and mastery. Maintain and optimize software and cloud expenses to meet financial goals of the company Provide technical leadership to the team in design and architecture of data products and drive change across process, practices, and technology within the organization Work with engineering managers and functional leads to set directio n and ambitious goals for the Engineering department Ensure data quality, security, and accessibility across the organization Skills You Will Need 10+ years of experience in data engineering 5+ years of experience leading data teams of 30+ resources or more, including selection of talent planning / allocating resources across multiple geographies and functions. 5+ years of experience with GCP tools and technologies, specifically, Google BigQuery, Google cloud composer, Dataflow, Dataform, etc. Experience creating large-scale data engineering pipelines, data-based decision-making and quantitative analysis tools and software Experience with hands-on to code version control systems (git) Experience with CICD, data architectures, pipelines, quality, and code management Experience with complex, high volume, multi-dimensional data, based on unstructured, structured, and streaming datasets Experience with SQL and NoSQL databases Experience creating, testing, and supporting production software and systems Proven track record of identifying and resolving performance bottlenecks for production systems Experience designing and developing data lake, data warehouse, ETL and task orchestrating systems Strong leadership, communication, time management and interpersonal skills Proven architectural skills in data engineering Experience leading teams developing production-grade data pipelines on large datasets Experience designing a large data lake and lake house experience, managing data flows that integrate information from various sources into a common pool implementing data pipelines based on the ETL model Experience with common data languages (e.g. Python, Scala) and data warehouses (e.g. Redshift, BigQuery, Snowflake, Databricks) Extensive experience on cloud tools and technologies - GCP preferred Experience managing real-time data pipelines Successful track record and demonstrated thought-leadership and cross-functional influence and partnership within an agile / water-fall development environment. Experience in regulated industries or with compliance frameworks (e.g., SOC 2, ISO 27001). Nice to have: HR services industry experience Experience in data science, including predictive modeling Experience leading teams across multiple geographies

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Bengaluru

Hybrid

Job Description: We are seeking an experienced Data Engineer for a contract/consulting engagement to design, build, and maintain scalable data infrastructure using Google Cloud Platform technologies and advanced analytics visualization. This role requires 5-8 years of hands-on experience in modern data engineering practices with a strong focus on cloud-native solutions and business intelligence. Key Responsibilities: Data Infrastructure & Engineering (70%) Experience in designing tables and working with complex queries in Google BigQuery Build and maintain data transformation workflows using Dataflow, Dataform Design and implement robust data pipelines using Apache Airflow for workflow orchestration and scheduling Architect scalable ETL/ELT processes handling large-scale data ingestion from multiple sources Optimize BigQuery performance through partitioning, clustering, and cost management strategies Collaborate with DevOps teams to implement CI/CD pipelines for data infrastructure Solid technical background with a complete understanding of Data Warehouse Modeling, architectures, OLAP, OLTP data sets, etc Experience with Java, Python will be plus Analytics & Visualization (30%) Create compelling data visualizations and interactive dashboards using Tableau Experience in designing Tableau models with Live & Extract data source. Good to have experience in Tableau Prep Partner with business stakeholders to translate requirements into analytical solutions Design and implement self-service analytics capabilities for end users Optimize Tableau workbooks for performance and user experience Integrate Tableau with BigQuery for real-time analytics and reporting Technical Skills Core Data Engineering (Must Have) 5-8 years of progressive experience in data engineering roles Expert-level proficiency in SQL with complex query optimization experience Hands-on experience with Google BigQuery for data warehousing and analytics Proven experience with Apache Airflow for workflow orchestration and pipeline management Working knowledge of Dataflow and Dataform for data transformation and modeling Experience with GCP services: Cloud Storage, Pub/Sub, Cloud Functions, Cloud Composer Visualization & Analytics Strong proficiency in Tableau for data modeling, data visualization and dashboard development Experience integrating Tableau with cloud data platforms Understanding of data visualization best practices and UX principles Knowledge of Tableau Server/Cloud administration and governance Additional Technical Requirements Experience with version control systems (Git) and collaborative development practices Knowledge of data modeling techniques (dimensional modeling, data vault) Understanding of data governance, security, and compliance frameworks Experience with infrastructure as code (Terraform preferred) Familiarity with scripting languages (Python/Java) for data processing Preferred Qualifications Google Cloud Professional Data Engineer certification Tableau Desktop Certified Professional or equivalent certification Experience with real-time data processing and streaming analytics Knowledge of machine learning workflows and MLOps practices Previous experience in agile development environments Role & responsibilities Preferred candidate profile

Posted 3 weeks ago

Apply

4.0 - 9.0 years

11 - 19 Lacs

Chennai

Work from Office

Role & responsibilities Python, Dataproc, Airflow PySpark, Cloud Storage, DBT, DataForm, NAS, Pubsub, TERRAFORM, API, Big Query, Data Fusion, GCP, Tekton Preferred candidate profile Data Engineer in Python - GCP Location Chennai Only 4+ Years of Experience

Posted 1 month ago

Apply

8.0 - 10.0 years

20 - 30 Lacs

Chennai

Hybrid

Role & responsibilities GCP Services - Biq Query, Data Flow, Dataproc, DataPlex, DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years in Data Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strong experience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobs for data importing/exporting Preferred candidate profile Python, dataflow, Dataproc, GCP Cloud Run, DataForm, Agile Software Development, Big Query, TERRAFORM, Data Fusion, Cloud SQL, GCP, KAFKA,Java. We would like to inform you that only immediate joiners will be considered for this position due to project urgency.

Posted 1 month ago

Apply

0.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Manager, Data Management The Data Analyst - Data Management is responsible for creating, refining, and optimizing data sets and metrics for BI reporting, data science, and self-serve analytics using data available on Google Cloud. Where required data is unavailable, the role collaborates with technology teams to onboard new data sources, ensuring alignment with enterprise data strategies. This role plays a crucial part in ensuring that data models adhere to best practices and comply with technology guardrails, maintaining the safe and efficient use of Google Cloud Platform (GCP). Strong expertise in data modeling, SQL, and cloud-based data analysis is essential. Prior experience in banking and People Analytics is an advantage Responsibilities Data Preparation & Optimization . Develop and maintain scalable, well-structured, and optimized data sets for reporting and analytics. . Enhance existing data models and metrics to improve accuracy, efficiency, and usability for BI tools like Tableau and Qlik. . Work closely with Data Scientists to ensure data structures support advanced analytics use cases. . Proactively refine and streamline data pipelines to reduce processing time and costs. Data Sourcing & Integration . Identify data gaps and collaborate with technology teams to onboard additional data onto GCP. . Ensure all newly ingested data is well-documented, governed, and optimized for analytical use. . Work closely with business stakeholders to understand data needs and align technical solutions accordingly. Governance & Compliance . Ensure adherence to technology guardrails and enterprise data policies for the safe use of GCP. . Maintain high data integrity and security standards, ensuring compliance with banking and regulatory requirements. . Implement best practices for data lineage, metadata management, and access controls. Collaboration & Stakeholder Engagement . Work closely with BI developers, Data Scientists, and Product Owners to support evolving data requirements. . Partner with Technology and Data Governance teams to ensure alignment with enterprise data strategy. . Act as a subject matter expert for data modeling, cloud analytics, and best practices in People Analytics Qualifications we seek in you! Minimum Qualifications / Skills . Google Cloud Certified - Data Analyst or equivalent cloud certification (definite advantage). . Relevant years of experience in data analysis, BI reporting, or data engineering. . Prior experience in banking and/or People Analytics is a strong advantage. . Experience working with large-scale enterprise data platforms and cloud environments. . . Preferred Qualifications/ Skills Technical Expertise . Expert knowledge of data modeling concepts, SQL, and data warehousing principles. . Hands-on experience with Google Cloud Platform (BigQuery, Dataform, Cloud Storage, etc.). . Strong understanding of BI tools (Tableau, Qlik) and data science workflows. . Knowledge of ETL/ELT processes and data pipeline optimization. Business & Analytical Skills . Ability to translate business needs into structured data solutions. . Strong data storytelling skills, ensuring insights are clear and actionable. . Experience working in banking and People Analytics is highly advantageous. Governance & Compliance . Understanding of data governance, security, and compliance requirements in banking. . Familiarity with data cataloging, lineage, and metadata management practices. . Experience ensuring technology guardrails are adhered to in cloud environments. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies