Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
1 - 1 Lacs
Chennai
Hybrid
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description: At the client's Credit Company, we are modernizing our enterprise data warehouse in Google Cloud to enhance data, analytics, and AI/ML capabilities, improve customer experience, ensure regulatory compliance, and boost operational efficiencies. As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to the client's Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for the client's Credit. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform. Skills Required: Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform - Biq Query Experience Required: GCP Data Engineer Certified Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions. 5+ years of complex SQL development experience 2+ experience with programming languages such as Python, Java, or Apache Beam. Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions. Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API, cloudbuild, App Engine, Apache Kafka, Pub/Sub, AI/ML, Kubernetes Experience Preferred: In-depth understanding of GCP's underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing with microservice architecture from container orchestration framework. Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Data engineering or development experience gained in a regulated financial environment. Experience in coaching and mentoring Data Engineers Project management tools like Atlassian JIRA Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Experience with data security, governance, and compliance best practices in the cloud. Experience with AI solutions or platforms that support AI solutions Experience using data science concepts on production datasets to generate insights Experience Range: 5+ years Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.
Posted 3 days ago
1.0 - 5.0 years
0 Lacs
chennai, tamil nadu
On-site
As a skilled Data Engineer, you will leverage your expertise to contribute to the development of data modeling, ETL processes, and reporting systems. With over 3 years of hands-on experience in areas such as ETL, Big Query, SQL, Python, or Alteryx, you will play a crucial role in enhancing data engineering processes. Your advanced knowledge of SQL programming and database management will be key in ensuring the efficiency of data operations. In this role, you will utilize your solid experience with Business Intelligence reporting tools like Power BI, Qlik Sense, Looker, or Tableau to create insightful reports and analytics. Your understanding of data warehousing concepts and best practices will enable you to design robust data solutions. Your problem-solving skills and attention to detail will be instrumental in addressing data quality issues and proposing effective BI solutions. Collaboration and communication are essential aspects of this role, as you will work closely with stakeholders to define requirements and develop data-driven insights. Your ability to work both independently and as part of a team will be crucial in ensuring the successful delivery of projects. Additionally, your proactive approach to learning new tools and techniques will help you stay ahead in a dynamic environment. Preferred skills include experience with GCP cloud services, Python, Hive, Spark, Scala, JavaScript, and various BI/reporting tools. Your strong oral, written, and interpersonal communication skills will enable you to effectively convey insights and solutions to stakeholders. A Bachelor's degree in Computer Science, Computer Information Systems, or a related field is required for this role. Overall, as a Data Engineer, you will play a vital role in developing and maintaining data pipelines, reporting systems, and dashboards. Your expertise in SQL, BI tools, and data validation will contribute to ensuring data accuracy and integrity across all systems. Your analytical mindset and ability to perform root cause analysis will be key in identifying opportunities for improvement and driving data-driven decision-making within the organization.,
Posted 4 days ago
5.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
It&aposs fun to work in a company where people truly BELIEVE in what they are doing! We&aposre committed to bringing passion and customer focus to the business. Location - Open Position: Data Engineer (GCP) Technology If you are an extraordinary developer and who loves to push the boundaries to solve complex business problems using creative solutions, then we wish to talk with you. As an Analytics Technology Engineer, you will work on the Technology team that helps deliver our Data Engineering offerings at large scale to our Fortune clients worldwide. The role is responsible for innovating, building and maintaining technology services. Responsibilities: Be an integral part of large scale client business development and delivery engagements Develop the software and systems needed for end-to-end execution on large projects Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions Build the knowledge base required to deliver increasingly complex technology projects Qualifications & Experience: A bachelors degree in Computer Science or related field with 5 to 10 years of technology experience Desired Technical Skills: Data Engineering and Analytics on Google Cloud Platform: Basic Cloud Computing Concepts Bigquery, Google Cloud Storage, Cloud SQL, PubSub, Dataflow, Cloud Composer, GCP Data Transfer, gcloud CLI Python, Google Cloud Python SDK, SQL Experience in working with Any NoSQL/Columnar / MPP Database Experience in working with Any ETL Tool (Informatica/DataStage/Talend/Pentaho etc.) Strong Knowledge of database concepts, Data Modeling in RDBMS Vs NoSQL, OLTP Vs OLAP, MPP Architecture Other Desired Skills: Excellent communication and co-ordination skills Problem understanding, articulation and solutioning Quick learner & adaptable with regards to new technologies Ability to research & solve technical issues Responsibilities: Developing Data Pipelines (Batch/Streaming) Developing Complex data transformations ETL Orchestration Data Migration Develop and Maintain Datawarehouse / Data Lakes Good To Have: Experience in working with Apache Spark / Kafka Machine Learning concepts Google Cloud Professional Data Engineer Certification If you like wild growth and working with happy, enthusiastic over-achievers, you&aposll enjoy your career with us! Not the right fit Let us know you&aposre interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest! Show more Show less
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You will be working as a Technical Lead Data Engineer for a leading data and AI/ML solutions provider based in Gurgaon. In this role, you will be responsible for designing, developing, and leading complex data projects primarily on Google Cloud Platform and other modern data stacks. Your key responsibilities will include leading the design and implementation of robust data pipelines, collaborating with cross-functional teams to deliver end-to-end data solutions, owning project modules, developing technical roadmaps, and implementing data governance frameworks on GCP. You will be required to integrate GCP data services like BigQuery, Dataflow, Dataproc, Cloud Composer, Vertex AI Studio, and GenAI with platforms such as Snowflake. Additionally, you will write efficient code in Python, SQL, and ETL/orchestration tools, utilize containerized solutions for scalable deployments, and apply expertise in PySpark, Kafka, and advanced data querying for high-volume data environments. Monitoring, optimizing, and troubleshooting system performance, reducing job run-times through architecture optimization, developing data warehouses, and mentoring team members will also be part of your role. To be successful in this position, you should have a Bachelors or Masters degree in Computer Science, Engineering, or a related field. Extensive hands-on experience with Google Cloud Platform data services, Snowflake integration, strong programming skills in Python and SQL, proficiency in PySpark, Kafka, and data querying tools, and experience with containerized solutions using Google Kubernetes Engine are essential. Strong communication skills, documentation skills, experience with large distributed datasets, and the ability to balance short-term deliverables with long-term technical sustainability are also required. Prior leadership experience in data engineering teams and exposure to cloud data platforms are desirable. This role offers you the opportunity to lead high-impact data projects for reputed clients in a fast-growing data consulting environment, work with cutting-edge technologies, and collaborate in an innovative and growth-oriented culture.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
punjab
On-site
As a GCP Data Engineer in Australia, you will be responsible for leveraging your experience in Google Cloud Platform (GCP) to handle various aspects of data engineering. Your role will involve working on data migration projects from legacy systems such as SQL and Oracle. You will also be designing and building ETL pipelines for data lake and data warehouse solutions on GCP. In this position, your expertise in GCP data and analytics services will be crucial. You will work with tools like Cloud Dataflow, Cloud Dataprep, Apache Beam/Cloud composer, Cloud BigQuery, Cloud Fusion, Cloud PubSub, Cloud storage, and Cloud Functions. Additionally, you will utilize Cloud Native GCP CLI/gsutil for operations and scripting languages like Python and SQL to enhance data processing efficiencies. Furthermore, your experience with data governance practices, metadata management, data masking, and encryption will be essential. You will utilize GCP tools such as Cloud Data Catalog and GCP KMS tools to ensure data security and compliance. Overall, this role requires a strong foundation in GCP technologies and a proactive approach to data engineering challenges in a dynamic environment.,
Posted 1 week ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad
Work from Office
Greetings from Technogen !!! We thank you for taking time about your competencies and skills, while allowing us an opportunity to explain about us and our Technogen , we understand that your experience and expertise are relevant the current open with our clients. About Technogen : TechnoGen Brief Overview :- LinkedIn : https://www.linkedin.com/company/technogeninc/about/ TechnoGen, Inc. is an ISO 9001:2015, ISO 20000-1:2011, ISO 27001:2013, and CMMI Level 3 Global IT Services Company headquartered in Chantilly, Virginia. TechnoGen, Inc. (TGI) is a Minority & Women-Owned Small Business with over 20 years of experience providing end-to-end IT Services and Solutions to the Public and Private sectors. TGI provides highly skilled and certied professionals and has successfully executed more than 345 projects. TechnoGen is committed to helping our clients solve complex problems and achieve their goals, on time and under budget. Please share below details for further processing of your profile. Total years of experience: Relevant years of experience: CTC (Including Variable): ECTC: Notice Period: Reason for change: Current location: Job Title :GCP Data Engineer Required Experience : 5+ years Work Mode: WFO-4 Days from Office. Shift Time : UK Shift Time-12:00 PM IST to 09:00 PM IST. Location : Hyderabad. Job Summary :- As a GCP Data Engineer, we need someone with strong experience in SQL and Python. The ideal candidate should have hands-on expertise in Google Cloud Platform (GCP) services, especially BigQuery, Composer, Airflow framework and a solid understanding of data engineering best practices. You will work closely with our internal teams and technology partners to deliver comprehensive and scalable marketing data and analytics solutions. This role offers the unique opportunity to engage in many technology platforms in a rapidly evolving marketing technology landscape. Key Responsibilities: • Technical oversight and team management of the developers, coordination with US based Mattel resources, and perform estimation of work. Strong knowledge in cloud computing platforms - Google Cloud Expertise in MySQL & SQL/PL Good Experience in IICS Experience in ETL Ascend IO is added advantage GCP & BigQuery knowledge is must, GCP certification is added advantage Good experience in Google Cloud Storage (GCS), Cloud Composer, DAGs , Airflow REST API development experience Good in analytical and problem solving, efficient communication Experience in designing, implementing, and managing various ETL job execution flows. Utilize Git for source version control. Set up and maintain CI/CD pipelines. Troubleshoot, debug, and upgrade existing application & ETL job chains. Comprehensive data analysis across complex data sets Ability to collaborate effectively across technical development teams and business departments Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, or a related field. 5+ years of experience in data engineering or related roles Strong understanding of Google Cloud Platform and associated tools. Proven experience in delivering consumer marketing data and analytics solutions for enterprise clients. Strong knowledge of data management, ETL processes, data warehousing, and analytics platforms. Experience with SQL and NoSQL databases. Proficiency in Python programming languages. Hands-on experience with data warehousing solutions Knowledge of marketing analytics tools and technologies, including but not limited to Google Analytics, Blueconic, Klaviyo, etc. Knowledge of performance marketing concepts such as targeting & segmentation, real-time optimization, A/B testing, attribute modeling, etc. Excellent communication skills with a track record of collaboration across multiple teams Strong collaboration skills and team-oriented mindset. Strong problem-solving skills, adaptability, and the ability to thrive in a dynamic and rapidly changing environment. Experience working in Agile development environments Best Regards, Syam.M | Sr.IT Recruiter syambabu.m@technogenindia.com www.technogenindia.com | Follow us on LinkedIn
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have at least 3 years of hands-on experience in data modeling, ETL processes, developing reporting systems, and data engineering using tools such as ETL, Big Query, SQL, Python, or Alteryx. Additionally, you should possess advanced knowledge in SQL programming and database management. Moreover, you must have a minimum of 3 years of solid experience working with Business Intelligence reporting tools like Power BI, Qlik Sense, Looker, or Tableau, along with a good understanding of data warehousing concepts and best practices. Excellent problem-solving and analytical skills are essential for this role, as well as being detail-oriented with strong communication and collaboration skills. The ability to work both independently and as part of a team is crucial for success in this position. Preferred skills include experience with GCP cloud services such as BigQuery, Cloud Composer, Dataflow, CloudSQL, Looker, Looker ML, Data Studio, and GCP QlikSense. Strong SQL skills and proficiency in various BI/Reporting tools to build self-serve reports, analytic dashboards, and ad-hoc packages leveraging enterprise data warehouses are also desired. Moreover, having at least 1 year of experience in Python and Hive/Spark/Scala/JavaScript is preferred. Additionally, you should have a solid understanding of consuming data models, developing SQL, addressing data quality issues, proposing BI solution architecture, articulating best practices in end-user visualizations, and development delivery experience. Furthermore, it is important to have a good grasp of BI tools, architectures, and visualization solutions, coupled with an inquisitive and proactive approach to learning new tools and techniques. Strong oral, written, and interpersonal communication skills are necessary, and you should be comfortable working in a dynamic environment where problems are not always well-defined.,
Posted 1 week ago
8.0 - 13.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced GCP Data Engineer with 8+ years of expertise in designing and implementing robust, scalable data architectures on Google Cloud Platform. Your role involves defining and leading the implementation of data architecture strategies using GCP services to meet business and technical requirements. As a visionary GCP Data Architect, you will be responsible for architecting and optimizing scalable data pipelines using Google Cloud Storage, BigQuery, Dataflow, Cloud Composer, Dataproc, and Pub/Sub. You will design solutions for large-scale batch processing and real-time streaming, leveraging tools like Dataproc for distributed data processing. Your responsibilities also include establishing and enforcing data governance, security frameworks, and best practices for data management. You will conduct architectural reviews and performance tuning for GCP-based data solutions, ensuring cost-efficiency and scalability. Collaborating with cross-functional teams, you will translate business needs into technical requirements and deliver innovative data solutions. The required skills for this role include strong expertise in GCP services such as Google Cloud Storage, BigQuery, Dataflow, Cloud Composer, Dataproc, and Pub/Sub. Proficiency in designing and implementing data processing frameworks for ETL/ELT, batch, and real-time workloads is essential. You should have an in-depth understanding of data modeling, data warehousing, and distributed data processing using tools like Dataproc and Spark. Hands-on experience with Python, SQL, and modern data engineering practices is required. Your knowledge of data governance, security, and compliance best practices on GCP will be crucial in this role. Strong problem-solving, leadership, and communication skills are necessary for guiding teams and engaging stakeholders effectively.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
At PwC, the focus in data and analytics is on leveraging data to drive insights and make informed business decisions. Utilizing advanced analytics techniques to help clients optimize their operations and achieve strategic goals is key. In data analysis at PwC, the emphasis is on utilizing advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. Skills in data manipulation, visualization, and statistical modeling play a crucial role in supporting clients in solving complex business problems. Candidates with 4+ years of hands-on experience are sought for the position of Senior Associate in supply chain analytics. Successful candidates should possess proven expertise in supply chain analytics across domains such as demand forecasting, inventory optimization, logistics, segmentation, and network design. Additionally, hands-on experience working on optimization methods like linear programming, mixed integer programming, and scheduling optimization is required. Proficiency in forecasting techniques and machine learning techniques, along with a strong command of statistical modeling, testing, and inference, is essential. Familiarity with GCP tools like BigQuery, Vertex AI, Dataflow, and Looker is also necessary. Required skills include building data pipelines and models for forecasting, optimization, and scenario planning, strong SQL and Python programming skills, experience deploying models in a GCP environment, and knowledge of orchestration tools like Cloud Composer (Airflow). Nice-to-have skills consist of familiarity with MLOps, containerization (Docker, Kubernetes), and orchestration tools, as well as strong communication and stakeholder engagement skills at the executive level. The roles and responsibilities of the Senior Associate involve assisting analytics projects within the supply chain domain, driving design, development, and delivery of data science solutions. They are expected to interact with and advise consultants/clients as subject matter experts, conduct analysis using advanced analytics tools, and implement quality control measures for deliverable integrity. Validating analysis outcomes, making presentations, and contributing to knowledge and firm building activities are also part of the role. The ideal candidate should hold a degree in BE / B.Tech / MCA / M.Sc / M.E / M.Tech / Masters Degree / MBA from a reputed institute.,
Posted 2 weeks ago
7.0 - 12.0 years
22 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role & responsibilities BigQuery for building and optimizing data warehouses. Implement both batch and real-time (streaming) data processing solutions using Java. Cloud Composer (Airflow) for workflow orchestration and pipeline management. Dataproc for managing Apache Spark jobs in the cloud. Google Cloud Storage (GCS) for data storage and management.
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Data Engineer (ETL, Big Data, Hadoop, Spark, GCP) at Assistant Vice President level, located in Pune, India, you will be responsible for developing and delivering engineering solutions to achieve business objectives. You are expected to have a strong understanding of crucial engineering principles within the bank, and be skilled in root cause analysis through addressing enhancements and fixes in product reliability and resiliency. Working independently on medium to large projects with strict deadlines, you will collaborate in a cross-application technical environment, demonstrating a solid hands-on development track record within an agile methodology. Furthermore, this role involves collaborating with a globally dispersed team and is integral to the development of the Compliance tech internal team in India, delivering enhancements in compliance tech capabilities to meet regulatory commitments. Your key responsibilities will include analyzing data sets, designing and coding stable and scalable data ingestion workflows, integrating them with existing workflows, and developing analytics algorithms on ingested data. You will also be working on data sourcing in Hadoop and GCP, owning unit testing, UAT deployment, end-user sign-off, and production go-live. Root cause analysis skills will be essential for identifying bugs and issues, and supporting production support and release management teams. You will operate in an agile scrum team and ensure that new code is thoroughly tested at both unit and system levels. To excel in this role, you should have over 10 years of coding experience with reputable organizations, hands-on experience in Bitbucket and CI/CD pipelines, and proficiency in Hadoop, Python, Spark, SQL, Unix, and Hive. A basic understanding of on-prem and GCP data security, as well as hands-on development experience with large ETL/big data systems (with GCP experience being a plus), are required. Familiarity with cloud services such as cloud build, artifact registry, cloud DNS, and cloud load balancing, along with data flow, cloud composer, cloud storage, and data proc, is essential. Additionally, knowledge of data quality dimensions and data visualization is beneficial. You will receive comprehensive support, including training and development opportunities, coaching from experts in your team, and a culture of continuous learning to facilitate your career progression. The company fosters a collaborative and inclusive work environment, empowering employees to excel together every day. As part of Deutsche Bank Group, we encourage applications from all individuals and promote a positive and fair workplace culture. For further details about our company and teams, please visit our website: https://www.db.com/company/company.htm.,
Posted 3 weeks ago
0.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Principal Consultant - GCP Sr Data Engineer We are seeking a highly accomplished and strategic Google Cloud Data Engineer with over deep experience in data engineering, with a significant and demonstrable focus on the Google Cloud Platform (GCP). In this leadership role, you will be instrumental in defining and driving our overall data strategy on GCP, architecting transformative data solutions, and providing expert guidance to engineering teams. You will be a thought leader in leveraging GCP%27s advanced data services to solve complex business challenges, optimize our data infrastructure at scale, and foster a culture of data excellence. Responsibilities Define and champion the strategic direction for our data architecture and infrastructure on Google Cloud Platform, aligning with business objectives and future growth. Architect and oversee the development of highly scalable, resilient, and cost-effective data platforms and pipelines on GCP, leveraging services like BigQuery , Dataflow, Cloud Composer, DataProc , and more. Provide expert-level guidance and technical leadership to senior data engineers and development teams on best practices for data modeling, ETL/ELT processes, and data warehousing within GCP. Drive the adoption of cutting-edge GCP data technologies and methodologies to enhance our data capabilities and efficiency. Lead the design and implementation of comprehensive data governance frameworks, security protocols, and compliance measures within the Google Cloud environment. Collaborate closely with executive leadership, product management, data science, and analytics teams to translate business vision into robust and scalable data solutions on GCP. Identify and mitigate critical technical risks and challenges related to our data infrastructure and architecture on GCP. Establish and enforce data quality standards, monitoring systems, and incident response processes within the GCP data landscape. Mentor and develop senior data engineers, fostering their technical expertise and leadership skills within the Google Cloud context. Evaluate and recommend new GCP services and third-party tools to optimize our data ecosystem. Represent the data engineering team in strategic technical discussions and contribute to the overall technology roadmap. Qualifications we seek in you! Minimum Q ualifications / Skills Bachelor%27s or Master%27s degree in Computer Science , Engineering, or a related field. progressive and impactful experience in data engineering roles, with a significant and deep focus on the Google Cloud Platform. Expert-level knowledge of GCP%27s core data engineering services and best practices for building scalable and reliable solutions. Proven ability to architect and implement complex data warehousing and data lake solutions on GCP ( BigQuery , Cloud Storage). Mastery of SQL and extensive experience with programming languages relevant to data engineering on GCP (e.g., Python, Scala, Java). Deep understanding of data governance principles, security best practices within GCP (IAM, Security Command Center), and compliance frameworks (e.g., GDPR, HIPAA). Exceptional problem-solving, strategic thinking, and analytical skills, with the ability to navigate complex technical and business challenges. Outstanding communication, presentation, and influencing skills, with the ability to articulate complex technical visions to both technical and non-technical audiences, including executive leadership. Proven track record of leading and mentoring high-performing data engineering teams within a cloud- first environment. Preferred Q ualifications / Skills Google Cloud Certified Professional Data Engineer. Extensive experience with infrastructure-as-code tools for GCP (e.g., Terraform, Deployment Manager). Deep expertise in data streaming technologies on GCP (e.g., Dataflow, Pub/Sub, Apache Beam). Proven experience in integrating machine learning workflows and MLOps on GCP (e.g., Vertex AI). Significant contributions to open-source data projects or active participation in the GCP data engineering community. Experience in defining and implementing data mesh or data fabric architectures on GCP. Strong understanding of enterprise architecture principles and their application within the Google Cloud ecosystem. Experience in [mention specific industry or domain relevant to your company]. Demonstrated ability to drive significant technical initiatives and influence organizational data strategy. Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 3 weeks ago
18.0 - 22.0 years
25 - 30 Lacs
Pune
Work from Office
Treasury Technology are responsible for the design, build and operation of Deutsche Banks Treasury trading, Balance-sheet Management and Liquidity Reporting ecosystem. In partnership with the Treasury business we look to deliver innovative technology solutions that will enable business to gain competitive edge and operational efficiency. This is a Global role to lead the Engineering function for Treasury Engineering product portfolio. This aim is to develop best in class portfolio consisting of following products: Liqudity Measurement and Management Issuance and Securitization Risk in Banking Book Funds Transfer Pricing Treasury is about managing the money and financial risks in a business. This involves making sure the business has the capital it needs to manage its day-to-day business obligations, while helping develop its long term financial strategy and policies. Economic factors such as interest rate rises, changes in regulations and volatile foreign exchange rates can have a serious impact on any business. Treasuey is responsobile to monitor and assess market conditions and put strategies in place to mitigate any potential financial risks to the business. As a senior leader in Software Engineering, you will lead a highly inspired and inquisitive team of technologists to develop applications to the highest standards. You will be expected to solve complex business and technical challenges while managing a large and senior business stakeholders. You will build an effective and trusted global engineering capability that can deliver consistently against the business ambitions. You are expected to take ownership of the quality of the platform, dev automation, agile processes and production resiliency. Position Specific Responsibilities and Accountabilities: Lead the Global Engineering function across our strategic locations based at Pune, Buchrest, London and New York Communicate with senior business stakeholders with regards to the vision and business goals. Provide transparency to program status, and manage risks and issues Lead a culture of innovation and experimentation, support full software development lifecycle that incorporates the best of technology approaches and delivery methodologies Ensure on time product releases that are of high quality, enabling the core vision of next generation trade processing systems compliant with regulatory requirements Lead development of next generation of cloud enabled platforms which includes modern web frameworks and complex transaction processing systems leveraging a broad set of technology stacks Experience in building fault-tolerant, low-latency, scalable solutions that are performed at a global enterprise scale Implement the talent strategy for engineering aligned to the broader Treasury Technology talent strategy & operating model Develop application with industry best practise using DevOps and automated deployment and testing framework Skills Matrix: Education Qualifications: Degree from an accredited college or university (or equivalent certification and/or relevant work experience). Business Analysis and SME Experience: 18+ years experience in the following areas: Well-developed requirements analysis skills, including good communication abilities (both speaking and listening) and stakeholder management (all levels up to Managing Director). Experience working with Front Office business teams highly desirable Experience in IT delivery or architecture including experience as an Application Developer and people manager Strong object-oriented design skills Previous experience hiring, motivating and managing internal and vendor teams. Technical Experience Mandatory Skills: Java, ideally Spark and Scala Oracle PostGres other Database technologies Experience developing microservices based architectures UI design and implementation Business Process management tools (e.g.g JBPM, IBM BPN) Experience with a range of BI technologies including Tableau Experience with DevOps best practices (DORA), CI/CD Experience in application security, scalability, performance tuning and optimization (NFRs) Experience in API designing, sound knowledge of micro services, containerization (Docker), exposure to federated and NoSQL DB. Experience in database query tuning and optimization Experience in implementing Devops best practices including CI CD, Implementing API testing automation. Experience working in an Agile based team ideally Scrum Desirable skills: Experience with Cloud Services Platforms in particular Google Cloud, and internal cloud based development (Cloud Run, Cloud Composer, Cloud SQL, Docker, K8s) Industry Domain Experience Hands-on knowledge of enterprise technology platforms supporting Front Office, Finance and/or Risk domains would be a significant advantage, as would experience or interest in Sustainable Finance. For example: Knowledge of the Finance/controlling domain and end-to-end workflow for a banking & trading businesses. High level understanding of financial products across Investment, Corporate and Private/Retail banking, in particular Loans. Knowledge of the investment banking, sales & trading, asset management and similar industries is a strong advantage. Clear Thought & Leadership A mindset built on simplicity A clear understanding of the concept of re-use in software development, and the drive to apply it relentlessly Proficiency to talk in functional and data terms to clients, embedded architects and senior managers Technical Leadership skills Ability to work in a fast paced environment with competing and alternating priorities with a constant focus on delivery. Proven ability to balance business demands and IT fulfillment in terms of standardisation, reducing risk and increasing IT flexibility. Logical & structured approach to problem-solving in both near-term (tactical) and mid-long term (strategic) horizons. Communication: Good verbal as well as written communication and presentation capabilities. Good team player facilitator-negotiator networker. Able to lead senior managers towards common goals and build consensus across a diverse group. Able to lead and influence a diverse team from a range of technical and non-technical backgrounds.
Posted 1 month ago
0.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GCP Python Pyspark Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Strong hands-on experience in designing and building data pipelines using Google Cloud Platform (GCP) services like BigQuery, Dataflow, and Cloud Composer. Proficient in Python for data processing, scripting, and automation in cloud and distributed environments. Solid working knowledge of Apache Spark / PySpark, with experience in large-scale data transformation and performance tuning. Familiar with CI/CD processes, version control (Git), and workflow orchestration tools such as Airflow or Composer. Ability to work independently in fast-paced Agile environments with strong problem-solving and communication skills. Exposure to modern data architectures and real-time/streaming data solutions is an added advantage. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 1 month ago
0.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - GCP Sr Data Engineer We are seeking a highly accomplished and strategic Google Cloud Data Engineer with deep experience in data engineering, with a significant and demonstrable focus on the Google Cloud Platform (GCP). In this leadership role, you will be instrumental in defining and driving our overall data strategy on GCP, architecting transformative data solutions, and providing expert guidance to engineering teams. You will be a thought leader in leveraging GCP%27s advanced data services to solve complex business challenges, optimize our data infrastructure at scale, and foster a culture of data excellence. Responsibilities Define and champion the strategic direction for our data architecture and infrastructure on Google Cloud Platform, aligning with business objectives and future growth. Architect and oversee the development of highly scalable, resilient, and cost-effective data platforms and pipelines on GCP, leveraging services like BigQuery , Dataflow, Cloud Composer, DataProc , and more. Provide expert-level guidance and technical leadership to senior data engineers and development teams on best practices for data modeling, ETL/ELT processes, and data warehousing within GCP. Drive the adoption of cutting-edge GCP data technologies and methodologies to enhance our data capabilities and efficiency. Lead the design and implementation of comprehensive data governance frameworks, security protocols, and compliance measures within the Google Cloud environment. Collaborate closely with executive leadership, product management, data science, and analytics teams to translate business vision into robust and scalable data solutions on GCP. Identify and mitigate critical technical risks and challenges related to our data infrastructure and architecture on GCP. Establish and enforce data quality standards, monitoring systems, and incident response processes within the GCP data landscape. Mentor and develop senior data engineers, fostering their technical expertise and leadership skills within the Google Cloud context. Evaluate and recommend new GCP services and third-party tools to optimize our data ecosystem. Represent the data engineering team in strategic technical discussions and contribute to the overall technology roadmap. Qualifications we seek in you! Minimum Qualifications / Skills Bachelor%27s or Master%27s degree in Computer Science , Engineering, or a related field. experience in data engineering roles, with a significant and deep focus on the Google Cloud Platform. Expert-level knowledge of GCP%27s core data engineering services and best practices for building scalable and reliable solutions. Proven ability to architect and implement complex data warehousing and data lake solutions on GCP ( BigQuery , Cloud Storage). Mastery of SQL and extensive experience with programming languages relevant to data engineering on GCP (e.g., Python, Scala, Java). Deep understanding of data governance principles, security best practices within GCP (IAM, Security Command Center), and compliance frameworks (e.g., GDPR, HIPAA). Exceptional problem-solving, strategic thinking, and analytical skills, with the ability to navigate complex technical and business challenges. Outstanding communication, presentation, and influencing skills, with the ability to articulate complex technical visions to both technical and non-technical audiences, including executive leadership. Proven track record of leading and mentoring high-performing data engineering teams within a cloud- first environment. Preferred Qualifications/ Skills Google Cloud Certified Professional Data Engineer. Extensive experience with infrastructure-as-code tools for GCP (e.g., Terraform, Deployment Manager). Deep expertise in data streaming technologies on GCP (e.g., Dataflow, Pub/Sub, Apache Beam). Proven experience in integrating machine learning workflows and MLOps on GCP (e.g., Vertex AI). Significant contributions to open-source data projects or active participation in the GCP data engineering community. Experience in defining and implementing data mesh or data fabric architectures on GCP. Strong understanding of enterprise architecture principles and their application within the Google Cloud ecosystem. Experience in [mention specific industry or domain relevant to your company]. Demonstrated ability to drive significant technical initiatives and influence organizational data strategy. Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 month ago
5.0 - 10.0 years
20 - 35 Lacs
Pune
Work from Office
Responsibilities Lead and mentor a team of data engineers, providing technical guidance, setting best practices, and overseeing task execution for the migration project. Design, develop, and architect scalable ETL processes to extract, transform, and load petabytes of data from on-premises SQL Server to GCP Cloud SQL PostgreSQL. Oversee the comprehensive analysis of existing SQL Server schemas, data types, stored procedures, and complex data models, defining strategies for their optimal conversion and refactoring for PostgreSQL. Establish and enforce rigorous data validation, quality, and integrity frameworks throughout the migration lifecycle, ensuring accuracy and consistency. Collaborate strategically with Database Administrators, application architects, business stakeholders, and security teams to define migration scope, requirements, and cutover plans. Lead the development and maintenance of advanced scripts (primarily Python) for automating large-scale migration tasks, complex data transformations, and reconciliation processes. Proactively identify, troubleshoot, and lead the resolution of complex data discrepancies, performance bottlenecks, and technical challenges during migration. Define and maintain comprehensive documentation standards for migration strategies, data mapping, transformation rules, and post-migration validation procedures. Ensure data governance, security, and compliance standards are meticulously applied throughout the migration process, including data encryption and access controls within GCP. Implement Schema conversion or custom schema mapping strategy for SQL Server to PostgreSQL shift Refactor and translate complex stored procedures and T-SQL logic to PostgreSQL-compatible constructs while preserving functional equivalence. Develop and execute comprehensive data reconciliation strategies to ensure consistency and parity between legacy and migrated datasets post-cutover. Design fallback procedures and lead post-migration verification and support to ensure business continuity. Ensuring metadata cataloging and data lineage tracking using GCP-native or integrated tools. Must-Have Skills Expertise in data engineering, specifically for Google Cloud Platform (GCP). Deep understanding of relational database architecture, advanced schema design, data modeling, and performance tuning. Expert-level SQL proficiency, with extensive hands-on experience in both T-SQL (SQL Server) and PostgreSQL. Hands-on experience with data migration processes, including moving datasets from on-premises databases to cloud storage solutions. Proficiency in designing, implementing, and optimizing complex ETL/ELT pipelines for high-volume data movement, leveraging tools and custom scripting. Strong knowledge of GCP services: Cloud SQL, Dataflow, Pub/Sub, Cloud Storage, Dataproc, Cloud Composer, Cloud Functions, and Bigquery. Solid understanding of data governance, security, and compliance practices in the cloud, including the management of sensitive data during migration. Strong programming skills in Python or Java for building data pipelines and automating processes. Experience with real-time data processing using Pub/Sub, Dataflow, or similar GCP services. Experience with CI/CD practices and tools like Jenkins, GitLab, or Cloud Build for automating the data engineering pipeline. Knowledge of data modeling and best practices for structuring cloud data storage for optimal query performance and analytics in GCP. Familiarity with observability and monitoring tools in GCP (e.g., Stackdriver, Prometheus) for real-time data pipeline visibility and alerting. Good-to-Have Skills Direct experience with GCP Database Migration Service, Storage Transfer Service, or similar cloud-native migration tools. Familiarity with data orchestration using tools like Cloud Composer (based on Apache Airflow) for managing workflows. Experience with containerization tools like Docker and Kubernetes for deploying data pipelines in a scalable manner. Exposure to DataOps tools and methodologies for managing data workflows. Experience with machine learning platforms like AI Platform in GCP to integrate with data pipelines. Familiarity with data lake architecture and the integration of BigQuery with Google Cloud Storage or Dataproc.
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Hybrid
Shift : (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Data Engineering, Big Data Technologies, Hadoop, Spark, Hive, Presto, Airflow, Data Modeling, Etl development, Data Lake Architecture, Python, Scala, GCP Big Query, Dataproc, Dataflow, Cloud Composer, AWS, Big Data Stack, Azure, GCP Wayfair is Looking for: About the job The Data Engineering team within the SMART org supports development of large-scale data pipelines for machine learning and analytical solutions related to unstructured and structured data. You'll have the opportunity to gain hands-on experience on all kinds of systems in the data platform ecosystem. Your work will have a direct impact on all applications that our millions of customers interact with every day: search results, homepage content, emails, auto-complete searches, browse pages and product carousels and build and scale data platforms that enable to measure the effectiveness of wayfairs ad-costs , media attribution that helps to decide on day to day or major marketing spends. About the Role: As a Data Engineer, you will be part of the Data Engineering team with this role being inherently multi-functional, and the ideal candidate will work with Data Scientist, Analysts, Application teams across the company, as well as all other Data Engineering squads at Wayfair. We are looking for someone with a love for data, understanding requirements clearly and the ability to iterate quickly. Successful candidates will have strong engineering skills and communication and a belief that data-driven processes lead to phenomenal products. What you'll do: Build and launch data pipelines, and data products focussed on SMART Org. Helping teams push the boundaries of insights, creating new product features using data, and powering machine learning models. Build cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization. Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructure What You'll Need: Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience 3+ years relevant work experience in the Data Engineering field with web scale data sets. Demonstrated strength in data modeling, ETL development and data lake architecture. Data Warehousing Experience with Big Data Technologies (Hadoop, Spark, Hive, Presto, Airflow etc.). Coding proficiency in at least one modern programming language (Python, Scala, etc) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing and query performance tuning skills of large data sets. Industry experience as a Big Data Engineer and working along cross functional teams such as Software Engineering, Analytics, Data Science with a track record of manipulating, processing, and extracting value from large datasets. Strong business acumen. Experience leading large-scale data warehousing and analytics projects, including using GCP technologies Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related big data technologies in other cloud platforms like AWS, Azure etc. Be a team player and introduce/follow the best practices on the data engineering space. Ability to effectively communicate (both written and verbally) technical information and the results of engineering design at all levels of the organization. Good to have : Understanding of NoSQL Database exposure and Pub-Sub architecture setup. Familiarity with Bl tools like Looker, Tableau, AtScale, PowerBI, or any similar tools.
Posted 1 month ago
4.0 - 8.0 years
10 - 18 Lacs
Hyderabad
Hybrid
About the Role: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. Implement and enforce data quality checks, validation rules, and monitoring. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL. Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: 4-6 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture, data integration, and transformation techniques. Experience in working with version control systems like GitHub and knowledge of CI/CD practices. Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.
Posted 1 month ago
0.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About VOIS: VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group's partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. About VOIS India: In 2009, VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Description Role purpose: Creating detailed data architecture documentation, including data models, data flow diagrams, and technical specifications Creating and maintaining data models for databases, data warehouses, and data lakes, defining relationships between data entities to optimize data retrieval and analysis. Designing and implementing data pipelines to integrate data from multiple sources, ensuring data consistency and quality across systems. Collaborating with business stakeholders to define the overall data strategy, aligning data needs with business requirements. Support migration of new & changed software, elaborate and perform production checks Need to effectively communicate complex data concepts to both technical and non-technical stakeholders. GCP Knowledge/exp with Cloud Composer, BigQuery, Pub/Sub, Cloud Functions. -- Strong communicator, experienced in leading & negotiating decision and effective outcomes. -- Strong overarching Data Architecture knowledge and experience with ability to govern application of architecture principles within projects VOIS Equal Opportunity Employer Commitment India: VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 10 Best Workplaces for Millennials, Equity, and Inclusion , Top 50 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 10th Overall Best Workplaces in India by the Great Place to Work Institute in 2024. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we'll be in touch!
Posted 1 month ago
2.0 - 6.0 years
4 - 8 Lacs
Faridabad
Work from Office
Job Summary We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities Design and implement scalable data models using Snowflake and Erwin Data Modeler. Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. Ensure performance tuning, security, and optimization of the Snowflake data warehouse. Document metadata, data lineage, and business logic behind data structures and flows. Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills Snowflake architecture, schema design, and data warehouse experience. DBT (Data Build Tool) for data transformation and pipeline development. Strong expertise in SQL (query optimization, complex joins, window functions, etc.). Hands-on experience with Erwin Data Modeler (logical and physical modeling). Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have Experience with CI/CD tools and DevOps for data environments. Familiarity with data governance, security, and privacy practices. Exposure to Agile methodologies and working in distributed teams. Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills Excellent problem-solving and analytical skills. Strong communication and stakeholder management. Self-driven with the ability to work independently in a remote setup. Skills: gcp,erwin,dbt,sql,data modeling,dbeaver,bigquery,query optimization,dataflow,cloud storage,snowflake,erwin data modeler,data pipelines,data transformation,datamodeler
Posted 1 month ago
6.0 - 11.0 years
17 - 30 Lacs
Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of GCP Sr Data Engineer We are seeking a highly experienced and visionary Senior Google Cloud Data Engineer to spearhead the design, development, and optimization of our data infrastructure and pipelines on the Google Cloud Platform (GCP). With over 10 years of hands-on experience in data engineering, you will be instrumental in building scalable, reliable, and performant data solutions that power our advanced analytics, machine learning initiatives, and real-time reporting. You will provide technical leadership, mentor team members, and champion best practices for data engineering within a GCP environment. Responsibilities Architect, design, and implement end-to-end data pipelines on GCP using services like Dataflow, Cloud Composer (Airflow), Pub/Sub, and BigQuery. Build and optimize data warehousing solutions leveraging BigQuery's capabilities for large-scale data analysis. Design and implement data lakes on Google Cloud Storage, ensuring efficient data organization and accessibility. Develop and maintain scalable ETL/ELT processes to ingest, transform, and load data from diverse sources into GCP. Implement robust data quality checks, monitoring, and alerting mechanisms within the GCP data ecosystem. Collaborate closely with data scientists, analysts, and business stakeholders to understand their data requirements and deliver high-impact solutions on GCP. Lead the evaluation and adoption of new GCP data engineering services and technologies. Implement and enforce data governance policies, security best practices, and compliance requirements within the Google Cloud environment. Provide technical guidance and mentorship to other data engineers on the team, promoting knowledge sharing and skill development within the GCP context. Troubleshoot and resolve complex data-related issues within the GCP infrastructure. Contribute to the development of data engineering standards, best practices, and comprehensive documentation specific to GCP. Qualifications we seek in you! Minimum Qualifications / Skills • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 10+ years of progressive experience in data engineering roles, with a strong focus on cloud technologies. • Deep and demonstrable expertise with the Google Cloud Platform (GCP) and its core data engineering services (e.g., BigQuery, Dataflow, Cloud Composer, Cloud Storage, Pub/Sub, Cloud Functions). • Extensive experience designing, building, and managing large-scale data pipelines and ETL/ELT workflows specifically on GCP. • Strong proficiency in SQL and at least one programming language relevant to data engineering on GCP (e.g., Python). • Comprehensive understanding of data warehousing concepts, data modeling techniques optimized for BigQuery, and NoSQL database options on GCP (e.g., Cloud Bigtable, Firestore). • Solid grasp of data governance principles, data security best practices within GCP (IAM, KMS), and compliance frameworks. • Excellent problem-solving, analytical, and debugging skills within a cloud environment. • Exceptional communication, collaboration, and presentation skills, with the ability to articulate technical concepts clearly to various audiences. Preferred Qualifications/ Skills Google Cloud certifications relevant to data engineering (e.g., Professional Data Engineer). Experience with infrastructure-as-code tools for GCP (e.g., Terraform, Deployment Manager). Familiarity with data streaming technologies on GCP (e.g., Dataflow, Pub/Sub). Experience with machine learning workflows and MLOps on GCP (e.g., Vertex AI). Knowledge of containerization technologies (Docker, Kubernetes) and their application within GCP data pipelines (e.g., Dataflow FlexRS). Experience with data visualization tools that integrate well with GCP (e.g., Looker). Familiarity with data cataloging and data lineage tools on GCP (e.g., Data Catalog). Experience in [mention specific industry or domain relevant to your company]. Proven experience in leading technical teams and mentoring junior engineers in a GCP environment. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 1 month ago
7.0 - 10.0 years
0 Lacs
Pune, Chennai, Bengaluru
Work from Office
As a GCP data engineer the colleague should be able to designs scalable data architectures on Google Cloud Platform, using services like Big Query and Dataflow. They write and maintain code (Python, Java), ensuring efficient data models and seamless ETL processes. Quality checks and governance are implemented to maintain accurate and reliable data. Security is a priority, enforcing measures for storage, transmission, and processing, while ensuring compliance with data protection standards. Collaboration with cross-functional teams is key for understanding diverse data requirements. Comprehensive documentation is maintained for data processes, pipelines, and architectures. Responsibilities extend to optimizing data pipelines and queries for performance, troubleshooting issues, and proactively monitoring data accuracy. Continuous learning is emphasized to stay updated on GCP features and industry best practices, ensuring a current and effective data engineering approach. Experience - Proficiency in programming languages: Python, Pyspark - Expertise in data processing frameworks: Apache Beam (Data Flow) - Active experience on GCP tools and technologies like Big Query, Dataflow, Cloud Composer , Cloud Spanner, GCS, DBT etc., - Data Engineering skillset using Python, SQL - Experience in ETL (Extract, Transform, Load) processes - Knowledge of DevOps tools like Jenkins, GitHub, Terraform is desirable. Should have good knowledge on Kafka (Batch/ streaming) - Understanding of Data models and experience in performing ETL design and build, database replication using Message based CDC - Familiarity with cloud storage solutions - Strong problem-solving abilities in data engineering challenges - Understanding of data security and scalability - Proficiency in relevant tools like Apache Airflow Desirables Knowledge of data modelling and database design Good understanding of Cloud Security Proven practical experience of using the Google Cloud SDK to deliver APIs and automation Crafting continuous integration and continuous delivery/deployment tooling pipelines (Jenkins/Spinnaker)
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
Pune, Bengaluru
Hybrid
Job Summary We are seeking a highly skilled Hadoop Developer / Lead Data Engineer to join our data engineering team based in Bangalore or Pune. The ideal candidate will have extensive experience with Hadoop ecosystem technologies and cloud-based big data platforms, particularly on Google Cloud Platform (GCP). This role involves designing, developing, and maintaining scalable data ingestion, processing, and transformation frameworks to support enterprise data needs. Minimum Qualifications Bachelor's degree in computer science, Computer Information Systems, or related technical field. 5-10 years of experience in software engineering or data engineering, with a strong focus on big data technologies. Proven experience in implementing software development life cycles (SDLC) in enterprise environments. Technical Skills & Expertise Big Data Technologies: Expertise in Hadoop platform, Hive , and related ecosystem tools. Strong experience with Apache Spark (using SQL, Scala, and/or Java). Experience with real-time data streaming using Kafka . Programming Languages & Frameworks: Proficient in PySpark and SQL for data processing and transformation. Strong coding skills in Python . Cloud Technologies (Google Cloud Platform): Experience with BigQuery for data warehousing and analytics. Familiarity with Cloud Composer (Airflow) for workflow orchestration. Hands-on with DataProc for managed Spark and Hadoop clusters. Responsibilities Design, develop, and implement scalable data ingestion and transformation pipelines using Hadoop and GCP services. Build real-time and batch data processing solutions leveraging Spark, Kafka, and related technologies. Ensure data quality, governance, and lineage by implementing automated validation and classification frameworks. Collaborate with cross-functional teams to deploy and operationalize data analytics tools at enterprise scale. Participate in production support and on-call rotations to maintain system reliability. Follow established SDLC practices to deliver high-quality, maintainable solutions. Preferred Qualifications Experience leading or mentoring data engineering teams. Familiarity with CI/CD pipelines and DevOps best practices for big data environments. Strong communication skills with an ability to collaborate across teams.
Posted 1 month ago
12.0 - 15.0 years
40 - 45 Lacs
Chennai
Work from Office
Skill & Experience Strategic Planning and Direction Maintain architecture principles, guidelines and standards Project & Program Management Data Warehousing Big Data Data Analytics &; Data Science for solutioning Expert in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, Etc Strong Experience in Big Data- Data Modelling, Design, Architecting & Solutioning Understands programming language like SQL, Python, R-Scala. Good Python skills, - Experience from data visualisation tools such as Google Data Studio or Power BI Knowledge in A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Date Engineering, ETL Data Processing Strong Migration experience of production Hadoop Cluster to Google Cloud. Experience in designing & mplementing solution in mentioned areas:Strong Google Cloud Platform Data Components BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, Etc
Posted 2 months ago
7.0 - 12.0 years
20 - 25 Lacs
Chennai, Bengaluru
Work from Office
We are looking for a Senior GCP Data Engineer / GCP Technical Lead with strong expertise in Google Cloud Platform (GCP), Apache Spark, and Python to join our growing data engineering team. The ideal candidate will have extensive experience working with GCP data services and should be capable of leading technical teams, designing robust data pipelines, and interacting directly with clients to gather requirements and ensure project delivery. Project Duration : 1 year and extendable Role & responsibilities Design, develop, and deploy scalable data pipelines and solutions using GCP services like DataProc and BigQuery. Lead and mentor a team of data engineers to ensure high-quality deliverables. Collaborate with cross-functional teams and client stakeholders to define technical requirements and deliver solutions aligned with business goals. Optimize data processing and transformation workflows for performance and cost-efficiency. Ensure adherence to best practices in cloud data architecture, data security, and governance. Mandatory Skills: Google Cloud Platform (GCP) especially DataProc and BigQuery Apache Spark Python Programming Preferred Skills: Experience in working with large-scale data processing frameworks. Exposure to DevOps/CI-CD practices in a cloud environment. Hands-on experience with other GCP tools like Cloud Composer, Pub/Sub, or Cloud Storage is a plus. Soft Skills: Strong communication and client interaction skills. Ability to work independently and as part of a distributed team. Excellent problem-solving and team management capabilities.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough