Jobs
Interviews

167 Cloud Sql Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Technology Service Specialist, AVP at our Pune location, you will be an integral part of the Technology, Data, and Innovation (TDI) Private Bank team. In this role, you will be responsible for providing 2nd Level Application Support for business applications used in branches, by mobile sales, or via the internet. Your expertise in Incident Management and Problem Management will be crucial in ensuring the stability of these applications. Partnerdata, the central client reference data system in Germany, is a core banking system that integrates many banking processes and applications through numerous interfaces. With the recent migration to Google Cloud (GCP), you will be involved in operating and further developing applications and functionalities on the cloud platform. Your focus will also extend to regulatory topics surrounding partner/client relationships. We are seeking individuals who can contribute to this contemporary and emerging Cloud application area. Key Responsibilities: - Ensure optimum service level to supported business lines - Oversee resolution of incidents and problems within the team - Assist in managing business stakeholder relationships - Define and manage OLAs with relevant stakeholders - Monitor team performance, adherence to processes, and alignment with business SLAs - Manage escalations and work with relevant functions to resolve issues quickly - Identify areas for improvement and implement best practices in your area of expertise - Mentor and coach Production Management Analysts within the team - Fulfill Service Requests, communicate with Service Desk function, and participate in major incident calls - Document tasks, incidents, problems, changes, and knowledge bases - Improve monitoring of applications and implement automation of tasks Skills and Experience: - Service Operations Specialist experience in a global operations context - Extensive experience supporting complex application and infrastructure domains - Ability to manage and mentor Service Operations teams - Strong ITIL/best practice service context knowledge - Proficiency in interface technologies, communication protocols, and ITSM tools - Bachelor's Degree in IT or Computer Science related discipline - ITIL certification and experience with ITSM tool ServiceNow preferred - Knowledge of Banking domain and regulatory topics - Experience with databases like BigQuery and understanding of Big Data and GCP technologies - Proficiency in tools like GitHub, Terraform, Cloud SQL, Cloud Storage, Dataproc, Dataflow - Architectural skills for big data solutions and interface architecture Area-Specific Tasks/Responsibilities: - Handle Incident/Problem Management and Service Request Fulfilment - Analyze and resolve incidents escalated from 1st Level Support - Support the resolution of high-impact incidents and escalate when necessary - Provide solutions for open problems and support service transition for new projects/applications Joining our team, you will receive training, development opportunities, coaching from experts, and a culture of continuous learning to support your career progression. We value diversity and promote a positive, fair, and inclusive work environment at Deutsche Bank Group. Visit our company website for more information.,

Posted 1 day ago

Apply

6.0 - 10.0 years

1 - 1 Lacs

Chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description: At the client's Credit Company, we are modernizing our enterprise data warehouse in Google Cloud to enhance data, analytics, and AI/ML capabilities, improve customer experience, ensure regulatory compliance, and boost operational efficiencies. As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to the client's Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for the client's Credit. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform. Skills Required: Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform - Biq Query Experience Required: GCP Data Engineer Certified Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions. 5+ years of complex SQL development experience 2+ experience with programming languages such as Python, Java, or Apache Beam. Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions. Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API, cloudbuild, App Engine, Apache Kafka, Pub/Sub, AI/ML, Kubernetes Experience Preferred: In-depth understanding of GCP's underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing with microservice architecture from container orchestration framework. Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Data engineering or development experience gained in a regulated financial environment. Experience in coaching and mentoring Data Engineers Project management tools like Atlassian JIRA Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Experience with data security, governance, and compliance best practices in the cloud. Experience with AI solutions or platforms that support AI solutions Experience using data science concepts on production datasets to generate insights Experience Range: 5+ years Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.

Posted 3 days ago

Apply

9.0 - 14.0 years

7 - 14 Lacs

Hyderabad, Pune

Hybrid

Role & responsibilities Key Skills Required are 8 years of handson experience in cloud application architecture with a focus on creating scalable and reliable software systems 8 Years Experience using Google Cloud Platform GCP including but not restricting to services like Bigquery Cloud SQL Fire store Cloud Composer Experience on Security identity and access management Networking protocols such as TCPIP and HTTPS Network security design including segmentation encryption logging and monitoring Network topologies load balancing and segmentation Python for Rest APIs and Microservices Design and development guidance Python with GCP Cloud SQLPostgreSQL BigQuery Integration of Python API to FE applications built on React JS Unit Testing frameworks Python unit test pytest Java junit spock and groovy DevOps automation process like Jenkins Docker deployments etc Code Deployments on VMs validating an overall solution from the perspective of Infrastructure performance scalability security capacity and create effective mitigation plans Automation technologies Terraform or Google Cloud Deployment Manager Ansible Implementing solutions and processes to manage cloud costs Experience in providing solution to Web Applications Requirements and Design knowledge React JS Elastic Cache GCP IAM Managed Instance Group VMs and GKE Owning the endtoend delivery of solutions which will include developing testing and releasing Infrastructure as Code Translate business requirementsuser stories into a practical scalable solution that leverages the functionality and best practices of the HSBC Executing technical feasibility assessments solution estimations and proposal development for moving identified workloads to the GCP Designing and implementing secure scalable and innovative solutions to meet Banks requirements Ability to interact and influence across all organizational levels on technical or business solutions Certified Google Cloud Architect would be an addon Create and own scaling capacity planning configuration management and monitoring of processes and procedures Create put into practice and use cloudnative solutions Lead the adoption of new cloud technologies and establish best practices for them Experience establishing technical strategy and architecture at the enterprise level Experience leading GCP Cloud project delivery Collaborate with IT security to monitor cloud privacy Architecture DevOps data and integration teams to ensure best practices are followed throughout cloud adoption Respond to technical issues and provide guidance to technical team Skills Mandatory Skills : GCP Storage,GCP BigQuery,GCP DataProc,GCP Vertex AI,GCP Spanner,GCP Dataprep,GCP Datastream,Google Analytics Hub,GCP Dataform,GCP Dataplex/Catalog,GCP Cloud Datastore/Firestore,GCP Datafusion,GCP Pub/Sub,GCP Cloud SQL,GCP Cloud Composer,Google Looker,GCP Cloud Datastore,GCP Data Architecture,Google Cloud IAM,GCP Bigtable,GCP Looker1,GCP Data Flow,GCP Cloud Pub/Sub"

Posted 3 days ago

Apply

6.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

You should have 6-10 years of experience in development, specifically in Java/J2EE, with a strong knowledge of core Java. Additionally, you must be proficient in Spring frameworks, particularly in Spring MVC, Spring Boot, and JPA + Hibernate. It is essential to have hands-on experience with Microservice technology, including development of RESTFUL and SOAP Web Services. A good understanding of Oracle DB is required. Your communication skills, especially when interacting with clients, should be excellent. Experience in building tools like Maven, deployment, and troubleshooting issues is necessary. Knowledge of CI/CD tools such as Jenkins and experience with GIT or similar source control tools is expected. You should also be familiar with Agile/Scrum software development methodologies using tools like Jira, Confluence, and BitBucket and have performed Requirement Analysis. It would be beneficial to have knowledge of frontend stacks like React or Angular, as well as frontend and backend API integration. Experience with AWS, CI/CD best practices, and designing security reference architectures for AWS Infrastructure Applications is advantageous. You should possess good verbal and written communication skills, the ability to multitask in a fast-paced environment, and be highly organized and detail-oriented. Awareness of common information security principles and practices is required. TELUS International is committed to creating a diverse and inclusive workplace and is an equal opportunity employer. All employment decisions are based on qualifications, merits, competence, and performance without regard to any characteristic related to diversity.,

Posted 3 days ago

Apply

5.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

It&aposs fun to work in a company where people truly BELIEVE in what they are doing! We&aposre committed to bringing passion and customer focus to the business. Location - Open Position: Data Engineer (GCP) Technology If you are an extraordinary developer and who loves to push the boundaries to solve complex business problems using creative solutions, then we wish to talk with you. As an Analytics Technology Engineer, you will work on the Technology team that helps deliver our Data Engineering offerings at large scale to our Fortune clients worldwide. The role is responsible for innovating, building and maintaining technology services. Responsibilities: Be an integral part of large scale client business development and delivery engagements Develop the software and systems needed for end-to-end execution on large projects Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions Build the knowledge base required to deliver increasingly complex technology projects Qualifications & Experience: A bachelors degree in Computer Science or related field with 5 to 10 years of technology experience Desired Technical Skills: Data Engineering and Analytics on Google Cloud Platform: Basic Cloud Computing Concepts Bigquery, Google Cloud Storage, Cloud SQL, PubSub, Dataflow, Cloud Composer, GCP Data Transfer, gcloud CLI Python, Google Cloud Python SDK, SQL Experience in working with Any NoSQL/Columnar / MPP Database Experience in working with Any ETL Tool (Informatica/DataStage/Talend/Pentaho etc.) Strong Knowledge of database concepts, Data Modeling in RDBMS Vs NoSQL, OLTP Vs OLAP, MPP Architecture Other Desired Skills: Excellent communication and co-ordination skills Problem understanding, articulation and solutioning Quick learner & adaptable with regards to new technologies Ability to research & solve technical issues Responsibilities: Developing Data Pipelines (Batch/Streaming) Developing Complex data transformations ETL Orchestration Data Migration Develop and Maintain Datawarehouse / Data Lakes Good To Have: Experience in working with Apache Spark / Kafka Machine Learning concepts Google Cloud Professional Data Engineer Certification If you like wild growth and working with happy, enthusiastic over-achievers, you&aposll enjoy your career with us! Not the right fit Let us know you&aposre interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest! Show more Show less

Posted 3 days ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Overview of 66degrees 66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With our unmatched engineering capabilities and vast industry experience, we help the world&aposs leading brands transform their business challenges into opportunities and shape the future of work. At 66degrees, we believe in embracing the challenge and winning together. These values not only guide us in achieving our goals as a company but also for our people. We are dedicated to creating a significant impact for our employees by fostering a culture that sparks innovation and supports professional and personal growth along the way. Overview of Role As a Data Engineer specializing in AI/ML, you&aposll be instrumental in designing, building, and maintaining the data infrastructure crucial for training, deploying, and serving our advanced AI and Machine Learning models. You&aposll work closely with Data Scientists, ML Engineers, and Cloud Architects to ensure data is accessible, reliable, and optimized for high-performance AI/ML workloads, primarily within the Google Cloud ecosystem. Responsibilities Data Pipeline Development: Design, build, and maintain robust, scalable, and efficient ETL/ELT data pipelines to ingest, transform, and load data from various sources into data lakes and data warehouses, specifically optimized for AI/ML consumption. AI/ML Data Infrastructure: Architect and implement the underlying data infrastructure required for machine learning model training, serving, and monitoring within GCP environments. Google Cloud Ecosystem: Leverage a broad range of Google Cloud Platform (GCP) data services including, BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, Vertex AI, Composer (Airflow), and Cloud SQL. Data Quality & Governance: Implement best practices for data quality, data governance, data lineage, and data security to ensure the reliability and integrity of AI/ML datasets. Performance Optimization: Optimize data pipelines and storage solutions for performance, cost-efficiency, and scalability, particularly for large-scale AI/ML data processing. Collaboration with AI/ML Teams: Work closely with Data Scientists and ML Engineers to understand their data needs, prepare datasets for model training, and assist in deploying models into production. Automation & MLOps Support: Contribute to the automation of data pipelines and support MLOps initiatives, ensuring seamless integration from data ingestion to model deployment and monitoring. Troubleshooting & Support: Troubleshoot and resolve data-related issues within the AI/ML ecosystem, ensuring data availability and pipeline health. Documentation: Create and maintain comprehensive documentation for data architectures, pipelines, and data models. Qualifications 1-2+ years of experience in Data Engineering, with at least 2-3 years directly focused on building data pipelines for AI/ML workloads. Deep, hands-on experience with core GCP data services such as BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and Composer/Airflow. Strong proficiency in at least one relevant programming language for data engineering (Python is highly preferred).SQL skills for complex data manipulation, querying, and optimization. Solid understanding of data warehousing concepts, data modeling (dimensional, 3NF), and schema design for analytical and AI/ML purposes. Proven experience designing, building, and optimizing large-scale ETL/ELT processes. Familiarity with big data processing frameworks (e.g., Apache Spark, Hadoop) and concepts. Exceptional analytical and problem-solving skills, with the ability to design solutions for complex data challenges. Excellent verbal and written communication skills, capable of explaining complex technical concepts to both technical and non-technical stakeholders. 66degrees is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to actual or perceived race, color, religion, sex, gender, gender identity, national origin, age, weight, height, marital status, sexual orientation, veteran status, disability status or other legally protected class. Show more Show less

Posted 3 days ago

Apply

3.0 - 7.0 years

13 - 18 Lacs

Pune

Work from Office

About The Role : Job Title Technical-Specialist Big Data (PySpark) Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Python and Spark technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the team. As a developer you will bring extensive design and development skills to enforce the group of developers within the team. The candidate will extensively make use and apply Continuous Integration tools and practices in the context of Deutsche Banks digitalization journey. What well offer you . 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design and discuss your own solution for addressing user stories and tasks. Develop and unit-test, Integrate, deploy, maintain, and improve software. Perform peer code review. Actively participate into the sprint activities and ceremonies e.g., daily stand-up/scrum meeting, Sprint planning, retrospectives, etc. Apply continuous integration best practices in general (SCM, build automation, unit testing, dependency management) Collaborate with other team members to achieve the Sprint objectives. Report progress/update Agile team management tools (JIRA/Confluence) Manage individual task priorities and deliverables. Responsible for quality of solutions candidate / applicant provides. Contribute to planning and continuous improvement activities & support PO, ITAO, Developers and Scrum Master. Your skills and experience Engineer with Good development experience in Big Data platform for at least 5 years. Hands own experience in Spark (Hive, Impala). Hands own experience in Python Programming language. Preferably, experience in BigQuery , Dataproc , Composer , Terraform , GKE , Cloud SQL and Cloud functions. Experience in set-up, maintenance, and ongoing development of continuous build/ integration infrastructure as a part of DevOps. Create and maintain fully automated CI build processes and write build and deployment scripts. Has experience with development platformsOpenShift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g., GIT, TeamCity, Maven, SONAR Good Knowledge about the core SDLC processes and tools such as HP ALM, Jira, Service Now. Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded and willing to learn business and technology. Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge to expertise the team members. How well support you . . . .

Posted 4 days ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Pune

Work from Office

About The Role : Job TitleSenior Engineer PD, AVP LocationPune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at leastSpark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 4 days ago

Apply

6.0 - 10.0 years

0 Lacs

haryana

On-site

Join GlobalLogic as a valuable member of the team working on a significant software project for a world-class company that provides M2M / IoT 4G/5G modules to industries such as automotive, healthcare, and logistics. Your engagement will involve contributing to the development of end-user modules" firmware, implementing new features, maintaining compatibility with the latest telecommunication and industry standards, and analyzing and estimating customer requirements. Requirements - BA / BS degree in Computer Science, Mathematics, or a related technical field, or equivalent practical experience. - Proficiency in Cloud SQL and Cloud Bigtable. - Experience with Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub / Sub, and Genomics. - Familiarity with Google Transfer Appliance, Cloud Storage Transfer Service, and BigQuery Data Transfer. - Knowledge of data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and data processing algorithms (MapReduce, Flume). - Previous experience working with technical customers. - Proficiency in writing software in languages like Java or Python. - 6-10 years of relevant consulting, industry, or technology experience. - Strong problem-solving and troubleshooting skills. - Excellent communication skills. Job Responsibilities - Hands-on experience working with data warehouses, including technical architectures, infrastructure components, ETL / ELT, and reporting / analytic tools. - Experience in technical consulting. - Proficiency in architecting and developing software or internet-scale Big Data solutions in virtualized environments like Google Cloud Platform (mandatory) and AWS / Azure (good to have). - Familiarity with big data, information retrieval, data mining, machine learning, and building high availability applications with modern web technologies. - Working knowledge of ITIL and / or agile methodologies. - Google Data Engineer certification. What We Offer - Culture of caring: Prioritize a culture of caring, where people come first, fostering an inclusive environment of acceptance and belonging. - Learning and development: Commitment to continuous learning and growth, offering various programs, training curricula, and hands-on opportunities for personal and professional advancement. - Interesting & meaningful work: Engage in impactful projects that allow for creative problem-solving and exploration of new solutions. - Balance and flexibility: Embrace work-life balance with diverse career areas, roles, and work arrangements to support personal well-being. - High-trust organization: Join a high-trust organization with a focus on integrity, trustworthiness, and ethical practices. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for collaborating with forward-thinking companies to create innovative digital products and experiences. Join the team in transforming businesses and industries through intelligent products, platforms, and services, contributing to cutting-edge solutions that shape the world today.,

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a Google Cloud Engineer at our company, you will play a crucial role in designing, building, deploying, and maintaining our cloud infrastructure and applications on Google Cloud Platform (GCP). Your collaboration with development, operations, and security teams will ensure that our cloud environment is scalable, secure, highly available, and cost-optimized. If you are enthusiastic about cloud-native technologies, automation, and overcoming intricate infrastructure challenges, we welcome you to apply. Your responsibilities will include: - Designing, implementing, and managing robust, scalable, and secure cloud infrastructure on GCP utilizing Infrastructure as Code (IaC) tools like Terraform. - Deploying, configuring, and managing core GCP services such as Compute Engine, Kubernetes Engine (GKE), Cloud SQL, Cloud Storage, Cloud Functions, BigQuery, Pub/Sub, and networking components. - Developing and maintaining CI/CD pipelines for automated deployment and release management using various tools. - Implementing and enforcing security best practices within the GCP environment, including IAM, network security, data encryption, and compliance adherence. - Monitoring cloud infrastructure and application performance, identifying bottlenecks, and implementing optimization solutions. - Troubleshooting and resolving complex infrastructure and application issues in production and non-production environments. - Collaborating with development teams to ensure cloud-native deployment, scalability, and resilience of applications. - Participating in on-call rotations for critical incident response and timely resolution of production issues. - Creating and maintaining comprehensive documentation for cloud architecture, configurations, and operational procedures. - Keeping up-to-date with new GCP services, features, and industry best practices to propose and implement improvements. - Contributing to cost optimization efforts by identifying and implementing efficiencies in cloud resource utilization. We require you to have: - A Bachelors or Masters degree in Computer Science, Software Engineering, or a related field. - 6+ years of experience with C#, .NET Core, .NET Framework, MVC, Web API, Entity Framework, and SQL Server. - 3+ years of experience with cloud platforms, preferably GCP, including designing and deploying cloud-native applications. - 3+ years of experience with source code management, CI/CD pipelines, and Infrastructure as Code. - Strong experience with Javascript and a modern Javascript framework, with VueJS preferred. - Proven leadership and mentoring skills with development teams. - Strong understanding of microservices architecture and serverless computing. - Experience with relational databases like SQL Server and PostgreSQL. - Excellent problem-solving, analytical, and communication skills, along with Agile/Scrum environment experience. What can make you stand out: - GCP Cloud Certification. - UI development experience with HTML, JavaScript, Angular, and Bootstrap. - Agile environment experience with Scrum, XP. - Relational database experience with SQL Server, PostgreSQL. - Proficiency in Atlassian tools like JIRA, Confluence, and Github. - Working knowledge of Python and exceptional problem-solving and analytical abilities, along with strong teamwork skills.,

Posted 6 days ago

Apply

10.0 - 14.0 years

0 Lacs

kochi, kerala

On-site

Joining Gadgeon offers a dynamic and rewarding career experience that fosters both personal and professional growth. Our collaborative culture encourages innovation, empowering team members to contribute their ideas and expertise to cutting-edge projects. We are seeking a Principal Data Engineer with over 10 years of experience. In this high-impact role, you will be responsible for defining and driving the data architecture strategy, guiding technical teams, and ensuring robust, scalable, and secure database systems that support business-critical applications. You will work autonomously and directly with stakeholders across engineering, infrastructure, and product leadership, influencing decisions related to GCP Cloud SQL, GKE, relational and non-relational databases (including Cassandra and Couchbase), while also managing complex database challenges. Key Responsibilities: - Define and lead the overall data architecture strategy across cloud and hybrid environments, in alignment with business and technical goals. - Collaborate directly with leadership to drive key architectural decisions. - Provide deep, hands-on expertise in GCP Cloud SQL and containerized database deployment using Google Kubernetes Engine (GKE). - Architect and optimize relational database solutions using PostgreSQL, MySQL, and MS SQL Server. - Design and implement NoSQL data solutions including Cassandra, Couchbase, and other scalable data stores. - Lead end-to-end planning and execution of complex data engineering tasks, including schema design, performance tuning, high availability, and disaster recovery. - Handle and coordinate P1 incident responses, conducting root cause analysis and designing long-term mitigation plans. - Guide and mentor internal database teams, fostering technical quality and architectural consistency. - Automate database provisioning, scaling, and maintenance using modern DevOps and infrastructure-as-code tools. - Provide architectural documentation and recommendations for scalability, security, compliance, and cost efficiency. Skills: - 10+ years of experience in data engineering and architecture, including hands-on cloud and database infrastructure leadership. - Proven expertise in Google Cloud Platform (GCP) especially Cloud SQL and GKE. - Strong hands-on background in relational databases: PostgreSQL, MySQL, MS SQL Server - Deep knowledge and implementation experience with NoSQL technologies: Cassandra, Couchbase, [optionally MongoDB, Redis, etc.] - Experience in independent or consulting roles, capable of working with minimal supervision and maximum ownership. - Proven ability to manage and resolve high-severity incidents and communicate resolution paths to leadership. - Strong scripting and automation skills (e.g., Python, Terraform, Bash). - Familiarity with CI/CD, container orchestration, and observability practices for modern data infrastructure. - Background in supporting data compliance, governance, and cost optimization in cloud environments. Why Join Us: - Work with a team that values deep security thinking over checkbox compliance. - Experiment with tools, techniques, and platforms to continuously simulate, validate, and improve. - Collaborate across teams that respect security as a partner, not a blocker. - Build scalable security strategies in a growing, tech-driven organization. Experience: - 10+ Years,

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a GCP Senior Data Engineer/Architect, you will play a crucial role in our team by designing, developing, and implementing robust and scalable data solutions on the Google Cloud Platform (GCP). Collaborating closely with Architects and Business Analysts, especially for our US clients, you will translate data requirements into effective technical solutions. Your responsibilities will include designing and implementing scalable data warehouse and data lake solutions, orchestrating complex data pipelines, leading cloud data lake implementation projects, participating in cloud migration projects, developing containerized applications, optimizing SQL queries, writing automation scripts in Python, and utilizing various GCP data services such as BigQuery, Bigtable, and Cloud SQL. Your expertise in data warehouse and data lake design and implementation, experience in data pipeline development and tuning, hands-on involvement in cloud migration and data lake projects, proficiency in Docker and GKE, strong SQL and Python scripting skills, and familiarity with GCP services like BigQuery, Cloud SQL, Dataflow, and Composer will be essential for this role. Additionally, knowledge of data governance principles, experience with dbt, and the ability to work effectively within a team and adapt to project needs are highly valued. Strong communication skills, the willingness to work in UK shift timings, and the openness to giving and receiving feedback are important traits that will contribute to your success in this role.,

Posted 1 week ago

Apply

2.0 - 5.0 years

13 - 17 Lacs

Bengaluru

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 1 week ago

Apply

4.0 - 8.0 years

20 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Salary: 20 to 35 LPA Exp: 3 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Chennai

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Agile Project Management Good to have skills : Apache SparkMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in ensuring that data is accessible, reliable, and ready for analysis, contributing to informed decision-making across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering practices.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Agile Project Management.- Good To Have Skills: Experience with Apache Spark, Google Cloud SQL, Python (Programming Language).- Strong understanding of data pipeline architecture and design principles.- Experience with ETL tools and data integration techniques.- Familiarity with data quality frameworks and best practices. Additional Information:- The candidate should have minimum 7.5 years of experience in Agile Project Management.- This position is based in Chennai (Mandatory).- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

6 - 13 Lacs

Mohali

Work from Office

Job Profile: Senior SQL Cloud Database Administrator (DBA) position. Responsibilities: - Managing, optimizing, and securing our cloud-based SQL databases, ensuring high availability and performance. - Design and implement scalable and secure SQL database structures in AWS and GCP environments. - Plan and execute data migration from on-premises or legacy systems to AWS and GCP cloud platforms. - Monitor database performance, identify bottlenecks, and fine-tune queries and indexes for optimal efficiency. - Implement and manage database security protocols, including encryption, access controls, and compliance with regulations. - Develop and maintain robust backup and recovery strategies to ensure data integrity and availability. - Perform regular maintenance tasks such as patching, updates, and troubleshooting database issues. - Work closely with developers, DevOps, and data engineers to support application development and deployment. - Ensure data quality, consistency, and governance across distributed systems. - Keep up with emerging technologies, cloud services, and best practices in database management. Required Skills: - Proven experience as a SQL Database Administrator with expertise in AWS and GCP cloud platforms. - Strong knowledge of SQL database design, implementation, and optimization. - Experience with data migration to cloud environments. - Proficiency in performance monitoring and query optimization. - Knowledge of database security protocols and compliance regulations. - Familiarity with backup and disaster recovery strategies. - Excellent troubleshooting and problem-solving skills. - Strong collaboration and communication skills. - Knowledge of DevOps integration.

Posted 1 week ago

Apply

4.0 - 8.0 years

20 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Salary: 20 to 35 LPA Exp: 3 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.

Posted 1 week ago

Apply

7.0 - 9.0 years

8 - 10 Lacs

Hyderabad, Telangana, India

On-site

Entity: Google Cloud Architect Experience on GCP architecture, landing zone and foundation design and implementation. Experience in GCVE Experience creating infrastructure using Terraform. Experience in writing modules and resources and automating resource creation using cloud build. Experience in GCP services like VPC network, firewalls, compute engine, app engine, Kubernetes engine, cloud SQL, Cloud storage, filestore, monitoring, and logging. Experience in establishing hybrid connectivity from on-prem to GCP (including VPN and Interconnect) Experience with DevOps practices and Infrastructure-as-Code (IaC) scripts Experience with Observability / Logging & Monitoring Experience with designing and implementing infrastructure components such as Kubernetes (GKE) Design and delivery of backup and DR (BDR) solution in GCP Experience with database modernization (CloudSQL) Understanding of GCP networking Data Platform Architect Proven experience as a Data Architect, with a focus on hyperscalers (AWS, GCP) Strong understanding of GCP services such as BigQuery, Cloud Storage, Dataproc, Dataflow, and Pub/Sub. Experience with data modeling, ETL processes, and data warehousing. Experience with data cataloging, classification, labeling, sensitive data inspection and redaction and Data Loss Prevention (DLP) on GCP. Knowledge of security best practices and regulatory compliance. Excellent problem-solving skills and the ability to work in a fast-paced environment. Strong leadership and collaboration skills Data Center Architect Data Center Design: Develop and implement comprehensive data center solutions, including server, storage, network, and infrastructure components. Ensure scalability, performance, and reliability to meet business needs. Infrastructure Planning & Optimization: Assess existing infrastructure, identify areas of improvement, and design solutions that enhance performance, availability, and scalability while reducing operational costs. Technology Integration: Lead the integration of cloud services, virtualization technologies, and on-premise systems, ensuring seamless operations across multiple platforms and regions. Security & Compliance: Ensure that data center designs adhere to the latest security standards, policies, and regulatory compliance (e.g., ISO, PCI DSS, GDPR). Perform vulnerability assessments and recommend security enhancements. Disaster Recovery & High Availability: Design and implement disaster recovery (DR) and business continuity solutions to minimize downtime and data loss. Ensure systems are highly available and resilient. Capacity Planning: Analyze workload trends and performance data to ensure the data center can meet future demand. Provide recommendations on hardware and software upgrades to support growth. Collaboration & Leadership: Work closely with various teams, including IT operations, network engineers, cloud architects, and third-party vendors, to ensure the data center infrastructure aligns with business objectives. Documentation & Compliance: Maintain detailed documentation of the data center architecture, configurations, and processes. Ensure designs are audit-ready and compliant with industry standards. Network Architect Network Design & Architecture: Design, develop, and implement scalable and secure network solutions, including LAN, WAN, VPN, and cloud networking. Evaluate the current network setup and recommend improvements to enhance performance, security, and cost-efficiency. Develop detailed network blueprints and architectural models to support business requirements and future scalability. Technology Integration & Innovation: Integrate new networking technologies (such as SDN, SD-WAN, cloud networking) with legacy systems while ensuring optimal performance. Lead the adoption of emerging technologies to improve network automation, monitoring, and operational efficiency. Collaborate with cloud architects to ensure seamless hybrid cloud network integration. Network Security: Design and enforce security protocols, including firewalls, encryption, access control, and network segmentation, to protect against cyber threats. Ensure that network designs comply with industry standards and regulatory requirements (e.g., PCI DSS, ISO 27001, GDPR). Capacity Planning & Scalability: Analyze network traffic patterns, usage, and performance to predict future capacity needs and plan for expansion. Implement strategies to optimize network performance during peak times and reduce latency. Disaster Recovery & High Availability: Design and implement disaster recovery solutions and high-availability architectures to ensure minimal downtime and uninterrupted network operations. Develop failover strategies and redundancy plans for critical systems and applications. Collaboration & Leadership: Collaborate with IT, security, and cloud teams to ensure network architecture aligns with organizational goals. Act as a technical lead, guiding network engineers and support teams in troubleshooting, deployment, and maintenance. Engage with stakeholders to understand business needs and translate them into technical network requirements. Documentation & Compliance: Create and maintain detailed documentation of the network architecture, including diagrams, configurations, and operational procedures. Ensure the network design is compliant with industry standards and organizational policies.

Posted 1 week ago

Apply

10.0 - 12.0 years

9 - 10 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

GCP- Certified, Designing and Architecture, GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) containers and orchestration Terraform, Cloud Build, Cloud Functions, or other GCP-native tools IAM, VPCs, firewall rules, service accounts, and Cloud Identity Grafana any Monitoring tool Role & responsibilities Preferred candidate profile

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. For the role at 66degrees, we are seeking a senior contractor to engage in a 2.5-month remote assignment with the potential to extend. Candidates with the required skills and the ability to work independently as well as within a team environment are encouraged to apply. As part of the responsibilities, you will be expected to facilitate, guide, and influence the client and teams towards an effective architectural pattern. You will serve as an interface between business leadership, technology leadership, and the delivery teams. Your role will involve performing Migration Assessments and producing Migration Plans that include Total Cost of Ownership (TCO), Migration Architecture, Migration Timelines, and Application Waves. Additionally, you will be responsible for designing a solution architecture on Google Cloud to support critical workloads. This will include Heterogeneous Oracle Migrations to Postgres or Spanner. You will need to design a migration path that accounts for the conversion of Application Dependencies, Database objects, Data, Data Pipelines, Orchestration, Users and Security. You will oversee migration activities and provide troubleshooting support, including translation of DDL and DML, executing data transfers using native Google Cloud and 3rd party tools, and setting up and configuring relative Google Cloud components. Furthermore, you will engage with customer teams as a Google Cloud expert to provide Education Workshops, Architectural Recommendations, and Technology reviews and recommendations. Qualifications: - 5+ years of experience with data engineering, cloud architecture, or working with data infrastructure. - 5+ years of Oracle database management and IT experience. - Experience with Oracle Database adjacent products like Golden Gate and Data Guard. - 3+ years of PostgreSQL experience. - Proven experience in performing performance testing and applying remediations to address performance issues. - Experience in designing data models. - Proficiency in Python programming language and SQL. - Advanced SQL skills, including the ability to write, tune, and interpret SQL queries; tool-specific experience in the database platforms listed above is ideal. - Proven experience in migrating and/or implementing cloud databases like Cloud SQL, Spanner, and Bigtable. Desired Skills: - Google Cloud Professional Architect and/or Data Engineer Certification is preferred. 66degrees is committed to protecting your privacy. Your personal information is collected, used, and shared in accordance with the California Consumer Privacy Act (CCPA).,

Posted 1 week ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Comprehensive understanding of BGPConfigure and troubleshoot in large global environments. Comprehensive understanding of routing conceptsInter-VR routing, policy-based routing, routing protocols, route filtering, path selection, access-lists etc. Comprehensive understanding of switching conceptsVLANs, Layer 2, Mac-forwarding, vlan trunking, VRRP, Gratuitous ARP. Comprehensive understanding of firewall/security conceptsL2-L7, all versions of NAT, failover scenarios, zonal concepts, IPSec, L7 encryption concepts, URL filtering, DNS, security profiles and rules, proxying. Comprehensive understanding of Load Balancing conceptsCloud LB and conventional LB and their differences in functionality. Good understanding of Public Cloud platformsPreferably GCP; specifically Networking, Firewalling, IAM and how they relate to Cloud Native services (PSA, Cloud SQL, GCVE, Cloud Interconnects, BMS, FileStore, Netapp, etc). Good understanding of Infrastructure as Code (IAC) to provision resourcesMust be able to customize and optimize the codebase to simplify deliveries. Good understanding of Linux administrationUsing Linux to bridge technical gaps in Windows and understanding the tools available to troubleshoot network connectivity. Understanding of APIsin order to expedite data collection and configuration to eliminate human error. Understanding of DevOpshow it can improve delivery and operation. Primary Skills Skills Required Network Security, Switches, Router, firewalls, and cloud. Products JuniperMX, SRX, QFX Palo AltoPhysical and virtual firewalls, Panorama. Tools Terraform Algosec or similar tool for traffic flow governance. Mandatory languages Python HCL (HashiCorp Configuration Language)

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Chennai

Work from Office

" PLEASE READ THE JOB DESTCRIPTION AND APPLY" Data Engineer Job Description Position Overview Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it the present. - Master Oogway Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in transforming raw marketing data into actionable insights that power our digital marketing platform. As a key member of our data infrastructure team, you will design, develop, and maintain robust data pipelines, data warehouses, and analytics platforms that serve as the backbone of our digital marketing product development. Sometimes the hardest choices require the strongest wills. - Thanos (but we promise, our data decisions are much easier! ) In this role, you will collaborate with cross-functional teams including Data Scientists, Product Managers, and Marketing Technology specialists to ensure seamless data flow from various marketing channels, ad platforms, and customer touchpoints to our analytics dashboards and reporting systems. You'll be responsible for building scalable, reliable, and efficient data solutions that can handle high-volume marketing data processing and real-time campaign analytics. What You'll Do: - Design and implement enterprise-grade data pipelines for marketing data ingestion and processing - Build and optimize data warehouses and data lakes to support digital marketing analytics - Ensure data quality, security, and compliance across all marketing data systems - Create data models and schemas that support marketing attribution, customer journey analysis, and campaign performance tracking - Develop monitoring and alerting systems to maintain data pipeline reliability for critical marketing operations - Collaborate with product teams to understand digital marketing requirements and translate them into technical solutions Why This Role Matters: I can do this all day. - Captain America (and you'll want to, because this role is that rewarding!) You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting. Sometimes you gotta run before you can walk. - Iron Man (and sometimes you gotta build the data pipeline before you can analyze the data! ) Our Philosophy: We believe in the power of data to transform lives, just like the Dragon Warrior transformed the Valley of Peace. Every line of code you write, every pipeline you build, and every insight you enable has the potential to change how marketers work and succeed. We're not just building data systems - we're building the future of digital marketing, one insight at a time. Your story may not have such a happy beginning, but that doesn't make you who you are. It is the rest of your story, who you choose to be. - Soothsayer What Makes You Special: We're looking for someone who embodies the spirit of both Captain America's unwavering dedication and Iron Man's innovative genius. You'll need the patience to build robust systems (like Cap's shield ) and the creativity to solve complex problems (like Tony's suit). Most importantly, you'll have the heart to make a real difference in marketers' lives. Inner peace... Inner peace... Inner peace... - Po (because we know data engineering can be challenging, but we've got your back! ) Key Responsibilities Data Pipeline Development - Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes - Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources) - Implement data transformation and cleaning processes to ensure data quality and consistency - Optimize data pipeline performance and reliability Data Infrastructure Management - Design and implement data warehouse architectures - Manage and optimize database systems (SQL and NoSQL) - Implement data lake solutions and data governance frameworks - Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture - Design and implement data models for analytics and reporting - Create and maintain data dictionaries and documentation - Develop data schemas and database structures - Implement data versioning and lineage tracking Data Quality, Security, and Compliance - Ensure data quality, integrity, and consistency across all marketing data systems - Implement and monitor data security measures to protect sensitive information - Ensure privacy and compliance with regulatory requirements (e.g., GDPR, CCPA) - Develop and enforce data governance policies and best practices Collaboration and Support - Work closely with Data Scientists, Analysts, and Business stakeholders - Provide technical support for data-related issues and queries Monitoring and Maintenance - Implement monitoring and alerting systems for data pipelines - Perform regular maintenance and optimization of data systems - Troubleshoot and resolve data pipeline issues - Conduct performance tuning and capacity planning Required Qualifications Experience - 2+ years of experience in data engineering or related roles - Proven experience with ETL/ELT pipeline development - Experience with cloud data platform (GCP) - Experience with big data technologies Technical Skills - Programming Languages : Python, SQL, Golang (preferred) - Databases: PostgreSQL, MySQL, Redis - Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform - Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud Storage, Pub/Sub, App Engine, Compute Engine etc.) - Data Warehousing: Google BigQuery - Data Visualization: Superset, Looker, Metabase, Tableau - Version Control: Git, GitHub - Containerization: Docker Soft Skills - Strong problem-solving and analytical thinking - Excellent communication and collaboration skills - Ability to work independently and in team environments - Strong attention to detail and data quality - Continuous learning mindset Preferred Qualifications Additional Experience - Experience with real-time data processing and streaming - Knowledge of machine learning pipelines and MLOps - Experience with data governance and data catalog tools - Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.) - Experience using AI-powered tools (such as Cursor, Claude, Copilot, ChatGPT, Gemini, etc.) to accelerate coding, automate tasks, or assist in system design ( We belive run with machine, not against machine ) Interview Process 1. Initial Screening: Phone/video call with HR 2. Technical Interview: Deep dive into data engineering concepts 3. Final Interview: Discussion with senior leadership Note: This job description is intended to provide a general overview of the position and may be modified based on organizational needs and candidate qualifications. Our Team Culture We are Groot. - We work together, we grow together, we succeed together. We believe in: - Innovation First - Like Iron Man, we're always pushing the boundaries of what's possible - Team Over Individual - Like the Avengers, we're stronger together than apart - Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving - Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing!) Growth Journey There is no charge for awesomeness... or attractiveness. - Po Your journey with us will be like Po's transformation from noodle maker to Dragon Warrior: - Level 1 : Master the basics of our data infrastructure - Level 2: Build and optimize data pipelines - Level 3 : Lead complex data projects and mentor others - Level 4: Become a data engineering legend (with your own theme music! ) What We Promise I am Iron Man. - We promise you'll feel like a superhero every day! - Work that matters - Every pipeline you build helps real marketers succeed - Growth opportunities - Learn new technologies and advance your career - Supportive team - We've got your back, just like the Avengers - Work-life balance - Because even superheroes need rest!

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

Wipro Limited is a leading technology services and consulting company dedicated to developing innovative solutions that cater to the most complex digital transformation needs of clients. Our comprehensive range of consulting, design, engineering, and operational capabilities enables us to assist clients in achieving their most ambitious goals and establishing sustainable, future-ready businesses. With a global presence of over 230,000 employees and business partners spanning 65 countries, we remain committed to supporting our customers, colleagues, and communities in navigating an ever-evolving world. We are currently seeking an individual with hands-on experience in data modeling for both OLTP and OLAP systems. The ideal candidate should possess a deep understanding of Conceptual, Logical, and Physical data modeling, coupled with a robust grasp of indexing, partitioning, and data sharding, supported by practical experience. Experience in identifying and mitigating factors impacting database performance for near-real-time reporting and application interaction is essential. Proficiency in at least one data modeling tool, preferably DB Schema, is required. Additionally, functional knowledge of the mutual fund industry would be beneficial. Familiarity with GCP databases such as Alloy DB, Cloud SQL, and Big Query is preferred. The role demands willingness to work from our Chennai office, with a mandatory presence on-site at the customer site requiring five days of work per week. Cloud-PaaS-GCP-Google Cloud Platform is a mandatory skill set for this position. The successful candidate should have 5-8 years of relevant experience and should be prepared to contribute to the reimagining of Wipro as a modern digital transformation partner. We are looking for individuals who are inspired by reinvention - of themselves, their careers, and their skills. At Wipro, we encourage continuous evolution, reflecting our commitment to adapt to the changing world around us. Join us in a business driven by purpose, where you have the freedom to shape your own reinvention. Realize your ambitions at Wipro. We welcome applications from individuals with disabilities. For more information, please visit www.wipro.com.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Software Engineer Practitioner at TekWissen in Chennai, you will be a crucial part of the team responsible for the development and maintenance of the Enterprise Data Platform. Your main focus will be on designing, building, and optimizing scalable data pipelines within the Google Cloud Platform (GCP) environment. Working with GCP Native technologies such as BigQuery, Dataform, Dataflow, and Pub/Sub, you will ensure data governance, security, and optimal performance. This role offers you the opportunity to utilize your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at the client. To be successful in this role, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related field of study. You should have at least 5 years of experience with a strong understanding of database concepts and multiple database technologies to optimize query and data processing performance. Proficiency in SQL, Python, and Java is essential, along with experience in programming engineering transformations in Python or similar languages. Additionally, you should have the ability to work effectively across different organizations, product teams, and business partners, along with knowledge of Agile (Scrum) methodology and experience in writing user stories. Your skills should include expertise in data architecture, data warehousing, and Google Cloud Platform tools such as BigQuery, Data Flow, Dataproc, Data Fusion, and others. Experience with Data Warehouse concepts, ETL processes, and data service ecosystems is crucial for this role. Strong communication skills are necessary for both internal team collaboration and external stakeholder interactions. Your role will involve advocating for user experience through empathetic stakeholder relationships and ensuring effective communication within the team and with stakeholders. As a Software Engineer Practitioner, you should have excellent communication, collaboration, and influence skills to energize the team. Your knowledge of data, software, architecture operations, data engineering, and data management standards will be valuable in this role. Hands-on experience in Python using libraries like NumPy and Pandas is required, along with extensive knowledge of GCP offerings and bundled services related to data operations. You should also have experience in re-developing and optimizing data operations, data science, and analytical workflows and products. TekWissen Group is an equal opportunity employer that supports workforce diversity, and we encourage applicants from diverse backgrounds to apply. Join us in shaping the future of data engineering and making a positive impact on lives, communities, and the planet.,

Posted 1 week ago

Apply

6.0 - 9.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. You should have extensive experience with Google Cloud Platform (GCP), Kubernetes, and Docker. role involves working closely with our development and operations teams to ensure seamless integration and deployment of applications. Responsibilities Design, implement, and manage CI/CD pipelines on GCP. Automate infrastructure provisioning, configuration, and deployment using tools like Terraform and Ansible. Manage and optimize Kubernetes clusters for high availability and scalability. Containerize applications using Docker and manage container orchestration. Monitor system performance, troubleshoot issues, and ensure system reliability and security. Collaborate with development teams to ensure smooth and reliable operation of software and systems. Implement and manage logging, monitoring, and alerting solutions. Stay updated with the latest industry trends and best practices in DevOps and cloud technologies. Skills Must have Looking for 6 to 9 years of experience as a DevOps Engineer and a minimum of 4 years of relevant experience in GCP. Bachelor's degree in Computer Science, Engineering, or a related field. Strong expertise in Kubernetes and Docker. Experience with infrastructure as code (IaC) tools such as Terraform and Ansible. Proficiency in scripting languages like Python, Bash, or Go. Familiarity with CI/CD tools such as Jenkins, GitLab CI, or CircleCI. Knowledge of networking, security, and database management. Excellent problem-solving skills and attention to detail. Nice to have Strong communication and collaboration skills.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies