Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 9.0 years
20 - 35 Lacs
pune, gurugram, bengaluru
Hybrid
Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.
Posted 2 weeks ago
5.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior GCP Developer, you will be responsible for designing, developing, and deploying scalable, secure, and efficient cloud-based applications on the Google Cloud Platform (GCP). Your primary tasks will include utilizing languages like Java, Python, or Go to create cloud applications on GCP, developing and updating technical documentation, collaborating with various teams to address project needs, ensuring adherence to security and regulatory standards, addressing technical challenges, staying informed about the latest GCP features and services, and mentoring junior team members while offering technical support. You should possess at least 5-12 years of experience in developing cloud applications on GCP, a comprehensive understanding of GCP services such as Compute Engine, App Engine, Cloud Storage, Cloud SQL, and Cloud Datastore, proficiency in Java, Python, or Go programming languages, familiarity with GCP security, compliance, and regulatory protocols, experience with Agile development practices and Git version control, as well as exceptional problem-solving abilities and meticulous attention to detail. Your expertise in java, gcp security, go, agile development, cloud services like compute engine, python, Git, app engine, cloud sql, cloud datastore, and cloud storage will be instrumental in excelling in this role.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Full-time GCP DevOps Engineer at our company, you will be responsible for leading a team and utilizing your expertise in Google Cloud Platform (GCP). With your GCP Professional Cloud Architect or DevOps Engineer certification, along with a minimum of 5 years of experience, you will play a key role in optimizing our cloud infrastructure. Your hands-on experience with various core GCP services such as Compute Engine, Cloud Storage, VPC, IAM, BigQuery, Cloud SQL, Cloud Functions, Operations Suite Terraform will be crucial in ensuring the efficiency and reliability of our systems. Additionally, your proficiency in CI/CD tools like Jenkins, GitLab CI/CD, Cloud Build will help streamline our development processes. You should have a strong background in containerization technologies including Docker, Kubernetes (GKE), Helm, and be adept at scripting using languages such as Python, Bash, or Go for automation purposes. Your familiarity with monitoring and observability tools like Prometheus, Grafana, Cloud Monitoring will allow you to maintain the health and performance of our infrastructure. Furthermore, your proven knowledge of cloud security, compliance, and cost optimization strategies will be essential in safeguarding our systems and maximizing cost-efficiency. Experience with API Gateway / Apigee, service mesh, and microservices architecture will be an added advantage in this role. If you are a detail-oriented individual with a passion for GCP DevOps and a track record of successful team leadership, we invite you to apply for this exciting opportunity in either Hyderabad or Indore.,
Posted 2 weeks ago
8.0 - 15.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Architecture professional with 8-15 years of experience, your primary responsibility will be to design and implement data-centric solutions on Google Cloud Platform (GCP). You will utilize various GCP tools such as Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, Dataproc, Artifact Registry, Cloud Build, Cloud Run, Vertex AI, Pub/Sub, and GCP APIs to create efficient and scalable solutions. Your role will involve building ETL pipelines to ingest data from diverse sources into our system and developing data processing pipelines using programming languages like Java and Python for data extraction, transformation, and loading (ETL). You will be responsible for creating and maintaining data models to ensure efficient storage, retrieval, and analysis of large datasets. Additionally, you will deploy and manage both SQL and NoSQL databases like Bigtable, Firestore, or Cloud SQL based on project requirements. Your expertise will be crucial in optimizing data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Version control and CI/CD practices for data engineering workflows will be implemented by you to ensure reliable and efficient deployments. You will leverage GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures. Troubleshooting and resolving issues related to data processing, storage, and retrieval will be part of your daily tasks. Addressing code quality issues throughout the development lifecycle using tools like SonarQube, Checkmarx, Fossa, and Cycode will also be essential. Implementing security measures and data governance policies to maintain the integrity and confidentiality of data will be a critical aspect of your role. Collaboration with stakeholders to gather and define data requirements aligned with business objectives is key to success. You will be responsible for developing and maintaining documentation for data engineering processes to facilitate knowledge transfer and system maintenance. Participation in on-call rotations to address critical issues and ensure the reliability of data engineering systems will be required. Furthermore, providing mentorship and guidance to junior team members to foster a collaborative and knowledge-sharing environment will be an integral part of your role as a Data Architecture professional.,
Posted 2 weeks ago
0.0 years
0 Lacs
noida, uttar pradesh, india
On-site
Job Description Required Skills: GCP Proficiency : Strong expertise in Google Cloud Platform (GCP) services and tools. Strong expertise in Google Cloud Platform (GCP) services and tools, including Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, Cloud SQL, Cloud Load Balancing, IAM, Google Workflows, Google Cloud Pub/Sub, App Engine, Cloud Functions, Cloud Run, API Gateway, Cloud Build, Cloud Source Repositories, Artifact Registry, Google Cloud Monitoring, Logging, and Error Reporting. Cloud-Native Applications : Experience in designing and implementing cloud-native applications, preferably on GCP. Workload Migration : Proven expertise in migrating workloads to GCP. CI/CD Tools and Practices : Experience with CI/CD tools and practices. Python and IaC : Proficiency in Python and Infrastructure as Code (IaC) tools such as Terraform. Responsibilities: Cloud Architecture and Design : Design and implement scalable, secure, and highly available cloud infrastructure solutions using Google Cloud Platform (GCP) services and tools such as Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL, and Cloud Load Balancing. Cloud-Native Applications Design : Development of high-level architecture design and guidelines for develop, deployment and life-cycle management of cloud-native applications on CGP, ensuring they are optimized for security, performance and scalability using services like App Engine, Cloud Functions, and Cloud Run. API Management: Develop and implement guidelines for securely exposing interfaces exposed by the workloads running on GCP along with granular access control using IAM platform, RBAC platforms and API Gateway. Workload Migration : Lead the design and migration of on-premises workloads to GCP, ensuring minimal downtime and data integrity.
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
CloudWerx is seeking a dynamic Senior Engineer, Data to join our vibrant Data Analytics & Engineering Team in Hyderabad, India. As a Senior Cloud Data Engineer, you will play a crucial role in architecting and implementing state-of-the-art data solutions that drive business transformation. Working with a diverse client base, ranging from startups to industry leaders, you will tackle complex data challenges using the latest Google Cloud Platform (GCP) technologies. This role offers a unique blend of technical expertise and client interaction, allowing you to not only build sophisticated data systems but also consult directly with clients to shape their data strategies and witness the real-world impact of your work. If you are passionate about pushing the boundaries of cloud data engineering and want to be part of a team that is shaping the future of data-driven decision-making, this is an exciting opportunity to make a significant impact in a rapidly evolving field. At CloudWerx, we are committed to assembling a highly skilled team with both technical proficiency and business acumen. Each team member brings unique value to the business and our customers. We strive to attract the best talent in the industry and aim to set the standard for cloud consulting and business acceleration. The Senior Engineer, Data role is a full-time position based in our Hyderabad office. **Insight on Your Impact:** - Lead technical discussions with clients, translating complex technical concepts into clear strategies aligned with their business goals. - Architect and implement innovative data solutions that transform clients" businesses, enabling them to leverage their data assets fully. - Collaborate with cross-functional teams to design and optimize data pipelines processing petabytes of data, driving critical business decisions and insights. - Mentor junior engineers and contribute to the growth of our data engineering practice, fostering a culture of continuous learning and innovation. - Drive the adoption of cutting-edge GCP technologies, positioning our company and clients at the forefront of the cloud data revolution. - Identify opportunities for process improvements and automation, enhancing the efficiency and scalability of our consulting services. - Collaborate with sales and pre-sales teams to scope complex data engineering projects, ensuring technical feasibility and alignment with client needs. **Your Qualification, Your Influence:** To excel in this role, you should possess the following skills: - 4-8 years of experience in data engineering, with a focus on Google Cloud Platform technologies. - Expertise in GCP data services, especially tools like BigQuery, Cloud Composer, Cloud SQL, and Dataflow, with the ability to architect complex data solutions. - Proficiency in Python and SQL, capable of writing efficient, scalable, and maintainable code. - Experience in data modeling, database performance tuning, and cloud migration projects. - Strong communication skills to explain technical concepts to technical and non-technical stakeholders. - Ability to work directly with clients, understanding their needs and translating them into technical solutions. - Project management skills, including Agile methodologies and tools like Jira. - Leadership and mentoring abilities to nurture junior team members and promote knowledge sharing. - Stay updated with emerging technologies and best practices in cloud data engineering. - Experience in a consulting or professional services environment, managing multiple projects and priorities. - Problem-solving skills to creatively overcome technical challenges. - Willingness to obtain relevant Google Cloud certifications if not already held. - Collaborative work approach in a remote environment, with strong time management and self-motivation. - Cultural sensitivity and adaptability to work effectively with diverse teams and clients across different time zones. **Our Diversity and Inclusion Commitment:** CloudWerx is dedicated to fostering a workplace that values and celebrates diversity, believing that an inclusive environment promotes innovation, collaboration, and mutual respect. We provide equal employment opportunities for individuals of all backgrounds and actively promote diversity across our organization's levels. We welcome diverse perspectives and identities, aiming to build a team that embraces inclusivity. Join us on our journey towards a more equitable and inclusive workplace. **Background Check Requirement:** All candidates for this position will undergo pre-employment background screening. Employment offers are contingent upon the successful completion of the background check. For more information on the background check requirements and process, please contact us directly. **Our Story:** CloudWerx is an engineering-focused cloud consulting firm rooted in Silicon Valley, at the forefront of hyper-scale and innovative technology. We help businesses in the cloud environment architect, migrate, optimize, secure, or reduce costs. With a team experienced in complex cloud environments at scale, we empower businesses to accelerate confidently.,
Posted 2 weeks ago
2.0 - 4.0 years
0 Lacs
pune, maharashtra, india
On-site
Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of a Senior Consultant Specialist. In this role, you will: Contribute to the Database/schema designs used by development pods. work in hybrid environment covering public cloud & on prem environments. solve challenges posed by large-scale databases, high-tech applications (such as low latency, high availability, increase in volumes etc.) Guide the Devops team in making correct Database use and architecture related decisions, leading Database design and NFR requirements. Focus on technical debt and work towards continuous improvements while designing new solutions in database world. Implement, maintain, and stabilize on premise and public cloud infrastructure Database used by Development Pods. Work on automation of processes like Database refresh using RMAN/expdp, restoration tests, Database monitoring and alerting using tools like Grafana, create consumption dashboards for database growth. Make use of tools like Delphix to provision prod like databases in less time. Focus on managing cyber vulnerabilities on Database and Linux infrastructure by automating patching for both RDBMS and OS by applying necessary database and redhat patches Work on solving complex performance issues on Database by suggesting modification of complex sqls ,indexes,logic changes etc Implement principles and practices for cost effective infrastructure demands. Work with different teams in bank to streamline the processes with respect to databases. Coordinate the Service Management aspect of the services e.g. Disaster Recovery, vulnerability patching of servers, timely back-up and recovery scenarios of servers used for databases. Build excellent relationships with Product, Product Delivery, Change and Operations teams to help create a one team approach to planning and delivery for databases. Automate day to day activities pertaining to Databases and os using tools like Ansible Requirements To be successful in this role, you should meet the following requirements: 1 2+ years of experience in IT infrastructure management with managing complex Database and Linux environments. Expert in ORACLE RAC troubleshooting and Oracle RDBMS patching for standalone and RAC infrastructure Expert in tools like Delphix ,Ansible, Rundeck Strong experience in Linux based environments supporting Database demands for micro-services running on Cloud platform (Google cloud and Ali cloud) and Linux Strong experience on Oracle infra maintenance, DML/DDLs, performance tuning, storage management, backup / recovery, test support, production support and monitoring / alerting on Oracle and PostgreSQL databases Support and own production Database issues. Should have worked on RDBMS likePostgreSQL, Oracle in design, build, maintenance and support. Develop in-house knowledge in PostgreSQL set up, admin tasks, performance tuning, storage management, backup / recovery, test support, production support and monitoring / alerting on PostgreSQL and Oracle databases. Should have knowledge in Google cloud infra support: Cloud SQL, Big Query and Cloud Storage - standardize set up, issue guidelines and best practices, Should have experience in automating data migrations from one RDBMS to another RDBMS, on premise as well as on cloud. Expert in distributed computing and managing large databases for OLAP and OLTP use cases. Should have implemented refresh automation using RMAN, expdp and , Delphix etc Strong understanding of database designs patterns available both on premise and cloud Should have in depth understanding of tools like Grafana, Jenkins, Ansible Rundeck etc Should have knowledge of diffenrt storage components like S3 Buckets,GCS buckets,NFs,VxFS etc Good to have knowledge of data virtualization tools like Dremio, Denodo Should be able to automate the infrastructure tasks like patching, monitoring, and alerting. Should be a problem solver. You will be tested daily on how you approach problems and resolve them. Self-motivated and willing to learn new technologies and business domain. Supporting the existing Databases and platform, resolving issues, employing best practices from a production support practice perspective Proven track record of working on significant projects from conception to completion. Ready to perform hands-on developer role and take up new development effort in an individual capacity beyond providing technology leadership. Ability to make key strategic recommendations around database area that are pragmatic and grounded in practical solutions. Proven ability to work across regions whilst maintaining a global perspective. Experience of delivering technology modernization (e.g. monolith to micro services), building service resiliency and scalable solutions in database area. Proactive risk and issues management You'll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by - HSBC Software Development India
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
The role of a Technical-Specialist Big Data (PySpark) Developer based in Pune, India involves designing, developing, and unit testing software applications in an Agile development environment. As an Engineer, you are responsible for ensuring the delivery of high-quality, maintainable, scalable, and high-performing software applications. You are expected to have a strong technological background with good working experience in Python and Spark technology. The role requires hands-on experience and the ability to work independently with minimal guidance while also providing technical guidance and mentorship to junior team members. You will play a key role in enforcing design and development skills within the team and will actively apply Continuous Integration tools and practices as part of Deutsche Bank's digitalization journey. As part of the benefits package, you will enjoy a best-in-class leave policy, gender-neutral parental leaves, childcare assistance benefit, sponsorship for industry-relevant certifications and education, employee assistance program, comprehensive hospitalization insurance, accident and term life insurance, and complementary health screening for individuals above 35 years. Your key responsibilities will include designing solutions for user stories, developing and unit-testing software, integrating, deploying, maintaining, and improving software, performing peer code reviews, participating in sprint activities and ceremonies, applying continuous integration best practices, collaborating with team members, reporting progress using Agile team management tools, managing task priorities and deliverables, ensuring the quality of solutions provided, and contributing to planning and continuous improvement activities. To be successful in this role, you should have at least 5 years of development experience in Big Data platforms, hands-on experience in Spark and Python programming, familiarity with BigQuery, Dataproc, Composer, Terraform, GKE, Cloud SQL, and Cloud functions, experience in setting up and maintaining continuous build/integration infrastructure, knowledge of development platforms and SDLC processes and tools, strong analytical and communication skills, proficiency in English, ability to work in virtual teams and matrixed organizations, a willingness to learn and keep pace with technical innovation, and the ability to share knowledge and expertise with team members. You will receive training and development opportunities, coaching and support from experts in your team, and a culture of continuous learning to aid your career progression. The company strives for a culture of empowerment, responsibility, commercial thinking, initiative, and collaboration, celebrating the successes of its people as part of the Deutsche Bank Group. Applications from all individuals are welcome, and the company promotes a positive, fair, and inclusive work environment. For further information about the company, please visit: [Deutsche Bank Company Website](https://www.db.com/company/company.htm),
Posted 2 weeks ago
15.0 - 20.0 years
4 - 8 Lacs
chennai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Agile Project Management Good to have skills : Apache SparkMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in ensuring that data is accessible, reliable, and ready for analysis, contributing to informed decision-making across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering practices.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Agile Project Management.- Good To Have Skills: Experience with Apache Spark, Google Cloud SQL, Python (Programming Language).- Strong understanding of data pipeline architecture and design principles.- Experience with ETL tools and data integration techniques.- Familiarity with data quality frameworks and best practices. Additional Information:- The candidate should have minimum 7.5 years of experience in Agile Project Management.- This position is based in Chennai (Mandatory).- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
7.0 - 9.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Key Responsibilities: 1. Development and Implementation: o Develop and deploy scalable, secure, and high-performance solutions using GCP services such as Compute Engine, Kubernetes Engine, BigQuery, Cloud Storage, Cloud SQL, and Cloud Pub/Sub. o Implement event-driven, distributed, and decoupled architectures. o Build and manage microservices using an API-first approach. 2. Code Quality and Best Practices: o Write clean, efficient, and reusable code following best practices. o Implement CI/CD pipelines to automate build, test, and deployment processes. o Perform code reviews to maintain high development standards and ensure adherence to best practices. 3. Collaboration and Coordination: o Collaborate with architects, product owners, and designers to understand technical specifications and project requirements. o Work closely with other developers, data engineers, and QA teams to deliver end-to-end solutions. o Troubleshoot and resolve technical issues during development and post-deployment. 4. Performance Optimization: 2 / 2 o Analyze system performance, identify bottlenecks, and optimize for scalability and cost-effectiveness. o Ensure efficient processing in real-time and batch workflows. 5. Continuous Learning and Innovation: o Stay updated on the latest GCP services, tools, and industry trends. o Suggest and implement new tools and technologies to improve development efficiency and solution quality. --- Technical Expertise: Frameworks: Event-driven and distributed frameworks; microservices development. Front-End Technologies: Experience with React.js or Express.js. Programming Languages: Expertise in Node.js (preferred), SQL, Java, or Go (GoLan) for backend development. Workflow Orchestration: Hands-on experience with Apache Airflow/Composer. Data Processing: Real-time and batch processing systems, EDW, and BigQuery. Containerization and Automation: Proficiency in Kubernetes-based deployments and CI/CD tools like Jenkins, GitLab, or similar. --- Qualifications and Experience: Bachelors degree in Computer Science, Information Technology, or a related field. 7+ years of experience in cloud-based solution development, with at least 3 years specializing in GCP. Proven track record of implementing solutions in Infrastructure Modernization, Data Management, Analytics, or Application Modernization. Strong experience with workflow orchestration, containerization, and API-based development. Excellent problem-solving skills and attention to detail. --- Preferred Skills: Familiarity with DevOps practices, including CI/CD pipelines and automated deployments. GCP Certifications (e.g., Professional Cloud Developer, Professional Cloud DevOps Engineer) are a plus Show more Show less
Posted 3 weeks ago
2.0 - 5.0 years
5 - 9 Lacs
hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google Cloud SQL Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while keeping abreast of the latest technologies and methodologies in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud SQL.- Good To Have Skills: Experience with cloud-based application development.- Strong understanding of database management and optimization techniques.- Familiarity with application programming interfaces and integration methods.- Experience in developing scalable and secure applications. Additional Information:- The candidate should have minimum 5 years of experience in Google Cloud SQL.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
5.0 - 8.0 years
10 - 14 Lacs
pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Engineering Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will require a balance of technical expertise and leadership skills to drive the project forward successfully. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Engineering.- Strong understanding of data pipeline architecture and ETL processes.- Experience with cloud platforms such as AWS or Azure.- Familiarity with data warehousing solutions and big data technologies.- Ability to work with various data storage solutions, including SQL and NoSQL databases. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Engineering.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
7.0 - 12.0 years
5 - 15 Lacs
bengaluru
Work from Office
Role - GCP Staff Data Engineer Experience: 8 - 13 years Preferred - Data Engineering Background Location -Bangalore, Chennai, Hyderabad, Kolkata, Pune, Gurgaon Job Requirement: Have Implemented and Architected solutions on Google Cloud Platform using the components of GCP Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases. Certified in Google Professional Data Engineer/ Solution Architect is a major Advantage Skills Required: 8 + years' experience in IT or professional services experience in IT delivery or large-scale IT analytics projects Candidates must have expertise knowledge of Google Cloud Platform; the other cloud platforms are nice to have. Expert knowledge in SQL development. Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc). Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Identify downstream implications of data loads/migration (e.g., data quality, regulatory, etc.) Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, and provide best practices for pipeline operations. Capability to work in a rapidly changing business environment and to enable simplified user access to massive data by building scalable data solutions Advanced SQL writing and experience in data mining (SQL, ETL, data warehouse, etc.) and using databases in a business environment with complex datasets Required Skills Required Skills - GCP DE Experience, Big query, SQL, Cloud compressor/Python, Cloud functions, Dataproc+pyspark, Python injection, Dataflow+PUB/SUB
Posted 3 weeks ago
3.0 - 8.0 years
5 - 15 Lacs
pune, bengaluru, delhi / ncr
Work from Office
Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. • You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. • You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design • You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines • Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability • Good knowledge on software configuration management systems • Awareness of latest technologies and Industry trends • Logical thinking and problem solving skills along with an ability to collaborate • Understanding of the financial processes for various types of projects and the various pricing models available • Ability to assess the current processes, identify improvement areas and suggest the technology solutions • One or two industry domain knowledge • Client Interfacing skills • Project and Team management Technical and Professional Requirements: Technology->Cloud Platform->GCP Data Analytics->Looker,Technology->Cloud Platform->GCP Database->Google BigQuery Preferred Skills: Technology->Cloud Platform->Google Big Data Technology->Cloud Platform->GCP Data Analytics
Posted 3 weeks ago
10.0 - 15.0 years
8 - 18 Lacs
bengaluru
Work from Office
Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. Responsibilities Develop solutions following established technical design, application development standards, and quality processes in projects. Assess the impacts on technical design because of the changes in functional requirements. Perform independent code reviews and execute unit tests on modules developed by self and other junior team members on the project. Write well-designed, efficient, and testable code. Interact with other stakeholders, not limited to end-user clients, the project manager or scrum master, Business Analysts, offshore development, testing, and other cross-functional teams. Skills Must have Must have 10+ Years of Java Development Experience with 3+ Years Architecture design experience BS/MS degree in Computer Science, Software Engineering, or a related subject Google Cloud Platform Experience Comfortable with practicing TDD and pair programming. Well-versed in DevOps Good knowledge of object-oriented design principles and Hands-on experience with object-oriented programming Good knowledge of Java standard library. Hands-on experience with Spring and/or Spring Boot is a big plus. Experience in agile software development Well versed with Solution Architecture and principles like below, but not limited to SOLID Hexagonal, Ports and Adapter Cloud Native Microservices patterns Experience in Large enterprise System Integrations and Architecture Strong understanding and hands-on experience with design in Scalability, High Availability, Reliability, Resiliency, Secure, and performant systems Should have good presentation, documentation, and communication skills Knowledge of Linux is a plus Knowledge of cloud platforms is a plus Desirable to have knowledge of TOGAF, Zachman frameworks Good to have an understanding of Application security frameworks and standards, eg, OWASP, NIST 4+ progressive years of experience in building and implementing model-driven, enterprise-level business solutions and applications in PRPC Excellent time management and organization skills, as well as the ability to manage multiple competing priorities Exceptional interpersonal skills and the ability to communicate, partner, and collaborate Dedication to achieving outstanding customer results with a team-oriented drive and a demonstrated ability to lead by example Exposure to a variety of technologies, including object-oriented techniques/principles, database design, application & web servers Aptitude to pick up new concepts and technology rapidly; ability to explain it to both business & IT stakeholders Ability to match technology solutions to customer needs Nice to have Banking Domain
Posted 3 weeks ago
3.0 - 8.0 years
12 - 16 Lacs
bengaluru
Work from Office
Job Summary Synechron is seeking a detail-oriented and analytical Python Developer to join our data team. In this role, you will design, develop, and optimize data pipelines, analysis tools, and workflows that support key business and analytical functions. Your expertise in data manipulation, database management, and scripting will enable the organization to enhance data accuracy, efficiency, and insights. This position offers an opportunity to work closely with data analysts and scientists to build scalable, reliable data solutions that contribute directly to business decision-making and operational excellence. Software Requirements Required Skills: Python (version 3.7 or higher) with experience in data processing and scripting Pandas library (experience in large dataset manipulation and analysis) SQL (proficiency in writing performant queries for data extraction and database management) Data management tools and databases such as MySQL, PostgreSQL, or similar relational databases Preferred Skills: Experience with cloud data services (AWS RDS, Azure SQL, GCP Cloud SQL) Knowledge of additional Python libraries such as NumPy, Matplotlib, or Jupyter Notebooks for data analysis and visualization Data pipeline orchestration tools (e.g., Apache Airflow) Version control tools like Git Overall Responsibilities Develop, test, and maintain Python scripts for ETL processes and data workflows Utilize Pandas to clean, analyze, and transform large datasets efficiently Write, optimize, and troubleshoot SQL queries for data extraction, updates, and management Collaborate with data analysts and scientists to create data-driven analytic tools and solutions Automate repetitive data workflows to increase operational efficiency and reduce errors Maintain detailed documentation of data processes, pipelines, and procedures Troubleshoot data discrepancies, pipeline failures, and database-related issues efficiently Support ongoing data quality initiatives by identifying and resolving data inconsistencies Technical Skills (By Category) Programming Languages: Required: Python (3.7+), proficiency with data manipulation and scripting Preferred: Additional scripting languages such as R or familiarity with other programming environments Databases/Data Management: Relational databases: MySQL, PostgreSQL, or similar Experience with query optimization and database schema design Cloud Technologies: Preferred: Basic experience with cloud data services (AWS, Azure, GCP) for data storage and processing Frameworks and Libraries: Pandas, NumPy, Matplotlib, Jupyter Notebooks for data analysis and visualization Airflow or similar orchestration tools (preferred) Development Tools and Methodologies: Git or similar version control tools Agile development practices and collaborative workflows Security Protocols: Understanding of data privacy, confidentiality, and secure coding practices Experience Requirements 3+ years of experience in Python development with a focus on data processing and management Proven hands-on experience in building and supporting ETL workflows and data pipelines Strong experience working with SQL and relational databases Demonstrated ability to analyze and manipulate large datasets efficiently Familiarity with cloud data services is advantageous but not mandatory Day-to-Day Activities Write and enhance Python scripts to perform ETL, data transformation, and automation tasks Design and optimize SQL queries for data extraction and updates Collaborate with data analysts, scientists, and team members during daily stand-ups and planning sessions Investigate and resolve data quality issues or pipeline failures promptly Document data pipelines, workflows, and processes for clarity and future maintenance Assist in developing analytical tools and dashboards for business insights Review code changes through peer reviews and ensure adherence to best practices Participate in continuous improvement initiatives related to data workflows and processing techniques Qualifications Bachelors degree in Computer Science, Data Science, Information Technology, or a related field Relevant certifications or training in Python, data engineering, or database management are a plus Proven track record of working on data pipelines, analysis, and automation projects Professional Competencies Strong analytical and problem-solving skills with attention to detail Effective communication skills, able to collaborate across teams and explain technical concepts clearly Ability to work independently and prioritize tasks effectively Continuous learner, eager to adopt new tools, techniques, and best practices in data processing Adaptability to changing project requirements and proactive in identifying process improvements Focused on delivering high-quality work with a results-oriented approach
Posted 3 weeks ago
5.0 - 7.0 years
0 Lacs
noida, uttar pradesh, india
On-site
Who We Are Zinnia is the leading technology platform for accelerating life and annuities growth. With innovative enterprise solutions and data insights, Zinnia simplifies the experience of buying, selling, and administering insurance products. All of which enables more people to protect their financial futures. Our success is driven by a commitment to three core values: be bold, team up, deliver value and that we do. Zinnia has over $180 billion in assets under administration, serves 100+ carrier clients, 2500 distributors and partners, and over 2 million policyholders. Who You Are Our data team serves Zinnia through data engineering, data analysis, and data science. Our goal is to help uncover opportunities and make decisions with data. We partner with all department stakeholders across the company to develop deeper predictors of behavior, develop insights that drive business strategy and build solutions to optimize our internal and external experiences. What Youll Do Overseeing technological choices and implementation of data pipelines and warehousing philosophy Execute and serve as lead and/or SME on cross-organizational and cross-divisional projects automating our data value chain processes Promoting technical best practices throughout the data organization Design data architecture that is simple and maintainable while enabling Data Analysts, Data Scientists, and stakeholders to efficiently work with data. Mentor data team members in architecture and coding techniques. Serve as a source of knowledge for the Data Engineering team for process improvement, automation and new technologies available to enable best-in-class timeliness and data coverage Design data pipelines utilizing ETL tools, event driven software, and other streaming software. Partner with both data scientists and engineers to bring our amazing concepts to reality. This requires learning to speak the language of statisticians as well as software engineers. Ensure reliability in data pipelines and enforce data governance, security and protection of our customers information while balancing tech debt. Demonstrate innovation, customer focus, and experimentation mindsets Partner with product and engineering teams to design data models for downstream data maximization. Evaluate and champion new engineering tools that help us move faster and scale our team What Youll Need A Technical Bachelor/Master&aposs Degree with 5+ years of experience across Data Engineering (Data Pipelining, Warehousing, ETL Tools etc.) Extensive experience with data engineering techniques, Python and using SQL Familiarity and working knowledge of Airflow and dbt You are comfortable and have expertise in data engineering tooling such as Jira, git, buildkite, terraform, airflow, dbt and containers as well as GCP suite, terraform kubernetes, cloud functions You understand standard ETL patterns, modern data warehousing ideas such as data mesh or data vaulting, and data quality practices regarding test driven design and data observability. You enjoy being a high-level architect sometimes, and a low-level coder sometimes You are passionate about all things data: Big data, small data, moving and transforming it, its quality, its accessibility, and delivering value from it to internal and external clients You want ownership to solve for and lead a team to deliver modern and efficient data pipeline components You are passionate about a culture of learning and teaching You love challenging yourself to constantly improve, and sharing your knowledge to empower others You like to take risks when looking for novel solutions to complex problems. If faced with roadblocks, you continue to reach higher to make greatness happen Technologies, you will use: Python for data pipelining and automation. Airbyte for ETL purpose Google Cloud Platform, Terraform, Kubernetes, Cloud SQL, Cloud Functions, BigQuery, DataStore, and more: we keep adopting new tools as we grow! Airflow and dbt for data pipelining Tableau and PowerBI for data visualization and consumer facing dashboards. WHATS IN IT FOR YOU At Zinnia, you collaborate with smart, creative professionals who are dedicated to delivering cutting-edge technologies, deeper data insights, and enhanced services to transform how insurance is done. Visit our website at www.zinnia.com for more information. Apply by completing the online application on the careers section of our website. We are an Equal Opportunity employer committed to a diverse workforce. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability. Show more Show less
Posted 3 weeks ago
4.0 - 6.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Software Engineer III, Fullstack (Backend Heavy) Team And Responsibilities If you are someone who is passionate about the art of programming, cares about clean and semantic code and keeps a tab on new developments in technology then you are the right fit for us! What you&aposll do : Design, and develop software with high quality and take ownership. Work collaboratively with product management Participate in a full development life cycle including planning and code reviews. Build solutions that can easily scale to the demands of Vimeo traffic bursts. Ensure the best technical design and approach with an aim for continuous improvement. Set high technical standards. Skill and knowledge you should possess: B-Tech / MTech in Computer science or equivalent degree Minimum 4 years of backend development experience with GoLang/PHP/Java and other languages (PHP preferred). Minimum 1 year experience in React. Strong troubleshooting, debugging, and testing skills Very good in algorithms, data structures, time & space complexities and problem solving, in general. Very good knowledge of Object Oriented programming paradigm, design patterns. Sound knowledge in cloud technologies and concepts like CDN, caching, ate limit, latence, throughput Sound knowledge in database and caching technologies like MySQL, Redis, Cloud SQL, Memcache Good knowledge on designing systems, analyzing trade-offs between different choices. Nice to have exposure on various authorization, authentication models and technologies like RBAC, ReBAC, SSO, SCIM etc Nice to have basic understanding of infrastructure technologies like Varnish, HAProxy and alike Willingness to learn and experiment with new technology. About Us: Vimeo (NASDAQ: VMEO) is the world&aposs most innovative video experience platform. We enable anyone to create high-quality video experiences to better connect and bring ideas to life. We proudly serve our community of millions of users from creative storytellers to globally distributed teams at the world&aposs largest companies whose videos receive billions of views each month. Learn more at www.vimeo.com. Vimeo is headquartered in New York City with offices around the world. At Vimeo, we believe our impact is greatest when our workforce of passionate, dedicated people, represents our diverse and global community. Were proud to be an equal opportunity employer where diversity, equity, and inclusion is championed in how we build our products, develop our leaders, and strengthen our culture. Show more Show less
Posted 3 weeks ago
4.0 - 8.0 years
15 - 19 Lacs
bengaluru
Work from Office
Bengaluru, India DevOps BCM Industry 25/04/2025 Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. You should have extensive experience with Google Cloud Platform (GCP), Kubernetes, and Docker. role involves working closely with our development and operations teams to ensure seamless integration and deployment of applications. Responsibilities Design, implement, and manage CI/CD pipelines on GCP. Automate infrastructure provisioning, configuration, and deployment using tools like Terraform and Ansible. Manage and optimize Kubernetes clusters for high availability and scalability. Containerize applications using Docker and manage container orchestration. Monitor system performance, troubleshoot issues, and ensure system reliability and security. Collaborate with development teams to ensure smooth and reliable operation of software and systems. Implement and manage logging, monitoring, and alerting solutions. Stay updated with the latest industry trends and best practices in DevOps and cloud technologies. Skills Must have Looking for 6 to 9 years of experience as a DevOps Engineer and a minimum of 4 years of relevant experience in GCP. Bachelor's degree in Computer Science, Engineering, or a related field. Strong expertise in Kubernetes and Docker. Experience with infrastructure as code (IaC) tools such as Terraform and Ansible. Proficiency in scripting languages like Python, Bash, or Go. Familiarity with CI/CD tools such as Jenkins, GitLab CI, or CircleCI. Knowledge of networking, security, and database management. Excellent problem-solving skills and attention to detail. Nice to have Strong communication and collaboration skills. Other Languages EnglishC2 Proficient Seniority Senior
Posted 3 weeks ago
3.0 - 5.0 years
5 - 8 Lacs
chennai
Work from Office
" PLEASE READ THE JOB DESTCRIPTION AND APPLY" Data Engineer Job Description Position Overview Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it the present. - Master Oogway Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in transforming raw marketing data into actionable insights that power our digital marketing platform. As a key member of our data infrastructure team, you will design, develop, and maintain robust data pipelines, data warehouses, and analytics platforms that serve as the backbone of our digital marketing product development. Sometimes the hardest choices require the strongest wills. - Thanos (but we promise, our data decisions are much easier! ) In this role, you will collaborate with cross-functional teams including Data Scientists, Product Managers, and Marketing Technology specialists to ensure seamless data flow from various marketing channels, ad platforms, and customer touchpoints to our analytics dashboards and reporting systems. You'll be responsible for building scalable, reliable, and efficient data solutions that can handle high-volume marketing data processing and real-time campaign analytics. What You'll Do: - Design and implement enterprise-grade data pipelines for marketing data ingestion and processing - Build and optimize data warehouses and data lakes to support digital marketing analytics - Ensure data quality, security, and compliance across all marketing data systems - Create data models and schemas that support marketing attribution, customer journey analysis, and campaign performance tracking - Develop monitoring and alerting systems to maintain data pipeline reliability for critical marketing operations - Collaborate with product teams to understand digital marketing requirements and translate them into technical solutions Why This Role Matters: I can do this all day. - Captain America (and you'll want to, because this role is that rewarding!) You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting. Sometimes you gotta run before you can walk. - Iron Man (and sometimes you gotta build the data pipeline before you can analyze the data! ) Our Philosophy: We believe in the power of data to transform lives, just like the Dragon Warrior transformed the Valley of Peace. Every line of code you write, every pipeline you build, and every insight you enable has the potential to change how marketers work and succeed. We're not just building data systems - we're building the future of digital marketing, one insight at a time. Your story may not have such a happy beginning, but that doesn't make you who you are. It is the rest of your story, who you choose to be. - Soothsayer What Makes You Special: We're looking for someone who embodies the spirit of both Captain America's unwavering dedication and Iron Man's innovative genius. You'll need the patience to build robust systems (like Cap's shield ) and the creativity to solve complex problems (like Tony's suit). Most importantly, you'll have the heart to make a real difference in marketers' lives. Inner peace... Inner peace... Inner peace... - Po (because we know data engineering can be challenging, but we've got your back! ) Key Responsibilities Data Pipeline Development - Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes - Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources) - Implement data transformation and cleaning processes to ensure data quality and consistency - Optimize data pipeline performance and reliability Data Infrastructure Management - Design and implement data warehouse architectures - Manage and optimize database systems (SQL and NoSQL) - Implement data lake solutions and data governance frameworks - Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture - Design and implement data models for analytics and reporting - Create and maintain data dictionaries and documentation - Develop data schemas and database structures - Implement data versioning and lineage tracking Data Quality, Security, and Compliance - Ensure data quality, integrity, and consistency across all marketing data systems - Implement and monitor data security measures to protect sensitive information - Ensure privacy and compliance with regulatory requirements (e.g., GDPR, CCPA) - Develop and enforce data governance policies and best practices Collaboration and Support - Work closely with Data Scientists, Analysts, and Business stakeholders - Provide technical support for data-related issues and queries Monitoring and Maintenance - Implement monitoring and alerting systems for data pipelines - Perform regular maintenance and optimization of data systems - Troubleshoot and resolve data pipeline issues - Conduct performance tuning and capacity planning Required Qualifications Experience - 2+ years of experience in data engineering or related roles - Proven experience with ETL/ELT pipeline development - Experience with cloud data platform (GCP) - Experience with big data technologies Technical Skills - Programming Languages : Python, SQL, Golang (preferred) - Databases: PostgreSQL, MySQL, Redis - Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform - Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud Storage, Pub/Sub, App Engine, Compute Engine etc.) - Data Warehousing: Google BigQuery - Data Visualization: Superset, Looker, Metabase, Tableau - Version Control: Git, GitHub - Containerization: Docker Soft Skills - Strong problem-solving and analytical thinking - Excellent communication and collaboration skills - Ability to work independently and in team environments - Strong attention to detail and data quality - Continuous learning mindset Preferred Qualifications Additional Experience - Experience with real-time data processing and streaming - Knowledge of machine learning pipelines and MLOps - Experience with data governance and data catalog tools - Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.) - Experience using AI-powered tools (such as Cursor, Claude, Copilot, ChatGPT, Gemini, etc.) to accelerate coding, automate tasks, or assist in system design ( We belive run with machine, not against machine ) Interview Process 1. Initial Screening: Phone/video call with HR 2. Technical Interview: Deep dive into data engineering concepts 3. Final Interview: Discussion with senior leadership Note: This job description is intended to provide a general overview of the position and may be modified based on organizational needs and candidate qualifications. Our Team Culture We are Groot. - We work together, we grow together, we succeed together. We believe in: - Innovation First - Like Iron Man, we're always pushing the boundaries of what's possible - Team Over Individual - Like the Avengers, we're stronger together than apart - Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving - Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing!) Growth Journey There is no charge for awesomeness... or attractiveness. - Po Your journey with us will be like Po's transformation from noodle maker to Dragon Warrior: - Level 1 : Master the basics of our data infrastructure - Level 2: Build and optimize data pipelines - Level 3 : Lead complex data projects and mentor others - Level 4: Become a data engineering legend (with your own theme music! ) What We Promise I am Iron Man. - We promise you'll feel like a superhero every day! - Work that matters - Every pipeline you build helps real marketers succeed - Growth opportunities - Learn new technologies and advance your career - Supportive team - We've got your back, just like the Avengers - Work-life balance - Because even superheroes need rest! Apply : https://customerlabs.freshteam.com/jobs
Posted 3 weeks ago
5.0 - 7.0 years
13 - 17 Lacs
bengaluru
Work from Office
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies : OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence
Posted 3 weeks ago
4.0 - 8.0 years
15 - 25 Lacs
pune, gurugram, bengaluru
Hybrid
Salary: 15 to 25 LPA Exp: 4 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.
Posted 3 weeks ago
3.0 - 5.0 years
5 - 15 Lacs
pune
Hybrid
Responsibilities: Design, implement, and manage ETL pipelines on Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, Composer) . Write complex SQL queries and optimize for BigQuery performance. Work with structured/unstructured data from multiple sources (databases, APIs, streaming). Build reusable data frameworks for transformation, validation, and quality checks. Collaborate with stakeholders to understand business requirements and deliver analytics-ready datasets. Implement best practices in data governance, security, and cost optimization . Requirements: Bachelors in Computer Science, IT, or related field. experience in ETL/Data Engineering . Strong Python & SQL skills. Hands-on with GCP (BigQuery, Dataflow, Composer, Pub/Sub, Dataproc) . Experience with orchestration tools (Airflow preferred). Knowledge of data modeling and data warehouse design. Exposure to CI/CD, Git, DevOps practices is a plus.
Posted 4 weeks ago
7.0 - 12.0 years
5 - 15 Lacs
bengaluru
Work from Office
Role - GCP Staff Data Engineer Experience: 8 - 13 years Preferred - Data Engineering Background Location -Bangalore, Chennai, Hyderabad, Kolkata, Pune, Gurgaon Job Requirement: Have Implemented and Architected solutions on Google Cloud Platform using the components of GCP Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases. Certified in Google Professional Data Engineer/ Solution Architect is a major Advantage Skills Required: 8 + years' experience in IT or professional services experience in IT delivery or large-scale IT analytics projects Candidates must have expertise knowledge of Google Cloud Platform; the other cloud platforms are nice to have. Expert knowledge in SQL development. Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc). Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Identify downstream implications of data loads/migration (e.g., data quality, regulatory, etc.) Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, and provide best practices for pipeline operations. Capability to work in a rapidly changing business environment and to enable simplified user access to massive data by building scalable data solutions Advanced SQL writing and experience in data mining (SQL, ETL, data warehouse, etc.) and using databases in a business environment with complex datasets Required Skills Required Skills - GCP DE Experience, Big query, SQL, Cloud compressor/Python, Cloud functions, Dataproc+pyspark, Python injection, Dataflow+PUB/SUB
Posted 4 weeks ago
3.0 - 7.0 years
11 - 15 Lacs
bengaluru
Work from Office
Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. Responsibilities Participate in the full application development lifecycle for the development of Java applications, Microservices, and reusable components to support overall project objectives. Leverage design patterns, test-driven development (TDD), and behaviour-driven development (BDD) to build software that is reliable and easy to support in production. Must be adaptable to different responsibilities, and possess strong communication skills in order to work effectively with team members and stakeholders. Design and deliver front-to-back technical solutions and integrate into business processes. Participate in hands-on coding, code reviews, architectural decisions, and reviews. Work in an Agile Systems Development Life Cycle. Skills Must have Overall 2 to 4 years of experience as a Java Developer 2+ Years of Experience developing in Core Java and Spring Framework Google Cloud Platform Experience Worked with the latest features of Java 8, 11, and 17 in Development Solid understanding of Data Structures Good hands-on coding skills Experience in Kafka or other messaging Knowledge of key APIsJPA, JTA, CDI, etc. Knowledge of various design and architectural patterns Understanding of microservices architecture Containerization solutions (e.g. Docker, Kubernetes, OpenShift) Building tools (e.g., Maven, Gradle) Version Control (e.g., Git) Continuous Integration systems (e.g., TeamCity, Jenkins) English Upper-Intermediate Be well versed with concepts of references, class instances, methods, objects, constructors, mutable and immutable class concepts, functional interfaces, array lists, linked lists, Hashmap, collections, the difference between recoverable and non-recoverable exceptions, Inversion Control, design a data structure that supports Insert, Delete, Search in constant time complexity, etc. Nice to have Banking Domain
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |