Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
20 - 30 Lacs
Chennai
Work from Office
• Experience in cloud-based systems (GCP, BigQuery) • Strong SQL programming skills. • Expertise in database programming and performance tuning techniques • Possess knowledge of data warehouse architectures, ETL, reporting/analytic tools,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a Google Cloud Engineer at our company, you will play a crucial role in designing, building, deploying, and maintaining our cloud infrastructure and applications on Google Cloud Platform (GCP). Your collaboration with development, operations, and security teams will ensure that our cloud environment is scalable, secure, highly available, and cost-optimized. If you are enthusiastic about cloud-native technologies, automation, and overcoming intricate infrastructure challenges, we welcome you to apply. Your responsibilities will include: - Designing, implementing, and managing robust, scalable, and secure cloud infrastructure on GCP utilizing Infrastructure as Code (IaC) tools like Terraform. - Deploying, configuring, and managing core GCP services such as Compute Engine, Kubernetes Engine (GKE), Cloud SQL, Cloud Storage, Cloud Functions, BigQuery, Pub/Sub, and networking components. - Developing and maintaining CI/CD pipelines for automated deployment and release management using various tools. - Implementing and enforcing security best practices within the GCP environment, including IAM, network security, data encryption, and compliance adherence. - Monitoring cloud infrastructure and application performance, identifying bottlenecks, and implementing optimization solutions. - Troubleshooting and resolving complex infrastructure and application issues in production and non-production environments. - Collaborating with development teams to ensure cloud-native deployment, scalability, and resilience of applications. - Participating in on-call rotations for critical incident response and timely resolution of production issues. - Creating and maintaining comprehensive documentation for cloud architecture, configurations, and operational procedures. - Keeping up-to-date with new GCP services, features, and industry best practices to propose and implement improvements. - Contributing to cost optimization efforts by identifying and implementing efficiencies in cloud resource utilization. We require you to have: - A Bachelors or Masters degree in Computer Science, Software Engineering, or a related field. - 6+ years of experience with C#, .NET Core, .NET Framework, MVC, Web API, Entity Framework, and SQL Server. - 3+ years of experience with cloud platforms, preferably GCP, including designing and deploying cloud-native applications. - 3+ years of experience with source code management, CI/CD pipelines, and Infrastructure as Code. - Strong experience with Javascript and a modern Javascript framework, with VueJS preferred. - Proven leadership and mentoring skills with development teams. - Strong understanding of microservices architecture and serverless computing. - Experience with relational databases like SQL Server and PostgreSQL. - Excellent problem-solving, analytical, and communication skills, along with Agile/Scrum environment experience. What can make you stand out: - GCP Cloud Certification. - UI development experience with HTML, JavaScript, Angular, and Bootstrap. - Agile environment experience with Scrum, XP. - Relational database experience with SQL Server, PostgreSQL. - Proficiency in Atlassian tools like JIRA, Confluence, and Github. - Working knowledge of Python and exceptional problem-solving and analytical abilities, along with strong teamwork skills.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
We are looking for a skilled Data Governance Engineer to spearhead the development and supervision of robust data governance frameworks on Google Cloud Platform (GCP). You should have a deep understanding of data management, metadata frameworks, compliance, and security within cloud environments to ensure the adoption of high-quality, secure, and compliant data practices aligned with organizational objectives. The ideal candidate should possess: - Over 4 years of experience in data governance, data management, or data security. - Hands-on expertise with Google Cloud Platform (GCP) tools like BigQuery, Dataflow, Dataproc, and Google Data Catalog. - Proficiency in metadata management, data lineage, and data quality tools such as Collibra, Informatica. - Comprehensive knowledge of data privacy laws and compliance frameworks. - Strong skills in SQL and Python for governance automation. - Experience with RBAC, encryption, and data masking techniques. - Familiarity with ETL/ELT pipelines and data warehouse architectures. Your main responsibilities will include: - Developing and implementing comprehensive data governance frameworks emphasizing metadata management, lineage tracking, and data quality. - Defining, documenting, and enforcing data governance policies, access control mechanisms, and security standards utilizing GCP-native services like IAM, DLP, and KMS. - Managing metadata repositories using tools like Collibra, Informatica, Alation, or Google Data Catalog. - Collaborating with data engineering and analytics teams to ensure compliance with GDPR, CCPA, SOC 2, and other regulatory standards. - Automating processes for data classification, monitoring, and reporting using Python and SQL. - Supporting data stewardship initiatives including the creation of data dictionaries and governance documentation. - Optimizing ETL/ELT pipelines and data workflows to adhere to governance best practices. At GlobalLogic, we offer: - A culture of caring that prioritizes inclusivity, acceptance, and personal connections. - Continuous learning and development opportunities to enhance your skills. - Engagement in interesting and meaningful work with cutting-edge solutions. - Balance and flexibility to help you integrate work and life effectively. - A high-trust organization committed to integrity and ethical practices. GlobalLogic, a Hitachi Group Company, is a leading digital engineering partner to world-renowned companies, focusing on creating innovative digital products and experiences. Join us to collaborate on transforming businesses through intelligent products, platforms, and services.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
kolkata, west bengal
On-site
You are a Data Engineer with 3+ years of experience, proficient in SQL and Python development. You will be responsible for designing, developing, and maintaining scalable data pipelines to support ETL processes using tools like Apache Airflow, AWS Glue, or similar. Your role involves optimizing and managing relational and NoSQL databases such as MySQL, PostgreSQL, MongoDB, or Cassandra for high performance and scalability. You will write advanced SQL queries, stored procedures, and functions to efficiently extract, transform, and analyze large datasets. Additionally, you will implement and manage data solutions on cloud platforms like AWS, Azure, or Google Cloud, utilizing services such as Redshift, BigQuery, or Snowflake. Your contributions to designing and maintaining data warehouses and data lakes will support analytics and BI requirements. Automation of data processing tasks through script and application development in Python or other programming languages is also part of your responsibilities. As a Data Engineer, you will implement data quality checks, monitoring, and governance policies to ensure data accuracy, consistency, and security. Collaboration with data scientists, analysts, and business stakeholders to understand data needs and translate them into technical solutions is essential. Identifying and resolving performance bottlenecks in data systems, optimizing data storage, and retrieval are key aspects. Maintaining comprehensive documentation for data processes, pipelines, and infrastructure is crucial. Staying up-to-date with the latest trends in data engineering, big data technologies, and cloud services is expected from you. You should hold a Bachelors or Masters degree in Computer Science, Information Technology, Data Engineering, or a related field. Proficiency in SQL, relational databases, NoSQL databases, Python programming, and experience with data pipeline tools and cloud platforms is required. Knowledge of big data tools like Apache Spark, Hadoop, or Kafka is a plus. Strong analytical and problem-solving skills with a focus on performance optimization and scalability are essential. Excellent verbal and written communication skills are necessary to convey technical concepts to non-technical stakeholders. You should be able to work collaboratively in cross-functional teams. Preferred certifications include AWS Certified Data Analytics, Google Professional Data Engineer, or similar. An eagerness to learn new technologies and adapt quickly in a fast-paced environment is a mindset that will be valuable in this role.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
punjab
On-site
You are an experienced and results-driven ETL & DWH Engineer/Data Analyst with over 8 years of expertise in data integration, warehousing, and analytics. Your role involves having deep technical knowledge in ETL tools, strong data modeling skills, and the capability to lead intricate data engineering projects from inception to implementation. Your key skills include: - Utilizing ETL tools such as SSIS, Informatica, DataStage, or Talend for more than 4 years. - Proficiency in relational databases like SQL Server and MySQL. - Comprehensive understanding of Data Mart/EDW methodologies. - Designing star schemas, snowflake schemas, fact and dimension tables. - Experience with Snowflake or BigQuery. - Familiarity with reporting and analytics tools like Tableau and Power BI. - Proficient in scripting and programming using Python. - Knowledge of cloud platforms like AWS or Azure. - Leading recruitment, estimation, and project execution. - Exposure to Sales and Marketing data domains. - Working with cross-functional and geographically distributed teams. - Translating complex data issues into actionable insights. - Strong communication and client management abilities. - Initiative-driven with a collaborative approach and problem-solving mindset. Your roles & responsibilities will include: - Creating high-level and low-level design documents for middleware and ETL architecture. - Designing and reviewing data integration components while ensuring compliance with standards and best practices. - Ensuring delivery quality and timeliness for one or more complex projects. - Providing functional and non-functional assessments for global data implementations. - Offering technical guidance and support to junior team members for problem-solving. - Leading QA processes for deliverables and validating progress against project timelines. - Managing issue escalation, status tracking, and continuous improvement initiatives. - Supporting planning, estimation, and resourcing for data engineering efforts.,
Posted 1 week ago
5.0 - 13.0 years
0 Lacs
pune, maharashtra
On-site
You are a highly skilled and experienced Cloud Architect/Engineer with deep expertise in Google Cloud Platform (GCP). Your primary responsibility is to design, build, and manage scalable and reliable cloud infrastructure on GCP. You will leverage various GCP services such as Compute Engine, Cloud Run, BigQuery, Pub/Sub, Cloud Functions, Dataflow, Dataproc, IAM, and Cloud Storage to ensure high-performance cloud solutions. Your role also includes developing and maintaining CI/CD pipelines, automating infrastructure deployment using Infrastructure as Code (IaC) principles, and implementing best practices in cloud security, monitoring, performance tuning, and logging. Collaboration with cross-functional teams to deliver cloud solutions aligned with business objectives is essential. You should have 5+ years of hands-on experience in cloud architecture and engineering, with at least 3 years of practical experience on Google Cloud Platform (GCP). In-depth expertise in GCP services mentioned above is required. Strong understanding of networking, security, containerization (Docker, Kubernetes), and CI/CD pipelines is essential. Experience with monitoring, performance tuning, and logging in cloud environments is preferred. Familiarity with DevSecOps practices and tools such as HashiCorp Vault is a plus. Your role as a GCP Cloud Architect/Engineer will contribute to ensuring system reliability, backup, and disaster recovery strategies. This hybrid role is based out of Pune and requires a total of 10 to 13 years of relevant experience.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You will be working as a Technical Lead Data Engineer for a leading data and AI/ML solutions provider based in Gurgaon. In this role, you will be responsible for designing, developing, and leading complex data projects primarily on Google Cloud Platform and other modern data stacks. Your key responsibilities will include leading the design and implementation of robust data pipelines, collaborating with cross-functional teams to deliver end-to-end data solutions, owning project modules, developing technical roadmaps, and implementing data governance frameworks on GCP. You will be required to integrate GCP data services like BigQuery, Dataflow, Dataproc, Cloud Composer, Vertex AI Studio, and GenAI with platforms such as Snowflake. Additionally, you will write efficient code in Python, SQL, and ETL/orchestration tools, utilize containerized solutions for scalable deployments, and apply expertise in PySpark, Kafka, and advanced data querying for high-volume data environments. Monitoring, optimizing, and troubleshooting system performance, reducing job run-times through architecture optimization, developing data warehouses, and mentoring team members will also be part of your role. To be successful in this position, you should have a Bachelors or Masters degree in Computer Science, Engineering, or a related field. Extensive hands-on experience with Google Cloud Platform data services, Snowflake integration, strong programming skills in Python and SQL, proficiency in PySpark, Kafka, and data querying tools, and experience with containerized solutions using Google Kubernetes Engine are essential. Strong communication skills, documentation skills, experience with large distributed datasets, and the ability to balance short-term deliverables with long-term technical sustainability are also required. Prior leadership experience in data engineering teams and exposure to cloud data platforms are desirable. This role offers you the opportunity to lead high-impact data projects for reputed clients in a fast-growing data consulting environment, work with cutting-edge technologies, and collaborate in an innovative and growth-oriented culture.,
Posted 1 week ago
4.0 - 7.0 years
9 - 13 Lacs
Pune
Work from Office
skilled Java + GCP Developer Shell scripting and Python, Java, Spring Boot, BigQuery. The ideal candidate should have hands-on experience in Java, Spring Boot, and Google Cloud Platform (GCP)
Posted 1 week ago
5.0 - 7.0 years
5 - 14 Lacs
Pune, Gurugram, Bengaluru
Work from Office
• Handson experience in objectoriented programming using Python, PySpark, APIs, SQL, BigQuery, GCP • Building data pipelines for huge volume of data • Dataflow Dataproc and BigQuery • Deep understanding of ETL concepts
Posted 1 week ago
5.0 - 8.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Skills desired: Strong at SQL (Multi pyramid SQL joins) Python skills (FastAPI or flask framework) PySpark Commitment to work in overlapping hours GCP knowledge(BQ, DataProc and Dataflow) Amex experience is preferred(Not Mandatory) Power BI preferred (Not Mandatory) Flask, Pyspark, Python, Sql
Posted 1 week ago
5.0 - 10.0 years
7 - 17 Lacs
Pune, Bengaluru, Delhi / NCR
Work from Office
Responsibilities A day in the life of an Infoscion • As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. • You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design • You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines • Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability • Good knowledge on software configuration management systems • Awareness of latest technologies and Industry trends • Logical thinking and problem solving skills along with an ability to collaborate • Understanding of the financial processes for various types of projects and the various pricing models available • Ability to assess the current processes, identify improvement areas and suggest the technology solutions • One or two industry domain knowledge • Client Interfacing skills • Project and Team management Technical and Professional Requirements: Technology->Cloud Platform->GCP Data Analytics->Looker,Technology->Cloud Platform->GCP Database->Google BigQuery Preferred Skills: Technology->Cloud Platform->Google Big Data Technology->Cloud Platform->GCP Data Analytics
Posted 1 week ago
10.0 - 15.0 years
11 - 15 Lacs
Pune
Work from Office
BMC is looking for a Java Tech Lead, an innovator at heart, to join a team of highly skilled software developers team. Here is how, through this exciting role, YOU will contribute to BMC's and your own success: Design and develop platform solution based on Java/J2EE best practices and web standards. Discover, design, and develop analytical methods to support novel approaches of data and information processing Lead/participate in all aspects of product development, from requirements analysis to product release. Lead feature/product engineering teams and participate in architecture and design reviews. Responsible for delivery of high quality commercial software releases to aggressive schedules. Good troubleshooting and debugging skills. Ability to lead and participate on empowered virtual teams to deliver iteration deliverables, and drive the technical direction of the product. Design enterprise platform using UML, process flows, sequence diagrams, and pseudo-code level details ensuring solution alignment. Develop and implement software solutions that leverage GPT, LLM, and conversational AI technologies. Integrate GPT and LLM models into the software architecture to enable natural language understanding and generation. To ensure youre set up for success, you will bring the following skillset & experience: You have 10+ experience in designing and developing complex framework and platform solutions with practical use of design patterns. You are expert in server-side issues such as caching, clustering, persistence, security, SSO, high scalability/availability and failover You have experience in big data engineering technologies such as: stream/stream processing frameworks. You are experience in open source Java frameworks such as OSGI, Spring, JMS, JPA, JTA, JDBC. Kubernetes, AWS, GCP and Azure cloud platforms You are experience in PostgreSQL database and Aspect oriented architectures. You are experience in open source participation and apache projects, patent process, in depth knowledge of App server architectures and SaaS enabling platforms. You are familiarity with REST API principles, object-oriented design, and design patterns. You have knowledge of fine tuning LLMs including BERT and GPT based Whilst these are nice to have, our team can help you develop in the following skills: Familiarity with data warehouse/data lake platforms Snowflake, Databricks, Bigquery Knowledge of cloud platforms Amazon AWS, Google GCP, Oracle OCI, Microsoft Azure Experience in Generative AI frameworks such as LangChain and LlamaIndex
Posted 1 week ago
4.0 - 6.0 years
14 - 18 Lacs
Gurugram
Work from Office
Experience 4-6 years of professional experience as a backend engineer, primarily working with server-side technologies. Required Skills: Strong expertise in TypeScript and building scalable backend applications using Express (NestJS preferred) Proficient in building and managing Microservices architecture. Experience with ORM, preferably Prisma. Hands-on experience with Apache Kafka for real-time data streaming and messaging. Experience with Google Cloud Platform (GCP) services, including but not limited to Cloud Functions, Cloud Run, Pub/Sub, BigQuery, and Kubernetes Engine. Familiarity with RESTful APIs, database systems (SQL/NoSQL), and performance optimization. Solid understanding of version control systems, particularly Git. Preferred Skills: Knowledge of containerization using Docker. Experience with automated testing frameworks and methodologies. Understanding of monitoring, logging, and observability tools and practices. Responsibilities: Design, develop, and maintain backend services using NestJS within a microservices architecture. Implement robust messaging and event-driven architectures using Kafka. Deploy, manage, and optimize applications and services on Google Cloud Platform. Ensure high performance, scalability, reliability, and security of backend services. Collaborate closely with front-end developers, product managers, and DevOps teams. Write clean, efficient, and maintainable code, adhering to best practices and coding standards. Perform comprehensive testing and debugging, addressing production issues promptly.
Posted 1 week ago
8.0 - 13.0 years
25 - 40 Lacs
Hyderabad
Hybrid
Job Title: Tech Lead GCP Data Engineer Location: Hyderabad, India Experience: 5+ Years Job Type: Full-Time Industry: IT / Software Services Functional Area: Data Engineering / Cloud / Analytics Role Category: Cloud Data Engineering Position Overview We are seeking a GCP Data Engineer with strong expertise in SQL , Python , and Google Cloud Platform (GCP) services including BigQuery , Cloud Composer , and Airflow . The ideal candidate will play a key role in building scalable, high-performance data solutions to support marketing analytics initiatives. This role involves collaboration with cross-functional global teams and provides an opportunity to work on cutting-edge technologies in a dynamic marketing data landscape. Key Responsibilities Lead technical teams and coordinate with global stakeholders. Manage and estimate data development tasks and delivery timelines. Build and optimize data pipelines using GCP , especially BigQuery , Cloud Storage , and Cloud Composer . Work with Airflow DAGs , REST APIs, and data orchestration workflows. Collaborate on development and debugging of ETL pipelines , including IICS and Ascend IO (preferred). Perform complex data analysis across multiple sources to support business goals. Implement CI/CD pipelines and manage version control using Git. Troubleshoot and upgrade existing data systems and ETL chains. Contribute to data quality, performance optimization, and cloud-native solution design. Required Skills & Qualifications Bachelors or Masters in Computer Science, IT, or related field. 5+ years of experience in Data Engineering or relevant roles. Strong expertise in GCP , BigQuery , Cloud Composer , and Airflow . Proficient in SQL , Python , and REST API development. Hands-on experience with IICS , MySQL , and data warehousing solutions. Knowledge of ETL tools like Ascend IO is a plus. Exposure to marketing analytics tools (e.g., Google Analytics, Blueconic, Klaviyo) is desirable. Familiarity with performance marketing concepts (segmentation, A/B testing, attribution modeling, etc.). Excellent communication and analytical skills. GCP certification is a strong plus. Experience working in Agile environments. To Apply, Send Your Resume To:krishnanjali.m@technogenindia.com
Posted 1 week ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad
Work from Office
Greetings from Technogen !!! We thank you for taking time about your competencies and skills, while allowing us an opportunity to explain about us and our Technogen , we understand that your experience and expertise are relevant the current open with our clients. About Technogen : TechnoGen Brief Overview :- LinkedIn : https://www.linkedin.com/company/technogeninc/about/ TechnoGen, Inc. is an ISO 9001:2015, ISO 20000-1:2011, ISO 27001:2013, and CMMI Level 3 Global IT Services Company headquartered in Chantilly, Virginia. TechnoGen, Inc. (TGI) is a Minority & Women-Owned Small Business with over 20 years of experience providing end-to-end IT Services and Solutions to the Public and Private sectors. TGI provides highly skilled and certied professionals and has successfully executed more than 345 projects. TechnoGen is committed to helping our clients solve complex problems and achieve their goals, on time and under budget. Please share below details for further processing of your profile. Total years of experience: Relevant years of experience: CTC (Including Variable): ECTC: Notice Period: Reason for change: Current location: Job Title :GCP Data Engineer Required Experience : 5+ years Work Mode: WFO-4 Days from Office. Shift Time : UK Shift Time-12:00 PM IST to 09:00 PM IST. Location : Hyderabad. Job Summary :- As a GCP Data Engineer, we need someone with strong experience in SQL and Python. The ideal candidate should have hands-on expertise in Google Cloud Platform (GCP) services, especially BigQuery, Composer, Airflow framework and a solid understanding of data engineering best practices. You will work closely with our internal teams and technology partners to deliver comprehensive and scalable marketing data and analytics solutions. This role offers the unique opportunity to engage in many technology platforms in a rapidly evolving marketing technology landscape. Key Responsibilities: • Technical oversight and team management of the developers, coordination with US based Mattel resources, and perform estimation of work. Strong knowledge in cloud computing platforms - Google Cloud Expertise in MySQL & SQL/PL Good Experience in IICS Experience in ETL Ascend IO is added advantage GCP & BigQuery knowledge is must, GCP certification is added advantage Good experience in Google Cloud Storage (GCS), Cloud Composer, DAGs , Airflow REST API development experience Good in analytical and problem solving, efficient communication Experience in designing, implementing, and managing various ETL job execution flows. Utilize Git for source version control. Set up and maintain CI/CD pipelines. Troubleshoot, debug, and upgrade existing application & ETL job chains. Comprehensive data analysis across complex data sets Ability to collaborate effectively across technical development teams and business departments Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, or a related field. 5+ years of experience in data engineering or related roles Strong understanding of Google Cloud Platform and associated tools. Proven experience in delivering consumer marketing data and analytics solutions for enterprise clients. Strong knowledge of data management, ETL processes, data warehousing, and analytics platforms. Experience with SQL and NoSQL databases. Proficiency in Python programming languages. Hands-on experience with data warehousing solutions Knowledge of marketing analytics tools and technologies, including but not limited to Google Analytics, Blueconic, Klaviyo, etc. Knowledge of performance marketing concepts such as targeting & segmentation, real-time optimization, A/B testing, attribute modeling, etc. Excellent communication skills with a track record of collaboration across multiple teams Strong collaboration skills and team-oriented mindset. Strong problem-solving skills, adaptability, and the ability to thrive in a dynamic and rapidly changing environment. Experience working in Agile development environments Best Regards, Syam.M | Sr.IT Recruiter syambabu.m@technogenindia.com www.technogenindia.com | Follow us on LinkedIn
Posted 1 week ago
4.0 - 9.0 years
11 - 17 Lacs
Bengaluru
Work from Office
Greetings from TSIT Digital !! This is with regard to an excellent opportunity with us and if you have that unique and unlimited passion for building world-class enterprise software products that turn into actionable intelligence, then we have the right opportunity for you and your career. This is an opportunity for Permanent Employment with TSIT Digital. What are we looking for: Data Engineer Experience: 4+ Year's Relevant Experience 2-5 Years Location:Bangalore Notice period: Immediately to 15 days Job Description: Work location-Manyata Tech Park, Bengaluru, Karnataka, India Work mode- Hybrid Model Client- Lowes Mandatory Skills: Data Engineer Scala/Python, SQL,Scripting Knowledge on BigQuery, Pyspark, Airflow,Serverless Cloud Native Service, Kafka Streaming If you are interested please share your updated CV:- kousalya.v@tsit.co.in
Posted 1 week ago
4.0 - 8.0 years
12 - 18 Lacs
Hyderabad
Hybrid
Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms, including Google Cloud and Salesforce, to help clients drive action and impact through data and insights. We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better. We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results. Job Summary We are seeking a talented and passionate Python Developer to join our dynamic team. In this role, you will be instrumental in designing, developing, and deploying scalable and efficient applications on the Google Cloud Platform. You will have the opportunity to work on exciting projects and contribute to the growth and innovation of our products/services. You will also mentorship to other engineers, and engage with clients to understand their needs and deliver effective solutions. Responsibilities: Design, develop, and maintain robust and scalable applications using Python. Build and consume RESTful APIs using FastAPI. Deploy and manage applications on the Google Cloud Platform (GCP). Collaborate effectively with cross-functional teams, including product managers, designers, and other engineers. Write clean, well-documented, and testable code. Participate in code reviews to ensure code quality and adherence to best practices. Troubleshoot and debug issues in development and production environments. Create clear and effective documents. Stay up-to-date with the latest industry trends and technologies. Assist the junior team members. Required Skills and Experience 5+ years of relevant work experience in software development using Python Solid understanding and practical experience with the FastAPI framework Hands-on experience with the Google Cloud Platform (GCP) and its core services Experience with CI/CD pipelines Ability to write unit test cases and execute them Able to discuss and propose architectural changes Knowledge of security best practices Strong problem-solving and analytical skills Excellent communication and collaboration abilities Bachelors degree in Computer Science or a related field (or equivalent practical experience) Optional Skills (a plus) Experience with any front-end framework such as Angular, React, Vue.js , etc. Familiarity with DevOps principles and practices. Experience with infrastructure-as-code tools like Terraform. Knowledge of containerization technologies such as Docker and Kubernetes.
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
You are a technical and hands-on Lead Data Engineer with over 8 years of experience, responsible for driving the modernization of data transformation workflows within the organization. Your primary focus will be on migrating legacy SQL-based ETL logic to DBT-based transformations and designing a scalable, modular DBT architecture. You will also be tasked with auditing and refactoring legacy SQL code for clarity, efficiency, and modularity. In this role, you will lead the improvement of CI/CD pipelines for DBT, including automated testing, deployment, and code quality enforcement. Collaboration with data analysts, platform engineers, and business stakeholders is essential to understand current gaps and define future data pipelines. Additionally, you will own Airflow orchestration redesign where necessary and define coding standards, review processes, and documentation practices. As a Lead Data Engineer, you will coach junior data engineers on DBT and SQL best practices and provide lineage and impact analysis improvements using DBT's built-in tools and metadata. Key qualifications for this role include proven experience in migrating legacy SQL to DBT, a deep understanding of DBT best practices, proficiency in SQL performance tuning and query optimization, and hands-on experience with modern data stacks such as Snowflake or BigQuery. Strong communication and leadership skills are essential for this role, as you will be required to work cross-functionally and collaborate with various teams within the organization. Exposure to Python, data governance and lineage tools, and mentoring experience are considered nice-to-have qualifications for this position. If you are passionate about modernizing data transformation workflows and driving technical excellence within the organization, this role is the perfect fit for you.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Engineer, VP at our Pune location in India, you will be responsible for managing and performing work across various areas of the bank's IT Platform/Infrastructure. Your role will involve analysis, development, and administration, with possible oversight of engineering delivery for specific departments. Your day-to-day tasks will include planning and developing engineering solutions to achieve business goals, ensuring reliability and resiliency in solutions, and promoting maintainability and reusability. You will play a key role in architecting well-integrated solutions and reviewing engineering plans to enhance capability and reusability. You will collaborate with a cross-functional agile delivery team, bringing an innovative approach to software development using the latest technologies and practices to deliver business value efficiently. Your focus will be on fostering a collaborative environment, open code sharing, and supporting all stages of software delivery from analysis to production support. In this role, you will enjoy benefits such as a best-in-class leave policy, gender-neutral parental leaves, sponsorship for industry certifications, employee assistance programs, comprehensive insurance coverage, and health screening. You will be expected to lead engineering efforts, champion best practices, collaborate with stakeholders to achieve business outcomes, and acquire functional knowledge of the business capabilities being digitized. Key Skills required: - GCP Services: Composer, BigQuery, DataProc, GCP Cloud Architecture, etc. - Big Data Hadoop: Hive, HQL, HDFS - Programming: Python, PySpark, SQL Query writing - Scheduler: Control-M or any other scheduler - Experience in Database engines (e.g., SQL Server, Oracle), ETL Pipeline development, Tableau, Looker, and performance tuning - Proficiency in architecture design, technical documentation, and mapping business requirements with technology Desired Skills: - Understanding of Workflow automation and Agile methodology - Terraform Coding and experience in Project Management - Prior experience in Banking/Finance domain and hybrid cloud solutions, preferably using GCP - Product development experience Join us to excel in your career with training, coaching, and continuous learning opportunities. Our culture promotes responsibility, commercial thinking, initiative, and collaboration. We value a positive, fair, and inclusive work environment where we celebrate the successes of our people. Embrace the empowering culture at Deutsche Bank Group and be part of our success together. For more information about our company and teams, please visit our website at https://www.db.com/company/company.htm.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a GCP Senior Data Engineer/Architect, you will play a crucial role in our team by designing, developing, and implementing robust and scalable data solutions on the Google Cloud Platform (GCP). Collaborating closely with Architects and Business Analysts, especially for our US clients, you will translate data requirements into effective technical solutions. Your responsibilities will include designing and implementing scalable data warehouse and data lake solutions, orchestrating complex data pipelines, leading cloud data lake implementation projects, participating in cloud migration projects, developing containerized applications, optimizing SQL queries, writing automation scripts in Python, and utilizing various GCP data services such as BigQuery, Bigtable, and Cloud SQL. Your expertise in data warehouse and data lake design and implementation, experience in data pipeline development and tuning, hands-on involvement in cloud migration and data lake projects, proficiency in Docker and GKE, strong SQL and Python scripting skills, and familiarity with GCP services like BigQuery, Cloud SQL, Dataflow, and Composer will be essential for this role. Additionally, knowledge of data governance principles, experience with dbt, and the ability to work effectively within a team and adapt to project needs are highly valued. Strong communication skills, the willingness to work in UK shift timings, and the openness to giving and receiving feedback are important traits that will contribute to your success in this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Software Engineer specializing in Full Stack / Data Engineering, you will be based in Pune with a minimum experience of 5 years. Your primary responsibilities will revolve around designing and developing secure, high-performance systems specifically tailored for financial services. This will involve leveraging your strong programming skills in Python, Java, and JavaScript to build scalable systems within the BFSI / Credit Union domain. You will be expected to utilize your proficiency in Apache Spark and experience with cloud platforms such as GCP or AWS to optimize data pipelines and develop full-stack applications. Your understanding of API development, microservices, and secure transactions will be crucial in ensuring the scalability, resilience, and compliance with BFSI security standards. In addition to your core responsibilities, you will be collaborating closely with architects, QA, DevOps, and business teams to deliver production-grade code. Your involvement in agile ceremonies will play a pivotal role in the overall success of the projects. Furthermore, you will also be tasked with mentoring junior engineers to contribute to technical excellence within the team. While your primary focus will be on the must-have skills mentioned above, having additional experience with ETL tools like Snowflake or BigQuery, frontend technologies such as Vue.js, React, or Angular, and testing frameworks like JUnit, Cucumber, and Selenium would be advantageous in this role. Your ability to adapt to new technologies and frameworks will be key in staying current and delivering innovative solutions.,
Posted 1 week ago
5.0 - 13.0 years
0 Lacs
karnataka
On-site
Dexcom Corporation is a pioneer and global leader in continuous glucose monitoring (CGM), with a vision to forever change how diabetes is managed and to provide actionable insights for better health outcomes. With a history of 25 years in the industry, Dexcom is broadening its focus beyond diabetes to empower individuals to take control of their health. The company is dedicated to developing solutions for serious health conditions and aims to become a leading consumer health technology company. The software quality team at Dexcom is collaborative and innovative, focusing on ensuring the reliability and performance of CGM systems. The team's mission is to build quality into every stage of the development lifecycle through smart automation, rigorous testing, and a passion for improving lives. They are seeking individuals who are eager to grow their skills while contributing to life-changing technology. As a member of the team, your responsibilities will include participating in building quality into products by writing automated tests, contributing to software requirements and design specifications, designing, developing, executing, and maintaining automated and manual test scripts, creating verification and validation test plans, traceability matrices, and test reports, as well as recording and tracking issues using the bug tracking system. You will also analyze test failures, collaborate with development teams to investigate root causes, and contribute to the continuous improvement of the release process. To be successful in this role, you should have 13 years of hands-on experience in software development or software test development using Python or other object-oriented programming languages. Experience with SQL and NoSQL databases, automated test development for API testing, automated testing frameworks like Robot Framework, API testing, microservices, distributed systems in cloud environments, automated UI testing, cloud platforms like Google Cloud or AWS, containerization tools such as Docker and Kubernetes, and familiarity with FDA design control processes in the medical device industry are desired qualifications. Additionally, knowledge of GCP tools like Airflow, Dataflow, and BigQuery, distributed event streaming platforms like Kafka, performance testing, CI/CD experience, and Agile development and test development experience are valued. Effective collaboration across functions, self-starting abilities, and clear communication skills are also essential for success in this role. Please note that Dexcom does not accept unsolicited resumes or applications from agencies. Staffing and recruiting agencies must be authorized to submit profiles, applications, or resumes on specific requisitions. Dexcom is not responsible for any fees related to unsolicited resumes or applications.,
Posted 1 week ago
8.0 - 13.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced GCP Data Engineer with 8+ years of expertise in designing and implementing robust, scalable data architectures on Google Cloud Platform. Your role involves defining and leading the implementation of data architecture strategies using GCP services to meet business and technical requirements. As a visionary GCP Data Architect, you will be responsible for architecting and optimizing scalable data pipelines using Google Cloud Storage, BigQuery, Dataflow, Cloud Composer, Dataproc, and Pub/Sub. You will design solutions for large-scale batch processing and real-time streaming, leveraging tools like Dataproc for distributed data processing. Your responsibilities also include establishing and enforcing data governance, security frameworks, and best practices for data management. You will conduct architectural reviews and performance tuning for GCP-based data solutions, ensuring cost-efficiency and scalability. Collaborating with cross-functional teams, you will translate business needs into technical requirements and deliver innovative data solutions. The required skills for this role include strong expertise in GCP services such as Google Cloud Storage, BigQuery, Dataflow, Cloud Composer, Dataproc, and Pub/Sub. Proficiency in designing and implementing data processing frameworks for ETL/ELT, batch, and real-time workloads is essential. You should have an in-depth understanding of data modeling, data warehousing, and distributed data processing using tools like Dataproc and Spark. Hands-on experience with Python, SQL, and modern data engineering practices is required. Your knowledge of data governance, security, and compliance best practices on GCP will be crucial in this role. Strong problem-solving, leadership, and communication skills are necessary for guiding teams and engaging stakeholders effectively.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
As a GA 4 Analyst at Wildnet Technologies, you will play a crucial role in collaborating with Account Managers to understand client goals, challenges, and Key Performance Indicators (KPIs). Your responsibilities will involve analyzing campaign performance across various digital channels using tools like GA4, Search Console, BigQuery, Looker Studio, and CRM/reporting platforms. You will be tasked with creating clear and actionable monthly performance reports and dashboards for both clients and internal teams. Identifying opportunities to enhance conversion rates across websites, landing pages, email campaigns, and paid media funnels will be a key part of your role. Leveraging data trends, audience behavior, and attribution modeling, you will contribute to shaping marketing strategy and resource allocation. Attending client meetings virtually to present performance insights, explain outcomes, and guide campaign direction will also be part of your routine. You will closely collaborate with delivery teams specializing in SEO, PPC, Content, Social, etc., to drive performance improvements based on data analysis. Monitoring industry benchmarks and sector-specific trends to enhance campaign relevance and impact will be essential. Additionally, working with the head of IT and head of AI to influence internal processes for data-driven campaign planning and continuous improvement is a part of your responsibilities. To excel in this role, you should have proven experience in an analytics and reporting role. Proficiency in GA4, Google Search Console, BigQuery, Looker Studio, and CRM/marketing automation platforms is required. A solid understanding of digital channels and their interactions (PPC, SEO, social, email, CRO, etc.), along with a commercial mindset focusing on ROI, customer journeys, and growth-driving marketing metrics, is crucial. You should be able to translate complex data into easily understandable insights for Account Managers and clients using visualization tools. Confidence in presenting insights to clients and working effectively as part of a cross-functional team is essential. Familiarity with A/B testing, conversion rate optimization principles, and customer funnel strategy is advantageous. Being organized, proactive, and adept at managing multiple accounts and priorities is key. Strong communication and relationship-building skills are a requirement for this role. Joining Wildnet Technologies offers you the opportunity to be part of an established industry leader with over 15 years of expertise in digital marketing and IT services. As a Great Place to Work Certified company, we are committed to fostering a flexible, positive, and people-first work culture. You will benefit from a fast-paced environment with continuous training, career advancement, and leadership development opportunities. We prioritize the health and wellness of our employees and offer comprehensive insurance and wellness support for you and your family. Our focus on work-life balance is reflected in flexible working hours, a 5-day work week, and a generous leave policy to support your personal well-being. You will also have exposure to diverse projects with leading global brands across various industries.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Senior Generative AI Engineer, your primary role will involve conducting original research on generative AI models. You will focus on exploring model architecture, training methodologies, fine-tuning techniques, and evaluation strategies. It is essential to maintain a strong publication record in esteemed conferences and journals, demonstrating your valuable contributions to the fields of Natural Language Processing (NLP), Deep Learning (DL), and Machine Learning (ML). In addition, you will be responsible for designing and experimenting with multimodal generative models that incorporate various data types such as text, images, and other modalities to enhance AI capabilities. Your expertise will be crucial in developing autonomous AI systems that exhibit agentic behavior, enabling them to make independent decisions and adapt to dynamic environments. Leading the design, development, and implementation of generative AI models and systems will be a key aspect of your role. This involves selecting suitable models, training them on extensive datasets, fine-tuning hyperparameters, and optimizing overall performance. It is imperative to have a deep understanding of the problem domain to ensure effective model development and implementation. Furthermore, you will be tasked with optimizing generative AI algorithms to enhance their efficiency, scalability, and computational performance. Techniques such as parallelization, distributed computing, and hardware acceleration will be utilized to maximize the capabilities of modern computing architectures. Managing large datasets through data preprocessing and feature engineering to extract critical information for generative AI models will also be a crucial aspect of your responsibilities. Your role will also involve evaluating the performance of generative AI models using relevant metrics and validation techniques. By conducting experiments, analyzing results, and iteratively refining models, you will work towards achieving desired performance benchmarks. Providing technical leadership and mentorship to junior team members, guiding their development in generative AI, will also be part of your responsibilities. Documenting research findings, model architectures, methodologies, and experimental results thoroughly is essential. You will prepare technical reports, presentations, and whitepapers to effectively communicate insights and findings to stakeholders. Additionally, staying updated on the latest advancements in generative AI by reading research papers, attending conferences, and engaging with relevant communities is crucial to foster a culture of learning and innovation within the team. Mandatory technical skills for this role include strong programming abilities in Python and familiarity with frameworks like PyTorch or TensorFlow. In-depth knowledge of Deep Learning concepts such as CNN, RNN, LSTM, Transformers LLMs (BERT, GEPT, etc.), and NLP algorithms is required. Experience with frameworks like Langgraph, CrewAI, or Autogen for developing, deploying, and evaluating AI agents is also essential. Preferred technical skills include expertise in cloud computing, particularly with Google/AWS/Azure Cloud Platform, and understanding Data Analytics Services offered by these platforms. Hands-on experience with ML platforms like GCP: Vertex AI, Azure: AI Foundry, or AWS SageMaker is desirable. Strong communication skills, the ability to work independently with minimal supervision, and a proactive approach to escalate when necessary are also key attributes for this role. If you have a Master's or PhD degree in Computer Science and 6 to 8 years of experience with a strong record of publications in top-tier conferences and journals, this role could be a great fit for you. Preference will be given to research scholars from esteemed institutions like IITs, NITs, and IIITs.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough