Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
40 - 45 Lacs
Pune
Work from Office
: Job Title - Data Platform Engineer - Tech Lead Location - Pune, India Role Description DB Technology is a global team of tech specialists, spread across multiple trading hubs and tech centers. We have a strong focus on promoting technical excellence our engineers work at the forefront of financial services innovation using cutting-edge technologies. DB Pune location plays a prominent role in our global network of tech centers, it is well recognized for its engineering culture and strong drive to innovate. We are committed to building a diverse workforce and to creating excellent opportunities for talented engineers and technologists. Our tech teams and business units use agile ways of working to create best solutions for the financial markets. CB Data Services and Data Platform We are seeking an experienced Software Engineer with strong leadership skills to join our dynamic tech team. In this role, you will lead a group of engineers working on cutting-edge technologies in Hadoop, Big Data, GCP, Terraform, Big Query, Data Proc and data management. You will be responsible for overseeing the development of robust data pipelines, ensuring data quality, and implementing efficient data management solutions. Your leadership will be critical in driving innovation, ensuring high standards in data infrastructure, and mentoring team members. Your responsibilities will include working closely with data engineers, analysts, cross-functional teams, and other stakeholders to ensure that our data platform meets the needs of our organization and supports our data-driven initiatives. Join us in building and scaling our tech solutions including hybrid data platform to unlock new insights and drive business growth. If you are passionate about data engineering, we want to hear from you! Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Technical Leadership: Lead a cross-functional team of engineers in the design, development, and implementation of on prem and cloud-based data solutions. Provide hands-on technical guidance and mentorship to team members, fostering a culture of continuous learning and improvement. Collaborate with product management and stakeholders to define technical requirements and establish delivery priorities. . Architectural and Design Capabilities: Architect and implement scalable, efficient, and reliable data management solutions to support complex data workflows and analytics. Evaluate and recommend tools, technologies, and best practices to enhance the data platform. Drive the adoption of microservices, containerization, and serverless architectures within the team. Quality Assurance: Establish and enforce best practices in coding, testing, and deployment to maintain high-quality code standards. Oversee code reviews and provide constructive feedback to promote code quality and team growth. Your skills and experience Technical Skills: Bachelor's or Masters degree in Computer Science, Engineering, or related field. 7+ years of experience in software engineering, with a focus on Big Data and GCP technologies such as Hadoop, PySpark, Terraform, BigQuery, DataProc and data management. Proven experience in leading software engineering teams, with a focus on mentorship, guidance, and team growth. Strong expertise in designing and implementing data pipelines, including ETL processes and real-time data processing. Hands-on experience with Hadoop ecosystem tools such as HDFS, MapReduce, Hive, Pig, and Spark. Hands on experience with cloud platform particularly Google Cloud Platform (GCP), and its data management services (e.g., Terraform, BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Storage). Solid understanding of data quality management and best practices for ensuring data integrity. Familiarity with containerization and orchestration tools such as Docker and Kubernetes is a plus. Excellent problem-solving skills and the ability to troubleshoot complex systems. Strong communication skills and the ability to collaborate with both technical and non-technical stakeholders Leadership Abilities: Proven experience in leading technical teams, with a track record of delivering complex projects on time and within scope. Ability to inspire and motivate team members, promoting a collaborative and innovative work environment. Strong problem-solving skills and the ability to make data-driven decisions under pressure. Excellent communication and collaboration skills. Proactive mindset, attention to details, and constant desire to improve and innovate. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 weeks ago
7.0 - 12.0 years
16 - 20 Lacs
Pune
Work from Office
: Job TitleData Engineer (ETL, Big Data, Hadoop, Spark, GCP), AS Location:Pune, India Role Description Engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Hands-on experience for various data sourcing in Hadoop also GCP. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience: More than 7+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark, SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Banking experience regulatory and cross product knowledge. Passionate about test driven development. How well support you . . .
Posted 3 weeks ago
15.0 - 20.0 years
32 - 40 Lacs
Pune
Work from Office
: Job TitleSenior Engineer, VP LocationPune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the endto-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities The candidate is expected to Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your skills and experience Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience in Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integrationpatterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such asCI/CD pipelines using Jenkins, Git Actions etc Experience on leading teams and mentoring developers Key Skill: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How well support you . . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 weeks ago
0.0 - 3.0 years
6 - 8 Lacs
Noida
Work from Office
3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)
Posted 3 weeks ago
0.0 - 1.0 years
8 - 10 Lacs
Hyderabad
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 3 weeks ago
4.0 - 8.0 years
22 - 25 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)
Posted 3 weeks ago
7.0 - 12.0 years
25 - 27 Lacs
Hyderabad
Work from Office
3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)
Posted 3 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
Pune, Bengaluru, Delhi / NCR
Hybrid
Mandatory Skills: Apache beam,Big-Query,Dataflow,DataProc,Composer,Airflow,Pyspark,Python,SQL.
Posted 3 weeks ago
6.0 - 11.0 years
12 - 22 Lacs
Chennai
Hybrid
Greetings from Getronics! We have permanent opportunities for GCP Data Engineers for Chennai Location . Hope you are doing well! This is Jogeshwari from Getronics Talent Acquisition team. We have multiple opportunities for GCP Data Engineers. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to jogeshwari.k@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 6+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 6+ years of professional experience: Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Jogeshwari Senior Specialist
Posted 3 weeks ago
1.0 - 2.0 years
3 - 6 Lacs
Dhule
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 3 weeks ago
4.0 - 6.0 years
6 - 16 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
What youll be doing We are looking for data engineers who can work with world class team members to help drive telecom business to its full potential. We are building data products / assets for telecom wireless and wireline business which includes consumer analytics, telecom network performance and service assurance analytics etc. We are working on cutting edge technologies like digital twin to build these analytical platforms and provide data support for varied AI ML implementations. As a data engineer you will be collaborating with business product owners, coaches, industry renowned data scientists and system architects to develop strategic data solutions from sources which includes batch, file and data streams As a Data Engineer with ETL/ELT expertise for our growing data platform & analytics teams, you will understand and enable the required data sets from different sources both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Company. Understanding the business requirements and converting them to the technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. Understanding devops process and contributing for devops pipelines What were looking for... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. You’ll need to have: Bachelor’s degree or four or more years of work experience. Experience with Data Warehouse concepts and Data Management life cycle. Experience in GCP cloud platform - (BigQuery/Cloud Composer/Data Proc(or Hadoop+Spark))/Cloud Function). Experience in any programming language preferably Python. Proficiency in graph data modeling, including experience with graph data models and graph query language. Exposure in working on GenAI use cases. Experience in troubleshooting the data issues. Experience in writing complex SQL and performance tuning. Experience in DevOps Experience in GraphDB , Core Java Experience in real time streaming and lambda architecture. Role & responsibilities Preferred candidate profile
Posted 3 weeks ago
1.0 - 2.0 years
3 - 5 Lacs
Ahmedabad
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 3 weeks ago
13.0 - 17.0 years
32 - 35 Lacs
Noida, Gurugram
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 3 weeks ago
3.0 - 5.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 3 weeks ago
12.0 - 18.0 years
50 - 90 Lacs
Bengaluru
Hybrid
What youll be doing: As a Sr Manager for the Data Engineering team, you will be managing data platforms and implementing new technologies and tools to further enhance and enable data science/analytics, focus to drive scalable data management and governance practices. Leading the team of data engineers & solutions architects to deliver solutions to business teams. Driving the vision with the leadership team for data platform enrichment covering the areas like Data Warehousing/Data Lake/BI across the portfolio. Defining and executing on a plan to achieve that vision. Building a high-quality Data engineering team and continuing to drive to scale up. Ensuring the team adheres to the standard methodologies on data engineering practices. Building cross-functional relationships with Data Scientists, Data Analysts and Business teams to understand data needs and deliver data for insight solutions. Driving the design, building, and launching of new data models and data pipelines. Driving data quality across all data pipelines and related business areas. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. What we’re looking for: You are curious and passionate about Data and highly scalable data platforms. People count on you for your expertise in data management in all phases of the software development cycle. You create environments where teams thrive and feel valued, respected and supported. You enjoy the challenge of managing resources and competing priorities in a dynamic, complex and deadline-oriented environment. Building effective working relationships with other managers across the organization comes naturally to you. You’ll need to have: Bachelor’s degree or four or more years of work experience Six or more years of relevant work experience Two or more years of experience in leading the team and tracking the end-to-end deliverables Experience in end-to-end delivery of Data Platform Solutions and working on large scale data transformation Experience working with Google Cloud Platform, BigQuery & Data Proc Experience working with Bigdata Technologies & Utilities - Hadoop/Spark/Scala/Kafka/NiFi Experience with relational SQL and NoSQL databases Experience in working with globally distributed teams Good Communication and Presentation skills Knowledge of Data Governance and Data Quality Experience in building / mentoring the team Ability to meet tight deadlines, multi-task, and prioritize workload Role & responsibilities Preferred candidate profile
Posted 3 weeks ago
15.0 - 20.0 years
4 - 8 Lacs
Mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role :Analytics and Modelor Project Role Description :Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills :Google BigQuery, SSI:NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job :Key Responsibilities :Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX)1:Proven track record of delivering data integration, data warehousing soln2:Strong SQL And Hands-on (No FLEX)2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX)4:understanding on cloud native services :bucket storage, GBQ, cloud function, pub sub, composer, and KubernetesExp in cloud solutions, mainly data platform services , GCP Certifications 5:Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience :1:Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred2:Strong hands-on experience with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX)3:Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline4:Open mindset, ability to quickly adapt new technologies5:Performance tuning of BigQuery SQL scripts6:GCP Certified preferred7:Working in agile environment Professional Attributes :1:Must have good communication skills 2:Must have ability to collaborate with different teams and suggest solutions 3:Ability to work independently with little supervision or as a team 4:Good analytical problem solving skills 5:Good team handling skills Educational Qualification:15 years of Full time education Qualification 15 years full time education
Posted 3 weeks ago
5.0 - 10.0 years
20 - 35 Lacs
Pune
Work from Office
Description: Hiring Data Engineer with AWS or GCP Cloud Requirements: Role Summary: The Data Engineer will be responsible for designing, implementing, and maintaining the data infrastructure and pipelines necessary for AI/ML model training and deployment. They will work closely with data scientists and engineers to ensure data is clean, accessible, and efficiently processed Required Experience: • 6-8 years of experience in data engineering, ideally in financial services. • Strong proficiency in SQL, Python, and big data technologies (e.g., Hadoop, Spark). • Experience with cloud platforms (e.g., AWS, Azure, GCP) and data warehousing solutions. • Familiarity with ETL processes and tools. • Knowledge of data governance, security, and compliance best practices. Job Responsibilities: Key Responsibilities: • Build and maintain scalable data pipelines for data collection, processing, and analysis. • Ensure data quality and consistency for training and testing AI models. • Collaborate with data scientists and AI engineers to provide the required data for model development. • Optimize data storage and retrieval to support AI-driven applications. • Implement data governance practices to ensure compliance and security. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 3 weeks ago
2.0 - 6.0 years
7 - 11 Lacs
Hyderabad
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact! IBM’s Cloud Services are focused on supporting clients on their cloud journey across any platform to achieve their business goals. It encompasses Cloud Advisory, Architecture, Cloud Native Development, Application Portfolio Migration, Modernization, and Rationalization as well as Cloud Operations. Cloud Services supports all public/private/hybrid Cloud deployments: IBM Bluemix/IBM Cloud/Red Hat/AWS/ Azure/Google and client private environments. Cloud Services has the best Cloud developer architect Complex SI, Sys Ops and delivery talent delivered through our GEO CIC Factory model. As a member of our Cloud Practice you will be responsible for defining and implementing application cloud migration, modernisation and rationalisation solutions for clients across all sectors. You will support mobilisation and help to lead the quality of our programmes and services, liaise with clients and provide consulting services including: Create cloud migration strategies; defining delivery architecture, creating the migration plans, designing the orchestration plans and more. Assist in creating and executing of migration run books Evaluate source cloud (Physical Virtual and Cloud) and target Workloads Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues and deploy applications to the cloud platform
Posted 4 weeks ago
2.0 - 6.0 years
7 - 11 Lacs
Bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeComprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues and deploy applications to the cloud platform
Posted 4 weeks ago
4.0 - 9.0 years
20 - 35 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.
Posted 1 month ago
3.0 - 8.0 years
14 - 24 Lacs
Chennai
Hybrid
Greetings! We have permanent opportunities for GCP Data Engineers in Chennai Location . Experience Required : 3 Years and above Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Skill Required: GCP Data Engineer, Advanced SQL, ETL Data pieplines, BigQuery, Dataflow, Bigtable, Data fusion, cloud spanner, python, java, javascript, If interested, kindly share the below details along with updated CV and to Narmadha.baskar @getronics.com Regards, Narmadha Getronics Recruitment team
Posted 1 month ago
12.0 - 19.0 years
30 - 40 Lacs
Pune, Chennai, Bengaluru
Work from Office
Strong understanding of data warehousing and data modeling Proficient understanding of distributed computing principles - Hadoop v2, MapReduce, HDFS Strong data engineering skills on GCP cloud platforms Airflow, Cloud Composer, Data Fusion, Data Flow, Data Proc, Big Query Experience with building stream-processing systems, using solutions such as Storm or Spark- Streaming Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala Experience with Spark, SQL, and Knowledge of various ETL techniques and frameworks, such as Flume, Apache NiFi, or Experience with various messaging systems, such as Kafka or Good understanding of Lambda Architecture, along with its advantages and drawbacks
Posted 1 month ago
3.0 - 6.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Why this job matters We are searching for a proficient AI/ML engineer who can help us to extract value from our data. The resource will be responsible for E2E processes including data collection, cleaning & pre-processing, training of the models and deployment in all production and non-production environments. What youll be doing Understanding business objectives and developing models that help to achieve them, along with metrics to track their progress. Analysing the ML algorithms that could be used to solve a given problem and ranking them by their success probability. Analysing the ML algorithms that could be used to solve a given problem and ranking them by their success probability. Verifying data quality, and/or ensuring it via data cleaning. Supervising the data acquisition process if more data is needed. Defining validation strategies Defining the pre-processing or feature engineering to be done on given data. Defining data augmentation pipelines. Training models and tuning their hyperparameters. analysing the errors of the model and designing strategies to overcome them. Perform statistical analysis and fine-tuning using test results. Train and retrain systems when necessary. Strong knowledge on model deployment pipeline MLOPS and knowledge of AWS/GCP deployment. Skills Required Proven experience (4 or more years) as a Machine Learning Engineer/ Artificial Intelligence Engineer or similar role. Solving business problems using Machine Learning algorithms, Deep Learning/Neural Network algorithms, Sequential model development, and Time series data modelling. Experience with Computer Vision techniques, Convolutional Neural Networks (CNN), Generative AI, and Large Language Models (LLMs) Experience with deploying models using MLOps pipelines. Proficiency in handling both structured and unstructured data, including SQL, BigQuery, and DataProc. Hands-on experience with API development using frameworks like Flask, Django, and FastAPI. Automating business and functional operations using AIOps. Experience with cloud platforms such as GCP and AWS, and tools like Qlik (Added advantage) Understanding of data structures, data modelling and software architecture. Expertise in visualizing and manipulating big datasets. Deep knowledge of math, probability, statistics and algorithms. Proficiency with Python and basic libraries for machine learning such as scikit-learn and pandas. Knowledge in R or Java is a plus. Proficiency in TensorFlow or Keras and OpenCV is a plus. Excellent communication skills. Team player. Outstanding analytical and problem-solving skill. Familiarity with Linux environment. Low to medium familiarity with JIRA, GIT, Nexus, Jenkins etc is a plus. Minimum educational qualification: BE/B.Tech or similar degree in relevant field. The skills youll need Troubleshooting Agile Development Database Design/Development Debugging Programming/Scripting Microservices/Service Oriented Architecture Version Control IT Security Cloud Computing Continuous Integration/Continuous Deployment Automation & Orchestration Software Testing Application Development Algorithm Design Software Development Lifecycle Decision Making Growth Mindset Inclusive Leadership
Posted 1 month ago
5.0 - 10.0 years
18 - 25 Lacs
Sholinganallur
Hybrid
Skills Required:Big Query,, BigTable, Data Flow, Pub/Sub, Data fusion, Dataproc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Function, App Engine, AIRFLOW, Cloud Storage, BigTable, Cloud Spanner Skills Preferred:ETL Experience Required:• 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). • 5+ years of SQL development experience • 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. • Strong understanding and experience of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner • Experience developing with micro service architecture from container orchestration framework. • Designing pipelines and architectures for data processing • Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. • Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team • Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. • Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support • Evidence of a proactive mindset to problem solving and willingness to take the initiative. • Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines Thanks & Regards, Varalakshmi V 9019163564
Posted 1 month ago
3.0 - 5.0 years
5 - 5 Lacs
Kochi, Thiruvananthapuram
Work from Office
Role Proficiency: Independently develops error free code with high quality validation of applications guides other developers and assists Lead 1 - Software Engineering Outcomes: Understand and provide input to the application/feature/component designs; developing the same in accordance with user stories/requirements. Code debug test document and communicate product/component/features at development stages. Select appropriate technical options for development such as reusing improving or reconfiguration of existing components. Optimise efficiency cost and quality by identifying opportunities for automation/process improvements and agile delivery models Mentor Developer 1 - Software Engineering and Developer 2 - Software Engineering to effectively perform in their roles Identify the problem patterns and improve the technical design of the application/system Proactively identify issues/defects/flaws in module/requirement implementation Assists Lead 1 - Software Engineering on Technical design. Review activities and begin demonstrating Lead 1 capabilities in making technical decisions Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to schedule / timelines Adhere to SLAs where applicable Number of defects post delivery Number of non-compliance issues Reduction of reoccurrence of known defects Quick turnaround of production bugs Meet the defined productivity standards for project Number of reusable components created Completion of applicable technical/domain certifications Completion of all mandatory training requirements Outputs Expected: Code: Develop code independently for the above Configure: Implement and monitor configuration process Test: Create and review unit test cases scenarios and execution Domain relevance: Develop features and components with good understanding of the business problem being addressed for the client Manage Project: Manage module level activities Manage Defects: Perform defect RCA and mitigation Estimate: Estimate time effort resource dependence for one's own work and others' work including modules Document: Create documentation for own work as well as perform peer review of documentation of others' work Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Status Reporting: Report status of tasks assigned Comply with project related reporting standards/process Release: Execute release process Design: LLD for multiple components Mentoring: Mentor juniors on the team Set FAST goals and provide feedback to FAST goals of mentees Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Develop user interfaces business software components and embedded software components 5 Manage and guarantee high levels of cohesion and quality6 Use data models Estimate effort and resources required for developing / debugging features / components Perform and evaluate test in the customer or target environment Team Player Good written and verbal communication abilities Proactively ask for help and offer help Knowledge Examples: Appropriate software programs / modules Technical designing Programming languages DBMS Operating Systems and software platforms Integrated development environment (IDE) Agile methods Knowledge of customer domain and sub domain where problem is solved Additional Comments: UST is looking for Java Senior developers to build end to end business solutions and to work with one of the leading financial services organization in the UK. The ideal candidate must possess strong background on frontend and backend development technologies. The candidate must possess excellent written and verbal communication skills with the ability to collaborate effectively with domain experts and technical experts in the team. Responsibilities: As a Java developer, you will - Maintain active relationships with Product Owner to understand business requirements, lead requirement gathering meetings and review designs with the product owner - Own backlog items and coordinate with other team members to develop the features planned for each sprint - Perform technical design reviews and code reviews - Mentor, Lead and Guide the team on technical skills - Be Responsible for prototyping, developing, and troubleshooting software in the user interface or service layers - Perform peer reviews on source code to ensure reuse, scalability and the use of best practices - Participate in collaborative technical discussions that focus on software user experience, design, architecture, and development - Perform demonstrations for client stakeholders on project features and sub features, which utilizes the latest Front end and Backend development technologies Requirements: - 5+ years of experience in Java/JEE development - Skills in developing applications using multi-tier architecture - 2+ years of experience in GCP service development is preferred - Skills in developing applications in GCP is preferred - Should be an expert in Cloud Composer, Data Flow, Dataproc, Cloud pub/sub, DAG creation - Python scripting knowledge is preferred - Apache Beam knowledge is mandatory - Java/JEE, Spring, Spring boot, REST/SOAP web services, Hibernate, SQL, Tomcat, Application servers (WebSphere), SONAR, Agile, AJAX, Jenkins... - Skills in UML, application designing/architecture, Design Patterns.. - Skills in Unit testing application using Junit or similar technologies - Capability to support QA teams with test plans, root cause analysis and defect fixing - Strong experience in Responsive design, cross browser web applications - Strong knowledge of web service models - Strong knowledge in creating and working with APIs - Experience with Cloud services, specifically on Google cloud - Strong exposure in Agile, Scaled Agile based development models - Familiar with Interfaces such as REST web services, swagger profiles, JSON payloads. - Familiar with tools/utilities such as Bitbucket / Jira / Confluence. Required Skills Java,Spring ,Spring Boot,Microservices
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough