Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4 - 9 years
14 - 19 Lacs
Pune
Work from Office
About The Role : We are looking for a passionate and self-motivated Technology Leader to join our team in Accounting domain. Being part of a diverse multi-disciplinary global team, you will collaborate with other disciplines to shape technology strategy, drive engineering excellence and deliver business outcomes. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy * Best in class leave policy * Gender neutral parental leaves * 100% reimbursement under childcare assistance benefit (gender neutral) * Sponsorship for Industry relevant certifications and education * Employee Assistance Program for you and your family members * Comprehensive Hospitalization Insurance for you and your dependents * Accident and Term life Insurance * Complementary Health screening for 35 yrs. and above This role is responsible for Design and Implementation of the high-quality technology solutions. The candidate should have demonstrated technical expertise having excellent problem-solving skills. The candidate is expected to; be a hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities champion engineering best practices and guide/mentor team to achieve high performance. work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. acquire functional knowledge of the business capability being digitized/re-engineered. demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. focus on upskilling people, team building and career development. keeping up-to-date with industry trends and developments. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Advantageous: * Having prior experience in Banking/Finance domain * Having worked on hybrid cloud solutions preferably using GCP * Having worked on product development How we'll support you: * Training and development to help you excel in your career * Coaching and support from experts in your team * A culture of continuous learning to aid progression * A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 4 months ago
11 - 21 years
25 - 40 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 3 - 20 Yrs Location- Pan India Job Description : - Skills: GCP, BigQuery, Cloud Composer, Cloud DataFusion, Python, SQL 5-20 years of overall experience mainly in the data engineering space, 2+ years of Hands-on experience in GCP cloud data implementation, Experience of working in client facing roles in technical capacity as an Architect. must have implementation experience of GCP based clous Data project/program as solution architect, Proficiency of using Google Cloud Architecture Framework in Data context Expert knowledge and experience of core GCP Data stack including BigQuery, DataProc, DataFlow, CloudComposer etc. Exposure to overall Google tech stack of Looker/Vertex-AI/DataPlex etc. Expert level knowledge on Spark.Extensive hands-on experience working with data using SQL, Python Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms. (Both Cloud and On-Premise) Excellent communications skills with the ability to clearly present ideas, concepts, and solutions If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in or you can reach me @ 8939853050 With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 4 months ago
5 - 10 years
9 - 13 Lacs
Chennai
Work from Office
Overview GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) +Teradata Responsibilities GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata Requirements GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata
Posted 4 months ago
2 - 6 years
7 - 11 Lacs
Hyderabad
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeComprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and Supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues, and deploy applications to the cloud platform
Posted 4 months ago
3 - 6 years
9 - 13 Lacs
Bengaluru
Work from Office
locationsTower 02, Manyata Embassy Business Park, Racenahali & Nagawara Villages. Outer Ring Rd, Bangalore 540065 time typeFull time posted onPosted 5 Days Ago job requisition idR0000388711 About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII: At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Team Overview: Every time a guest enters a Target store or browses Target.com nor the app, they experience the impact of Targets investments in technology and innovation. Were the technologists behind one of the most loved retail brands, delivering joy to millions of our guests, team members, and communities. Join our global in-house technology team of more than 5,000 of engineers, data scientists, architects and product managers striving to make Target the most convenient, safe and joyful place to shop. We use agile practices and leverage open-source software to adapt and build best-in-class technology for our team members and guestsand we do so with a focus on diversity and inclusion, experimentation and continuous learning. AtTarget, we are gearing up for exponential growth and continuously expanding our guest experience. To support this expansion, Data Engineering is building robust warehouses and enhancing existing datasets to meet business needs across the enterprise. We are looking for talented individuals who are passionate about innovative technology, data warehousing and are eager to contribute to data engineering. . Position Overview Assess client needs and convert business requirements into business intelligence (BI) solutions roadmap relating to complex issues involving long-term or multi-work streams. Analyze technical issues and questions identifying data needs and delivery mechanisms Implement data structures using best practices in data modeling, ETL/ELT processes, Spark, Scala, SQL, database, and OLAP technologies Manage overall development cycle, driving best practices and ensuring development of high quality code for common assets and framework components Develop test-driven solutions and provide technical guidance and heavily contribute to a team of high caliber Data Engineers by developing test-driven solutions and BI Applications that can be deployed quickly and in an automated fashion. Manage and execute against agile plans and set deadlines based on client, business, and technical requirements Drive resolution of technology roadblocks including code, infrastructure, build, deployment, and operations Ensure all code adheres to development & security standards About you 4 year degree or equivalent experience 5+ years of software development experience preferably in data engineering/Hadoop development (Hive, Spark etc.) Hands on Experience in Object Oriented or functional programming such as Scala / Java / Python Knowledge or experience with a variety of database technologies (Postgres, Cassandra, SQL Server) Knowledge with design of data integration using API and streaming technologies (Kafka) as well as ETL and other data Integration patterns Experience with cloud platforms like Google Cloud, AWS, or Azure. Hands on Experience on BigQuery will be an added advantage Good understanding of distributed storage(HDFS, Google Cloud Storage, Amazon S3) and processing(Spark, Google Dataproc, Amazon EMR or Databricks) Experience with CI/CD toolchain (Drone, Jenkins, Vela, Kubernetes) a plus Familiarity with data warehousing concepts and technologies. Maintains technical knowledge within areas of expertise Constant learner and team player who enjoys solving tech challenges with global team. Hands on experience in building complex data pipelines and flow optimizations Be able to understand the data, draw insights and make recommendations and be able to identify any data quality issues upfront Experience with test-driven development and software test automation Follow best coding practices & engineering guidelines as prescribed Strong written and verbal communication skills with the ability to present complex technical information in a clear and concise manner to variety of audiences Life at Target- Benefits- Culture-
Posted 4 months ago
5 - 10 years
9 - 19 Lacs
Chennai, Bengaluru, Mumbai (All Areas)
Hybrid
Google BigQuery Location- Pan India Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Key Responsibilities : Analyze and model client market and key performance data Use analytical tools and techniques to develop business insights and improve decisionmaking \n1:Data Proc PubSub Data flow Kalka Streaming Looker SQL No FLEX\n2:Proven track record of delivering data integration data warehousing soln\n3: Strong SQL And Handson Pro in BigQuery SQL languageExp in Shell Scripting Python No FLEX\n4:Exp with data integration and migration projects Oracle SQL Technical Experience : Google BigQuery\n\n1: Expert in Python NO FLEX Strong handson knowledge in SQL NO FLEX Python programming using Pandas NumPy deep understanding of various data structure dictionary array list tree etc experiences in pytest code coverage skills\n2: Exp with building solutions using cloud native services: bucket storage Big Query cloud function pub sub composer and Kubernetes NO FLEX\n3: Pro with tools to automate AZDO CI CD pipelines like ControlM GitHub JIRA confluence CI CD Pipeline Professional Attributes :
Posted 4 months ago
8 - 13 years
12 - 20 Lacs
Bengaluru
Hybrid
Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : Google Cloud Platform Architecture Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Anu bachelors degree Summary: As a Cloud Platform Engineer, you will be responsible for designing, building, testing, and deploying cloud application solutions that integrate cloud and non-cloud infrastructure. You will deploy infrastructure and platform environments, create proof of architecture to test architecture viability, security, and performance. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the implementation of cloud solutions - Optimize cloud infrastructure for performance and cost-efficiency - Troubleshoot and resolve technical issues Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Platform Architecture - Strong understanding of cloud architecture principles - Experience with DevOps practices - Experience with Google Cloud SQL - Hands-on experience in cloud deployment and management - Knowledge of security best practices in cloud environments Additional Information: - The candidate should have a minimum of 7.5 years of experience in Google Cloud Platform Architecture - This position is based at our Bengaluru office - A bachelors degree is required
Posted 4 months ago
5.0 - 7.0 years
13 - 17 Lacs
hyderabad
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted Date not available
3.0 - 5.0 years
5 - 15 Lacs
hyderabad
Work from Office
About the Role We are looking for a skilled GCP Data Engineer to design, build, and optimize scalable data pipelines and platforms on Google Cloud. The ideal candidate will have hands-on experience with BigQuery, Dataflow, Composer, and Cloud Storage , along with strong SQL and programming skills. Key Responsibilities Design, build, and maintain ETL/ELT pipelines on GCP. Develop scalable data models using BigQuery and optimize query performance. Orchestrate workflows using Cloud Composer (Airflow) . Work with both structured and unstructured data from diverse sources. Implement data quality checks, monitoring, and governance frameworks. Collaborate with Data Scientists, Analysts, and Business teams to deliver reliable datasets. Ensure data security, compliance, and cost optimization on GCP. Debug, monitor, and improve existing pipelines for reliability and efficiency. Required Skills & Experience Strong experience in GCP services : BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer. Expertise in SQL (BigQuery SQL / Presto SQL) and performance tuning. Hands-on experience in Python/Java/Scala for data processing. Experience with workflow orchestration tools (Airflow, Composer, or similar). Familiarity with CI/CD pipelines, GitHub, and deployments . Knowledge of data warehouse design, dimensional modeling, and best practices . Strong problem-solving and analytical skills. Nice-to-Have Skills Experience with other cloud platforms (AWS/Azure) is a plus. Exposure to Machine Learning pipelines on GCP (Vertex AI). Knowledge of Terraform/Infrastructure as Code . Understanding of real-time streaming solutions (Kafka, Pub/Sub). Education Bachelors/Master’s degree in Computer Science, Engineering, or related field.
Posted Date not available
12.0 - 15.0 years
0 - 16 Lacs
noida
Work from Office
Roles and Responsibilities : Design, develop, and maintain large-scale data pipelines using Airflow on Google Cloud Platform (GCP) for clients. Collaborate with cross-functional teams to identify business requirements and design solutions that meet client needs. Develop complex workflows in Dataproc clusters to process massive datasets stored in BigQuery. Ensure high availability, scalability, and security of GCP infrastructure by monitoring performance metrics and implementing improvements. Job Requirements : 12-15 years of experience in IT services & consulting industry with expertise in cloud orchestration tools like Airflow. Strong understanding of Google Cloud Platform (GCP) services including Data Flow, Dataproc, BigQuery, Cloud Storage etc. . Experience working with big data technologies such as Hadoop ecosystem components like Spark Core/Hive/Pyspark etc. .
Posted Date not available
2.0 - 6.0 years
7 - 11 Lacs
bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeComprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues and deploy applications to the cloud platform
Posted Date not available
6.0 - 8.0 years
40 - 45 Lacs
pune
Work from Office
: Job Title - Data Platform Engineer - Tech Lead Location - Pune, India Role Description DB Technology is a global team of tech specialists, spread across multiple trading hubs and tech centers. We have a strong focus on promoting technical excellence our engineers work at the forefront of financial services innovation using cutting-edge technologies. DB Pune location plays a prominent role in our global network of tech centers, it is well recognized for its engineering culture and strong drive to innovate. We are committed to building a diverse workforce and to creating excellent opportunities for talented engineers and technologists. Our tech teams and business units use agile ways of working to create best solutions for the financial markets. CB Data Services and Data Platform We are seeking an experienced Software Engineer with strong leadership skills to join our dynamic tech team. In this role, you will lead a group of engineers working on cutting-edge technologies in Hadoop, Big Data, GCP, Terraform, Big Query, Data Proc and data management. You will be responsible for overseeing the development of robust data pipelines, ensuring data quality, and implementing efficient data management solutions. Your leadership will be critical in driving innovation, ensuring high standards in data infrastructure, and mentoring team members. Your responsibilities will include working closely with data engineers, analysts, cross-functional teams, and other stakeholders to ensure that our data platform meets the needs of our organization and supports our data-driven initiatives. Join us in building and scaling our tech solutions including hybrid data platform to unlock new insights and drive business growth. If you are passionate about data engineering, we want to hear from you! Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Technical Leadership: Lead a cross-functional team of engineers in the design, development, and implementation of on prem and cloud-based data solutions. Provide hands-on technical guidance and mentorship to team members, fostering a culture of continuous learning and improvement. Collaborate with product management and stakeholders to define technical requirements and establish delivery priorities. . Architectural and Design Capabilities: Architect and implement scalable, efficient, and reliable data management solutions to support complex data workflows and analytics. Evaluate and recommend tools, technologies, and best practices to enhance the data platform. Drive the adoption of microservices, containerization, and serverless architectures within the team. Quality Assurance: Establish and enforce best practices in coding, testing, and deployment to maintain high-quality code standards. Oversee code reviews and provide constructive feedback to promote code quality and team growth. Your skills and experience Technical Skills: Bachelor's or Masters degree in Computer Science, Engineering, or related field. 7+ years of experience in software engineering, with a focus on Big Data and GCP technologies such as Hadoop, PySpark, Terraform, BigQuery, DataProc and data management. Proven experience in leading software engineering teams, with a focus on mentorship, guidance, and team growth. Strong expertise in designing and implementing data pipelines, including ETL processes and real-time data processing. Hands-on experience with Hadoop ecosystem tools such as HDFS, MapReduce, Hive, Pig, and Spark. Hands on experience with cloud platform particularly Google Cloud Platform (GCP), and its data management services (e.g., Terraform, BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Storage). Solid understanding of data quality management and best practices for ensuring data integrity. Familiarity with containerization and orchestration tools such as Docker and Kubernetes is a plus. Excellent problem-solving skills and the ability to troubleshoot complex systems. Strong communication skills and the ability to collaborate with both technical and non-technical stakeholders Leadership Abilities: Proven experience in leading technical teams, with a track record of delivering complex projects on time and within scope. Ability to inspire and motivate team members, promoting a collaborative and innovative work environment. Strong problem-solving skills and the ability to make data-driven decisions under pressure. Excellent communication and collaboration skills. Proactive mindset, attention to details, and constant desire to improve and innovate. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted Date not available
4.0 - 9.0 years
16 - 20 Lacs
pune
Work from Office
About The Role : Job Title Data Engineer (ETL, Big Data, Hadoop, Spark, GCP) LocationPune, India Corporate Title As sistant Vice President Role Description Senior engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in Com in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Work as a senior developer for developing analytics algorithm on top of ingested data. Work as though senior developer for various data sourcing in Hadoop also GCP. Own unit testing UAT deployment end user sign off and prod go live. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience: More than 10 years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark ,SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Banking experience regulatory and cross product knowledge. Passionate about test driven development. Prior experience with release management tasks and responsibilities. Data visualization experience in tableau is good to have. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.
Posted Date not available
5.0 - 10.0 years
7 - 12 Lacs
bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google Dataproc, Apache Spark Good to have skills : Apache Airflow Minimum 5 year(s) of experience is required Educational Qualification : minimum 15 years of fulltime education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google Dataproc. Your typical day will involve working with Apache Spark and collaborating with cross-functional teams to deliver impactful data-driven solutions. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Google Dataproc. Collaborate with cross-functional teams to deliver impactful data-driven solutions. Utilize Apache Spark for data processing and analysis. Develop and maintain technical documentation for applications. Professional & Technical Skills: Strong expereince in Apache Spark and Java for Spark. Strong experience with multiple database models ( SQL, NoSQL, OLTP and OLAP) Strong experience with Data Streaming Architecture ( Kafka, Spark, Airflow) Strong knowledge of cloud data platforms and technologies such as GCS, BigQuery, Cloud Composer, Dataproc and other cloud-native offerings Knowledge of Infrastructure as Code (IaC) and associated tools (Terraform, ansible etc) Experience pulling data from a variety of data source types including Mainframe (EBCDIC), Fixed Length and delimited files, databases (SQL, NoSQL, Time-series) Experience performing analysis with large datasets in a cloud-based environment, preferably with an understanding of Google's Cloud Platform (GCP) Comfortable communicating with various stakeholders (technical and non-technical) GCP Data Engineer Certification is a nice to have Additional Information: The candidate should have a minimum of 5 years of experience in Google Dataproc and Apache Spark. The ideal candidate will possess a strong educational background in software engineering or a related field. This position is based at our Mumbai office. Qualifications minimum 15 years of fulltime education
Posted Date not available
15.0 - 20.0 years
4 - 8 Lacs
mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role :Analytics and Modelor Project Role Description :Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills :Google BigQuery, SSI:NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job :Key Responsibilities :Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX)1:Proven track record of delivering data integration, data warehousing soln2:Strong SQL And Hands-on (No FLEX)2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX)4:understanding on cloud native services :bucket storage, GBQ, cloud function, pub sub, composer, and KubernetesExp in cloud solutions, mainly data platform services , GCP Certifications 5:Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience :1:Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred2:Strong hands-on experience with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX)3:Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline4:Open mindset, ability to quickly adapt new technologies5:Performance tuning of BigQuery SQL scripts6:GCP Certified preferred7:Working in agile environment Professional Attributes :1:Must have good communication skills 2:Must have ability to collaborate with different teams and suggest solutions 3:Ability to work independently with little supervision or as a team 4:Good analytical problem solving skills 5:Good team handling skills Educational Qualification:15 years of Full time education Qualification 15 years full time education
Posted Date not available
15.0 - 20.0 years
4 - 8 Lacs
mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role :Analytics and Modelor Project Role Description :Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills :Google BigQuery, SSI:NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job :Key Responsibilities :Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX)1:Proven track record of delivering data integration, data warehousing soln2:Strong SQL And Hands-on (No FLEX)2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX)4:understanding on cloud native services :bucket storage, GBQ, cloud function, pub sub, composer, and KubernetesExp in cloud solutions, mainly data platform services , GCP Certifications 5:Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience :1:Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred2:Strong hands-on experience with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX)3:Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline4:Open mindset, ability to quickly adapt new technologies5:Performance tuning of BigQuery SQL scripts6:GCP Certified preferred7:Working in agile environment Professional Attributes :1:Must have good communication skills 2:Must have ability to collaborate with different teams and suggest solutions 3:Ability to work independently with little supervision or as a team 4:Good analytical problem solving skills 5:Good team handling skills Educational Qualification:15 years of Full time education Qualification 15 years full time education
Posted Date not available
15.0 - 20.0 years
4 - 8 Lacs
mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role :Analytics and Modelor Project Role Description :Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills :Google BigQuery, SSI:NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job :Key Responsibilities :Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX)1:Proven track record of delivering data integration, data warehousing soln2:Strong SQL And Hands-on (No FLEX)2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX)4:understanding on cloud native services :bucket storage, GBQ, cloud function, pub sub, composer, and KubernetesExp in cloud solutions, mainly data platform services , GCP Certifications 5:Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience :1:Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred2:Strong hands-on experience with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX)3:Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline4:Open mindset, ability to quickly adapt new technologies5:Performance tuning of BigQuery SQL scripts6:GCP Certified preferred7:Working in agile environment Professional Attributes :1:Must have good communication skills 2:Must have ability to collaborate with different teams and suggest solutions 3:Ability to work independently with little supervision or as a team 4:Good analytical problem solving skills 5:Good team handling skills Educational Qualification:15 years of Full time education Qualification 15 years full time education
Posted Date not available
15.0 - 20.0 years
4 - 8 Lacs
mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role :Analytics and Modelor Project Role Description :Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills :Google BigQuery, SSI:NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job :Key Responsibilities :Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX)1:Proven track record of delivering data integration, data warehousing soln2:Strong SQL And Hands-on (No FLEX)2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX)4:understanding on cloud native services :bucket storage, GBQ, cloud function, pub sub, composer, and KubernetesExp in cloud solutions, mainly data platform services , GCP Certifications 5:Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience :1:Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred2:Strong hands-on experience with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX)3:Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline4:Open mindset, ability to quickly adapt new technologies5:Performance tuning of BigQuery SQL scripts6:GCP Certified preferred7:Working in agile environment Professional Attributes :1:Must have good communication skills 2:Must have ability to collaborate with different teams and suggest solutions 3:Ability to work independently with little supervision or as a team 4:Good analytical problem solving skills 5:Good team handling skills Educational Qualification:15 years of Full time education Qualification 15 years full time education
Posted Date not available
15.0 - 20.0 years
4 - 8 Lacs
mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role :Analytics and Modelor Project Role Description :Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills :Google BigQuery, SSI:NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job :Key Responsibilities :Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX)1:Proven track record of delivering data integration, data warehousing soln2:Strong SQL And Hands-on (No FLEX)2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX)4:understanding on cloud native services :bucket storage, GBQ, cloud function, pub sub, composer, and KubernetesExp in cloud solutions, mainly data platform services , GCP Certifications 5:Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience :1:Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred2:Strong hands-on experience with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX)3:Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline4:Open mindset, ability to quickly adapt new technologies5:Performance tuning of BigQuery SQL scripts6:GCP Certified preferred7:Working in agile environment Professional Attributes :1:Must have good communication skills 2:Must have ability to collaborate with different teams and suggest solutions 3:Ability to work independently with little supervision or as a team 4:Good analytical problem solving skills 5:Good team handling skills Educational Qualification:15 years of Full time education Qualification 15 years full time education
Posted Date not available
15.0 - 20.0 years
4 - 8 Lacs
mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role :Analytics and Modelor Project Role Description :Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills :Google BigQuery, SSI:NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job :Key Responsibilities :Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX)1:Proven track record of delivering data integration, data warehousing soln2:Strong SQL And Hands-on (No FLEX)2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX)4:understanding on cloud native services :bucket storage, GBQ, cloud function, pub sub, composer, and KubernetesExp in cloud solutions, mainly data platform services , GCP Certifications 5:Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience :1:Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred2:Strong hands-on experience with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX)3:Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline4:Open mindset, ability to quickly adapt new technologies5:Performance tuning of BigQuery SQL scripts6:GCP Certified preferred7:Working in agile environment Professional Attributes :1:Must have good communication skills 2:Must have ability to collaborate with different teams and suggest solutions 3:Ability to work independently with little supervision or as a team 4:Good analytical problem solving skills 5:Good team handling skills Educational Qualification:15 years of Full time education Qualification 15 years full time education
Posted Date not available
15.0 - 20.0 years
4 - 8 Lacs
mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role :Analytics and Modelor Project Role Description :Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills :Google BigQuery, SSI:NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job :Key Responsibilities :Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX)1:Proven track record of delivering data integration, data warehousing soln2:Strong SQL And Hands-on (No FLEX)2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX)4:understanding on cloud native services :bucket storage, GBQ, cloud function, pub sub, composer, and KubernetesExp in cloud solutions, mainly data platform services , GCP Certifications 5:Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience :1:Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred2:Strong hands-on experience with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX)3:Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline4:Open mindset, ability to quickly adapt new technologies5:Performance tuning of BigQuery SQL scripts6:GCP Certified preferred7:Working in agile environment Professional Attributes :1:Must have good communication skills 2:Must have ability to collaborate with different teams and suggest solutions 3:Ability to work independently with little supervision or as a team 4:Good analytical problem solving skills 5:Good team handling skills Educational Qualification:15 years of Full time education Qualification 15 years full time education
Posted Date not available
7.0 - 10.0 years
20 - 35 Lacs
hyderabad, chennai, bengaluru
Work from Office
Strong command of SQL and Python. Practical knowledge of BigQuery, Cloud Composer (Airflow), Dataproc, and GCS, among other GCP services. Understanding of data modeling concepts and ETL/ELT operations. knowledge of CI/CD tools (GitLab, Jenkins) for automated deployments. Fundamental knowledge of Apache Spark and other distributed data processing technologies. A strong aptitude for addressing problems and the capacity to collaborate with others. knowledge of version control using Git.
Posted Date not available
5.0 - 7.0 years
4 - 7 Lacs
gurugram
Work from Office
Job Description: We are seeking a skilled Data Engineer with expertise in Google Cloud Platform (GCP) to join our team. The ideal candidate will have experience with various GCP services and a strong background in data engineering. Key Responsibilities: Ulize GCP Big Query, GCP Cloud Storage, Dataflow, Data Proc, and Data Mesh for data processing and analysis. Develop and opmize data products, enterprise data warehouses (EDW), and data models. Design and implement scalable data architectures and data warehouses. Collaborate with cross-funconal teams to understand business requirements and translate them into technical specificaons. Required Skills: Proficiency in GCP services including Big Query, Cloud Storage, Dataflow, Data Proc, and Data Mesh. Strong experience in data architecture, data modeling, and data warehousing. Ability to develop and manage data products and EDW
Posted Date not available
3.0 - 8.0 years
8 - 18 Lacs
chennai
Work from Office
Role & responsibilities 5+ years of relevant work experience and increasing responsibility Develop, Enhance and Maintain Data Integration workflows to process data from different sources Strong development experience on building data pipelines on Google Cloud Platform services GCP Experience: Big Query, Cloud Storage, Cloud Dataproc, Cloud Dataflow Should have strong SQL knowledge on Big Query / Any SQL databases etc. Should have experience in writing scripts in Pyspark / Python Ability to work independently in a semi structured environment, applications often have frequently changing or vaguely defined requirements Should have knowledge on Astronomer Airflow Well versed in Agile solution development methodologies Self-starter, ability to leverage various resources available to quickly learn and try out new concepts and technologies Organizational, analytical, problem-solving skills Strong interpersonal, communication and presentation skills
Posted Date not available
3.0 - 8.0 years
6 - 10 Lacs
bengaluru
Work from Office
Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : SUSE Linux Administration Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders, explain any performance issues or risks, and ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Proactively identify and address potential issues in Cloud services.- Collaborate with cross-functional teams to optimize Cloud orchestration processes.- Develop and implement strategies to enhance Cloud automation capabilities.- Analyze performance data to identify trends and areas for improvement.- Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in SUSE Linux Administration.- Strong understanding of Cloud orchestration and automation.- Experience in managing and troubleshooting Cloud services.- Knowledge of scripting languages for automation tasks.- Hands-on experience with monitoring and alerting tools.- Good To Have Skills: Experience with DevOps practices. Additional Information:- The candidate should have a minimum of 3 years of experience in SUSE Linux Administration.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |