Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 10.0 years
35 - 40 Lacs
Pune
Work from Office
: Job Title- Data Engineer (ETL, Big Data, Hadoop, Spark, GCP), AVP Location- Pune, India Role Description Engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in Com in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Work as a senior developer for developing analytics algorithm on top of ingested data. Work as though senior developer for various data sourcing in Hadoop also GCP. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience: More than 6+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark ,SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Banking experience regulatory and cross product knowledge. Passionate about test driven development. Prior experience with release management tasks and responsibilities. Data visualization experience in tableau is good to have. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : Managed File Transfer Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Ensure effective communication between client and operations teams. Analyze service delivery health and address performance issues. Conduct performance meetings to share data and trends. Professional & Technical Skills: Must To Have Skills:Proficiency in Managed File Transfer. Strong understanding of cloud orchestration and automation. Experience in SLA management and performance analysis. Knowledge of IT service delivery and escalation processes. Additional Information: The candidate should have a minimum of 5 years of experience in Managed File Transfer. This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Coimbatore
Work from Office
Project Role : Infrastructure Architect Project Role Description : Lead the definition, design and documentation of technical environments. Deploy solution architectures, conduct analysis of alternative architectures, create architectural standards, define processes to ensure conformance with standards, institute solution-testing criteria, define a solutions cost of ownership, and promote a clear and consistent business vision through technical architectures. Must have skills : Microsoft 365 Security & Compliance Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer in the Security Delivery job family group, you will be responsible for ensuring Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Your typical day will involve acting as a liaison between the client and Accenture operations teams for support and escalations, communicating service delivery health to all stakeholders, and holding performance meetings to share performance and consumption data and trends. Roles & Responsibilities: Act as a liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Professional & Technical Skills: Must To Have Skills:Experience in Microsoft 365 Security & Compliance. Defender for O365, Defender for Identity, Defender for Endpoints, Defender for Cloud Apps, Defender for Cloud, Microsoft Purview, DLP, eDiscovery, Microsoft priva, Microsoft Sentinel. Good To Have Skills:Experience in Cloud orchestration and automation. Strong understanding of Cloud technologies and security principles. Experience in managing and monitoring Cloud infrastructure. Experience in incident management and problem resolution. Additional Information: The candidate should have a minimum of 6 years of experience in Microsoft 365 Security & Compliance. Qualifications 15 years full time education
Posted 1 week ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Cloud Operations Engineer to manage and optimize cloud-based environments. Ideal for engineers passionate about automation, monitoring, and cloud-native technologies. Key Responsibilities: Maintain cloud infrastructure (AWS, Azure, GCP) Automate deployments and system monitoring Ensure availability, performance, and cost optimization Troubleshoot incidents and resolve system issues Required Skills & Qualifications: Hands-on experience with cloud platforms and DevOps tools Proficiency in scripting (Python, Bash) and IaC (Terraform, CloudFormation) Familiarity with logging/monitoring tools (CloudWatch, Datadog, etc.) Bonus: Experience with Kubernetes or serverless architectures Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 2 weeks ago
15.0 - 20.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : Cloud Based Service Management Process Design Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a vital link between clients and Accenture's operations teams, facilitating support and managing escalations. Your typical day will involve communicating the health of service delivery to stakeholders, addressing performance issues, and ensuring that cloud orchestration and automation capabilities are functioning optimally. You will hold performance meetings to discuss data and trends, ensuring that services meet the expected service level agreements with minimal downtime, thereby contributing to the overall efficiency and effectiveness of cloud services. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate regular communication between clients and internal teams to ensure alignment on service delivery.- Analyze performance metrics and prepare reports to inform stakeholders of service health and areas for improvement. Professional & Technical Skills: - Must To Have Skills: Proficiency in Cloud Based Service Management Process Design.- Strong understanding of cloud service models and deployment strategies.- Experience with cloud orchestration tools and automation frameworks.- Ability to analyze and interpret service performance data.- Familiarity with incident management and escalation processes. Additional Information:- The candidate should have minimum 5 years of experience in Cloud Based Service Management Process Design.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : SUSE Linux Administration Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders, explain any performance issues or risks, and ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Proactively identify and address potential issues in Cloud services.- Collaborate with cross-functional teams to optimize Cloud orchestration processes.- Develop and implement strategies to enhance Cloud automation capabilities.- Analyze performance data to identify trends and areas for improvement.- Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in SUSE Linux Administration.- Strong understanding of Cloud orchestration and automation.- Experience in managing and troubleshooting Cloud services.- Knowledge of scripting languages for automation tasks.- Hands-on experience with monitoring and alerting tools.- Good To Have Skills: Experience with DevOps practices. Additional Information:- The candidate should have a minimum of 3 years of experience in SUSE Linux Administration.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
4.0 - 9.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Data Transformation: Utilize Data Build Tool (dbt) to transform raw data into curated data models according to business requirements. Implement data transformations and aggregations to support analytical and reporting needs. Orchestration and Automation: Design and implement automated workflows using Google Cloud Composer to orchestrate data pipelines and ensure timely data delivery. Monitor and troubleshoot data pipelines, identifying and resolving issues proactively. Develop and maintain documentation for data pipelines and workflows. GCP Expertise: Leverage GCP services, including BigQuery, Cloud Storage, and Pub/Sub, to build a robust and scalable data platform. Optimize BigQuery performance and cost through efficient query design and data partitioning. Implement data security and access controls in accordance with banking industry standards. Collaboration and Communication: Collaborate with Solution Architect and Data Modeler to understand data requirements and translate them into technical solutions. Communicate effectively with team members and stakeholders, providing regular updates on project progress. Participate in code reviews and contribute to the development of best practices. Data Pipeline Development: Design, develop, and maintain scalable and efficient data pipelines using Google Cloud Dataflow to ingest data from various sources, including relational databases (RDBMS), data streams, and files. Implement data quality checks and validation processes to ensure data accuracy and consistency. Optimize data pipelines for performance and cost-effectiveness. Banking Domain Knowledge (Preferred): Understanding of banking data domains, such as customer data, transactions, and financial products. Familiarity with regulatory requirements and data governance standards in the banking industry. Required Experience: Bachelor's degree in computer science, Engineering, or a related field. ETL Knowledge. 4-9 years of experience in data engineering, with a focus on building data pipelines and data transformations. Strong proficiency in SQL and experience working with relational databases. Hands-on experience with Google Cloud Platform (GCP) services, including Dataflow, BigQuery, Cloud Composer, and Cloud Storage. Experience with data transformation tools, preferably Data Build Tool (dbt). Proficiency in Python or other scripting languages is a plus. Experience with data orchestration and automation. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Experience with data streams like Pub/Sub or similar. Experience in working with files such as CSV, JSON and Parquet. Primary Skills: GCP, Dataflow, BigQuery, Cloud Composer, Cloud Storage, Data Pipeline, Composer, SQL, DBT, DWH Concepts. Secondary Skills: Python, Banking Domain knowledge, pub/sub, Cloud certifications (e.g. Data engineer), Git or any other version control system.
Posted 2 weeks ago
2.0 - 7.0 years
4 - 9 Lacs
Coimbatore
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Pub/Sub, Google Dataproc Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a skilled GCP Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform (GCP). This role requires expertise in cloud-based data engineering and hands-on experience with GCP tools and services, ensuring efficient data integration, transformation, and storage for various business use cases.________________________________________ Roles & Responsibilities: Design, develop, and deploy data pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage. Optimize and monitor data workflows for performance, scalability, and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and implement solutions. Implement data security and governance measures, ensuring compliance with industry standards. Automate data workflows and processes for operational efficiency. Troubleshoot and resolve technical issues related to data pipelines and platforms. Document technical designs, processes, and best practices to ensure maintainability and knowledge sharing.________________________________________ Professional & Technical Skills:a) Must Have: Proficiency in GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and experience with data modeling and query optimization. Solid programming skills in Python ofor data processing and ETL development. Experience with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts, ELT/ETL processes, and real-time streaming. Strong understanding of data security, encryption, and IAM policies on GCP.b) Good to Have: Experience with Dialogflow or CCAI tools Knowledge of machine learning pipelines and integration with AI/ML services on GCP. Certifications such as Google Professional Data Engineer or Google Cloud Architect.________________________________________ Additional Information: - The candidate should have a minimum of 3 years of experience in Google Cloud Machine Learning Services and overall Experience is 3- 5 years - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualifications 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
16 - 20 Lacs
Mumbai
Work from Office
What will you do at Fynd? - Run the production environment by monitoring availability and taking a holistic view of system health. - Improve reliability, quality, and time-to-market of our suite of software solutions - Be the 1st person to report the incident. - Debug production issues across services and levels of the stack. - Envisioning the overall solution for defined functional and non-functional requirements, and being able to define technologies, patterns and frameworks to realise it. - Building automated tools in Python / Java / GoLang / Ruby etc. - Help Platform and Engineering teams gain visibility into our infrastructure. - Lead design of software components and systems, to ensure availability, scalability, latency, and efficiency of our services. - Participate actively in detecting, remediating and reporting on Production incidents, ensuring the SLAs are met and driving Problem Management for permanent remediation. - Participate in on-call rotation to ensure coverage for planned/unplanned events. - Perform other task like load-test & generating system health reports. - Periodically check for all dashboards readiness. - Engage with other Engineering organizations to implement processes, identify improvements, and drive consistent results. - Working with your SRE and Engineering counterparts for driving Game days, training and other response readiness efforts. - Participate in the 24x7 support coverage as needed Troubleshooting and problem-solving complex issues with thorough root cause analysis on customer and SRE production environments - Collaborate with Service Engineering organizations to build and automate tooling, implement best practices to observe and manage the services in production and consistently achieve our market leading SLA. - Improving the scalability and reliability of our systems in production. - Evaluating, designing and implementing new system architectures. Some specific Requirements : - B.Tech. in Engineering, Computer Science, technical degree, or equivalent work experience - At least 3 years of managing production infrastructure. - Leading / managing a team is a huge plus. - Experience with cloud platforms like - AWS, GCP. - Experience developing and operating large scale distributed systems with Kubernetes, Docker and and Serverless (Lambdas) - Experience in running real-time and low latency high available applications (Kafka, gRPC, RTP) - Comfortable with Python, Go, or any relevant programming language. - Experience with monitoring alerting using technologies like Newrelic / zybix /Prometheus / Garafana / cloudwatch / Kafka / PagerDuty etc. - Experience with one or more orchestration, deployment tools, e. CloudFormation / Terraform / Ansible / Packer / Chef. - Experience with configuration management systems such as Ansible / Chef / Puppet. - Knowledge of load testing methodologies, tools like Gating, Apache Jmeter. - Work your way around Unix shell. - Experience running hybrid clouds and on-prem infrastructures on Red Hat Enterprise Linux / CentOS - A focus on delivering high-quality code through strong testing practices.
Posted 3 weeks ago
5 - 10 years
20 - 35 Lacs
Bengaluru
Hybrid
GCP Data Engineer - 5+ Years of experience - GCP (all services needed for Big Data pipelines like BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, App Engine), Spark, Scala, Hadoop - Python, PySpark, Orchestration (Airflow), SQL CI/CD (experience with Deployment pipelines) Architecture and Design of cloud-based Big Data pipelines and exposure to any ETL tools Nice to Have - GCP certifications
Posted 1 month ago
11 - 20 years
45 - 75 Lacs
Gurugram
Work from Office
Role & responsibilities Proven success architecting and scaling complex software solutions, Familiarity with interface design. Experience and ability to drive a project/module independently from an execution stand. Prior experience with scalable architecture and distributed processing. Strong Programming expertise in Python, SQL, Scala Hands-on experience on any major big data solutions like Spark, Kafka, Hive. Strong data management skills with ETL, DWH, Data Quality and Data Governance. Hands-on experience on microservices architecture, docker and Kubernetes as orchestration. Experience on cloud-based data stores like Redshift and Big Query. Experience in cloud solution architecture. Experience on architecture of running spark jobs on k8s and optimization of spark jobs. Experience in MLops architecture/tools/orchestrators like Kubeflow, MLflow Experience in logging, metrics and distributed tracing systems (e.g. Prometheus/Grafana/Kibana) Experience in CI/CD using octopus/teamcity/jenkins Interested candidate can share their updated resume at surinder.kaur@mounttalent.com
Posted 1 month ago
2 - 6 years
8 - 12 Lacs
Pune
Work from Office
About The Role : Job Title Senior engineer - (Data Engineer (ETL, Big Data, Hadoop, Spark, GCP), AVP Location:Pune, India Role Description Senior engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Work as a senior developer for developing analytics algorithm on top of ingested data. Work as though senior developer for various data sourcing in Hadoop also GCP. Own unit testing UAT deployment end user sign off and prod go live. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience More than 10+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark, SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Desired Banking experience regulatory and cross product knowledge. Passionate about test driven development. Prior experience with release management tasks and responsibilities. Data visualization experience is good to have. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.
Posted 1 month ago
5 - 10 years
9 - 13 Lacs
Chennai
Work from Office
Overview GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) +Teradata Responsibilities GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata Requirements GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata
Posted 1 month ago
5 - 10 years
9 - 19 Lacs
Chennai, Bengaluru, Mumbai (All Areas)
Hybrid
Google BigQuery Location- Pan India Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Key Responsibilities : Analyze and model client market and key performance data Use analytical tools and techniques to develop business insights and improve decisionmaking \n1:Data Proc PubSub Data flow Kalka Streaming Looker SQL No FLEX\n2:Proven track record of delivering data integration data warehousing soln\n3: Strong SQL And Handson Pro in BigQuery SQL languageExp in Shell Scripting Python No FLEX\n4:Exp with data integration and migration projects Oracle SQL Technical Experience : Google BigQuery\n\n1: Expert in Python NO FLEX Strong handson knowledge in SQL NO FLEX Python programming using Pandas NumPy deep understanding of various data structure dictionary array list tree etc experiences in pytest code coverage skills\n2: Exp with building solutions using cloud native services: bucket storage Big Query cloud function pub sub composer and Kubernetes NO FLEX\n3: Pro with tools to automate AZDO CI CD pipelines like ControlM GitHub JIRA confluence CI CD Pipeline Professional Attributes :
Posted 1 month ago
8 - 13 years
7 - 12 Lacs
Pune
Work from Office
About The Role : Job TitleBusiness Functional Analyst Corporate TitleAssociate LocationPune, India Role Description Business Functional Analysis is responsible for business solution design in complex project environments (e.g. transformational programmes). Work includes: Identifying the full range of business requirements and translating requirements into specific functional specifications for solution development and implementation Analysing business requirements and the associated impacts of the changes Designing and assisting businesses in developing optimal target state business processes Creating and executing against roadmaps that focus on solution development and implementation Answering questions of methodological approach with varying levels of complexity Aligning with other key stakeholder groups (such as Project Management & Software Engineering) to support the link between the business divisions and the solution providers for all aspects of identifying, implementing and maintaining solutions What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Write clear and well-structured business requirements/documents. Convert roadmap features into smaller user stories. Analyse process issues and bottlenecks and to make improvements. Communicate and validate requirements with relevant stakeholders. Perform data discovery, analysis, and modelling. Assist with project management for selected projects. Understand and translate business needs into data models supporting long-term solutions. Understand existing SQL/Python code convert to business requirement. Write advanced SQL and Python scripts. Your skills and experience A minimum of 8+ years of experience in business analysis or a related field. Exceptional analytical and conceptual thinking skills. Proficient in SQL. Proficient in Python for data engineering. Experience in automating ETL testing using python and SQL. Exposure on GCP services corresponding cloud storage, data lake, database, data warehouse; like Big Query, GCS, Dataflow, Cloud Composer, gsutil, Shell Scripting etc. Previous experience in Procurement and Real Estate would be plus. Competency in JIRA, Confluence, draw i/o and Microsoft applications including Word, Excel, Power Point an Outlook. Previous Banking Domain experience is a plus. Good problem-solving skills How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs
Posted 2 months ago
3 - 8 years
37 - 45 Lacs
Pune
Work from Office
About The Role : Job TitleGCP DevOps/Platform Engineer Corporate TitleAVP LocationPune, India Role Description We are seeking a highly motivated and senior DevOps Engineer to join our team. The successful candidate will have at least 8-13 years of experience in the field and be proficient in Google Cloud Platform (GCP), GitHub Actions, Infrastructure as Code (IaC), and Site Reliability Engineering (SRE), CI/CD using Helm Charts, Platform engineering. The person performing the role may lead delivery of other members of the team and controls their work where applicable. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Develop and maintain infrastructure code using IaC tools such as Terraform and Ansible. Design, implement, and optimize cloud-based applications and services on GCP. Collaborate with cross-functional teams to ensure successful delivery of projects, including frontend development, backend development, and quality assurance. Troubleshoot and resolve issues related to application performance, reliability, and security. Optimize the deployment process using automation tools such as GitHub Actions. Provide technical guidance and mentorship to junior team members. Stay up-to-date with industry trends and best practices in DevOps engineering. Design, deploy, manage and document CI/CD pipelines Routine application maintenance tasks are an ongoing responsibility of DevOps Engineers that they accomplish via strategy-building techniques. Identifies issues / optimization potentials and implements solutions Your skills and experience Understanding of industry standards processes for build, deploy, release and support (CI/CD, incident/problem /change management etc.) Experience in building dashboards for billing, utilization and monitoring infrastructure. Experience in optimizing infrastructure cost and reducing footprint. Strong understanding and working experience in managing GKE and GKE Cluster Services. Experience in GKE node management, auto scaling, secrets management, config management, virtual services, gateways, Anthos service mesh Strong knowledge of Linux, Apache Webserver, Java Application servers, Load balancers Experience with any cloud-based infrastructure (GCP/AWS/Azure), highly available and fault tolerantapplications. Our tech stackGCP (GKE, Cloud Composer, Big Query, GCS etc), but any other public cloud experience is relevant, Kubernetes, Terraform, Confluent Kafka, GitHub Actions, Helm. Good understanding of infrastructure and platform componentsShell scripting, Python, Linux Application layer protocols (TLS/SSL, HTTP(S), DNS, etc) Experience in supporting/building Continuous Delivery pipelines Experience with deployment strategies (such as BlueGreen, Canary, A/B) Good understanding of various design and architectural patterns Good understanding of Microservices and API Management Experience in monitoring/reporting tools such as Splunk, Grafana/Prometheus/Google Cloud Operation etc Experience in Agile practices Collaboration Skills: Proactive can-do attitude; A creative approach towards solving technical problems; Able to work efficiently with colleagues in multiple locations; Willing to collaborate across domains, for efficiency in technology sharing and reuse; Excellent communication skills in English; How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 2 months ago
2 - 6 years
4 - 8 Lacs
Bengaluru
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact! IBMs Cloud Services are focused on supporting clients on their cloud journey across any platform to achieve their business goals. It encompasses Cloud Advisory, Architecture, Cloud Native Development, Application Portfolio Migration, Modernization, and Rationalization as well as Cloud Operations Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done. Expert in SQL can do data analysis and investigation using Sql queriesImplementation Knowledge Advance Sql functions like Regular Expressions, Aggregation, Pivoting, Ranking, Deduplication etc.BigQuery and BigQuery Transformation (using Stored Procedures) Data modelling concepts Star & Snowflake schemas, Fact & Dimension table, Joins, Cardinality etc GCP Services related to data pipelines like Workflow, Cloud Composer, Cloud Schedular, Cloud Storage etc Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues, and deploy applications to the cloud platform
Posted 2 months ago
5 - 10 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Apache Spark, Java, Google Dataproc Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. You will be responsible for the design and development of data solutions, collaborating with multiple teams, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines to migrate and deploy data across systems. Ensure data quality by implementing ETL processes. Collaborate with multiple teams to provide solutions to data-related problems. Professional & Technical Skills: Must To Have Skills:Proficiency in Apache Spark, Java, Google Dataproc. Good To Have Skills:Experience with Apache Airflow. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Strong experience with multiple database models ( SQL, NoSQL, OLTP and OLAP) Strong experience with Data Streaming Architecture ( Kafka, Spark, Airflow) Strong knowledge of cloud data platforms and technologies such as GCS, BigQuery, Cloud Composer, Dataproc and other cloud-native offerings Knowledge of Infrastructure as Code (IaC) and associated tools (Terraform, ansible etc) Experience pulling data from a variety of data source types including Mainframe (EBCDIC), Fixed Length and delimited files, databases (SQL, NoSQL, Time-series) Experience performing analysis with large datasets in a cloud-based environment, preferably with an understanding of Google's Cloud Platform (GCP) Comfortable communicating with various stakeholders (technical and non-technical) GCP Data Engineer Certification is a nice to have Additional Information: The candidate should have a minimum of 5 years of experience in Apache Spark. This position is based in Bengaluru. A 15 years full time education is required. Qualifications 15 years full time education
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Coimbatore
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Pub/Sub, Google Dataproc Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a skilled GCP Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform (GCP). This role requires expertise in cloud-based data engineering and hands-on experience with GCP tools and services, ensuring efficient data integration, transformation, and storage for various business use cases.________________________________________ Roles & Responsibilities: Design, develop, and deploy data pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage. Optimize and monitor data workflows for performance, scalability, and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and implement solutions. Implement data security and governance measures, ensuring compliance with industry standards. Automate data workflows and processes for operational efficiency. Troubleshoot and resolve technical issues related to data pipelines and platforms. Document technical designs, processes, and best practices to ensure maintainability and knowledge sharing.________________________________________ Professional & Technical Skills:a) Must Have: Proficiency in GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and experience with data modeling and query optimization. Solid programming skills in Python ofor data processing and ETL development. Experience with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts, ELT/ETL processes, and real-time streaming. Strong understanding of data security, encryption, and IAM policies on GCP.b) Good to Have: Experience with Dialogflow or CCAI tools Knowledge of machine learning pipelines and integration with AI/ML services on GCP. Certifications such as Google Professional Data Engineer or Google Cloud Architect.________________________________________ Additional Information: - The candidate should have a minimum of 3 years of experience in Google Cloud Machine Learning Services and overall Experience is 3- 5 years - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualifications 15 years full time education
Posted 2 months ago
7 - 10 years
40 - 45 Lacs
Chennai, Ahmedabad, Kolkata
Work from Office
Dear Candidate, We are seeking an experienced Cloud Solutions Architect to design and implement scalable, secure, and high-performance cloud solutions. You will work closely with engineering and business teams to develop cloud architectures, migration strategies, and infrastructure automation while ensuring cost optimization and security compliance. Key Responsibilities: Design and implement cloud-native architectures using AWS, Azure, or Google Cloud . Develop and execute cloud migration strategies for on-premise to cloud transitions. Ensure the scalability, security, and resilience of cloud-based applications. Implement Infrastructure as Code (IaC) using Terraform, Bicep, or CloudFormation. Optimize cloud costs while ensuring performance and reliability. Lead cloud security initiatives, including identity management, encryption, and compliance . Collaborate with development teams to integrate CI/CD pipelines and DevOps best practices . Define and document best practices, architectural patterns, and governance frameworks . Implement high-availability, disaster recovery, and backup strategies . Stay updated with the latest cloud technologies and industry trends . Required Skills & Qualifications: Extensive experience with AWS, Azure, or Google Cloud services. Proficiency in cloud architecture design, networking, and security best practices . Strong knowledge of containers and orchestration tools (Docker, Kubernetes, ECS, AKS, GKE). Experience with Infrastructure as Code (IaC) using Terraform, Bicep, or CloudFormation. Expertise in cloud security and compliance (IAM, encryption, SOC2, ISO27001). Proficiency in scripting and automation (Python, PowerShell, Bash). Strong understanding of DevOps methodologies and CI/CD pipelines . Hands-on experience with serverless computing (AWS Lambda, Azure Functions, Google Cloud Functions). Knowledge of database management in cloud environments (RDS, CosmosDB, BigQuery). Strong problem-solving, analytical thinking, and communication skills. Soft Skills: Strong problem-solving and analytical skills. Excellent communication skills to work with cross-functional teams. Ability to work independently and as part of a team. Detail-oriented with a focus on delivering high-quality solutions Note: If you are interested, please share your updated resume and suggest the best number & time to connect with you. If your resume is shortlisted, one of the HR from my team will contact you as soon as possible. Srinivasa Reddy Kandi Delivery Manager Integra Technologies
Posted 2 months ago
5 - 10 years
7 - 12 Lacs
Coimbatore
Work from Office
Project Role : Infrastructure Architect Project Role Description : Lead the definition, design and documentation of technical environments. Deploy solution architectures, conduct analysis of alternative architectures, create architectural standards, define processes to ensure conformance with standards, institute solution-testing criteria, define a solutions cost of ownership, and promote a clear and consistent business vision through technical architectures. Must have skills : Microsoft 365 Security & Compliance Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer in the Security Delivery job family group, you will be responsible for ensuring Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Your typical day will involve acting as a liaison between the client and Accenture operations teams for support and escalations, communicating service delivery health to all stakeholders, and holding performance meetings to share performance and consumption data and trends. Roles & Responsibilities: Act as a liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Professional & Technical Skills: Must To Have Skills:Experience in Microsoft 365 Security & Compliance. Defender for O365, Defender for Identity, Defender for Endpoints, Defender for Cloud Apps, Defender for Cloud, Microsoft Purview, DLP, eDiscovery, Microsoft priva, Microsoft Sentinel. Good To Have Skills:Experience in Cloud orchestration and automation. Strong understanding of Cloud technologies and security principles. Experience in managing and monitoring Cloud infrastructure. Experience in incident management and problem resolution. Additional Information: The candidate should have a minimum of 6 years of experience in Microsoft 365 Security & Compliance. Qualifications 15 years full time education
Posted 2 months ago
12 - 20 years
30 - 45 Lacs
Hyderabad
Hybrid
Job Description: We are seeking a highly experienced Data Architect with 15-20 years of experience to lead the design and implementation of data solutions at scale. The ideal candidate will have deep expertise in cloud technologies, particularly GCP, along with a broad skill set in SQL, BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, DLP, Dataproc, Cloud Composer, Python, ETL, and big data technologies like MapR/Hadoop, Hive, Spark, and Scala. Key Responsibilities: Lead the design and implementation of complex data architectures across cloud platforms, ensuring scalability, performance, and cost-efficiency. Architect data solutions using Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, Dataproc, Cloud Composer, and DLP. Design and optimize ETL - Abinitio processes and data pipelines using Python and related technologies, ensuring seamless data integration across multiple systems. Work with big data technologies including Hadoop (MapR), Hive, Spark, and Scala to build and manage large-scale, distributed data systems. Oversee the end-to-end data flow from ingestion to processing, transformation, and storage, ensuring high availability and disaster recovery. Lead and mentor a team of engineers, guiding them in adopting best practices in data architecture, security, and governance. Define and enforce data governance, security, and compliance standards to ensure data privacy and integrity. Collaborate with cross-functional teams to understand business requirements and translate them into data architecture and technical solutions. Design and implement data lake, data warehouse, and analytics solutions to support business intelligence and advanced analytics. Lead the integration of cloud-native tools and services for real-time and batch processing, using Pub/Sub, Dataproc, and Cloud Composer. Conduct performance tuning and optimization for SQL, BigQuery, and big data technologies to ensure efficient query execution and resource usage. Provide strategic direction on new data technologies, trends, and best practices to ensure the organization remains competitive and innovative. Required Skills: 15-20 years of experience in data architecture, data engineering, or related roles, with a focus on cloud solutions. Extensive experience with Google Cloud Platform (GCP) services, particularly BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, Dataproc, Cloud Composer, and DLP. Strong Experience in ETL - Abinitio. Proficient in SQL and experience with cloud-native data storage and processing technologies (BigQuery, Hive, Hadoop, Spark). Expertise in Python for ETL pipeline development and data manipulation. Solid understanding of big data technologies such as MapR, Hadoop, Hive, Spark, and Scala. Experience in designing and implementing scalable, high-performance data architectures and data lakes/warehouses. Deep understanding of data governance, security, privacy (DLP), and compliance standards. Proven experience in leading teams and delivering large-scale data solutions in cloud environments. Excellent problem-solving, communication, and leadership skills. Ability to work with senior business and technical leaders to align data solutions with organizational goals. Preferred Skills: Experience with other cloud platforms (AWS, Azure). Knowledge of machine learning and AI data pipelines. Familiarity with containerized environments and orchestration tools (e.g., Kubernetes). Experience with advanced analytics or data science initiatives.
Posted 2 months ago
3 - 8 years
5 - 10 Lacs
Bengaluru
Work from Office
Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : SUSE Linux Administration Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders, explain any performance issues or risks, and ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Ensure effective communication between client and Accenture operations teams. Monitor and maintain Cloud orchestration and automation capability. Analyze performance data and trends to identify areas for improvement. Collaborate with stakeholders to address service delivery issues. Implement strategies to optimize service delivery efficiency. Professional & Technical Skills: Must To Have Skills: Proficiency in SUSE Linux Administration. Strong understanding of cloud orchestration and automation technologies. Experience in analyzing performance data and trends. Knowledge of SLAs and service delivery optimization techniques. Additional Information: The candidate should have a minimum of 3 years of experience in SUSE Linux Administration. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 2 months ago
5 - 10 years
7 - 12 Lacs
Pune
Work from Office
Project Role :Data Engineer Project Role Description :Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills :Google Cloud Data Services, Python (Programming Language), GCP Dataflow, Apache Airflow Good to have skills :NA Minimum 5 year(s) of experience is required Educational Qualification :15 years full time education Summary:As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. You will be responsible for designing and implementing data solutions that meet the needs of the organization and contribute to its overall success. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Design and develop data pipelines to extract, transform, and load data. Ensure data quality and integrity throughout the data processing lifecycle. Implement ETL processes to migrate and deploy data across systems. Collaborate with cross-functional teams to understand data requirements and design appropriate solutions. Professional & Technical Skills: Must Have Skills:Strong Proficiency in Python (Programming Language), Apache Airflow and Google Cloud Data Services. Must Have Skills:5+ years' experience in python programming experience in complex data structures as well as data pipeline development. Must Have Skills:5+ years' experience in python libraires Airflow, Pandas, Py Spark, Redis, SQL (or similar libraries). Must Have Skills:3+ Strong SQL programming experience in Mid/Advance functions like Aggregate Functions (SUM, AVG), Conditional Functions (CASE WHEN, NULLIF), Mathematical Functions (ROUND, ABS), Ranking Functions (RANK) and Windowing Functions etc. Nice to Have Skills:3+ years' experience in Dataflow, dataproc, Cloud Composer, Big Query, Cloud Storage, GKE, etc. Nice to Have Skills:Alternative to Google Cloud, data pipeline development using python for 3+ years for any other cloud platform can be considered. Strong of experience with one of the leading public clouds. Strong of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading. Strong experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics. Mandatory Experience:years of experience with Python with working knowledge on Notebooks. Mandatory - years working on a cloud data projects Additional Information: The candidate should have a minimum of 5 years of experience in Google Cloud Data Services. This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 3 months ago
5 - 10 years
7 - 12 Lacs
Pune
Work from Office
Project Role :Data Engineer Project Role Description :Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills :Python (Programming Language), Data Engineering, Apache Airflow, SQL Good to have skills :GCP Dataflow Minimum 5 year(s) of experience is required Educational Qualification :15 years full time education Summary:As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Your typical day will involve designing and developing data solutions, collaborating with teams, and ensuring data integrity and quality. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Design and develop data solutions for data generation, collection, and processing. Create and maintain data pipelines to ensure efficient data flow. Implement ETL processes to migrate and deploy data across systems. Ensure data quality and integrity throughout the data lifecycle. Professional & Technical Skills: Must Have Skills:Strong Proficiency in Python (Programming Language), Apache Airflow and SQL. Must Have Skills:5+ years' experience in python programming experience in complex data structures as well as data pipeline development. Must Have Skills:5+ years' experience in python libraires Airflow, Pandas, Py Spark, Redis, SQL (or similar libraries). Must Have Skills:3+ Strong SQL programming experience in Mid/Advance functions like Aggregate Functions (SUM, AVG), Conditional Functions (CASE WHEN, NULLIF), Mathematical Functions (ROUND, ABS), Ranking Functions (RANK) and Windowing Functions etc. Nice to Have Skills:3+ years' experience in Dataflow, dataproc, Cloud Composer, Big Query, Cloud Storage, GKE, etc. Nice to Have Skills:Alternative to Google Cloud, data pipeline development using python for 3+ years for any other cloud platform can be considered. Strong of experience with one of the leading public clouds. Strong of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading. Strong experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics. Mandatory Experience:years of experience with Python with working knowledge on Notebooks. Mandatory - years working on a cloud data projects Strong understanding of statistical analysis and machine learning algorithms. Additional Information: The candidate should have a minimum of 5 years of experience in Python (Programming Language). This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2