Jobs
Interviews

1629 Cloud Platforms Jobs - Page 41

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 15.0 years

14 - 19 Lacs

Bengaluru

Work from Office

Our solution architects are expected to drive patterns versus products. Therefore, all existing solution architects should be able to expand themselves to fit into hybrid cloud patterns and continue to add value. Common skill requirements for Solution Architects: 10 to 15 years of industry experience with preferred experience of at least 5 years of solution architecture (understand customer problems and devise solutions) Established strength of being able to engage with clients and communicate effectively as a subject matter expert Prior experience of working with a system integrator and building solution offerings is a plus Good knowledge of trending technologies e.g. AI, Security, Cloud (IBM Cloud preferred) Hands-on with technology and being able create demo assets and showcase to customers, conduct PoCs, run enablement sessions Interface with Ecosystem Lab teams and Technology teams to support System Integrators in procuring tech resources to build technical assets. Ability to interlock, understand and engage with various senior executives of key BUs within IBM (EgIBM Consulting) to create value prop, content and assets to drive Mainframe skills, capabilities and expertise for client engagements. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Over 10+ years of experience in working with integration technologies. At least 3 to 5 years of working with IBM z/OS Systems, Managing Z Security or Z Performance Management. Good knowledge of OpenShift, Cloud and container technologies, Knowledge of Cloud Paks is a plus. Working knowledge on IBM Cloud (Administration skills and User / resource management) would be a great plus or a learning requirement for this role. Solution architect for Z would primarily perform one or more of the following Co-creation of solutions with GSI partners Development of customizable solution offerings covering various SI personas GSI consumable collateral, Frameworks, Accelerators etc. that are subsequently customized by SIs for their use. Creation of Industry solution accelerators Education / Enablement of GSIs with new solutions or new offerings. Support GSIs during internal or client POCs Support joint technical engagements with clients. Preferred technical and professional experience Hands on skills on Software Installation / Maintenance on System Z , System Administration, Performance Tuning tasks would be a plus. Working knowledge of cloud services that are a key part of REST API driven integration architecture and administrative skills on cloud platforms preferred.

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Design, develop, and maintain high-performance, scalable Java applications using Java, Spring Boot and React/Angular. Build REST APIs and SDKs. Should be excellent in Java, OOPS concepts & Java Collections. Should be excellent in Spring Boot/Spring/hibernate. Strong proficiency in Java and related frameworks (e.g., Spring, Hibernate). Should have worked on REST API implementation and microservices implementation Experience with cloud platforms (e.g., AWS, Azure, Google Cloud). Experience in AWS, Docker and Kubernetes. Knowledge of microservices architecture. Familiarity with CI/CD pipelines and DevOps practices. Excellent communication skills Ability to work effectively in a fast-paced, collaborative environment.

Posted 1 month ago

Apply

2.0 - 5.0 years

14 - 17 Lacs

Navi Mumbai

Work from Office

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark)In-depth knowledge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data TechnologiesFamiliarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modelling, and data warehousing concepts. Strong proficiency in PythonExpertise in Python programming with a focus on data processing and manipulation. Data Processing FrameworksKnowledge of data processing libraries such as Pandas, NumPy. SQL ProficiencyExperience writing optimized SQL queries for large-scale data analysis and transformation. Cloud PlatformsExperience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Design, implement, and manage scalable, secure, and highly available infrastructure on GCP Automate infrastructure provisioning using tools like Terraform or Deployment Manager Build and manage CI/CD pipelines using Jenkins, GitLab CI, or similar tools Manage containerized applications using Kubernetes (GKE) and Docker Monitor system performance and troubleshoot infrastructure issues using tools like Stackdriver, Prometheus, or Grafana Implement security best practices across cloud infrastructure and deployments Collaborate with development and operations teams to streamline release processes Ensure high availability, disaster recovery, and backup strategies are in place Participate in performance tuning and cost optimization of GCP resources Strong hands-on experience with Google Cloud Platform (GCP) services Harness as an optional skill. Proficiency in Infrastructure as Code tools like Terraform or Google Deployment Manager Experience with Kubernetes (especially GKE) and Docker Knowledge of CI/CD tools such as Jenkins, GitHub Actions, GitLab CI, or CircleCI Familiarity with scripting languages (e.g., Bash, Python) Experience with logging and monitoring tools (e.g., Stackdriver, Prometheus, ELK, Grafana) Understanding of networking, security, and IAM in a cloud environment Strong problem-solving and communication skills Experience in Agile environments and DevOps culture GCP Associate or Professional Cloud DevOps Engineer certification Experience with Helm, ArgoCD, or other GitOps tools Familiarity with other cloud platforms (AWS, Azure) is a plus Knowledge of application performance tuning and cost management on GCP

Posted 1 month ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Design, develop, and maintain data pipelines and ETL processes using Databricks. Manage and optimize data solutions on cloud platforms such as Azure and AWS. Implement big data processing workflows using PySpark. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Optimize and tune big data solutions for performance and scalability. Stay updated with the latest industry trends and technologies in big data and cloud computing. : Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Big Data Engineer or similar role. Strong proficiency in Databricks and cloud platforms (Azure/AWS). Expertise in PySpark and big data processing. Experience with data modeling, ETL processes, and data warehousing. Familiarity with cloud services and infrastructure. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications: Experience with other big data technologies and frameworks. Knowledge of machine learning frameworks and libraries.

Posted 1 month ago

Apply

7.0 - 12.0 years

3 - 6 Lacs

Hyderabad

Work from Office

RoleIICS Developer Work ModeHybrid Work timings2pm to 11pm LocationChennai & Hyderabad Primary Skills: IICS Job Summary: We are looking for a highly experienced Senior Lead Data Engineer role with strong expertise in Informatica IICS, Snowflake, Unix/Linux Shell Scripting, CI/CD tools, Agile, and cloud platforms. The ideal candidate will lead complex data engineering initiatives, optimize data architecture, and drive automation while ensuring high standards of data quality and governance within an agile environment. Required Qualifications: - Required Minimum 5+ years of experience in data warehousing and data warehouse concepts. Extensive experience in Informatica IICS, and Snowflake. - Experience in designing, developing, and maintaining data integration solutions using IICS. - Experience in designing, implementing, and optimizing data storage and processing solutions using Snowflake. - Design and execute complex SQL queries for data extraction, transformation, and analysis. - Strong proficiency in Unix/Linux shell scripting and SQL. - Extensive expertise in CI/CD tools and ESP Scheduling. - Experience working in agile environments, with a focus on iterative improvements and collaboration. - Knowledge of SAP Data Services is an added advantage. - Expertise in cloud platforms (AWS, Azure). - Proven track record in data warehousing, data integration, and data governance. - Excellent data analysis and data profiling skills - Collaborate with stakeholders to define data requirements and develop effective data strategies. - Strong leadership and communication skills, with the ability to drive strategic data initiatives.

Posted 1 month ago

Apply

6.0 - 11.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 1 month ago

Apply

12.0 - 17.0 years

12 - 16 Lacs

Pune

Work from Office

As a member of the development group, you will become part of a team that develops and maintains one of Coupas software products developed using Ruby and React, built as a multi-tenant SaaS solution on all Cloud Platforms like AWS, Windows Azure & GCP. We expect that you are a strong leader with extensive technical experience. You have a well-founded analytical approach to finding good solutions, a strong sense of responsibility, and excellent skills in communication and planning. You are proactive in your approach and a strong team player. What you will do: Implement a cloud-native analytics platform with high performance and scalability Build an API-first infrastructure for data in and data out Build data ingestion capabilities for Coupa data, as well as external spend data Leverage data classification AI algorithms to cleanse and harmonize data Own data modelling, microservice orchestration, monitoring & alerting Build solid expertise in the entire Coupa application suite and leverage this knowledge to better design application and data frameworks. Adhere to Coupa iterative development processes to deliver concrete value each release while driving longer-term technical vision. Engage with cross-organizational teams such as Product Management, Integrations, Services, Support, and Operations, to ensure the success of overall software development, implementation, and deployment. What you will bring to Coupa: Bachelors degree in computer science, information systems, computer engineering, systems analysis or a related discipline, or equivalent work experience. 4 to 8 years of experience building enterprise, SaaS web applications using one or more of the following modern frameworks technologiesJava/ .Net/C etc. Exposure to Python & Familiarity with AI/ML-based data cleansing, deduplication and entity resolution techniques Familiarity with a MVC framework such as Django or Rails Full stack web development experience with hands-on experience building responsive UI, Single Page Applications, reusable components, with a keen eye for UI design and usability. Understanding of micro services and event driven architecture Strong knowledge of APIs, and integration with the backend Experience with relational SQL and NoSQL databases such MySQL / PostgreSQL / AWS Aurora / Cassandra Proven expertise in Performance Optimization and Monitoring Tools. Strong knowledge of Cloud Platforms (e.g., AWS, Azure, or GCP) Experience with CI/CD Tooling and software delivery and bundling mechanisms Nice to haveExpertise in Python & Familiarity with AI/ML-based data cleansing, deduplication and entity resolution techniques Nice to haveExperience with Kafka or other pub-sub mechanisms Nice to haveExperience with Redis or other caching mechanisms

Posted 1 month ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Hyderabad

Work from Office

Design, develop, and maintain scalable and high-performance web applications using .NET technologies. Implement responsive and user-friendly front-end interfaces using React or Angular. Collaborate with cross-functional teams to define, design, and ship new features. Ensure the performance, quality, and responsiveness of applications. Integrate with cloud services (Azure/AWS) to enhance application functionality and scalability. Troubleshoot and resolve complex technical issues. Mentor junior developers and contribute to team growth and knowledge sharing. Participate in code reviews and ensure adherence to best practices and coding standards. Bachelors degree in Computer Science, Engineering, or related field. 5+ years of experience in full stack development with .NET technologies. Proficiency in front-end development using React or Angular. Strong experience with cloud platforms (Azure/AWS). Solid understanding of RESTful APIs and web services. Experience with database design and management (SQL Server, NoSQL). Familiarity with DevOps practices and CI/CD pipelines. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills

Posted 1 month ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Gurugram

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Preferred technical and professional experience Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 1 month ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Preferred technical and professional experience Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Preferred technical and professional experience Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 1 month ago

Apply

3.0 - 7.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Scale an existing RAG code base for a production grade AI application Proficiency in Prompt Engineering, LLMs, and Retrieval Augmented Generation Programming languages like Python or Java Experience with vector databases Experience using LLMs in software applications, including prompting, calling, and processing outputs Experience with AI frameworks such as LangChain Troubleshooting skills and creative in finding new ways to leverage LLM Experience with Azure Proof of Concept (POC) DevelopmentDevelop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another, particularly COBOL to JAVA through rapid prototypes/ PoC Documentation and Knowledge SharingDocument solution architectures, design decisions, implementation details, and lessons learned. Create technical documentation, white papers, and best practice guides. Contribute to internal knowledge sharing initiatives and mentor new team members. Industry Trends and InnovationStay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation. Experience in python and pyspark will be added advantage Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong programming skills, with proficiency in Python and experience with AI frameworks such as TensorFlow, PyTorch, Keras or Hugging Face. Understanding in the usage of libraries such as SciKit Learn, Pandas, Matplotlib, etc. Familiarity with cloud platforms (e.g. Kubernetes, AWS, Azure, GCP) and related services is a plus. Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation Prepare the effort estimates, WBS, staffing plan, RACI, RAID etc. . Excellent interpersonal and communication skills. Engage with stakeholders for analysis and implementation. Commitment to continuous learning and staying updated with advancements in the field of AI. Demonstrate a growth mindset to understand clients' business processes and challenges Preferred technical and professional experience EducationBachelor's, Master's, or Ph.D. degree in Computer Science, Artificial Intelligence, Data Science or a related field. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 1 month ago

Apply

4.0 - 9.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Scale an existing RAG code base for a production grade AI application Proficiency in Prompt Engineering, LLMs, and Retrieval Augmented Generation Programming languages like Python or Java Experience with vector databases Experience using LLMs in software applications, including prompting, calling, and processing outputs Experience with AI frameworks such as LangChain Troubleshooting skills and creative in finding new ways to leverage LLM Experience with Azure Proof of Concept (POC) DevelopmentDevelop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another, particularly COBOL to JAVA through rapid prototypes/ PoC Documentation and Knowledge SharingDocument solution architectures, design decisions, implementation details, and lessons learned. Create technical documentation, white papers, and best practice guides. Contribute to internal knowledge sharing initiatives and mentor new team members. Industry Trends and InnovationStay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation. Experience in python and pyspark will be added advantage Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong programming skills, with proficiency in Python and experience with AI frameworks such as TensorFlow, PyTorch, Keras or Hugging Face. Understanding in the usage of libraries such as SciKit Learn, Pandas, Matplotlib, etc. Familiarity with cloud platforms (e.g. Kubernetes, AWS, Azure, GCP) and related services is a plus. Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation Prepare the effort estimates, WBS, staffing plan, RACI, RAID etc. . Excellent interpersonal and communication skills. Engage with stakeholders for analysis and implementation. Commitment to continuous learning and staying updated with advancements in the field of AI. Demonstrate a growth mindset to understand clients' business processes and challenges Preferred technical and professional experience EducationBachelor's, Master's, or Ph.D. degree in Computer Science, Artificial Intelligence, Data Science or a related field. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 1 month ago

Apply

8.0 - 13.0 years

13 - 18 Lacs

Noida

Work from Office

Engineering at Innovaccer With every line of code, we accelerate our customers' success, turning complex challenges into innovative solutions. Collaboratively, we transform each data point we gather into valuable insights for our customers. Join us and be part of a team that's turning dreams of better healthcare into reality, one line of code at a time. Together, were shaping the future and making a meaningful impact on the world. About the Role We are seeking a highly skilled Staff Engineer to lead the architecture, development, and scaling of our Marketplace platform including portals & core services such as Identity & Access Management (IAM), Audit, and Tenant Management services. This is a hands-on technical leadership role where you will drive engineering excellence, mentor teams, and ensure our platforms are secure, compliant, and built for scale. A Day in the Life Design and implement scalable, high-performance backend systems for all the platform capabilities Lead the development and integration of IAM, audit logging, and compliance frameworks, ensuring secure access, traceability, and regulatory adherence. Champion best practices for reliability, availability, and performance across all marketplace and core service components. Mentor engineers, conduct code/design reviews, and establish engineering standards and best practices. Work closely with product, security, compliance, and platform teams to translate business and regulatory requirements into technical solutions. Evaluate and integrate new technologies, tools, and processes to enhance platform efficiency, developer experience, and compliance posture. Take end-to-end responsibility for the full software development lifecycle, from requirements and design through deployment, monitoring, and operational health. What You Need 8+ years of experience in backend or infrastructure engineering, with a focus on distributed systems, cloud platforms, and security. Proven expertise in building and scaling marketplace platforms and developer/admin/API portals. Deep hands-on experience with IAM, audit logging, and compliance tooling. Strong programming skills in languages such as Python or Go. Experience with cloud infrastructure (AWS, Azure), containerization (Docker, Kubernetes), and service mesh architectures. Understanding of security protocols (OAuth, SAML, TLS), authentication/authorization, and regulatory compliance. Demonstrated ability to lead technical projects and mentor engineering teams & excellent problem-solving, communication, and collaboration skills. Proficiency in observability tools such as Prometheus, Grafana, OpenTelemetry. Prior experience with Marketplace & Portals Bachelor's or Masters degree in Computer Science, Engineering, or a related field.

Posted 1 month ago

Apply

6.0 - 9.0 years

4 - 8 Lacs

Pune

Work from Office

Experience with strong Linux and Unix (Solaris 9,10,11) domain knowledge and troubleshooting skills scripting (shell script, Python etc) debug and build tools Experience in RedHat Linux OS (5,6,7,8) LVM RAID Knowledge Hands on in RedHat cluster Experience in Solaris OS SVM, VXVM RAID Knowledge Hands on in VCS cluster Experience in User administration for both Linux and Solaris Experience in File system administration for both Linux and Solaris Experience in PRINT administration for both Linux and Solaris Experience in hardware troubleshot and fix issues for both Linux and Solaris Experience in performance tuning of OS Application and Databases Relevant experience on a high-volume enterprise help desk or similar support role Excellent hands-on experience on different Linux distributions Exposure on virtualization and cloud platforms Hands on experience in Hardware IBM HP, HP Blade servers, SUN hardware and LDOMs and ZONEs and ZFS Experienced in managing various storage Connectivity to storage on different protocols i e ISCSI, NFS, FC FCOE, SCSI etc Solid knowledge of and experience with Red Hat based Linux distributions Experience with different flavours of Linux RHEL Debian and Ubuntu. Experience with different flavours of Linux RHEL Debian and Ubuntu. Experience with server patchingmigration/upgradation via BIGFIX REDHAT Satellite etc.

Posted 1 month ago

Apply

5.0 - 8.0 years

12 - 16 Lacs

Bengaluru

Work from Office

This is a hands-on full stack Java role where you'll design and develop real-time data logging, monitoring & analytics platforms integrated with cutting-edge cloud and microservices architecture. Location: Bangalore (Work from Office – Kumbalgodu)

Posted 1 month ago

Apply

8.0 - 12.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Design, build, and maintain our containerization and orchestration solutions using Docker and Kubernetes.Automate deployment, monitoring, and management of applications using Ansible and Python.Collaborate with development teams to ensure seamless integration and deployment.Implement and manage CI/CD pipelines to streamline software delivery.Monitor system performance and troubleshoot issues to ensure high availability and reliability.Ensure security best practices for containerized environments.Provide support and guidance for development and operations teams. Required Skills and Qualifications:Bachelor's degree in Computer Science, Information Technology, or a related field.Proven experience as a DevOps Engineer or in a similar role.Extensive experience with Docker and Kubernetes.Strong proficiency in Python and Ansible.Solid understanding of CI/CD principles and tools.Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.Excellent problem-solving and troubleshooting skills.Strong communication and teamwork skills. Preferred Qualifications:Experience with infrastructure-as-code tools like Terraform.Knowledge of monitoring and logging tools (e.g., Prometheus, Grafana, ELK stack).Familiarity with Agile development methodologies.Experience with containerization technologies like Docker and Kubernetes.

Posted 1 month ago

Apply

6.0 - 11.0 years

16 - 19 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled Cloud Architect to lead the design and implementation of cloud-based solutions that drive innovation, scalability, and cost-efficiency for our organization. The ideal candidate will possess deep knowledge and hands-on experience with major cloud platforms (such as AWS, Microsoft Azure, or Google Cloud) and will play a pivotal role in defining and executing cloud strategies, infrastructure design, and deployment methodologies.

Posted 1 month ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Telangana

Work from Office

Job Summary We are looking for a skilled DevOps Engineer with strong experience in Python, Ansible, Docker, and Kubernetes. The ideal candidate will have a proven track record of automating and optimizing processes, deploying and managing containerized applications, and ensuring system reliability and scalability. Key Responsibilities Design, build, and maintain our containerization and orchestration solutions using Docker and Kubernetes. Automate deployment, monitoring, and management of applications using Ansible and Python. Collaborate with development teams to ensure seamless integration and deployment. Implement and manage CI/CD pipelines to streamline software delivery. Monitor system performance and troubleshoot issues to ensure high availability and reliability. Ensure security best practices for containerized environments. Provide support and guidance for development and operations teams. Required Skills and Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as a DevOps Engineer or in a similar role. Extensive experience with Docker and Kubernetes. Strong proficiency in Python and Ansible. Solid understanding of CI/CD principles and tools. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. Excellent problem-solving and troubleshooting skills. Strong communication and teamwork skills. Preferred Qualifications Experience with infrastructure-as-code tools like Terraform. Knowledge of monitoring and logging tools (e.g., Prometheus, Grafana, ELK stack). Familiarity with Agile development methodologies. Experience with containerization technologies like Docker and Kubernetes.

Posted 1 month ago

Apply

6.0 - 8.0 years

6 - 8 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary: Alike Thoughts Info Systems is looking for a highly skilled Java Full Stack Developer to join our dynamic development team. You'll be instrumental in designing and developing scalable web applications, leveraging your expertise in Java (backend) with Spring Boot and modern frontend frameworks like Angular or React.js . This role demands proficiency across the entire software development stack, from backend services to responsive user interfaces. Key Responsibilities: Design and develop robust, scalable web applications using Java , Spring Boot , and either Angular or React.js for the frontend. Work across the full stack, encompassing the development of both backend services and intuitive, responsive front-end interfaces. Collaborate closely with product managers, UX designers, and QA engineers to deliver high-quality, end-to-end solutions. Write clean, maintainable, and efficient code that adheres to best practices and coding standards. Actively participate in code reviews, providing constructive feedback, and contribute to mentoring junior developers to foster team growth. Nice to Have: Exposure to cloud platforms (AWS, Azure, or GCP) Knowledge of containerization tools (Docker, Kubernetes)

Posted 1 month ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Bengaluru, Karnataka, India

On-site

Select, Filter (WHERE), group Joins Aggregate, Qualify, Roll-up and Drill-down SQl to extract report by joining facts and dimension data Advance SQL Query Otimisation stored procedures, user-defined functions window functions for complex data manipulation Row_number, Rank,. Dense_Rank LEAD, LAG Snowflake features Snowflake Architecture Cloud Computing and Infrastructure (cloud platforms like AWS, Azure, and GCP, where Snowflake can be deployed.) Data Modeling Database Security and Access Control Metadata Management Time-travel Zero copy cloning Data Pipeline Automation - Snowpipe

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Mumbai

Work from Office

Provide support and management for Informatica Intelligent Cloud Services (IICS), ensuring smooth data integration and processing. You will troubleshoot issues, manage data flows, and optimize cloud services. Proficiency in IICS, data integration, and cloud platforms is required for this role.

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Chennai

Work from Office

Manage and deploy containerized applications using Kubernetes. You will optimize Kubernetes clusters, ensure high availability, and troubleshoot container-related issues. Expertise in Kubernetes, containerization, and cloud platforms is required.

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Mumbai, Hyderabad, Bengaluru

Hybrid

Greetings from Teamware Solutions a division of Quantum Leap Consulting Pvt. Ltd We are hiring a Senior Data Engineer Work Mode: Hybrid Location: Bengaluru, Hyderabad, Mumbai, Kolkata, Gurgaon, Noida Experience: 5 - 8 Years Notice Period: Immediate to 15 days Job Summary: We are seeking a highly motivated and experienced Senior Data Engineer to join our team. This role requires a deep curiosity about our business and a passion for technology and innovation. You will be responsible for designing and developing robust, scalable data engineering solutions that drive our business intelligence and data-driven decision-making processes. If you thrive in a dynamic environment and have a strong desire to deliver top-notch data solutions, we want to hear from you. Key Responsibilities: Collaborate with agile teams to design and develop cutting-edge data engineering solutions. Build and maintain distributed, low-latency, and reliable data pipelines ensuring high availability and timely delivery of data. Design and implement optimized data engineering solutions for Big Data workloads to handle increasing data volumes and complexities. Develop high-performance real-time data ingestion solutions for streaming workloads. Adhere to best practices and established design patterns across all data engineering initiatives. Ensure code quality through elegant design, efficient coding, and performance optimization. Focus on data quality and consistency by implementing monitoring processes and systems. Produce detailed design and test documentation, including Data Flow Diagrams, Technical Design Specs, and Source to Target Mapping documents. Perform data analysis to troubleshoot and resolve data-related issues. Automate data engineering pipelines and data validation processes to eliminate manual interventions. Implement data security and privacy measures, including access controls, key management, and encryption techniques. Stay updated on technology trends, experimenting with new tools, and educating team members. Collaborate with analytics and business teams to improve data models and enhance data accessibility. Communicate effectively with both technical and non-technical stakeholders. Qualifications: Education: Bachelors degree in Computer Science, Computer Engineering, or a related field. Experience: Minimum of 5+ years in architecting, designing, and building data engineering solutions and data platforms. Proven experience in building Lakehouse or Data Warehouses on platforms like Databricks or Snowflake. Expertise in designing and building highly optimized batch/streaming data pipelines using Databricks. Proficiency with data acquisition and transformation tools such as Fivetran and DBT. Strong experience in building efficient data engineering pipelines using Python and PySpark. Experience with distributed data processing frameworks such as Apache Hadoop, Apache Spark, or Flink. Familiarity with real-time data stream processing using tools like Apache Kafka, Kinesis, or Spark Structured Streaming. Experience with various AWS services, including S3, EC2, EMR, Lambda, RDS, DynamoDB, Redshift, and Glue Catalog. Expertise in advanced SQL programming and performance tuning. Key Skills: Strong problem-solving abilities and perseverance in the face of ambiguity. Excellent emotional intelligence and interpersonal skills. Ability to build and maintain productive relationships with internal and external stakeholders. A self-starter mentality with a focus on growth and quick learning. Passion for operational products and creating outstanding employee experiences. If you are interested in this position, please send your resume to netra.s@twsol.com.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies