Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
11 - 16 Lacs
Hyderabad
Work from Office
Job Summary: We are looking for a highly skilled AWS Data Architect to design and implement scalable, secure, and high-performing data architecture solutions on AWS. The ideal candidate will have hands-on experience in building data lakes, data warehouses, and data pipelines, along with a solid understanding of data governance and cloud security best practices. Roles and Responsibilities: Design and implement data architecture solutions on AWS using services such as S3, Redshift, Glue, Lake Formation, Athena, and Lambda. Develop scalable ETL/ELT workflows and data pipelines using AWS Glue, Apache Spark, or AWS Data Pipeline. Define and implement data governance, security, and compliance strategies, including IAM policies, encryption, and data cataloging. Create and manage data lakes and data warehouses that are scalable, cost-effective, and secure. Collaborate with data engineers, analysts, and business stakeholders to develop robust data models and reporting solutions. Evaluate and recommend tools, technologies, and best practices to optimize data architecture and ensure high-quality solutions. Ensure data quality, performance tuning, and optimization for large-scale data storage and processing Required Skills and Qualifications: Proven experience in AWS data services such as S3, Redshift, Glue, etc. Strong knowledge of data modeling, data warehousing, and big data architecture. Hands-on experience with ETL/ELT tools and data pipeline frameworks. Good understanding of data security and compliance in cloud environments. Excellent problem-solving skills and ability to work collaboratively with cross-functional teams. Strong verbal and written communication skills. Preferred Skills: AWS Certified Data Analytics – Specialty or AWS Solutions Architect Certification. Experience in performance tuning and optimizing large datasets.
Posted 1 week ago
15.0 - 20.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Amazon Web Services (AWS) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for an experienced AWS Cloud Specialist to help design, build, and maintain cloud infrastructure that supports robust and secure connectivity between AWS and Salesforce platforms. The ideal candidate will have strong practical knowledge of core AWS services, including EC2 (VMs), S3, IAM, CloudWatch, and VPC, as well as experience with advanced networking solutions like Direct Connect and NLB. This role is critical to ensuring smooth, secure integration and performance between cloud applications and Salesforce. Roles & Responsibilities:- Design and manage foundational AWS infrastructure components including:EC2 instances (VMs)S3 bucketsIAM roles, users, and policiesCloudWatch, CloudTrail for monitoring and auditing- Architect and implement secure AWS networking using VPC, subnets, route tables, NAT Gateways, and Network Load Balancers (NLBs).- Establish and maintain Salesforce-to-AWS connectivity using Direct Connect, VPNs, and Salesforce Private Connect.- Support Salesforce API integrations and ensure network performance for connected services.- Automate infrastructure provisioning and management using Terraform or CloudFormation.- Monitor system health, diagnose issues, and troubleshoot performance bottlenecks across AWS and Salesforce touchpoints.- Maintain clear documentation for architecture, configurations, procedures, and change management. Professional & Technical Skills: - Proficiency with core AWS services, including:EC2 (virtual machines)S3 (storage)IAM (identity & access management)CloudWatch, CloudTrail- Hands-on experience with:VPC design, subnets, security groups, route tablesDirect Connect, VPN, and hybrid network setupNetwork Load Balancer (NLB) and target group configuration- Understanding of Salesforce integration methods such as Private Connect and REST APIs- Familiarity with cloud networking and security best practices- Experience with Infrastructure-as-Code tools like Terraform or CloudFormation- Strong problem-solving, collaboration, and documentation skills- AWS Certifications (e.g., Solutions Architect Associate, Advanced Networking Specialty)- Exposure to API Gateway, Lambda, and serverless integration- Knowledge of Salesforce authentication protocols (OAuth2, SAML)- Scripting experience (e.g., Python, Bash) for automation tasks- Understanding of compliance standards (e.g., GDPR, HIPAA) and cloud cost optimization strategies Additional Information:- The candidate should have minimum 5 years of experience in Amazon Web Services (AWS).- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Data Engineering Sr. Advisor demonstrates expertise in data engineering technologies with the focus on engineering, innovation, strategic influence and product mindset. This individual will act as key contributor of the team to design, build, test and deliver large-scale software applications, systems, platforms, services or technologies in the data engineering space. This individual will have the opportunity to work directly with partner IT and business teams, owning and driving major deliverables across all aspects of software delivery.The candidate will play a key role in automating the processes on Databricks and AWS. They collaborate with business and technology partners in gathering requirements, develop and implement. The individual must have strong analytical and technical skills coupled with the ability to positively influence on delivery of data engineering products. The applicant will be working in a team that demands innovation, cloud-first, self-service-first, and automation-first mindset coupled with technical excellence. The applicant will be working with internal and external stakeholders and customers for building solutions as part of Enterprise Data Engineering and will need to demonstrate very strong technical and communication skills.Delivery Intermediate delivery skills including the ability to deliver work at a steady, predictable pace to achieve commitments, decompose work assignments into small batch releases and contribute to tradeoff and negotiation discussions.Domain Expertise Demonstrated track record of domain expertise including the ability to understand technical concepts necessary to do the job effectively, demonstrate willingness, cooperation, and concern for business issues and possess in-depth knowledge of immediate systems worked on.Problem Solving Proven problem solving skills including debugging skills, allowing you to determine source of issues in unfamiliar code or systems and the ability to recognize and solve repetitive problems rather than working around them, recognize mistakes using them as learning opportunities and break down large problems into smaller, more manageable ones& Responsibilities:The candidate will be responsible to deliver business needs end to end from requirements to development into production.Through hands-on engineering approach in Databricks environment, this individual will deliver data engineering toolchains, platform capabilities and reusable patterns.The applicant will be responsible to follow software engineering best practices with an automation first approach and continuous learning and improvement mindset.The applicant will ensure adherence to enterprise architecture direction and architectural standards.The applicant should be able to collaborate in a high-performing team environment, and an ability to influence and be influenced by others.Experience Required:More than 12 years of experience in software engineering, building data engineering pipelines, middleware and API development and automationMore than 3 years of experience in Databricks within an AWS environmentData Engineering experienceExperience Desired:Expertise in Agile software development principles and patternsExpertise in building streaming, batch and event-driven architectures and data pipelinesPrimary Skills: Cloud-based security principles and protocols like OAuth2, JWT, data encryption, hashing data, secret management, etc.Expertise in Big data technologies such as Spark, Hadoop, Databricks, Snowflake, EMR, GlueGood understanding of Kafka, Kafka Streams, Spark Structured streaming, configuration-driven data transformation and curationExpertise in building cloud-native microservices, containers, Kubernetes and platform-as-a-service technologies such as OpenShift, CloudFoundryExperience in multi-cloud software-as-a-service products such as Databricks, SnowflakeExperience in Infrastructure-as-Code (IaC) tools such as terraform and AWS cloudformationExperience in messaging systems such as Apache ActiveMQ, WebSphere MQ, Apache Artemis, Kafka, AWS SNSExperience in API and microservices stack such as Spring Boot, Quarkus, Expertise in Cloud technologies such as AWS Glue, Lambda, S3, Elastic Search, API Gateway, CloudFrontExperience with one or more of the following programming and scripting languages Python, Scala, JVM-based languages, or JavaScript, and ability to pick up new languagesExperience in building CI/CD pipelines using Jenkins, Github ActionsStrong expertise with source code management and its best practicesProficient in self-testing of applications, unit testing and use of mock frameworks, test-driven development (TDD)Knowledge on Behavioral Driven Development (BDD) approachAdditional Skills: Ability to perform detailed analysis of business problems and technical environmentsStrong oral and written communication skillsAbility to think strategically, implement iteratively and estimate financial impact of design/architecture alternativesContinuous focus on an on-going learning and development Qualification 15 years full time education
Posted 1 week ago
5.0 - 10.0 years
13 - 17 Lacs
Pune
Work from Office
Project Role : Security Architect Project Role Description : Define the security architecture, ensuring that it meets the business requirements and performance goals. Must have skills : Security Platform Engineering Good to have skills : Java Enterprise Edition, Amazon Web Services (AWS), Infrastructure As Code (IaC)Minimum 5 year(s) of experience is required Educational Qualification : BE or BTech Degree in Computer Science:As a Security Architect, you will be solving deep technical problems and building creative solutions in a dynamic environment working with knowledgeable and passionate SDEs. You are experienced building for the cloud designing for five 9s, globally distributed all active deployments, horizontal scalability, fault tolerance, and more. You are motivated by learning, evaluating, and deploying new technologies. Our services are deployed in an Amazon Web Services environment, and so you will be working hands on with many AWS components. You thrive in true agile, highly paced, production facing environment. You have a low tolerance for mediocrity. You love to write code and build extraordinary things.We are looking for coders, people who love to code, just like we do. You should be energetic, confident, and ready to contribute in many areas of the software development lifecycle. You may be involved in all the aspects from research, design, specs, coding, and bug fixing. Our team focus is on writing dependable code and getting high quality products and services to market as quick as possible. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Develop and implement security policies and procedures.- Conduct security assessments and audits.- Stay updated on the latest security trends and technologies. Professional & Technical Skills: - Experience with CICD pipelines (i.e. Jenkins) with building and/or configuring pipelines for build and deployment of software.- Coding knowledge and experience with SQL- Experience of writing scripts in Python, PowerShell or Bash- Experience in building highly-available (HA) production-grade solutions in AWS.- CICV, CICD, Automation (Terraform a plus)- Experience designing/implementing high performance Web services using SOA/REST/Microservices- Experience in the design/build/maintenance/refactor of large scale low latency high performance systems- Ability to quickly learn and develop expertise in existing highly complex applications and architectures- Extensive knowledge with high volume distributed application development in cloud environment- Strong troubleshooting and debugging skills, particularly in both production and non-production environments.- Experience using Agile methodologies, TDD, Code review, clear and concise documentation- Strong analytic, problem solving, and troubleshooting skills- Uncommon ability and motivation to tackle problems and learn fast- Ability to perform at a high level within a technical team- Ability to work independently with minimal supervision- Excellent communication and relationship skills- Distributed teamwork Additional Information:- The candidate should have a minimum of 5 years of experience in Security Platform Engineering.- Minimum 3 years experience building AWS cloud native services using EC2, S3, ECS, SQS, API Gateway, Lambda, etc.- Minimum 5 years coding knowledge and experience with java and/or C++ and object oriented methodologies- Minimum 1 year experience with CICD pipelines (i.e. Jenkins) with building and/or configuring pipelines for build and deployment of software.- This position is based at our Pune office.- A BE or BTech Degree in Computer Science or related technical field or equivalent practical knowledge is required. Qualification BE or BTech Degree in Computer Science
Posted 1 week ago
5.0 - 10.0 years
5 - 9 Lacs
Coimbatore
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : AWS Architecture Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : should be a graduate and AWS certified Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. You will play a crucial role in ensuring that the applications are designed to meet the needs of the organization and its stakeholders. Your typical day will involve collaborating with various teams, analyzing requirements, and designing innovative solutions to address business challenges. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Collaborate with stakeholders to gather requirements and understand business processes.- Design and develop applications that meet the business process and application requirements.- Ensure the applications are scalable, secure, and efficient.- Conduct code reviews and provide guidance to the development team.- Stay updated with the latest industry trends and technologies.- Assist in troubleshooting and resolving application issues.- Document application designs, processes, and procedures.- Train and mentor junior team members to enhance their skills and knowledge. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Architecture.- Good To Have Skills: Experience with cloud platforms such as Azure or Google Cloud.- Strong understanding of cloud computing concepts and architecture.- Experience in designing and implementing scalable and secure cloud solutions.- Knowledge of AWS services such as EC2, S3, Lambda, RDS, and DynamoDB.- Familiarity with infrastructure as code tools like CloudFormation or Terraform.- Experience in designing and implementing CI/CD pipelines.- Excellent problem-solving and analytical skills. Additional Information:- The candidate should have a minimum of 5 years of experience in AWS Architecture.- This position is based at our Coimbatore office.- A graduate degree is required and AWS certification is preferred. Qualification should be a graduate and AWS certified
Posted 1 week ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : ASP.NET MVC Good to have skills : Amazon Web Services (AWS)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will be pivotal in driving innovation and efficiency within the application development lifecycle, fostering a collaborative environment that encourages creativity and problem-solving. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in ASP.NET MVC.- Good To Have Skills: Experience with Amazon Web Services (AWS).- Strong understanding of web application architecture and design patterns.- Experience with front-end technologies such as HTML, CSS, and JavaScript.- Familiarity with database management systems and SQL.- Ability to troubleshoot and optimize application performance.Must -AWS:Lambda, DynamoDB, CloudWatch, ...-GitHubMust/Nice -NodeJSNice -React-Azure DevOps-NewRelic knowledge-Fintech knowledge Additional Information:- The candidate should have minimum 5 years of experience in ASP.NET MVC.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Spring Boot, Amazon Web Services (AWS), Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication within the team and stakeholders. Roles & Responsibilities:- Design, develop, and maintain high-quality applications using Spring Boot.- Collaborate with cross-functional teams to identify and prioritize application requirements.- Develop and maintain Microservices and Light Weight Architecture.- Integrate MongoDB with Spring Boot applications for efficient data storage and retrieval. Professional & Technical Skills: Must To Have Skills: Proficiency in Spring Boot, Amazon Web Services (AWS), Oracle Procedural Language Extensions to SQL (PLSQL)- Resource should be good at Coding. Please conduct Coding Test.- Strong understanding of Spring Boot and its various components.- Experience with RESTful web services and API development.- Experience with database design and development.- Experience with version control systems such as Git.- Experience with agile development methodologies such as Scrum or Kanban.- Knowledge of database technologies such as MySQL, PostgreSQL, or MongoDB.- Good to have AWS AppSync, Lambda experience.- Ready to work in shifts - 12 PM to 10 PM Additional Information:- The candidate should have a minimum of 12 years of experience in Spring Boot.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 week ago
7.0 - 10.0 years
27 - 42 Lacs
Chennai
Work from Office
Data Engineer Skills and Qualifications SQL - Mandatory Strong knowledge of AWS services (e.g., S3, Glue, Redshift, Lambda ). - Mandatory Experience working with DBT – Nice to have Proficiency in PySpark or Python for big data processing. - Mandatory Experience with orchestration tools like Apache Airflow and AWS CodePipeline . - Mandatory Familiarity with CI/CD tools and DevOps practices. Expertise in data modeling, ETL processes, and data warehousing.
Posted 1 week ago
12.0 - 15.0 years
20 - 25 Lacs
Pune, Bengaluru, Hinjewadi
Work from Office
job requisition idJR1027350 Job Summary Synechron is seeking a experienced and strategic Delivery Lead to oversee complex technology projects utilizing .NET, C#, and AWS. This role is instrumental in managing end-to-end project delivery, guiding cross-functional teams, and ensuring alignment with business objectives. The Delivery Lead will drive improvements in delivery efficiency, quality, and stakeholder satisfaction, contributing significantly to the organizations technological growth and operational excellence. Software Required Skills: Development and delivery experience with .NET (preferably version 4.7 or later) C# programming proficiency Hands-on experience with Amazon Web Services (AWS) (EC2, S3, Lambda, etc.) Project management tools Jira , SharePoint , MS Excel , PowerPoint , Power BI Agile and Waterfall project management methodologies Preferred Skills: Experience with DevOps/CI-CD pipelines Knowledge of Azure cloud platform Familiarity with software release management Overall Responsibilities End-to-End Project Delivery: Manage multiple projects from initiation to closure, ensuring delivery is on time, within scope, and within budget. Team Leadership: Lead and mentor diverse project teams, fostering collaboration and high performance. Stakeholder Management: Act as the primary point of contact for clients and internal stakeholders, translating business needs into technical solutions. Governance & Compliance: Ensure adherence to organizational policies, standards, and industry best practices. Technical Oversight: Provide guidance on architecture, technology choices, and solution design aligned with best practices. Process Optimization: Continuously identify opportunities to improve delivery processes, increase efficiency, and reduce risk. Financial Oversight: Monitor project budgets, optimize resource utilization, and report on financial performance. Risk & Issue Management: Identify, assess, and mitigate risks impacting project delivery. Performance Measurement: Establish metrics and KPIs to measure project success and customer satisfaction. Technical Skills (By Category) Programming Languages: Essential: C#, .NET Framework/Core Preferred: Java, Python (for integration or automation) Databases/Data Management: SQL Server, AWS RDS Cloud Technologies: AWS cloud services (EC2, S3, Lambda, CloudWatch) Frameworks & Libraries: .NET Core / .NET Framework RESTful APIs, Microservices architecture Development Tools & Methodologies: Agile, Scrum, Kanban DevOps tools (Jenkins, Azure DevOps, Git) Security Protocols: AWS security best practices Data privacy and compliance standards Experience 12 to 15 years of professional experience in managing IT projects and delivery teams. Demonstrable experience leading large-scale software development and implementation projects. Strong background with .NET/C# development, AWS cloud solutions, and cross-functional team management. Experience managing global or distributed teams. Proven stakeholder management experience with senior management and clients. Prior exposure to Agile and Waterfall project methodologies. Alternative Experience: Candidates with extensive experience in software delivery, cloud migration, or enterprise application implementation may be considered. Day-to-Day Activities Conduct project planning, resource allocation, and status reporting. Hold regular stand-ups, progress reviews, and stakeholder meetings. Review development progress, remove blockers, and ensure adherence to quality standards. Collaborate with technical teams on architecture design and problem resolution. Manage change requests, scope adjustments, and project adjustments. Track project KPIs, update dashboards, and communicate progress to leadership. Oversee risk registers and implement mitigation strategies. Facilitate retrospectives and process improvement initiatives. Qualifications Bachelors degree in Computer Science, Engineering, or related field; Masters preferred. Project Management certifications such as PMP, PMI-ACP, or ScrumMaster are advantageous. Training or certification in AWS or cloud architecture is preferred. Commitment to continuous learning and professional development. Professional Competencies Strong analytical and problem-solving skills Effective leadership and team management capabilities Excellent stakeholder communication and negotiation skills Ability to adapt to evolving project requirements and technologies Strategic thinking and organizational agility Data-driven decision-making Prioritization and time management skills Change management and process improvement orientation
Posted 1 week ago
8.0 - 12.0 years
30 - 40 Lacs
Pune
Work from Office
Assessment & Analysis Review CAST software intelligence reports to identify technical debt, architectural flaws, and cloud readiness. Conduct manual assessments of applications to validate findings and prioritize migration efforts. Identify refactoring needs (e.g., monolithic to microservices, serverless adoption). Evaluate legacy systems (e.g., .NET Framework, Java EE) for compatibility with AWS services. Solution Design Develop migration strategies (rehost, replatform, refactor, retire) for each application. Architect AWS-native solutions using services like EC2, Lambda, RDS, S3, and EKS. Design modernization plans for legacy systems (e.g., .NET Framework .NET Core, Java EE Spring Boot). Ensure compliance with AWS Well-Architected Framework (security, reliability, performance, cost optimization). Collaboration & Leadership Work with cross-functional teams (developers, DevOps, security) to validate designs. Partner with clients to align technical solutions with business objectives. Mentor junior architects and engineers on AWS best practices. Roles and Responsibilities Job Title: Senior Solution Architect - Cloud Migration & Modernization (AWS) Location: [Insert Location] Department: Digital Services Reports To: Cloud SL
Posted 2 weeks ago
5.0 - 9.0 years
8 - 13 Lacs
Noida
Work from Office
About the Role: Grade Level (for internal use): 11 T he Team Dividend Forecasting (DF) provide discrete forecasts for over 28,000 stocks worldwide, supported by regional insights from our team of 40 dividend analysts and leveraging our Advanced Analytics Dividend Forecasting Model. Our Dividend Forecasting service provides timely data, insights and commentary to help financial institutions price derivatives, enhance investment decisions and manage risks. The team, located in India, US, Singapore , and the UK, is currently focused on working on strategic projects and BAU in a cloud-native architecture to enhance scalability and efficiency. The Impact Scalability and Performance Your role will be key in designing the platform and project for cloud-native architecture, adding product growth and optimizing performance to handle increased data loads. Innovation and Efficiency You will implement advanced cloud technologies and best practices, streamlining processes to enhance product features, operational efficiency and accelerate service delivery. Whats in it for you Joining this role presents a unique opportunity for professional development and skill enhancement in a dynamic and innovative environment. Cloud Development Skills: Gain expertise in cloud-native technologies and frameworks, enhancing your proficiency in modern application development. Cross-Functional Collaboration Work closely with diverse teams across regions, improving your collaboration and communication skills in a global environment. Agile Methodologies Experience working in an Agile development environment, allowing you to adapt quickly to changes and improve project management skills. Data Management and Analytics Develop skills in managing and analyzing large datasets, which is crucial for optimizing performance and driving data-driven decisions. Responsibilities Design, develop, and maintain cloud-based applications using Java based stack . Migrate legacy components to modern cloud architecture, ensuring scalability, reliability, and security. Implement and manage AWS cloud technologies, focusing on commonly used services such as EC2, S3, Lambda, and RDS. Take ownership of projects, from concept to delivery, ensuring adherence to project timelines and objectives . Work as part of an agile team to identify and deliver solutions to prioritized requirements Must demonstrate strong expertise in system design, architectural patterns, and building efficient, scalable systems. What Were Looking For Bachelors degree in computer science , Engineering, or related field. 5 to 9 years of experience in software development with hands on experience on Java , Sprin g and Angular . Hands on experience and knowledge on AWS (S3, Lambda, Step functions, SNS, SQS, RDS , ECS, Others ) Hands on experience and knowledge on RDBMS ( MS SQL Server , PostgreSQL , Oracle , Others ) Excellent problem-solving skills and the ability to work independently or as part of a team. Exceptional communication skills, with the ability to articulate technical concepts to non-technical stakeholders. Good to have knowledge of Python language. Basic Qualifications Bachelors degree in computer science , Engineering, or related field. Preferred Qualifications B Tech in computer science , IT, Engineering, or related field. MCA Computer science.
Posted 2 weeks ago
5.0 - 10.0 years
2 - 6 Lacs
Bengaluru
Work from Office
Req ID: 324162 We are currently seeking a Python Engineer with AWS and Java to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job DutiesMorgan Stanley is seeking a highly skilled Senior Python Developer with over 5 years of experience to join our team in developing a state-of-the-art electronic communications surveillance system. This system will monitor all voice communications, chats, and email messages of employees across the firm, ensuring compliance and security. The ideal candidate will have a proven track record in writing high-performance, low-latency code capable of processing millions of messages daily, with expertise in Python, a solid understanding of data structures, design patterns, and familiarity with Java. Responsibilities "¢ Design, develop, and implement a robust surveillance system from the ground up to monitor electronic communications in real-time. "¢ Write high-performance, low-latency Python code to handle large-scale message processing (millions of messages per day). "¢ Collaborate with cross-functional teams to define system architecture and ensure scalability, reliability, and maintainability. "¢ Optimize data processing pipelines using Apache Kafka for real-time message streaming. "¢ Leverage Amazon AWS for cloud-based infrastructure, ensuring secure and efficient deployment. "¢ Design and maintain database schemas in Postgres SQL for efficient data storage and retrieval. "¢ Integrate Collibra for data governance and metadata management. "¢ Utilize Airflow for workflow orchestration and scheduling. "¢ Implement CI/CD pipelines using Jenkins and manage containerized applications with Docker. "¢ Use Artifactory for artifact management and dependency tracking. "¢ Apply advanced knowledge of data structures and design patterns to create clean, modular, and reusable code. "¢ Contribute to code reviews, testing, and documentation to maintain high-quality standards. Minimum Skills Required"¢ Experience5+ years of professional software development experience, with a focus on Python. "¢ Technical Skills: o Expertise in writing high-performance, low-latency Python code for large-scale systems. o Strong understanding of data structures, algorithms, and design patterns. o Familiarity with Java for cross-language integration and support. o Hands-on experience with Apache Kafka for real-time data streaming. o Proficiency in Amazon AWS services (e.g., EC2, S3, Lambda, RDS). o Experience with Postgres SQL for relational database management. o Knowledge of Collibra for data governance (preferred). o Familiarity with Apache Airflow for workflow orchestration. o Experience with Jenkins CI for continuous integration and deployment. o Proficiency in Docker for containerization and Artifactory for artifact management. "¢ Soft Skills: o Strong problem-solving skills and attention to detail. o Ability to work independently and collaboratively in a fast-paced environment. o Excellent communication skills to articulate technical concepts to non-technical stakeholders. "¢ EducationBachelor"™s degree in Computer Science, Engineering, or a related field (or equivalent experience). Preferred Qualifications "¢ Experience in financial services or compliance systems. "¢ Familiarity with surveillance or monitoring systems for voice, chat, or email communications. Knowledge of regulatory requirements in the financial industry
Posted 2 weeks ago
5.0 - 10.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Req ID: 306668 We are currently seeking a Cloud Solution Delivery Sr Advisor to join our team in Bangalore, Karntaka (IN-KA), India (IN). Position Overview We are seeking a highly skilled and experienced Lead Data Engineer to join our dynamic team. The ideal candidate will have a strong background in implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies; leading teams and directing engineering workloads. This role requires a deep understanding of data engineering, cloud services, and the ability to implement high quality solutions. Key Responsibilities Lead and direct a small team of engineers engaged in - Engineer end-to-end data solutions using AWS services, including Lambda, S3, Snowflake, DBT, Apache Airflow - Cataloguing data - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Providing best in class documentation for downstream teams to develop, test and run data products built using our tools - Testing our tooling, and providing a framework for downstream teams to test their utilisation of our products - Helping to deliver CI, CD and IaC for both our own tooling, and as templates for downstream teams - Use DBT projects to define re-usable pipelines Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 5+ years of experience in data engineering - 2+ years of experience inleading a team of data engineers - Experience in AWS cloud services - Expertise with Python and SQL - Experience of using Git / Github for source control management - Experience with Snowflake - Strong understanding of lakehouse architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Strong use of version control and proven ability to govern a team in the best practice use of version control - Strong understanding of Agile and proven ability to govern a team in the best practice use of Agile methodologies Preferred Skills and Qualifications - An understanding of Lakehouses - An understanding of Apache Iceberg tables - An understanding of data cataloguing - Knowledge of Apache Airflow for data orchestration - An understanding of DBT - SnowPro Core certification
Posted 2 weeks ago
7.0 - 12.0 years
16 - 20 Lacs
Pune
Work from Office
Req ID: 301930 We are currently seeking a Digital Solution Architect Lead Advisor to join our team in Pune, Mahrshtra (IN-MH), India (IN). Position Overview We are seeking a highly skilled and experienced Data Solution Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Key Responsibilities - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS - Design and implement data streaming pipelines using Kafka/Confluent Kafka - Develop data processing applications using Python - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Provide technical leadership and mentorship to development teams - Stay current with emerging technologies and industry trends Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Proficiency in Kafka/Confluent Kafka and Python - Experience with Synk for security scanning and vulnerability management - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders
Posted 2 weeks ago
7.0 - 12.0 years
13 - 18 Lacs
Bengaluru
Work from Office
We are currently seeking a Lead Data Architect to join our team in Bangalore, Karntaka (IN-KA), India (IN). Position Overview We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Key Responsibilities - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, Kafka and Confluent, all within a larger and overarching programme ecosystem - Architect data processing applications using Python, Kafka, Confluent Cloud and AWS - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams - Provide technical leadership and mentorship to development teams and lead engineers - Stay current with emerging technologies and industry trends Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Strong experience with Confluent - Strong experience in Kafka - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Knowledge of Apache Airflow for data orchestration Preferred Qualifications - An understanding of cloud networking patterns and practises - Experience with working on a library or other long term product - Knowledge of the Flink ecosystem - Experience with Terraform - Deep experience with CI/CD pipelines - Strong understanding of the JVM language family - Understanding of GDPR and the correct handling of PII - Expertise with technical interface design - Use of Docker Responsibilities - Design and implement scalable data architectures using AWS services, Confluent and Kafka - Develop data ingestion, processing, and storage solutions using Python and AWS Lambda, Confluent and Kafka - Ensure data security and implement best practices using tools like Synk - Optimize data pipelines for performance and cost-efficiency - Collaborate with data scientists and analysts to enable efficient data access and analysis - Implement data governance policies and procedures - Provide technical guidance and mentorship to junior team members - Evaluate and recommend new technologies to improve data architecture
Posted 2 weeks ago
5.0 - 10.0 years
6 - 11 Lacs
Bengaluru
Work from Office
Req ID: 306669 We are currently seeking a Lead Data Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Position Overview We are seeking a highly skilled and experienced Lead Data/Product Engineer to join our dynamic team. The ideal candidate will have a strong background in streaming services and AWS cloud technology, leading teams and directing engineering workloads. This is an opportunity to work on the core systems supporting multiple secondary teams, so a history in software engineering and interface design would be an advantage. Key Responsibilities Lead and direct a small team of engineers engaged in - Engineering reuseable assets for the later build of data products - Building foundational integrations with Kafka, Confluent Cloud and AWS - Integrating with a large number of upstream and downstream technologies - Providing best in class documentation for downstream teams to develop, test and run data products built using our tools - Testing our tooling, and providing a framework for downstream teams to test their utilisation of our products - Helping to deliver CI, CD and IaC for both our own tooling, and as templates for downstream teams Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 5+ years of experience in data engineering - 3+ years of experience with real time (or near real time) streaming systems - 2+ years of experience leading a team of data engineers - A willingness to independently learn a high number of new technologies and to lead a team in learning new technologies - Experience in AWS cloud services, particularly Lambda, SNS, S3, and EKS, API Gateway - Strong experience with Python - Strong experience in Kafka - Excellent understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts both directly and through documentation - Strong use of version control and proven ability to govern a team in the best practice use of version control - Strong understanding of Agile and proven ability to govern a team in the best practice use of Agile methodologies Preferred Skills and Qualifications - An understanding of cloud networking patterns and practises - Experience with working on a library or other long term product - Knowledge of the Flink ecosystem - Experience with terraform - Experience with CI pipelines - Ability to code in a JVM language - Understanding of GDPR and the correct handling of PII - Knowledge of technical interface design - Basic use of Docker
Posted 2 weeks ago
8.0 - 12.0 years
10 - 14 Lacs
Gurugram
Work from Office
About The Role : AWS Cloud Engineer Required Skills and Qualifications: 4-7 years of hands-on experience with AWS services, including EC2, S3, Lambda, ECS, EKS, and RDS/DynamoDB, API Gateway. Strong working knowledge of Python, JavaScript. Strong experience with Terraform for infrastructure as code. Expertise in defining and managing IAM roles, policies, and configurations . Experience with networking, security, and monitoring within AWS environments. Experience with containerization technologies such as Docker and orchestration tools like Kubernetes (EKS) . Strong analytical, troubleshooting, and problem-solving skills. Experience with AI/ML technologies and Services like Textract will be preferred. AWS Certifications ( AWS Developer, Machine Learning - Specialty ) are a plus. Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback2Self- ManagementProductivity, efficiency, absenteeism, Training Hours, No of technical training completed
Posted 2 weeks ago
3.0 - 7.0 years
5 - 9 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Locations- Pune/Bangalore/Hyderabad/Ahmedabad/Indore Job Responsibilities 1.Design and deploy scalable, highly available, secure, and fault tolerant systems on AWS for the development and test lifecycle of our cloud security product. 2.Focus on building Dockerized application components by integrating with AWS EKS. 3.Modify existing Application in AWS to improve performance. 4.Passion for solving challenging issues. 5.Promote cooperation and commitment within a team to achieve common goals. 6.Examine data to grasp issues, draw conclusions, and solve problems. Must Have Skills 1.Demonstrated competency with the following AWS servicesEKS, AppStream, Cognito, CloudWatch, Fargate Cluster, EC2, EBS, S3, Glacier, RDS, VPC, Route53, ELB, IAM, CloudFront, CloudFormation, SQS, SES, Lambda, APIGateway, 2.Knowledge in Containerization Hosting Technologies like Docker and Kubernetes is highly desirable 3.Experienced with ECS and EKS Managed node clusters, and Fargate 4.Proficient knowledge in scripting (Linux, Unix shell scripts, Python, Ruby, etc.) 5.Hands-on experience in Configuration Management and Deployment tools (CloudFormation, Terraform etc.) 6.Mastery of CI/CD tools (Jenkins, etc.) 7.Building CI/CD pipelines & competency in GIT Good working exposure with Jenkins and GitLab (GitHub, GitLab, Bitbucket) 8.Experience with DevOps Services of cloud vendors (AWS/Azure/GCP, etc.) is necessary. 9.Must be from Development Background 10.Exposure to application and infrastructure monitoring tools (Grafana, Prometheus, Nagios, etc.) outstanding skill to have! 11.Excellent soft skills for IT professionals 12.Aware with AWS IAM policies and basic guidelines 13.Sufficient understanding of AWS network components Good to Have 1.Experience in integrating SCM, Code Quality, Code Coverage, and Testing tools for CI/CD pipelines. 2.Developer background, worked with several code analysis tools, integrations like Sonarqure, Fortify etc, 3.Understands well about the static code analysis
Posted 2 weeks ago
6.0 - 10.0 years
13 - 17 Lacs
Mumbai, Pune
Work from Office
Design Containerized & cloud-native Micro services Architecture Plan & Deploy Modern Application Platforms & Cloud Native Platforms Good understanding of AGILE process & methodology Plan & Implement Solutions & best practices for Process Automation, Security, Alerting & Monitoring, and Availability solutions Should have good understanding of Infrastructure-as-code deployments Plan & design CI/CD pipelines across multiple environments Support and work alongside a cross-functional engineering team on the latest technologies Iterate on best practices to increase the quality & velocity of deployments Sustain and improve the process of knowledge sharing throughout the engineering team Keep updated on modern technologies & trends, and advocate the benefits Should possess good team management skills Ability to drive goals / milestones, while valuing & maintaining a strong attention to detail Excellent Judgement, Analytical & problem-solving skills Excellent in communication skills Experience maintaining and deploying highly-available, fault-tolerant systems at scale Practical experience with containerization and clustering (Kubernetes/OpenShift/Rancher/Tanzu/GKE/AKS/EKS etc) Version control system experience (e.g. Git, SVN) Experience implementing CI/CD (e.g. Jenkins, TravisCI) Experience with configuration management tools (e.g. Ansible, Chef) Experience with infrastructure-as-code (e.g. Terraform, Cloud formation) Expertise with AWS (e.g. IAM, EC2, VPC, ELB, ALB, Autoscaling, Lambda) Container Registry Solutions (Harbor, JFrog, Quay etc) Operational (e.g. HA/Backups) NoSQL experience (e.g. Cassandra, MongoDB, Redis) Good understanding on Kubernetes Networking & Security best practices Monitoring Tools like DataDog, or any other open source tool like Prometheus, Nagios Load Balancer Knowledge (AVI Networks, NGINX) Location: Pune / Mumbai [Work from Office]
Posted 2 weeks ago
3.0 - 6.0 years
2 - 6 Lacs
Chennai
Work from Office
AWS Lambda Glue Kafka/Kinesis RDBMS Oracle, MySQL, RedShift, PostgreSQL, Snowflake Gateway Cloudformation / Terraform Step Functions Cloudwatch Python Pyspark Job role & responsibilities: Looking for a Software Engineer/Senior Software engineer with hands on experience in ETL projects and extensive knowledge in building data processing systems with Python, pyspark and Cloud technologies(AWS). Experience in development in AWS Cloud (S3, Redshift, Aurora, Glue, Lambda, Hive, Kinesis, Spark, Hadoop/EMR) Required Skills: Amazon Kinesis, Amazon Aurora, Data Warehouse, SQL, AWS Lambda, Spark, AWS QuickSight Advanced Python Skills Data Engineering ETL and ELT Skills Experience of Cloud Platforms (AWS or GCP or Azure) Mandatory skills- Datawarehouse, ETL, SQL, Python, AWS Lambda, Glue, AWS Redshift.
Posted 2 weeks ago
4.0 - 9.0 years
3 - 7 Lacs
Coimbatore
Work from Office
Skills: AWSCompute, Networking, Security, EC2, S3, IAM, VPC, LAMBDA, RDS, ECS, EKS, CLOUDWATCH, LOAD BALANCERS,Autoscaling, CloudFront, Route53, Security Groups, DynamoDB, CloudTrail, REST API's, Fast-API, Node.js (Mandatory) Azure (Overview) (Optional) GCP (Overview) (Optional) Programming/IAC Skills: Python (Mandatory) Chef (Mandatory) Ansible (Mandatory) Terraform (Mandatory) Go (optional) Java (Optional) Candidate should be more than 4 years on cloud Development. Presently should be working on cloud Development.
Posted 2 weeks ago
6.0 - 10.0 years
15 - 25 Lacs
Bengaluru
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. AWS Data/API Gateway Pipeline Engineer responsible for designing, building, and maintaining real-time, serverless data pipelines and API services. This role requires extensive hands-on experience with Java, Python, Redis, DynamoDB Streams, and PostgreSQL, along with working knowledge of AWS Lambda and AWS Glue for data processing and orchestration. This position involves collaboration with architects, backend developers, and DevOps engineers to deliver scalable, event-driven data solutions and secure API services across cloud-native systems. Key Responsibilities API & Backend Engineering Build and deploy RESTful APIs using AWS API Gateway, Lambda, and Java and Python. Integrate backend APIs with Redis for low-latency caching and pub/sub messaging. Use PostgreSQL for structured data storage and transactional processing. Secure APIs using IAM, OAuth2, and JWT, and implement throttling and versioning strategies. Data Pipeline & Streaming Design and develop event-driven data pipelines using DynamoDB Streams to trigger downstream processing. Use AWS Glue to orchestrate ETL jobs for batch and semi-structured data workflows. Build and maintain Lambda functions to process real-time events and orchestrate data flows. Ensure data consistency and resilience across services, queues, and databases. Cloud Infrastructure & DevOps Deploy and manage cloud infrastructure using CloudFormation, Terraform, or AWS CDK. Monitor system health and service metrics using CloudWatch, SNS and structured logging. Contribute to CI/CD pipeline development for testing and deploying Lambda/API services. So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience Bachelor's degree in computer science, Engineering, or a related field. Over 6 years of experience in developing backend or data pipeline services using Java and Python . Strong hands-on experience with: AWS API Gateway , Lambda , DynamoDB Streams Redis (caching, messaging) PostgreSQL (schema design, tuning, SQL) AWS Glue for ETL jobs and data transformation Solid understanding of REST API design principles, serverless computing, and real-time architecture. Preferred Skills and Experience Familiarity with Kafka, Kinesis, or other message streaming systems Swagger/OpenAPI for API documentation Docker and Kubernetes (EKS) Git, CI/CD tools (e.g., GitHub Actions) Experience with asynchronous event processing, retries, and dead-letter queues (DLQs) Exposure to data lake architectures (S3, Glue Data Catalog, Athena) Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 2 weeks ago
4.0 - 9.0 years
35 - 50 Lacs
Bengaluru
Work from Office
Skill : Amazon Connect Developer/ Lead Location :PAN India Job Descroption: 1. Minimum exp 3-9 years, 2.Strong experience in contact center development 3.Experience in creating AC flows (Amazon Connect) , Lex chatbots and Lambda functions 4.Java / node.js Architect with knowledge on AWS environment, Design and develop APIs (Rest and SOAP services) 5.Knowledge on AWS Lambda services and familiarity in AWS environment and eco system. 6.Knowledge on Spring, Maven, Hibernate 7.Knowledge on Data base technologies like MySQL or SQL Server or DB2/ RDS 8. Application Development experience in any of Java, C#, Node.js, Python, PHP
Posted 2 weeks ago
3.0 - 8.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Business Analysis Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : Bachelor of Engineering in Electronics or any related stream Summary :Working closely with stakeholders across departments, the Business Analyst gathers and documents requirements, conducts data analysis, and supports project implementation to ensure alignment with business objectives. Roles & Responsibilities:1.Collaborate with stakeholders to gather, document, and validate business and technical requirements related to AWS cloud-based systems.2.Analyze current infrastructure, applications, and workflows to identify opportunities for migration, optimization, and cost-efficiency on AWS.3.Assist in creating business cases for cloud adoption or enhancements, including ROI and TCO analysis.4.Support cloud transformation initiatives by developing detailed functional specifications and user stories.5.Liaise with cloud architects, DevOps engineers, and developers to ensure solutions are aligned with requirements and business goals.6.Conduct gap analyses, risk assessments, and impact evaluations for proposed AWS solutions.7.Prepare reports, dashboards, and presentations to communicate findings and recommendations to stakeholders.8.Ensure compliance with AWS best practices and relevant security, governance, and regulatory requirements. Professional & Technical Skills: 1.Proven experience (3+ years) as a Business Analyst, preferably in cloud computing environments.2.Solid understanding of AWS services (EC2, S3, RDS, Lambda, IAM, etc.) and cloud architecture.3.Familiarity with Agile and DevOps methodologies.4.Strong analytical, problem-solving, and documentation skills.5.Excellent communication and stakeholder management abilities.6.AWS certification (e.g., AWS Certified Cloud Practitioner or Solutions Architect Associate) is a plus.7.Have well-developed analytical skills, a person who is rigorous but pragmatic, being able to justify decisions with solid rationale. Additional Information:- The candidate should have minimum 3 years of experience in Business Analyst.- This position is based at our Hyderabad office.- A Bachelor of Engineering in Electronics or any related stream is required. Qualification Bachelor of Engineering in Electronics or any related stream
Posted 2 weeks ago
12.0 - 15.0 years
5 - 9 Lacs
Navi Mumbai
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Drupal Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with various stakeholders to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions with team members to brainstorm innovative solutions and ensure that the applications align with business objectives. Your role will also include reviewing design documents and providing feedback to enhance application performance and user experience, all while maintaining a focus on quality and efficiency. You must have knowledge on Adobe Analytics; PHP, Laravel, Drupal; HTML, CSS; Javascript, Stencil.js, Vue.js; React; Python; Auth0, Terraform; Azure, Azure-ChatGPT; GenAI Basics; AWS SAM (Lambda), AWS EC2, AWS S3, AWS RDS, AWS DynamoDB, AWS SNS, AWS SQS, AWS SES; Cloudflare, Cloudflare Workers; REST API; GitHub; Web Server; SQL Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Drupal.- Strong understanding of web development principles and best practices.- Experience with content management systems and their implementation.- Familiarity with front-end technologies such as HTML, CSS, and JavaScript.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 12 years of experience in Drupal.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France