Home
Jobs

834 Aws Cloud Jobs - Page 33

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2 - 6 years

5 - 15 Lacs

Noida, Gurugram, Delhi / NCR

Work from Office

Naukri logo

Note- Preferring candidates from product firm only. Hi Candidates, We have an opportunities for Full stack developer for leading product based company. Interested candidates can mail their cv's at Abhishek.saxena@mounttalent.com Job Responsibilities- What we expect from you: Work with the product team on requirements analysis and make technical trade off decisions at application level (e.g., component design). Define and develop solutions to technical problems within Agile framework. Use your expertise to input into reengineering and design. Act as a subject matter expert for focus areas across the technology space. Follow and improve the coding standards with the team, ensure the high code quality. • Actively look for the latest tools and technologies. Actively learn and help implementing the new industry standards related to products, people, and processes. Identify the inefficiencies in the process and help the team to improve them to maximize the teams performance. Help identifying the dependencies, risks, and bottlenecks in the projects proactively. Work actively with the Engineering Manager/Team Lead to resolve them. Skill-Set requirements: Required Skills: Strong understanding of object-oriented programming concepts, data structures, and algorithms. Experience working with either React or Angular (React preferred). Strong proficiency in C# and .NET Core, with a deep understanding of developing RESTful APIs. Experience with unit testing and ensuring code quality. Familiarity with any relational database (Postgres preferred). Knowledge and experience with Docker. Preferred Skills: Experience with Kubernetes (K8s) and cloud platforms such as GCP or Azure. Knowledge of micro-services and event-driven architecture, particularly Pub Sub

Posted 1 month ago

Apply

1 - 4 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Job Area: Engineering Group, Engineering Group > Software Applications Engineering General Summary: Tools team at Qualcomm is looking for experienced Software engineer to help us design and build quality software. Job Overview Develops, creates, and modifies general computer applications software or specialized utility programs. Analyzes user needs and develops software solutions. Designs software or customizes software for client use with the aim of optimizing operational efficiency. Modifies existing software to correct errors, allow it to adapt to new hardware, or to improve its performance. Stores, retrieves, and manipulates data for analysis of system capabilities and requirements. Minimum Qualifications Bachelor's degree in Computer Science Engineering or related field. 3 plus years Software Engineering or related work experience. 3 plus years of experience in C# programming Language is mandatory. 3 plus years of experience in web application development using Angular, ASP. Net and webapis is mandatory. 3 plus years of experience in application development in .net core and .net framework. Knowledge of Python and AWS/Azure cloud. Knowledge of sqlserver database, operating system, and algorithms design. Preferred Qualifications Experience in developing one or more products throughout all phases of the software development lifecycle. Exposure to Web development using C#.net, Python, ASP.net, Angular, Webapis, JavaScript, HTML 5, CSS and REST web services. Experience in working on AWS cloud platforms Having exposure to Business Intelligence & Machine learning will be an added advantage. Having exposure to CMS applications, restructured text, markdown and DITA will be an added advantage. Experience working in an Agile/Scrum development process Education Requirements RequiredBachelor's, Computer Engineering and/or Computer Science PreferredMaster's, Computer Engineering and/or Computer Science Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field.

Posted 1 month ago

Apply

5 - 10 years

30 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Data Engineering Manager 5-9 years Software Development Manager 9+ Years Kotak Mahindra BankBengaluru, Karnataka, India (On-site) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 1 month ago

Apply

5 - 10 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 month ago

Apply

8 - 13 years

30 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 month ago

Apply

2 - 6 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

At F5, we strive to bring a better digital world to life. Our teams empower organizations across the globe to create, secure, and run applications that enhance how we experience our evolving digital world. We are passionate about cybersecurity, from protecting consumers from fraud to enabling companies to focus on innovation. Everything we do centers around people. That means we obsess over how to make the lives of our customers, and their customers, better. And it means we prioritize a diverse F5 community where each individual can thrive. Join a team using leading edge security technology and processes to protect the F5 enterprise and product environment. The Security Engineer position will execute strategic processes and implement technical solutions to enable our information security program and address day-to-day security challenges amidst the industry’s evolving technology landscape. Primary Responsibilities Build and implement new security controls, processes and tools. Identify organizational risks to confidentiality, integrity, and availability, and determine appropriate mitigations. Leverage native Azure, GCP, and AWS cloud services to automate and improve existing security and control activities. Develop or implement open-source/third-party tools to assist in detection, prevention and analysis of security threats. Perform technical security assessments against product and enterprise cloud hosted, virtual, and on-premise systems including static and dynamic analysis, and threat modeling. Review and test changes to services, applications, and networks for potential security impacts. Collaborate with Architecture, Site Reliability Engineering and Operations teams to develop and implement technical solutions and security standards. Stay abreast on security best practices and secure design principles. Review changes to and ongoing operations of enterpise environments and supporting systems for security and compliance impacts. Assist in incident detection and response efforts. Implement zero-trust patterns with cloud agnostic tools to support enterprise business units. Implement, design, develop, administer, and manage enterprise security tooling. Knowledge, Skills and Abilities Experience working with high-availability enterprise production environments Familiarity with scripting languages (e.g., (Go, Python, Ruby, Rust,etc.). and building scripts for process improvements Experience automating security testing and reporting outputs Technical knowledge and hands-on experience with security and networking security, basic networking protocols, cloud security, network security design, intrusion prevention/detection, and firewall architecture Experience assessing and implementing technical security controls Willingness to innovate and learn new technologies Excellent interpersonal and relationship skills with a collaborative mindset Knowledge or familiarity with technological stack (Big-IP, Azure, AWS, GCP, CentOS, Hashicorp Vault, Palo Alto, Qualys). Experience with network and application vulnerability and penetration testing tools. Baseline competency in administration of Microsoft Azure Cloud, Amazon Web Services (AWS), Google Cloud Platform (GCP) or equivalent public cloud infrastructure. Exposure to DevOps tooling, CI/CD pipelines, container orchestration, and infrastructure as code approach (e.g. Puppet, Chef, Ansible, Terraform, Jenkins, CircleCI, Artifactory, Git) Strong written and verbal communication skills. Strong self-directed work habits, exhibiting initiative, drive, creativity, maturity, self-assurance and professionalism. Agile, tactful, and proactive attitude that can manage prioritization and know when to escalate. Qualifications B.S. or M.S. in Computer Science, Engineering, or related field, or equivalent experience. 3+ years of relevant security and networking experience The About The Role is intended to be a general representation of the responsibilities and requirements of the job. However, the description may not be all-inclusive, and responsibilities and requirements are subject to change. Please note that F5 only contacts candidates through F5 email address (ending with @f5.com) or auto email notification from Workday (ending with f5.com or @myworkday.com ) . Equal Employment Opportunity It is the policy of F5 to provide equal employment opportunities to all employees and employment applicants without regard to unlawful considerations of race, religion, color, national origin, sex, sexual orientation, gender identity or expression, age, sensory, physical, or mental disability, marital status, veteran or military status, genetic information, or any other classification protected by applicable local, state, or federal laws. This policy applies to all aspects of employment, including, but not limited to, hiring, job assignment, compensation, promotion, benefits, training, discipline, and termination. F5 offers a variety of reasonable accommodations for candidates . Requesting an accommodation is completely voluntary. F5 will assess the need for accommodations in the application process separately from those that may be needed to perform the job. Request by contacting accommodations@f5.com.

Posted 1 month ago

Apply

12 - 15 years

35 - 45 Lacs

Pune, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Strong frontend development experience with ReactJS, JavaScript or TypeScript. Proficiency in HTML5, CSS3 & responsive design best practices. Hands-on exp with AWS Cloud Services, specifically designing systems with SNS, SQS, EC2, Lambda & S3. Required Candidate profile Expert-level exp in backend development using .NetCore, C# & EF Core. Strong expertise in PostgreSQL & efficient database design. Proficient in building & maintaining RESTful APIs at scale.

Posted 1 month ago

Apply

2 - 5 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

12 plus years of overall IT experience 5 plus years of Cloud implementation experience (AWS - S3), Terraform, Docker, Kubernetes Expert in troubleshooting cloud impementation projects Expert in cloud native technologies Good working knowledge in Terraform and Quarkus Must Have skills Cloud AWS Knowledge (AWSS3, Load-Balancers,VPC/VPC-Peering/Private-Public-Subnets, EKS, SQS, Lambda,Docker/Container Services, Terraform or other IaC-Technologies for normal deployment), Quakrus, PostgreSQL, Flyway, Kubernetes, OpenId flow, Open-Search/Elastic-Search, Open API/Swagger, Java OptionalKafka, Python #LI-INPAS Job Segment Developer, Java, Technology

Posted 1 month ago

Apply

2 - 6 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Title - Lead Data Architect (Streaming) Required Skills and Qualifications Overall 10+ years of IT experience of which 7+ years of experience in data architecture and engineering Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS Strong experience with Confluent Strong experience in Kafka Solid understanding of data streaming architectures and best practices Strong problem-solving skills and ability to think critically Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Knowledge of Apache Airflow for data orchestration Bachelor's degree in Computer Science, Engineering, or related field Preferred Qualifications An understanding of cloud networking patterns and practises Experience with working on a library or other long term product Knowledge of the Flink ecosystem Experience with Terraform Deep experience with CI/CD pipelines Strong understanding of the JVM language family Understanding of GDPR and the correct handling of PII Expertise with technical interface design Use of Docker Key Responsibilities Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, Kafka and Confluent, all within a larger and overarching programme ecosystem Architect data processing applications using Python, Kafka, Confluent Cloud and AWS Develop data ingestion, processing, and storage solutions using Python and AWS Lambda, Confluent and Kafka Ensure data security and compliance throughout the architecture Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Optimize data flows for performance, cost-efficiency, and scalability Implement data governance and quality control measures Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams Provide technical leadership and mentorship to development teams and lead engineers Stay current with emerging technologies and industry trends Collaborate with data scientists and analysts to enable efficient data access and analysis Evaluate and recommend new technologies to improve data architecture Position Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Computer Science, Consulting, Technology

Posted 1 month ago

Apply

2 - 6 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

AWS Data Engineer: ***************** As an AWS Data Engineer, you will contribute to our client and will have the below responsibilities: Work with technical development team and team lead to understand desired application capabilities. Candidate would need to do development using application development by lifecycles, & continuous integration/deployment practices. Working to integrate open-source components into data-analytic solutions Willingness to continuously learn & share learnings with others Required: 5+ years of direct applicable experience with key focus: Glue and Python; AWS; Data Pipeline creation Develop code using Python, such as o Developing data pipelines from various external data sources to internal data. Use of Glue for extracting data from the design data base. Developing Python APIs as needed Minimum 3 years of hands-on experience in Amazon Web Services including EC2, VPC, S3, EBS, ELB, Cloud-Front, IAM, RDS, Cloud Watch. Able to interpret business requirements, analyzing, designing and developing application on AWS Cloud and ETL technologies Able to design and architect server less application using AWS Lambda, EMR, and DynamoDB Ability to leverage AWS data migration tools and technologies including Storage Gateway, Database Migration and Import Export services. Understands relational database design, stored procedures, triggers, user-defined functions, SQL jobs. Familiar with CI/CD tools e.g., Jenkins, UCD for Automated application deployments Understanding of OLAP, OLTP, Star Schema, Snow Flake Schema, Logical/Physical/Dimensional Data Modeling. Ability to extract data from multiple operational sources and load into staging, Data warehouse, Data Marts etc. using SCDs (Type 1/Type 2/ Type 3/Hybrid) loads. Familiar with Software Development Life Cycle (SDLC) stages in a Waterfall and Agile environment. Nice to have: Familiar with the use of source control management tools for Branching, Merging, Labeling/Tagging and Integration, such as GIT and SVN. Experience working with UNIX/LINUX environments Hand-on experience with IDEs such as Jupiter Notebook Education & Certification University degree or diploma and applicable years of experience Job Segment Developer, Open Source, Data Warehouse, Cloud, Database, Technology

Posted 1 month ago

Apply

2 - 6 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). TitleLead Data Architect (Warehousing) Required Skills and Qualifications Overall 10+ years of IT experience of which 7+ years of experience in data architecture and engineering Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS Proficiency in Python Solid understanding of data warehousing architectures and best practices Strong Snowflake skills Strong Data warehouse skills Strong problem-solving skills and ability to think critically Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Experience of data cataloguing Knowledge of Apache Airflow for data orchestration Experience modelling, transforming and testing data in DBT Bachelor's degree in Computer Science, Engineering, or related field Preferred Qualifications Familiarity with Atlan for data catalog and metadata management Experience integrating with IBM MQ Familiarity with Sonarcube for code quality analysis AWS certifications (e.g., AWS Certified Solutions Architect) Experience with data modeling and database design Knowledge of data privacy regulations and compliance requirements An understanding of Lakehouses An understanding of Apache Iceberg tables SnowPro Core certification Key Responsibilities Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, as well as Snowflake, DBT and Apache Airflow, all within a larger and overarching programme ecosystem Develop data ingestion, processing, and storage solutions using Python and AWS Lambda and Snowflake Architect data processing applications using Python Ensure data security and compliance throughout the architecture Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Optimize data flows for performance, cost-efficiency, and scalability Implement data governance and quality control measures Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams Provide technical leadership and mentorship to development teams and lead engineers Stay current with emerging technologies and industry trends Ensure data security and implement best practices using tools like Synk Collaborate with data scientists and analysts to enable efficient data access and analysis Evaluate and recommend new technologies to improve data architecture Position Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Solution Architect, Data Warehouse, Computer Science, Database, Technology

Posted 1 month ago

Apply

2 - 6 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 306668 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Job TitleLead Data Engineer (Warehouse) Required Skills and Qualifications - 7+ years of experience in data engineering of which atleast 3+ years as lead / managed team of 5+ data engineering team. - Experience in AWS cloud services - Expertise with Python and SQL - Experience of using Git / Github for source control management - Experience with Snowflake - Strong understanding of lakehouse architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Strong use of version control and proven ability to govern a team in the best practice use of version control - Strong understanding of Agile and proven ability to govern a team in the best practice use of Agile methodologies Preferred Skills and Qualifications - An understanding of Lakehouses - An understanding of Apache Iceberg tables - An understanding of data cataloguing - Knowledge of Apache Airflow for data orchestration - An understanding of DBT - SnowPro Core certification - Bachelor's degree in Computer Science, Engineering, or related field Key Responsibilities Lead and direct a small team of engineers engaged in - Engineer end-to-end data solutions using AWS services, including Lambda, S3, Snowflake, DBT, Apache Airflow - Cataloguing data - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Providing best in class documentation for downstream teams to develop, test and run data products built using our tools - Testing our tooling, and providing a framework for downstream teams to test their utilisation of our products - Helping to deliver CI, CD and IaC for both our own tooling, and as templates for downstream teams - Use DBT projects to define re-usable pipelines Position Overview: We are seeking a highly skilled and experienced Lead Data Engineer to join our dynamic team. The ideal candidate will have a strong background in implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies; leading teams and directing engineering workloads. This role requires a deep understanding of data engineering, cloud services, and the ability to implement high quality solutions. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Computer Science, Database, SQL, Consulting, Technology

Posted 1 month ago

Apply

4 - 9 years

16 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 301930 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Solution Architect Lead Advisor to join our team in Bangalore, Karnataka (IN-KA), India (IN). TitleData Solution Architect Position Overview: We are seeking a highly skilled and experienced Data Solution Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Proficiency in Kafka/Confluent Kafka and Python - Experience with Synk for security scanning and vulnerability management - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Preferred Qualifications - Experience with Kafka Connect and Confluent Schema Registry - Familiarity with Atlan for data catalog and metadata management - Knowledge of Apache Flink for stream processing - Experience integrating with IBM MQ - Familiarity with Sonarcube for code quality analysis - AWS certifications (e.g., AWS Certified Solutions Architect) - Experience with data modeling and database design - Knowledge of data privacy regulations and compliance requirements Key Responsibilities - Design and implement scalable data architectures using AWS services and Kafka - Develop data ingestion, processing, and storage solutions using Python and AWS Lambda - Ensure data security and implement best practices using tools like Synk - Optimize data pipelines for performance and cost-efficiency - Collaborate with data scientists and analysts to enable efficient data access and analysis - Implement data governance policies and procedures - Provide technical guidance and mentorship to junior team members - Evaluate and recommend new technologies to improve data architecture - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS - Design and implement data streaming pipelines using Kafka/Confluent Kafka - Develop data processing applications using Python - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Provide technical leadership and mentorship to development teams - Stay current with emerging technologies and industry trend About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA is an equal opportunity employer and considers all applicants without regarding to race, color, religion, citizenship, national origin, ancestry, age, sex, sexual orientation, gender identity, genetic information, physical or mental disability, veteran or marital status, or any other characteristic protected by law. We are committed to creating a diverse and inclusive environment for all employees. If you need assistance or an accommodation due to a disability, please inform your recruiter so that we may connect you with the appropriate team. Job Segment Solution Architect, Consulting, Database, Computer Science, Technology

Posted 1 month ago

Apply

2 - 5 years

6 - 10 Lacs

Mumbai

Work from Office

Naukri logo

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Systems Integration Specialist to join our team in Mumbai, Maharashtra (IN-MH), India (IN). . Cloud Security Lead - a. Exp06-08 years' experience in cloud/security b. Skills: Actively and aggressively fix security observations on AWS cloud - Security Hub, Guard Duty, Detective, Inspector, etc. Experience working on Checkpoint, Prisma cloud will be advantage. Should be working is definitive targets to improve the security posture. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Consulting, Technology

Posted 1 month ago

Apply

3 - 6 years

2 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description We are seeking an experienced MDM Manager with 10–14 years of experience to lead strategic development and operations of our Master Data Management (MDM) platforms, with hands-on experience in Informatica or Reltio. This role will involve managing a team of data engineers, architects, and quality experts to deliver high-performance, scalable, and governed MDM solutions that align with enterprise data strategy. To succeed in this role, the candidate must have strong MDM experience along with Data Governance, DQ, Data Cataloging implementation knowledge, hence the candidates must have minimum 6-8 years of core MDM technical experience for this role (Along with total experience in the range of 10-14 years) . Roles & Responsibilities Lead the implementation and optimization of MDM solutions using Informatica or Reltio platforms. Define and drive enterprise-wide MDM architecture, including IDQ, data stewardship, and metadata workflows. Match/Merge and Survivorship strategy and implementation experience D esign and delivery of MDM processes and data integrations using Unix, Python, and SQL. Collaborate with backend data engineering team and frontend custom UI team for strong integrations and a seamless enhanced user experience respectively Manage cloud-based infrastructure using AWS and Databricks to ensure scalability and performance. Coordinate with business and IT stakeholders to align MDM capabilities with organizational goals. Establish data quality metrics and monitor compliance using automated profiling and validation tools. Promote data governance and contribute to enterprise data modeling and approval workflow (DCRs). Ensure data integrity, lineage, and traceability across MDM pipelines and solutions. Provide mentorship and technical leadership to junior team members and ensure project delivery timelines. Lead custom UI design for better user experience on data stewardship Basic Qualifications and Experience Master’s degree with 8 - 10 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 10 - 14 years of experience in Business, Engineering, IT or related field OR Diploma with 14 - 16 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Deep knowledge of MDM tools (Informatica, Reltio) and data quality frameworks (IDQ) from configuring data assets to building end to end data pipelines and integrations for data mastering and orchestrations of ETL pipelines Very good understanding on reference data, hierarchy and its integration with MDM Hands on experience with custom workflows AVOS , Eclipse etc Strong experience with external data enrichment services like D&B, Address doctor etc Strong experience on match/merge and survivorship rules strategy and implementations Strong experience with group fields, cross reference data and UUIDs Strong understanding of AWS cloud services and Databricks architecture. Proficiency in Python, SQL, and Unix for data processing and orchestration. Experience with data modeling, governance, and DCR lifecycle management. Proven leadership and project management in large-scale MDM implementations. Able to implement end to end integrations including API based integrations, Batch integrations and Flat file based integrations Must have worked on atleast 3 end to end implementations of MDM Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications Any MDM certification ( e.g. Informatica , Reltio etc ) Any Data Analysis certification (SQL) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

6 - 10 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Join Amgen's Mission to Serve Patients If you feel like you’re part of something bigger, it’s because you are. At Amgen, our shared mission—to serve patients—drives all that we do. It is key to our becoming one of the world’s leading biotechnology companies. We are global collaborators who achieve together—researching, manufacturing, and deliver ever-better products that reach over 10 million patients worldwide. It’s time for a career you can be proud of. Digital Product Manager/Content Curator Live What you will do Let’s do this. Let’s change the world. In this vital role We are seeking a detail-oriented and research-savvy Content Curator to support our enterprise Search Program within the pharmaceutical sector. This role is critical to improving how scientists, researchers, clinicians, and business teams discover relevant, accurate, and well-structured information across vast internal and external data sources. You will curate, classify, and optimize content to ensure it is accessible, contextual, and aligned with regulatory standards. Curate scientific, clinical, regulatory, and commercial content for use within internal search platforms. Sourcing and aggregating relevant content across various platforms. Ensure high-value content is properly tagged, described, and categorized using standard metadata and taxonomies. Identify and fill content gaps based on user needs and search behavior. Organizing and scheduling content publication to maintain consistency. Analyzing content performance and making data-driven decisions to optimize engagement Provide feedback and input on synonym lists, controlled vocabularies, and NLP enrichment tools Apply and help maintain consistent metadata standards, ontologies, and classification schemes (e.g., MeSH, SNOMED, MedDRA). Work with taxonomy and knowledge management teams to evolve tagging strategies and improve content discoverability. Capture and highlight the best content from a wide range of topics Stay up-to-date on best practices and make recommendations for content strategy Edit and optimize content for search engine optimization Perform quality assurance checks on all content before publication Identify and track metrics to measure the success of content curation efforts Review and curate content from a wide variety of categories with a focus Understanding of fundamental data structures and algorithms Understanding how to optimize content for search engines is important for visibility. Experience in identifying, organizing, and sharing content. Ability to clearly and concisely communicate complex information. Ability to analyze data and track the performance of content. Ability to quickly adapt to changing information landscapes and find new resources. A deep understanding of Google Cloud Platform services and technologies is crucial and will be an added advantage Check and update digital assets regularly and, if needed, modify their accessibility and security settings Investigate, secure, and properly document permission clearance to publish data, graphics, videos, and other media Develop and manage a system for storing and organizing digital material Convert collected assets to a different digital format and discard the material that is no longer relevant or needed Investigate new trends and tools connected with the generation and curation of digital material Basic Qualifications: Degree in Data Management, Mass communication and computer science & engineering preferred with 9-12 years of software development experience 5+ years of experience in (digital) content curation or a related position Excellent organizational and time-management skills. Ability to analyze data and derive insights for content optimization. Familiarity with metadata standards, taxonomy tools, and content management systems. Ability to interpret scientific or clinical content and structure it for digital platforms. Ability to analyze data and derive insights for content optimization. Exceptional written and verbal communication skills. Experience in Content Management Systems (CMS), SEO, Google Analytics, GXP Search Engine/ Solr Search, enterprise search platforms, data bricks Strong problem solving, analytical skills; Ability to learn quickly; Excellent communication and interpersonal skills Exceptional written and verbal communication skills. Excellent organizational and time-management skills. Preferred Qualifications: Experience with enterprise search platforms (e.g., Lucene, Elasticsearch, Coveo, Sinequa). Experience with GCP Cloud/AWS cloud /Azure Cloud Experience GXP Search Engine/ Solr Search Experience in Posgres SQL /Mongo DB SQL database, vector database for large language models, Databricks or RDS, Dynamo DB, S3 Experience in Agile software development methodologies Good to Have Skills Willingness to work on AI Applications Experience with popular large language models Experience with Langchain or llamaIndex framework for language models Experience with prompt engineering, model fine tuning Knowledge of NLP techniques for text analysis and sentiment analysis Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global teams. High degree of initiative and self-motivation. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Thrive What you can expect from us As we work to develop treatments that take care of others, we also work to care for our teammates’ professional and personal growth and well-being. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination In our quest to serve patients above all else, Amgen is the first to imagine, and the last to doubt. Join us. careers.amgen.com Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

14 - 20 years

40 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

About the Company Ness is a full lifecycle digital engineering firm offering digital advisory through scaled engineering services. With 20+ years of specialization in product and platform engineering, Ness is a global leader in digital transformation. We design, build, & integrate digital platforms and enterprise software that help organizations to engage with customers, differentiate their brands, and drive profitable growth for them. Our experience designers, software engineers, data experts, and business consultants, partner with clients to develop roadmaps that identify ongoing opportunities to increase the value of their digital solutions and enterprise systems. The exciting work happens through 11 innovation hubs with 5000+ Nessians located across the globe. Please visit our website www.ness.com and learn about our wonderful work. We are inviting applications for Engineering Manager. In this role you would be working towards developing the organization’s strategy for using technological resources. Ensuring technologies are used efficiently, profitably, and securely and evaluating and implementing new solutions. Roles & Responsibility: • Should have 14+ years of experience of working in product development organizations with a proven experience of developing enterprise scale products in a highly Agile/Scrum environment. • Should be able to manage releases for products having multiple live versions and multiple releases through the year • Ability to manage delivery with 20+ Engineers, including architecture, design, code reviews & peoples career management; with good exposure to engineering processes, product delivery, playbooks, frameworks etc.) • Specific responsibilities include driving the team on innovation and implementation, creating and reviewing architectural designs, mentoring the team, and honing its engineering skills. • Strong knowledge of Java-Spring based technical stack, databases (SQL Server, Oracle), modern JS frameworks like React, AWS cloud, design and architectural patterns and frameworks • Good understanding of application Security, Performance & Quality, and DevOps process • Very good knowledge of software development tools, patterns and processes (Agile principles, SCRUM, SAFe) • Collaborate with architects, product management, and engineering teams to create solutions that increase the platform's value. • Create technical specifications, prototypes, and presentations to communicate your ideas. • Well-versed in emerging industry technologies and trends and the ability to communicate that knowledge to the team and influence product direction. • Own progress of the product through the development life cycle, identifying risks and opportunities, and ensuring visibility to senior leadership. • Partner with product management to define and refine our product road map, user experience, priorities, and schedule. • Excellent Critical thinking, Analytical, problem solving & Solutioning skills with a customer first mindset. Good to have: • Highly motivated and has the ability to convert vague and ill-defined problems into well-defined problems, take initiative and encourage consensus building in the team. • Strong written and verbal communication skills and articulation skills • Demonstrable project management, stakeholder management and organizational skills. • Proven ability to lead in a matrix environment. • Strong interpersonal and talent management skills, including the ability to identify and develop product management talent.

Posted 1 month ago

Apply

8 - 13 years

25 - 40 Lacs

Pune

Work from Office

Naukri logo

Role & responsibilities Design and develop infrastructure solutions to support business functions, processes, and applications. Responsible for designing, building AWS cloud Infrastructure solution and Automation framework. Participates in creation of new infrastructure and modification of existing infrastructure environments in the Cloud. Communicate with stakeholders and development teams to assist in coordinating the successful delivery of tools and software. Continuously improve the existing infra-structure by identifying the gaps and ticket trends Develop, evaluate, and make recommendations for alternative infrastructure solutions. Stay current with new technology options and cloud infra solutions Build Infrastructure as a code service IE: CloudFormation and/or terraform • Set up and drive standard enterprise process for cloud infra set up and deployment Help other development and engineering teams resolve application to platform integration issues for Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) services. Work closely with stakeholders and product teams to gather technical and non-technical requirements and translate them into well-architected solutions Be upto date with the latest development in the Cloud services and bring the knowledge and best Preferred candidate profile Hands on experience in Design, implement, and manage cloud-based solutions, ensuring scalability, reliability, security, compliance and performance across various AWS services. Collaborate with development, operations, and architecture teams to integrate with AWS services. Optimize cost management by setting up budgets, analysing cost usage using tools like Cost Explorer, and providing recommendations for cost-saving measures. Develop migration strategies and Lead Execution of migration activities Implement security controls and align the cloud environment with relevant industry regulations (e.g., GDPR, HIPAA) Conduct security assessments and set up of control towers Develop recovery strategies: design disaster recovery (DR) and business continuity (BC) plans Develop and maintain cloud governance frameworks, policies, and procedures. Develop team proficiency in infrastructure automation tools like Terraform, AWS CloudFormation, Azure Resource Manager (ARM) templates Should be proficient in scripting languages like PowerShell, infrastructure automation frameworks and CI/CD pipelines to automate configuration management, provisioning, and deployment processes

Posted 1 month ago

Apply

8 - 13 years

30 - 35 Lacs

Pune

Hybrid

Naukri logo

We are seeking a highly skilled and experienced Cloud Solution Architect to lead the design, implementation, and management of cloud-based solutions across Azure and AWS. The ideal candidate will have deep expertise in cloud architecture, infrastructure, security, and automation while demonstrating strong capabilities in proposal writing, RFPs, architecture diagrams, and technical presentations. Key Responsibilities: Cloud Architecture & Design: Design and implement scalable, secure, and cost-effective cloud architectures across Azure and AWS. Pre-Sales & Solutioning: Engage with clients to understand business requirements and develop cloud solutions, including responding to RFPs and crafting technical proposals. Technical Documentation & Diagrams: Create high-level and low-level architecture diagrams, design documents, and best practices for cloud adoption. Presentations & Stakeholder Communication: Deliver technical presentations, PPTs , and cloud strategy discussions to clients and internal teams. Migration & Modernization: Assess and execute cloud migration strategies, including lift-and-shift, re-platforming, and re-architecting applications. Security & Compliance: Ensure cloud solutions comply with industry standards such as SOC 2, ISO 27001, NIST, and CIS benchmarks. Automation & Optimization: Utilize Infrastructure-as-Code (IaC) tools like Terraform, ARM Templates, or CloudFormation for automation. Collaboration & Leadership: Work closely with DevOps, engineering, and security teams to drive cloud adoption and best practices. Required Skills & Experience: 15+ years of experience in IT, with 10+ years in cloud architecture (Azure, AWS). Expertise in Azure services (VMs, AKS, AAD, Networking, Security, Storage, etc.) and AWS services (EC2, RDS, Lambda, VPC, IAM, etc.). Strong experience with architecture frameworks like TOGAF, Well-Architected Framework (Azure & AWS). Hands-on experience in Infrastructure as Code (Terraform, ARM, CloudFormation) and automation using PowerShell, Python, or Bash. Knowledge of cloud security, identity & access management, and compliance frameworks. Experience working on RFPs, proposals, and pre-sales activities. Strong communication and presentation skills with the ability to create and deliver PPTs, whitepapers, and technical documentation. Experience with hybrid cloud solutions, multi-cloud strategies, and cloud governance. Understanding of networking, VPNs, firewalls, load balancing, and DNS in a cloud environment. Certifications such as Azure Solutions Architect Expert, AWS Certified Solutions Architect.

Posted 1 month ago

Apply

5 - 8 years

4 - 8 Lacs

Bengaluru

Remote

Naukri logo

We are seeking a skilled and motivated AWS Cloud Engineer to manage and optimize our cloud infrastructure. You will be responsible for designing, implementing, and maintaining scalable, secure, and cost-effective AWS environments that support our fintech products and services. Key Responsibilities: Design, deploy, and maintain cloud infrastructure on AWS. Automate provisioning, configuration, and scaling using Infrastructure as Code (IaC) tools such as Terraform or CloudFormation. Monitor system performance, troubleshoot issues, and optimize cloud resources for performance and cost. Implement security best practices including IAM roles, security groups, and encryption. Collaborate with development, QA, and DevOps teams to support CI/CD pipelines. Ensure high availability, backup, and disaster recovery plans are in place and tested. Maintain compliance with security, governance, and regulatory standards. Key Skills: Deep knowledge of Amazon Web Services (AWS) : EC2, S3, RDS, Lambda, VPC, CloudWatch, etc. Experience with Infrastructure as Code : Terraform or AWS CloudFormation. Strong scripting skills (Bash, Python, etc.). Knowledge of CI/CD tools : Jenkins, GitHub Actions, GitLab CI. Experience with monitoring/logging tools: CloudWatch, ELK Stack, Prometheus, Grafana. Understanding of cloud security best practices and networking concepts . Familiarity with containerization (Docker) and orchestration (Kubernetes optional). Experience with Linux-based server environments.

Posted 1 month ago

Apply

8 - 13 years

40 - 50 Lacs

Bengaluru

Work from Office

Naukri logo

Role-Infrastructure Engineer Location-Bangalore Duration-Permanent Exp-8+ years About the role We are seeking an experienced Infrastructure Engineer to join our team at, a leader in blockchain technology and solutions. The ideal candidate will have a strong background in infrastructure management and a deep understanding of blockchain ecosystems. You will be responsible for designing, implementing, and maintaining the foundational infrastructure that supports our blockchain platforms, ensuring high availability, scalability, and security. Your expertise in AWS cloud technologies and database management, particularly with RDS, PostgreSQL, and Aurora, will be essential to our success. Responsibilities: Design & Deployment: Develop, deploy, and manage the infrastructure for blockchain nodes, databases, and network systems. Automation & Optimization: Automate infrastructure provisioning and maintenance tasks to enhance efficiency and reduce downtime. Optimize performance, reliability, and scalability across our blockchain systems. Monitoring & Troubleshooting: Set up monitoring and alerting systems to proactively manage infrastructure health. Quickly identify, troubleshoot, and resolve issues in production environments. Security Management: Implement robust security protocols, firewalls, and encryption to protect infrastructure and data from breaches and vulnerabilities. should be aware of VPC Virtual private cloud good in this Collaboration: Work closely with development, DevOps, and security teams to ensure seamless integration and support of blockchain applications. Support cross-functional teams in achieving network reliability and efficient resource management. Documentation: Maintain comprehensive documentation of infrastructure configurations, processes, and recovery plans. Continuous Improvement: Research and implement new tools and practices to improve infrastructure resiliency, performance, and cost-efficiency. Stay updated with blockchain infrastructure trends and industry best practices. Incident management: Incident dashboard management. Integrate dashboard using different power tools. Requirements: Educational Background: Bachelors degree in computer science, Information Technology, or a related field. Experience: Minimum of 7 years of experience in AWS infrastructure engineering, using terraforms, Terra-grunt, and Atlantis with incident management and resolution using automation (infrastructure as a code) , AWS infrastructure cloud provisioning. Should be aware of VPC Virtual private cloud. Technical Skills: Terraform and Automation AWS Cloud watch Hands-on experience with monitoring tools (e.g., Prometheus, Grafana). DevOps with CI/CD pipelines. Incident management resolution and reporting. Proficiency in cloud platforms (e.g., AWS, GCP, Azure) and container orchestration (e.g., Docker, Kubernetes). Strong knowledge of Linux/Unix system administration. Understanding of networking protocols, VPNs, and firewalls. Participate in on-call rotations to provide 24/7 support for critical systems. Security Knowledge: Strong understanding of security best practices, especially within blockchain environments. Soft Skills: Excellent problem-solving abilities, attention to detail, strong communication skills, and a proactive, team-oriented mindset. Experience working with consensus protocols and node architecture.

Posted 1 month ago

Apply

12 - 15 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

ROLE AND RESPONSIBILITIES Must be a Subject Matter Expert in one the following technologies AWS laaS Experience in key AWS services EC2, Simple storage service(S3), Virtual private cloud (VPC ), Auto-scaling, Security groups, Public/Private subnets, route 53, Cloud Front, Snapshot, Direc t Connect , NACL , Elastic Block Storage ( EBS ), Elastic Load Balancer (ELB) Internet Gateway (IG), Transit Gateway (TG), NAT, SNS, SQS etc . Experience in Security services AWS Identity and Access Management (IAM), CloudWatch, AWS Secrets Manager, Web Application Firewall(WAF), Guard Duty, AWS Config, Cloud Trail, Amazon Inspector, AWS Shield, AWS Security Hub, Trusted Advisors, KMS etc. Implement AWS Landing Zone using AWS control tower & Managing Multiple AWS accounts , users & resources using Landing Zone 2. Azure laaS Azure AD, Resource Groups, Virtual Machines, Containers, Virtual Networks, Storage, Vnet Gateway, Site to Site VPN, Availability Sets, Recovery Services Vaults, Load Balancers, NSG, Azure Security Center, Azure PowerShell, Express Route, Azure Monitor, Azure Advisor, Azure DNS, App Services, Policy, Blueprint, Automation Account, Log Analytics Workspace etc. Key skills Required Containerization like Dockers and Kubernetes Infrastructure as a code tooling like Terraform Windows Power shell, Unix or Linux bash scripting At least one Continuous Integration and Continuous Deployment Pipelines (CICD) tooling like GitHub or Cloud Build Serverless technologies like AWS Lambda or Azure Functions QUALIFICATIONS AND CERTIFICATIONS REQUIREMENTS Work experience and educational background that a candidate should have when applying for the position. Years of Exp.: 12 -15 years of experience. AWS Certified Solution Architect AWS Certified Professional Azure Certified Solution Architect Google Certified Professional (GCP) Looking for Immediate Joiners

Posted 1 month ago

Apply

2 - 5 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

About The Role The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Process ManagerRoles and responsibilities: Designing and implementing scalable, reliable, and maintainable data architectures on AWS. Developing data pipelines to extract, transform, and load (ETL) data from various sources into AWS environments. Creating and optimizing data models and schemas for performance and scalability using AWS services like Redshift, Glue, Athena, etc. Integrating AWS data solutions with existing systems and third-party services. Monitoring and optimizing the performance of AWS data solutions, ensuring efficient query execution and data retrieval. Implementing data security and encryption best practices in AWS environments. Documenting data engineering processes, maintaining data pipeline infrastructure, and providing support as needed. Working closely with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. Technical and Functional Skills: Typically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. Strong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc Proficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Experience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. Familiarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Knowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Understanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Proficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Ability to analyze complex technical problems and propose effective solutions. Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders.

Posted 1 month ago

Apply

1 - 4 years

2 - 6 Lacs

Pune

Work from Office

Naukri logo

About The Role The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Process ManagerRoles and responsibilities: Designing and implementing scalable, reliable, and maintainable data architectures on AWS. Developing data pipelines to extract, transform, and load (ETL) data from various sources into AWS environments. Creating and optimizing data models and schemas for performance and scalability using AWS services like Redshift, Glue, Athena, etc. Integrating AWS data solutions with existing systems and third-party services. Monitoring and optimizing the performance of AWS data solutions, ensuring efficient query execution and data retrieval. Implementing data security and encryption best practices in AWS environments. Documenting data engineering processes, maintaining data pipeline infrastructure, and providing support as needed. Working closely with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. Technical and Functional Skills: Typically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. Strong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc Proficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Experience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. Familiarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Knowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Understanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Proficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Ability to analyze complex technical problems and propose effective solutions. Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders.

Posted 1 month ago

Apply

2 - 5 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

About The Role Process Manager - AWS Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel Requirements - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Understand clients requirement and provide effective and efficient solution in AWS using Snowflake. Assembling large, complex sets of data that meet non-functional and functional business requirements Using Snowflake / Redshift Architect and design to create data pipeline and consolidate data on data lake and Data warehouse. Demonstrated strength and experience in data modeling, ETL development and data warehousing concepts Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL Technical and Functional Skills: AWS ServicesStrong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc. Programming LanguagesProficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Data WarehousingExperience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. ETL ToolsFamiliarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Database ManagementKnowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Big Data TechnologiesUnderstanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Version ControlProficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Problem-solving Skills: Ability to analyze complex technical problems and propose effective solutions. Communication Skills: Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders. Education and ExperienceTypically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies