Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5 - 8 years
15 - 20 Lacs
Bengaluru
Work from Office
1. 5+ years of exp in Amazon RDS Aurora ( PostgreSQL/AWS). 2. Exp with other AWS database services (e.g., DynamoDB, Redshift). 3. Exp with database performance tuning & optimization. 4. Proficiency in SQL & exp with database scripting languages Required Candidate profile 1. Familiar with database security best practices. 2. Exp with database backup, recovery, & disaster recovery planning. 3. Understanding of data warehousing and ETL processes. 4. AWS Certified.
Posted 3 months ago
5 - 10 years
15 - 25 Lacs
Bengaluru
Hybrid
Hiring for one of our Big4 client: Job Title: Senior AWS Data Engineer Location: Bangalore Job Type: Full-Timee Overview: We are looking for an experienced Senior AWS Data Engineer (P1) with 5+ years of hands-on experience in building scalable, reliable data pipelines, and data engineering solutions using AWS services. The ideal candidate will have expertise in AWS Glue , Lambda , Redshift , Step Functions , CloudWatch , and advanced Python/PySpark skills. You will be responsible for designing, implementing, and optimizing ETL/ELT pipelines, managing data integration workflows, and ensuring high-performance data solutions on AWS. This role requires deep technical expertise in cloud-based data platforms, excellent problem-solving skills, and the ability to lead complex data projects while working collaboratively with business stakeholders. Key Responsibilities: Lead ETL/ELT Pipeline Development: Architect, design, and implement advanced ETL/ELT pipelines using AWS Glue , Lambda , PySpark , and SQL , ensuring efficient data integration and transformation. Data Warehousing & Optimization: Optimize Amazon Redshift data warehouse performance, design schema structures, and develop efficient data models to handle large-scale data workloads. Orchestration & Workflow Automation: Use AWS Step Functions , Airflow , and CloudWatch for orchestrating data workflows, automating tasks, and ensuring smooth pipeline operations. Cloud Services Integration: Leverage a broad set of AWS services, including API Gateway , S3 , SQS , SNS , SES , DMS , CloudFormation , CDK , and IAM , to integrate various data sources, manage permissions, and automate data processes. Technical Leadership: Provide guidance and mentorship to junior engineers, help them grow in their technical skills, and ensure best practices for coding, testing, and deployment. Solution Design & Development: Work closely with business analysts and product owners to translate functional requirements into high-performance, scalable technical solutions on AWS. Data Quality & Monitoring: Utilize CloudWatch for monitoring data pipelines, ensure optimal performance, and troubleshoot issues in production environments. Security & Compliance: Implement best practices for data security, access control using IAM , and ensure compliance with data governance and regulatory requirements. Documentation & Process Standardization: Create comprehensive technical documentation, including system designs, data models, and pipeline configurations. Standardize best practices across the team. Primary Skills Required: AWS Services Expertise: Extensive experience with AWS services, including Glue , Lambda , Step Functions , Redshift , S3 , API Gateway , SQS , SNS , SES , DMS , CloudFormation , CDK , IAM , and VPC . Programming: Strong proficiency in Python , PySpark , and SQL , with hands-on experience in developing data transformation scripts and automation for large-scale data processing. ETL/ELT Pipelines: Proven experience designing and implementing ETL/ELT pipelines using AWS Glue , Lambda , and other AWS services to efficiently process and transform data. Data Warehousing: Expertise in Amazon Redshift for data warehousing, including schema design, query optimization, performance tuning, and data integration. Orchestration & Workflow Management: Advanced experience with AWS Step Functions , Airflow , and CloudWatch to manage and orchestrate data workflows, monitor pipeline health, and ensure process efficiency. Cloud Infrastructure & Automation: Strong experience with CloudFormation and CDK for infrastructure automation, and managing resources in AWS. Security & Permissions Management: Deep knowledge of IAM for managing security and access control, ensuring secure data operations. Troubleshooting & Debugging: Expertise in monitoring data pipelines using CloudWatch , identifying bottlenecks, and resolving issues in data processes. Additional Skills (Nice to Have): Experience with Data Migration Service (DMS) for database replication and migration. Familiarity with Airflow or other orchestration frameworks for data workflows. Strong understanding of data governance and compliance standards in cloud environments. Knowledge of Agile development methodologies and proficiency with Git for version control. AWS certifications in relevant areas (e.g., AWS Certified Solutions Architect, AWS Certified Data Analytics). Education and Experience: Bachelors or Masters degree in Computer Science, Engineering, Information Systems, or a related field. 5+ years of experience as a Data Engineer or Cloud Engineer with a focus on AWS services, data engineering, and building ETL/ELT pipelines. Note: Immediate joiners / who are currently serving the notice or who can join us within 30 days
Posted 3 months ago
5 - 10 years
0 - 1 Lacs
Noida
Remote
Remote/Noida Strong expertise in Power BI development and visualization, AWS Redshift, and a deep understanding of Order Management Systems (OMS), inventory, or warehouse data handling. The candidate will be responsible for designing, developing Required Candidate profile 5+ years of experience in Power BI development, including advanced DAX calculations and data modeling. Strong experience in AWS Redshift, including query optimization and data warehousing concepts.
Posted 3 months ago
10 - 15 years
20 - 25 Lacs
Bengaluru
Work from Office
PySpark with experience in architecting high throughput data lakes. Understanding of CDC and respective tools like Debezium, DMS. Workflows – Airflow or Step Functions, or Glue Workflow. Glue ecosystem – Glue Jobs, Data Catalog, Bookmarks, Crawlers.
Posted 3 months ago
6 - 9 years
10 - 15 Lacs
Chennai, Bengaluru
Work from Office
We are seeking Data Engineers for our Enterprise Integration Team. As a Data Engineer at Paramount Global, you will help us unlock the full potential of our real-time and relational data and provide our businesses with insights that allow us to make better and faster decisions. The ideal candidate must have expertise in developing Python-based data integrations and APIs. Responsibilities: Building APIs and data pipelines using Python Orchestrating data workflows Able to work in a time-sensitive project environment where management of competing priorities is a must have skill. Participate in meetings with engineers and business users. Basic qualifications: Minimum 6+ years of hands-on experience as a data engineer. Strong ability to develop solutions with Python and SQL Possesses in-depth knowledge of enterprise integration and automation patterns. (ESB, ETL, ELT, API broker, consolidating and rationalizing microservices) Additional Qualifications: Database Programming Scripting in a Linux environment Solid grasp of Cloud Data Warehousing platforms such as Snowflake, Redshift, or BigQuery. Hands-on experience with Amazon Web Services. Ability to function collaboratively as part of a fast-paced, customer-oriented team, perform effectively as an independent producer under broad management direction, and a demonstrated willingness to support the team on all levels to get the job done. Superior communication skills in working with technical and non-technical users and the ability to cultivate and maintain collaborative relationships among all levels of an organization. Bachelors degree in computer science, Mathematics, Engineering, Data Science, or Statistics preferred.
Posted 3 months ago
6 - 8 years
14 - 19 Lacs
Bangalore Rural
Hybrid
Hi, We are looking for an Data Engineer Skills: ETL, Pyspark, Python, Redshift/Snowflake Location:Bangalore Exp:6 to 8 NP:immediate within 15 days Interested candidate send resume to sreeram.sekhar@thakralone.in
Posted 3 months ago
6 - 8 years
14 - 19 Lacs
Bengaluru
Hybrid
Hi, We are looking for an Data Engineer Skills: ETL, Pyspark, Python, Redshift/Snowflake Location:Bangalore Exp:6 to 8 NP:immediate within 15 days Interested candidate send resume to sreeram.sekhar@thakralone.in
Posted 3 months ago
7 - 12 years
0 - 0 Lacs
Pune, Bengaluru, Hyderabad
Hybrid
Hiring for Top MNC (For Long term contract) THE CANDIDATE MUST BE ABLE TO CREATE VARIETY OF INFRA AS CODE ON AWS THAT WILL BE LEVERAGED BY DATA ENGINEERING TEAMS. Candidate will use tools like cloud formation (CFM) to create IAC, deploy and operate the platform based on requirements from DEs. 2 FTEs; 5+ years of relevant experience senior PE Ability to lead a small team (2 pizzas squad) Ability to create scalable & stable serverless architecture/design Expert in Python (Object Oriented) development Expert in writing Python Unit Tests Extensive use of the following AWS services S3 Lambda Glue SQS IAM DynamoDB CloudWatch Event Bridge Step Functions EMR (inc. serverless) Redshift (inc. serverless) API Gateway and/or AppSync Optionally, AWS Lake Formation DMS DataSync Appflow Fluent with REST API Experience with CFM and CDK Knowledge of Data Engineering principles using the services above is important Previous experience with Azure AD and OAuth2 Previous experience in BDD (Business Driven Development) testing (or Integration Testing) Optionally, previous experience with Node.js/React development
Posted 3 months ago
7 - 12 years
25 - 35 Lacs
Bengaluru
Work from Office
Job Summary: About the Job: The Community You Will Join: At Cloudwick, youll be part of a dedicated team of professionals passionate about fostering a dynamic and inclusive workplace culture, with a specific focus on transforming healthcare through data. A Typical Day: Design, develop, and maintain ETL processes and data pipelines with Scala/PySpark, ensuring seamless integration of healthcare data formats such as HL7 and FHIR. Collaborate with data scientists, analysts, and healthcare stakeholders to understand data requirements and model Electronic Health Records (EHR) and Electronic Case Report (ECR) for high-quality, compliant data solutions. Optimize and tune data pipelines for performance and scalability, ensuring rapid access to critical healthcare information. Ensure data quality and integrity through robust testing and validation processes, adhering to healthcare regulations. Implement data governance and security best practices to protect sensitive patient information. Monitor and troubleshoot data pipelines to ensure continuous data flow, promptly addressing any issues to maintain operational efficiency. Stay up-to-date with the latest trends and technologies in data engineering, particularly in the healthcare sector. What You Bring to the Table: B.E/B.Tech, preferably in Computer Science or Engineering.5+ years of experience in handling data and designing ETL pipelines, with mandatory 4+ years of experience writing PySpark code, specifically in healthcare applications. Proven experience working on AWS using services like S3, Glue, Redshift, Lambda, IAM, and DynamoDB, focusing on healthcare data management. Demonstrated experience in capturing data requirements from product owners and business users within the healthcare domain, particularly around EHR and ECR systems. Good to have exposure to data modeling, data analytics, and design in both batch processing and real-time streaming, especially for healthcare applications. Solid understanding of data mapping, data processing patterns, distributed computing, and building applications for real-time and batch analytics in a healthcare context, including HL7 and FHIR standards. Strong programming skills in design and implementation using Python and PySpark. Good exposure to database architecture with Redshift, specifically for healthcare datasets. Experience with multiple file formats such as Avro, Parquet, ORC, and JSON, particularly in the context of healthcare data analytics. Developing, constructing, testing, and maintaining architectures for data lakes, data pipelines, data warehouses, and large-scale data processing systems on AWS, focusing on healthcare applications. Extensive experience using Spark, Scala, PySpark, Python, and SQL to handle complex healthcare data transformations. Hands-on experience in using AWS services like S3, Glue, Lambda, Redshift, and IAM for healthcare solutions. Experience with client interactions in the healthcare domain, ensuring the delivery of data solutions that meet regulatory standards. Proficient in writing complex SQL queries for healthcare data analysis. Preferred experience in the healthcare domain, with knowledge of EHR, ECR, and familiarity with HL7 and FHIR standards. AWS certifications relevant to data engineering and healthcare solutions are a plus. Location-Bangalore (Complete work from Office)
Posted 3 months ago
4 - 9 years
6 - 16 Lacs
Bengaluru
Work from Office
Job Description Responsibilities: - Maintain ETL pipelines using SQL, Spark, AWS Glue, and Redshift. - Optimize existing pipelines for performance and reliability. - Troubleshoot and resolve UAT / PROD pipeline issues, ensuring minimal downtime. - Implement data quality checks and monitoring to ensure data accuracy. - Collaborate with other teams, and other stakeholders to understand data requirements. Roles & Responsibilities Required Skills and Experience: - Bachelor's degree in Computer Science, Engineering, or a related field. - Extensive experience (e.g., 6+ years) in designing, developing, and maintaining ETL pipelines. - Strong proficiency in SQL and experience with relational databases (e.g., Redshift, PostgreSQL). - Hands-on experience with Apache Spark and distributed computing frameworks. - Solid understanding of AWS Glue and other AWS data services. - Experience with data warehousing concepts and best practices. - Excellent problem-solving and troubleshooting skills. - Strong communication and collaboration1 skills. - Experience with version control systems (e.g., Git). - Experience with workflow orchestration tools (e.g., Airflow). Preferred Skills and Experience: - Experience with other cloud platforms (e.g., Azure, GCP). - Knowledge of data modeling and data architecture principles. - Experience with data visualization tools (e.g., Tableau, Power BI). - Familiarity with Agile development methodologies
Posted 3 months ago
8 - 13 years
30 - 45 Lacs
Hyderabad
Work from Office
As a Data Engineer in Data Infrastructure and Strategy group, you will play a key role in transforming the way Operations Finance teams access and analyse the data. You will work to advance 3 Year Data Infrastructure Modernisation strategy and play a key role in adopting and expanding a unified data access platform and scalable governance and observability frameworks that follow modern data architecture and cloud-first designs. Your responsibilities will include supporting and migrating data analytics use cases of a targeted customer group and new features implementations within the central platform: component design, implementation using NAWS services that follow best engineering practices, user acceptance testing, launch, adoption and post-launch support. You will work on system design and integrate new components to established architecture. You will be engaged into cross-team collaboration by building reusable design patterns and components and adopting designs adopted by others. You will contribute to buy vs “build” decision by evaluating latest product and features releases for NAWS and internal products, perform gap analysis and define feasibility of their adoption and the list of blockers. The ideal candidate possess a track record of creating efficient AWS-based data solutions; data models for both relational databases and Glue/Athena/EMR stack; developing solution documentation, project plans, user guides and other project documentation. We are looking into individual contributor inspired to become data systems architects. Track of production level deliverables leveraging GenAI is big plus. Key job responsibilities * Elevate and optimize existing solutions while driving strategic migration. Conduct thorough impact assessments to identify opportunities for transformative re-architecture or migration to central platforms. Your insights will shape the technology roadmap, ensuring we make progress towards deprecation goals while providing best customer service; * Design, review and implement data solutions that support WW Operations Finance standardisation and automation initiatives using AWS technologies and internally built tools, including Spark/EMR, Redshift, Athena, DynamoDB, Lambda, S3, Glue, Lake Formation, etc; * Support data solutions adoption by both, finance and technical teams; identify and remove adoption blockers; * Ensure the speed of delivery and high-quality: iteratively improve development process and adopt mechanisms for optimisation of the development and support; * Contribute into engineering excellence by reviewing design and code created by others; * Contribute to delivery execution, planning, operational excellence, retrospectives, problem identification, and solution proposals; * Collaborate with finance analysts, engineers, product and program managers and external teams to influence and optimize the value of the delivery in data platform; * Create technical and customer-facing documentation on the products within the platform. A day in the life You work with the Engineering, Product, BI and Operations teams to elevate existing data platforms and implement best-of-class data solutions for Operations Finance organization. You solve unstructured customer pain points with technical solutions, you are focused on users productivity when working with the data. You participate in discussions with stakeholders to provide updates on project progress, gather feedback, and align on priorities. Utilizing AWS CDK and various AWS services, you design, execute, and deploy solutions. Your broader focus is on system architecture rather than individual pipelines. You regularly review your designs with Principal Engineer, incorporate gathered insights. Conscious of your impact on customers and infrastructure, you establish efficient development and change management processes to guarantee speed, quality and scalability of delivered solution. About the team Operations Finance Standardization and Automation improves customer experience and business outcomes across Amazon Operations Finance through innovative technical solutions, standardization and automation of processes and use of modern data analytics technologies. Basic qualifications MS or BS in Computer Science, Electrical Engineering, or similar fields; Strong AWS engineering background, 3+ years of demonstrated track record designing and operating data solutions in Native AWS. The right person will be highly technical and analytical with ability to drive technical execution towards organization goals; Exceptional triaging and bug fixing skills. Ability to assess risks, implement fix without customer impact; Strong data modelling experience, 3+ years of data modeling practice is required. Expertise in designing both analytical and operational data models is a must. Candidate needs to demonstrate the working knowledge of trade-offs in data model designs and platform-specific considerations with concentration in Redshift, MySQL, EMR/Spark and Athena; Excellent knowledge of modern data architecture concepts - data lakes, data lakehouses, - as well as governance practices; Strong documentation skills, proven ability to adapt the document to the audience. The ability to communicate the information on levels ranging from executive summaries and strategy addendums to detailed design specifications is critical to the success; Excellent communication skills, both written and oral. Ability to communicate technical complexity to a wide range of stakeholders. Preferred qualifications Data Governance frameworks experience; Compliance frameworks experience, SOX preferred; Familiarity or production level experience with AI-based AWS offerings (Bedrock) is a plus.
Posted 3 months ago
5 - 8 years
8 - 18 Lacs
Pune
Hybrid
Roles and Responsibilities: 1. 5+ years of Work experience in Data Engineering .(Python, SQL, Artificial Intelligence engineering) 2. Reporting tools: Power BI or Tableau 3. Design/ Develop ETL/ELT pipelines using Python. 4. Work with Cloud Services (AWS or Azure), Redshift, S3, DMS, Lamda, CI/CD, Dev/Ops 5. Working experience with Databases (like MS SQL, Oracle, MySQL, Redshift, NoSQL, etc.) 6. Working knowledge in Airflow (Monitoring and Scheduling) 7. Write good quality code that is reusable, testable and efficient 8. Work closely with Product Manager, Industry Subject Matter Experts to design data applications, models, and transformations 9. Work in a fast pace, test-driven environment 10. Participate in code reviews 11. Attend daily stand up, weekly and monthly team meetings. 12. Work in Agile to ensure Project deliverables on time with High quality Skills / Desired Candidate Profile: 1. 5+ years of experience in Python programming, Artificial Intelligence engineering and having strong SQL expertise (stored procedures, complex queries, views, etc.) 2. Experience performing data extraction, cleaning, analysis and presentation for medium to large datasets 3. Good knowledge of databases 4. Experience in scripting data transformation via Python/Bash scripts 5. Experience in designing, building, and maintaining data processing systems 6. 5+ years of experience solving analytical problems 7. Strong debugging and troubleshooting skills 8. Experience in JIRA, Github, Bitbucket, etc. 9. Bachelor's Degree in Engineering /Technology / MCA 10. Ability to meet the deadlines and work in a fast-paced environment. 11. Strong communication and interpersonal skills who can collaborate well with both offshore and onsite team Perks and Benefits: As per current salary and equivalent to market trend Preferred: Immediate Joiner
Posted 3 months ago
4 - 7 years
12 - 18 Lacs
Bengaluru, Mumbai (All Areas)
Hybrid
We're hiring a Data Engineer with 5-8 years of experience in AWS (S3, Redshift, Glue, EMR, Lambda), Apache Spark, and SQL. Build scalable ETL/ELT pipelines & enable business insights through data engineering. JD: https://tinyurl.com/dataengineerblr Perks and benefits Annual bonus Life insurance Performance bonus
Posted 3 months ago
8 - 10 years
15 - 20 Lacs
Bengaluru
Work from Office
1. 5+ years of exp in Amazon RDS Aurora ( PostgreSQL/AWS). 2. Exp with other AWS database services (e.g., DynamoDB, Redshift). 3. Exp with database performance tuning & optimization. 4. Proficiency in SQL & exp with database scripting languages Required Candidate profile 1. Familiar with database security best practices. 2. Exp with database backup, recovery, & disaster recovery planning. 3. Understanding of data warehousing and ETL processes. 4. AWS Certified.
Posted 3 months ago
6 - 10 years
20 - 25 Lacs
Bengaluru
Work from Office
AWS developer (Redshift , PostgressSQL, AuroraMySQL, SQS, MFTS) Must have experience working with Web servers (e.g. Apache) Must have experience with AWS Airflow and Redshift database.
Posted 3 months ago
6 - 8 years
2 - 5 Lacs
Chennai, Pune
Work from Office
Job Title: Data Engineer Job Summary: We are looking for a skilled Data Engineer to join our team and contribute to building scalable, high-performance data pipelines and infrastructure. The ideal candidate will have extensive experience with SQL, PySpark, AWS Glue, and Amazon Redshift. You will be responsible for designing, developing, and optimizing data workflows, enabling actionable insights from large datasets. Key Responsibilities: Develop, test, and maintain robust data pipelines using PySpark and AWS Glue. Design and optimize data models for Amazon Redshift to ensure high-performance analytics. Write complex SQL queries for data extraction, transformation, and analysis. Implement ETL workflows to process large datasets across multiple systems. Collaborate with data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. Monitor and maintain data pipeline performance, reliability, and scalability. Ensure data quality and integrity through validation and error-handling mechanisms. Utilize Infrastructure as Code (IaC) for deploying and managing data infrastructure. Document data workflows, processes, and systems for operational and troubleshooting purposes. Required Skills & Qualifications: Proficiency in SQL for complex querying and performance optimization. Hands-on experience with PySpark for big data processing and transformations. Expertise in AWS Glue for ETL development and data catalog management. Strong knowledge of Amazon Redshift, including schema design, performance tuning, and workload management. Familiarity with data lake architecture and related AWS services (e.g., S3, Athena, Lambda). Understanding of distributed computing principles and big data frameworks. Experience with version control systems like Git. Strong problem-solving and analytical skills. Preferred Skills: Familiarity with data pipeline orchestration tools (e.g., Apache Airflow, Step Functions). Experience with real-time data processing and streaming frameworks. Knowledge of AWS infrastructure and best practices for cost optimization. Exposure to machine learning workflows and tools. Familiarity with Agile methodologies and CI/CD pipelines. If Interested Please contact 8971971804(Mariam)
Posted 3 months ago
2 - 7 years
8 - 18 Lacs
Bengaluru
Work from Office
DESCRIPTION AWS Sales, Marketing, and Global Services (SMGS) is responsible for driving revenue, adoption, and growth from the largest and fastest growing small- and mid-market accounts to enterprise-level customers including public sector. Amazon Web Services is the global market leader and technology forerunner in the Cloud business. As a member of the AWS Support team in Amazon Web Services, you will be at the forefront of this transformational technology, assisting a global list of companies and developers that are taking advantage of a growing set of services and features to run their mission-critical applications. As a Cloud Support Engineer, you will act as the Cloud Ambassador across all the cloud products, arming our customers with required tools & tactics to get the most out of their Product and Support investment. Would you like to use the latest cloud computing technologies? Do you have an interest in helping customers understand application architectures and integration approaches? Are you familiar with best practices for applications, servers and networks? Do you want to be part of a customer facing technology team in India helping to ensure the success of Amazon Web Services (AWS) as a leading technology organization? If you fit the description, you might be the person we are looking for! We are a team passionate about cloud computing, and believe that world class support is critical to customer success. Key job responsibilities - Diagnose and resolve issues related to Kafka performance, connectivity, and configuration. - Monitor Kafka clusters and perform regular health checks to ensure optimal performance. - Collaborate with development teams to identify root causes of problems and implement effective solutions. - Provide timely and effective support to customers via email, chat, and phone. - Create and maintain documentation for troubleshooting procedures and best practices. - Assist in the deployment and configuration of Kafka environments, including brokers, producers, and consumers. - Conduct training sessions and provide knowledge transfer to team members. - You will be continuously learning groundbreaking technologies, and developing new technical skills and other professional competencies. - You will act as interviewer in hiring processes, and coach/mentor new team members. A day in the life • First and foremost this is a customer support role – in The Cloud. • On a typical day, a Support Engineer will be primarily responsible for solving customer’s cases through a variety of customer contact channels which include telephone, email, and web/live chat. You will apply advanced troubleshooting techniques to provide tailored solutions for our customers and drive customer interactions by thoughtfully working with customers to dive deep into the root cause of an issue. • Apart from working on a broad spectrum of technical issues, an AWS Support Engineer may also coach/mentor new hires, develop & present training, partner with development teams on complex issues or contact deflection initiatives, participate in new hiring, write tools/script to help the team, or work with leadership on process improvement and strategic initiatives to ensure better CX and compliance with global AWS standards, practices and policies. • Career development: We promote advancement opportunities across the organization to help you meet your career goals. • Training: We have training programs to help you develop the skills required to be successful in your role. • We hire smart people who are keen to build a career with AWS, so we are more interested in the areas that you do know instead of those you haven’t been exposed to yet. • Support engineers interested in travel have presented training or participated in focused summits across our sites or at specific AWS events. AWS Support is 24/7/365 operations and shift work will be required in afternoon i.e. 1 PM to 10 PM IST About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. BASIC QUALIFICATIONS - Bachelor’s degree OR equivalent experience in a technical position; Requires minimum of 2+ yrs experience in relevant technical position - Exposure to Database Fundamentals and General Troubleshooting (tuning and optimization, deadlocks, keys, normalization) in any Relational Database Engines (MySQL, PostgreSQL, Oracle, SQLServer) OR exposure to search services fundamentals and troubleshooting (indices and JVMMemory analysis and CPU utilization) for key open source products like Elasticsearch and Solr OR exposure to streaming services like Kafka / Kinesis. - Experience in Business Analytics application, support, and troubleshooting concepts; Experience with System Administration and troubleshooting with Linux (Ubuntu, CentOS, RedHat) and/or Microsoft Windows Server and associated technologies (Active Directory); Experience with Networking and troubleshooting (TCP/IP, DNS, OSI model, routing, switching, firewalls, LAN/WAN, traceroute, iperf, dig, URL or related) PREFERRED QUALIFICATIONS - Experience in a customer support environment and Experience in analyzing, troubleshooting, and providing solutions to technical issues - Knowledge in data warehousing and ETL process • Understanding of Cloud Computing concepts; Experience in scripting or developing in at least one of the following languages :Python, R, Ruby, GO, Java, .NET (C#), JavaScript - Expertise in any one of the Data warehouse technology (example Redshift, Teradata, Exadata or Snowflake) OR expertise in search services products like Elasticsearch / Solr; Expertise in streaming services like Kafka / Kinesis Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 3 months ago
2 - 7 years
8 - 18 Lacs
Bengaluru
Work from Office
DESCRIPTION AWS Sales, Marketing, and Global Services (SMGS) is responsible for driving revenue, adoption, and growth from the largest and fastest growing small- and mid-market accounts to enterprise-level customers including public sector. Amazon Web Services is the global market leader and technology forerunner in the Cloud business. As a member of the AWS Support team in Amazon Web Services, you will be at the forefront of this transformational technology, assisting a global list of companies and developers that are taking advantage of a growing set of services and features to run their mission-critical applications. As a Cloud Support Engineer, you will act as the Cloud Ambassador’ across all the cloud products, arming our customers with required tools & tactics to get the most out of their Product and Support investment. Would you like to use the latest cloud computing technologies? Do you have an interest in helping customers understand application architectures and integration approaches? Are you familiar with best practices for applications, servers and networks? Do you want to be part of a customer facing technology team in India helping to ensure the success of Amazon Web Services (AWS) as a leading technology organization? If you fit the description, you might be the person we are looking for! We are a team passionate about cloud computing, and believe that world class support is critical to customer success. Key job responsibilities Diagnose and resolve issues related to Kafka performance, connectivity, and configuration. Monitor Kafka clusters and perform regular health checks to ensure optimal performance. Collaborate with development teams to identify root causes of problems and implement effective solutions. Provide timely and effective support to customers via email, chat, and phone. Create and maintain documentation for troubleshooting procedures and best practices. Assist in the deployment and configuration of Kafka environments, including brokers, producers, and consumers. Conduct training sessions and provide knowledge transfer to team members. You will be continuously learning groundbreaking technologies, and developing new technical skills and other professional competencies. You will act as interviewer in hiring processes, and coach/mentor new team members. A day in the life • First and foremost this is a customer support role – in The Cloud. • On a typical day, a Support Engineer will be primarily responsible for solving customer’s cases through a variety of customer contact channels which include telephone, email, and web/live chat. You will apply advanced troubleshooting techniques to provide tailored solutions for our customers and drive customer interactions by thoughtfully working with customers to dive deep into the root cause of an issue. • Apart from working on a broad spectrum of technical issues, an AWS Support Engineer may also coach/mentor new hires, develop & present training, partner with development teams on complex issues or contact deflection initiatives, participate in new hiring, write tools/script to help the team, or work with leadership on process improvement and strategic initiatives to ensure better CX and compliance with global AWS standards, practices and policies. • Career development: We promote advancement opportunities across the organization to help you meet your career goals. • Training: We have training programs to help you develop the skills required to be successful in your role. • We hire smart people who are keen to build a career with AWS, so we are more interested in the areas that you do know instead of those you haven’t been exposed to yet. • Support engineers interested in travel have presented training or participated in focused summits across our sites or at specific AWS events. About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. BASIC QUALIFICATIONS - Bachelor’s degree OR equivalent experience in a technical position; Requires minimum of 2+ yrs experience in relevant technical position - Exposure to Database Fundamentals and General Troubleshooting (tuning and optimization, deadlocks, keys, normalization) in any Relational Database Engines (MySQL, PostgreSQL, Oracle, SQLServer) OR exposure to search services fundamentals and troubleshooting (indices and JVMMemory analysis and CPU utilization) for key open source products like Elasticsearch and Solr OR exposure to streaming services like Kafka / Kinesis. - Experience in Business Analytics application, support, and troubleshooting concepts; Experience with System Administration and troubleshooting with Linux (Ubuntu, CentOS, RedHat) and/or Microsoft Windows Server and associated technologies (Active Directory); Experience with Networking and troubleshooting (TCP/IP, DNS, OSI model, routing, switching, firewalls, LAN/WAN, traceroute, iperf, dig, URL or related) PREFERRED QUALIFICATIONS - Experience in a customer support environment and Experience in analyzing, troubleshooting, and providing solutions to technical issues - Knowledge in data warehousing and ETL process • Understanding of Cloud Computing concepts; Experience in scripting or developing in at least one of the following languages :Python, R, Ruby, GO, Java, .NET (C#), JavaScript - Expertise in any one of the Data warehouse technology (example Redshift, Teradata, Exadata or Snowflake) OR expertise in search services products like Elasticsearch / Solr; Expertise in streaming services like Kafka / Kinesis Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 3 months ago
2 - 7 years
8 - 18 Lacs
Bengaluru
Work from Office
DESCRIPTION AWS Sales, Marketing, and Global Services (SMGS) is responsible for driving revenue, adoption, and growth from the largest and fastest growing small- and mid-market accounts to enterprise-level customers including public sector. Amazon Web Services is the global market leader and technology forerunner in the Cloud business. As a member of the AWS Support team in Amazon Web Services, you will be at the forefront of this transformational technology, assisting a global list of companies and developers that are taking advantage of a growing set of services and features to run their mission-critical applications. As a Cloud Support Engineer, you will act as the Cloud Ambassador’ across all the cloud products, arming our customers with required tools & tactics to get the most out of their Product and Support investment. Would you like to use the latest cloud computing technologies? Do you have an interest in helping customers understand application architectures and integration approaches? Are you familiar with best practices for applications, servers and networks? Do you want to be part of a customer facing technology team in India helping to ensure the success of Amazon Web Services (AWS) as a leading technology organization? If you fit the description, you might be the person we are looking for! We are a team passionate about cloud computing, and believe that world class support is critical to customer success. Key job responsibilities Diagnose and resolve issues related to Kafka performance, connectivity, and configuration. Monitor Kafka clusters and perform regular health checks to ensure optimal performance. Collaborate with development teams to identify root causes of problems and implement effective solutions. Provide timely and effective support to customers via email, chat, and phone. Create and maintain documentation for troubleshooting procedures and best practices. Assist in the deployment and configuration of Kafka environments, including brokers, producers, and consumers. Conduct training sessions and provide knowledge transfer to team members. You will be continuously learning groundbreaking technologies, and developing new technical skills and other professional competencies. You will act as interviewer in hiring processes, and coach/mentor new team members. A day in the life First and foremost this is a customer support role – in The Cloud. • On a typical day, a Support Engineer will be primarily responsible for solving customer’s cases through a variety of customer contact channels which include telephone, email, and web/live chat. You will apply advanced troubleshooting techniques to provide tailored solutions for our customers and drive customer interactions by thoughtfully working with customers to dive deep into the root cause of an issue. • Apart from working on a broad spectrum of technical issues, an AWS Support Engineer may also coach/mentor new hires, develop & present training, partner with development teams on complex issues or contact deflection initiatives, participate in new hiring, write tools/script to help the team, or work with leadership on process improvement and strategic initiatives to ensure better CX and compliance with global AWS standards, practices and policies. • Career development: We promote advancement opportunities across the organization to help you meet your career goals. • Training: We have training programs to help you develop the skills required to be successful in your role. • We hire smart people who are keen to build a career with AWS, so we are more interested in the areas that you do know instead of those you haven’t been exposed to yet. • Support engineers interested in travel have presented training or participated in focused summits across our sites or at specific AWS events. About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. BASIC QUALIFICATIONS - Bachelor’s degree OR equivalent experience in a technical position; Requires minimum of 2+ yrs experience in relevant technical position - Exposure to Database Fundamentals and General Troubleshooting (tuning and optimization, deadlocks, keys, normalization) in any Relational Database Engines (MySQL, PostgreSQL, Oracle, SQLServer) OR exposure to search services fundamentals and troubleshooting (indices and JVMMemory analysis and CPU utilization) for key open source products like Elasticsearch and Solr OR exposure to streaming services like Kafka / Kinesis. - Experience in Business Analytics application, support, and troubleshooting concepts; Experience with System Administration and troubleshooting with Linux (Ubuntu, CentOS, RedHat) and/or Microsoft Windows Server and associated technologies (Active Directory); Experience with Networking and troubleshooting (TCP/IP, DNS, OSI model, routing, switching, firewalls, LAN/WAN, traceroute, iperf, dig, URL or related) PREFERRED QUALIFICATIONS - Experience in a customer support environment and Experience in analyzing, troubleshooting, and providing solutions to technical issues - Knowledge in data warehousing and ETL process • Understanding of Cloud Computing concepts; Experience in scripting or developing in at least one of the following languages :Python, R, Ruby, GO, Java, .NET (C#), JavaScript - Expertise in any one of the Data warehouse technology (example Redshift, Teradata, Exadata or Snowflake) OR expertise in search services products like Elasticsearch / Solr; Expertise in streaming services like Kafka / Kinesis Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 3 months ago
3 - 6 years
8 - 18 Lacs
Hyderabad
Work from Office
DESCRIPTION Amazon has built a reputation for excellence and Amazon Web Services India Private Limited (AWS India) the local reseller of AWS is carrying on that tradition while leading the world in Cloud technologies. AWS Sales, Marketing, and Global Services (SMGS) is responsible for driving revenue, adoption, and growth from the largest and fastest growing small- and mid-market accounts to enterprise-level customers including public sector. The AWS Global Support team interacts with leading companies and believes that world-class support is critical to customer success. AWS Support also partners with a global list of customers that are building mission-critical applications on top of AWS services AWS Services provides developers and small to large businesses access to the horizontally scalable state of the art cloud infrastructure like S3, EC2, AMI, Cloud Front and Simple DB, that powers Amazon.com. Developers can build any type of business on AWS Platform and scale their application with growing business needs. We want you to help share and shape our mission to be Earth's most customer-centric company. Our evolution from Web site to e-commerce partner to development platform is driven by the spirit of invention that is part of our DNA. We do every day by inventing elegant and simple solutions to complex technical and business problems. We're making history and the good news is that we've only just begun. This role requires the flexibility to work 5 days a week (occasionally on weekends) on a rotational basis. AWS Support is 24x7x365 operations and work timings for this role is in India night time i.e. 7 PM to 4 AM IST or 11 PM to 8 AM IST . You are expected to work in night shifts hours based on business requirements. Key job responsibilities First and foremost this is a customer support role in The Cloud. On a typical day, a Support Engineer will be primarily responsible for solving customer’s cases through a variety of customer contact channels which include telephone, email, and web/live chat. You will apply advanced troubleshooting techniques to provide tailored solutions for our customers and drive customer interactions by thoughtfully working with customers to dive deep into the root cause of an issue. Apart from working on a broad spectrum of technical issues, an AWS Support Engineer may also coach/mentor new hires, develop & present training, partner with development teams on complex issues or contact deflection initiatives, participate in new hiring, write tools/script to help the team, or work with leadership on process improvement and strategic initiatives to ensure better CX and compliance with global AWS standards, practices and policies. Career development: We promote advancement opportunities across the organization to help you meet your career goals. Training: We have training programs to help you develop the skills required to be successful in your role. Support engineers interested in travel have presented training or participated in focused summits across our sites or at specific AWS events. A day in the life Every day will bring new and exciting challenges on the job while you: • Learn and use new technologies. • Apply advanced troubleshooting techniques to provide unique solutions to our customers' individual needs. • Interact with leading technologists around the world and resolve customer issues. • Partner with Amazon teams in India to help reproduce and resolve customer issues. • Leverage your extensive customer support experience to provide feedback to internal Amazon teams in India on how to improve our services. • Drive customer communication during critical events. • Drive projects that improve support-related processes and our customers’ technical support experience. • Write tutorials, how-to videos, and other technical articles for the developer community. • Work on critical, highly complex customer problems that may span multiple AWS services. About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. Internal job description BASIC QUALIFICATIONS - Bachelor’s degree OR equivalent experience in a technical position; Requires minimum of 2+ yrs experience in relevant technical position - Exposure to Database Fundamentals and General Troubleshooting (tuning and optimization, deadlocks, keys, normalization) in any Relational Database Engines (MySQL, PostgreSQL, Oracle, SQLServer) OR exposure to search services fundamentals and troubleshooting (indices and JVMMemory analysis and CPU utilization) for key open source products like Elasticsearch and Solr OR exposure to streaming services like Kafka / Kinesis. - Experience in Business Analytics application, support, and troubleshooting concepts; Experience with System Administration and troubleshooting with Linux (Ubuntu, CentOS, RedHat) and/or Microsoft Windows Server and associated technologies (Active Directory); Experience with Networking and troubleshooting (TCP/IP, DNS, OSI model, routing, switching, firewalls, LAN/WAN, traceroute, iperf, dig, URL or related) PREFERRED QUALIFICATIONS - Experience in a customer support environment and Experience in analyzing, troubleshooting, and providing solutions to technical issues - Knowledge in data warehousing and ETL process. Understanding of Cloud Computing concepts; Experience in scripting or developing in at least one of the following languages :Python, R, Ruby, GO, Java, .NET (C#), JavaScript - Expertise in any one of the Data warehouse technology (example Redshift, Teradata, Exadata or Snowflake) OR expertise in search services products like Elasticsearch / Solr; Expertise in streaming services like Kafka / Kinesis - Ready to work in India night shift
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2