Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Remote
Skillset: PostgreSQL, Amazon Redshift, MongoDB, Apache Cassandra,AWS,ETL, Shell Scripting, Automation, Microsoft Azure We are looking for futuristic, motivated go getters having following skills for an exciting role. Job Description: Monitor and maintain the performance, reliability, and availability of multiple database systems. Optimize complex SQL queries, stored procedures, and ETL scripts for better performance and scalability. Troubleshoot and resolve issues related to database performance, integrity, backups, and replication. Design, implement, and manage scalable data pipelines across structured and unstructured sources. Develop automation scripts for routine maintenance tasks using Python, Bash, or similar tools. Perform regular database health checks, set up alerting mechanisms, and respond to incidents proactively. Analyze performance bottlenecks and resolve slow query issues and deadlocks. Work in DevOps/Agile environments, integrating with CI/CD pipelines for database operations. Collaborate with engineering, analytics, and infrastructure teams to integrate database solutions with applications and BI tools. Research and implement emerging technologies and best practices in database administration. Participate in capacity planning, security audits, and software upgrades for data infrastructure. Maintain comprehensive documentation related to database schemas, metadata, standards, and procedures. Ensure compliance with data privacy regulations and implement robust disaster recovery and backup strategies. Desired skills: Database Systems: Hands-on experience with SQL-based databases (PostgreSQL, MySQL), Amazon Redshift, MongoDB, and Apache Cassandra. Scripting & Automation: Proficiency in scripting using Python, Shell, or similar to automate database operations. Cloud Platforms: Working knowledge of AWS (RDS, Redshift, EC2, S3, IAM,Lambda) and Azure SQL/Azure Cosmos DB. Big Data & Distributed Systems: Familiarity with Apache Spark for distributed data processing. Performance Tuning: Deep experience in performance analysis, indexing strategies, and query optimization. Security & Compliance: Experience with database encryption, auditing, access control, and GDPR/PII policies. Familiarity with Linux and Windows server administration is a plus. Education & Experience: BE, B.Tech, MCA, Mtech from Tier 2/3 colleges & Science Graduates 5-8 years of work experience.
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
Job Title : Azure Presales Engineer. About the Role : As a Cloud Presales Engineer specializing in Azure, you will play a critical role in our sales process by working closely with sales and technical teams to provide expert guidance and solutions for our clients. Leveraging your in-depth knowledge on Azure services, you will understand customer needs, design tailored cloud solutions, and drive the adoption of our cloud offerings. This position requires strong technical acumen, excellent communication skills, and a passion for cloud technologies. Key Responsibilities Solution Design and Architecture : Understand customer requirements and design effective cloud solutions using Azure services. Create architecture diagrams and detailed proposals tailored to customer needs. Collaborate with sales teams to define the scope of technical solutions and present them to customers. Technical Expertise And Consultation Act as a subject matter expert on AWS and Azure services, including EC2, S3, Lambda, RDS, VPC, IAM, CloudFormation, Azure Virtual Machines, Blob Storage, Functions, SQL Database, Virtual Network, Azure Active Directory, and ARM Templates. Provide technical support during the sales process, including product demonstrations, POCs (Proof of Concepts), and answering customer queries. Advise customers on best practices for cloud adoption, migration, and optimization. Customer Engagement Build and maintain strong relationships with customers, understanding their business challenges and technical needs. Conduct customer workshops, webinars, and training sessions to educate customers on Azure solutions and services. Gather customer feedback and insights to help shape product and service offerings. Sales Support Partner with sales teams to develop sales strategies and drive cloud adoption. Prepare and deliver compelling presentations, demonstrations, and product pitches to customers. Assist in the preparation of RFPs, RFQs, and other customer documentation. Continuous Learning And Development Stay up-to-date with the latest AWS and Azure services, technologies, and industry trends. Achieve and maintain relevant AWS and Azure certifications to demonstrate expertise. Share knowledge and best practices with internal teams to enhance overall capabilities. Required Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Experience in a presales or technical consulting role, with a focus on cloud solutions. In-depth knowledge of AWS and Azure services, with hands-on experience in designing and implementing cloud-based architectures. Azure certifications (i.e. Microsoft Certified : Azure Solutions Architect Expert) are highly preferred. Strong understanding of cloud computing concepts, including IaaS, PaaS, SaaS, and hybrid cloud models. Excellent presentation, communication, and interpersonal skills. Ability to work independently and collaboratively in a fast-paced, dynamic environment. Preferred Qualifications Experience with other cloud platforms (i.e., Google Cloud) is a plus. Familiarity with DevOps practices, CI/CD pipelines, and infrastructure as code (IaC) using Terraform, CloudFormation, and ARM Templates. Experience with cloud security, compliance, and governance best practices. Background in software development, scripting, or system administration. Join us to be part of an innovative team, shaping cloud solutions and driving digital transformation for our clients!. (ref:hirist.tech),
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
kochi, kerala
On-site
You will be responsible for planning, implementing, and growing the AWS cloud infrastructure. Your role will involve building, releasing, and managing the configuration of all production systems. It will be essential to manage a continuous integration and deployment methodology for server-based technologies. Collaboration with architecture and engineering teams to design and implement scalable software services will also be part of your responsibilities. Ensuring system security through the utilization of best-in-class cloud security solutions will be crucial. Staying up to date with new technology options and vendor products is important, and you will be expected to evaluate which ones would be suitable for the company. Implementing continuous integration/continuous delivery (CI/CD) pipelines when needed will also fall under your purview. You will have the opportunity to recommend process and architecture improvements, troubleshoot the system, and resolve problems across all platform and application domains. Overseeing pre-production acceptance testing to maintain the high quality of the company's services and products will be part of your duties. Experience with Terraform, Ansible, GIT, and Cloud Formation will be beneficial for this role. Additionally, a solid background in Linux/Unix and Windows server system administration is required. Configuring the AWS CloudWatch and monitoring, creating and modifying scripts, and hands-on experience with MySQL are also essential skills. You should have experience in designing and building web environments on AWS, including working with services like EC2, ELB, RDS, and S3. This is a full-time position with benefits such as Provident Fund and a yearly bonus. The work schedule is during the day shift, and the preferred experience level for AWS is 3 years. The work location is in person.,
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
This is a full-time on-site role for a PHP Laravel Developer based in Chennai. In this position, you will play a key role in developing and maintaining web applications utilizing the Laravel framework. Your responsibilities will include coding, debugging, testing, and deploying new features. Additionally, you will collaborate with cross-functional teams to create efficient and scalable solutions. To excel in this role, you must possess a strong proficiency in PHP and have hands-on experience with the Laravel framework. Familiarity with frontend technologies like HTML, CSS, and JavaScript is essential. Moreover, knowledge of database management systems, particularly MySQL, is required. Understanding RESTful APIs, integrating third-party services, and using version control systems like Git are also important aspects of this position. Candidates should have practical experience in schema design, query optimization, REST API, and AWS services such as EC2, S3, RDS, Lambda, and Redis. Proficiency in designing scalable and secure web applications, expertise in automated testing frameworks, and a solid grasp of web security practices are crucial for success in this role. The ideal candidate will be able to prioritize tasks effectively and work both independently and collaboratively as part of a team. Strong problem-solving and troubleshooting skills are essential, as is clear communication and the ability to work with others. A Bachelor's degree in computer science or a related field, or equivalent experience, is required. Requirements: - Strong proficiency in PHP with Laravel framework - Experience in HTML, CSS, and JavaScript - Knowledge of MySQL and RESTful APIs - Familiarity with Git and version control systems - Hands-on experience with schema design, query optimization, and REST API - Profound knowledge of AWS services - Demonstrated experience in designing scalable and secure web applications - Expertise in automated testing frameworks - Strong understanding of web security practices - Ability to prioritize tasks and work independently or as part of a team - Excellent problem-solving and troubleshooting skills - Good communication and collaboration skills - Bachelor's degree or equivalent experience in computer science or related field Experience: 4+ Years Location: Chennai/Madurai Interested candidates can share CV at anushya.a@extendotech.com / 6374472538 Job Type: Full-time Benefits: Health insurance, Provident Fund Location Type: In-person Schedule: Morning shift Work Location: In person,
Posted 3 weeks ago
10.0 - 17.0 years
0 Lacs
hyderabad, telangana
On-site
We have an exciting opportunity for an ETL Data Architect position with an AI-ML driven SaaS Solution Product Company in Hyderabad. As an ETL Data Architect, you will play a crucial role in designing and implementing a robust Data Access Layer to provide consistent data access needs to the underlying heterogeneous storage layer. You will also be responsible for developing and enforcing data governance policies to ensure data security, quality, and compliance across all systems. In this role, you will lead the architecture and design of data solutions that leverage the latest tech stack and AWS cloud services. Collaboration with product managers, tech leads, and cross-functional teams will be essential to align data strategy with business objectives. Additionally, you will oversee data performance optimization, scalability, and reliability of data systems while guiding and mentoring team members on data architecture, design, and problem-solving. The ideal candidate should have at least 10 years of experience in data-related roles, with a minimum of 5 years in a senior leadership position overseeing data architecture and infrastructure. A deep background in designing and implementing enterprise-level data infrastructure, preferably in a SaaS environment, is required. Extensive knowledge of data architecture principles, data governance frameworks, security protocols, and performance optimization techniques is essential. Hands-on experience with AWS services such as RDS, Redshift, S3, Glue, Document DB, as well as other services like MongoDB, Snowflake, etc., is highly desirable. Familiarity with big data technologies (e.g., Hadoop, Spark) and modern data warehousing solutions is a plus. Proficiency in at least one programming language (e.g., Node.js, Java, Golang, Python) is a must. Excellent communication skills are crucial in this role, with the ability to translate complex technical concepts to non-technical stakeholders. Proven leadership experience, including team management and cross-functional collaboration, is also required. A Bachelor's degree in computer science, Information Systems, or related field is necessary, with a Master's degree being preferred. Preferred qualifications include experience with Generative AI and Large Language Models (LLMs) and their applications in data solutions, as well as familiarity with financial back-office operations and the FinTech domain. Stay updated on emerging trends in data technology, particularly in AI/ML applications for finance. Industry: IT Services and IT Consulting,
Posted 3 weeks ago
7.0 - 12.0 years
10 - 15 Lacs
Bengaluru
Hybrid
Hiring an AWS Data Engineer for a 6-month hybrid contractual role based in Bellandur, Bengaluru. The ideal candidate will have 7+ years of experience in data engineering, with strong expertise in AWS services (S3, EC2, RDS, Lambda, EKS), PostgreSQL, Redis, Apache Iceberg, and Graph/Vector Databases. Proficiency in Python or Golang is essential. Responsibilities include designing and optimizing data pipelines on AWS, managing structured and in-memory data, implementing advanced analytics with vector/graph databases, and collaborating with cross-functional teams. Prior experience with CI/CD and containerization (Docker/Kubernetes) is a plus.
Posted 3 weeks ago
3.0 - 8.0 years
2 - 7 Lacs
Nagpur
Work from Office
Role & responsibilities Provide advanced L2 support for server, virtualization, and desktop infrastructure. Design, create, and optimize Group Policies and global IT policies across multi-domain environments. Hands on knowledge and experience working on Windows OS (Client OS and Server OS) Hands on knowledge and experience on Active Directory, Azure Administration, O365 Administration. Manage and support Windows Servers , including installation, configuration, and maintenance. Work extensively with virtualization platforms such as Hyper-V and VMware . Configure and manage VDI solutions , especially using VMware Horizon . Set up and maintain Remote Desktop Services , including complex multi-user environments. Perform image refresh, deployment, and configuration of Thin Clients . Manage backup and disaster recovery solutions using Veeam . Collaborate with internal and external teams to support IT infrastructure for multiple clients. Maintain documentation for configurations, procedures, and changes. Required Skills: Proven experience in L2 IT Infrastructure Support Hands-on expertise in MSP environments and multi-client infrastructure management Strong understanding of Group Policy , Active Directory , DNS , and DHCP Proficient in VMware , Hyper-V , Horizon View , and overall virtualization technologies Experience with Veeam Backup & Replication In-depth knowledge of Remote Desktop Services and Thin Client configuration & image deployment Excellent communication, problem-solving, and documentation skills
Posted 3 weeks ago
6.0 - 11.0 years
12 - 22 Lacs
Bengaluru
Work from Office
Our engineering team is looking for a Data Engineer who is very proficient in python, has a very good understanding of AWS cloud computing, ETL Pipelines and demonstrates proficiency with SQL and relational database concepts In this role you will be a very mid-to senior-level individual contributor guiding our migration efforts by serving as a Senior data engineer working closely with the Data architects to evaluate best-fit solutions and processes for our team You will work with the rest of the team as we move away from legacy tech and introduce new tools and ETL pipeline solutions You will collaborate with subject matter experts, data architects, informaticists and data scientists to evolve our current cloud based ETL to the next Generation Responsibilities Independently prototypes/develop data solutions of high complexity to meet the needs of the organization and business customers. Designs proof-of-concept solutions utilizing an advanced understanding of multiple coding languages to meet technical and business requirements, with an ability to perform iterative solution testing to ensure specifications are met. Designs and develops data solutions that enables effective self-service data consumption, and can describe their value to the customer. Collaborates with stakeholders in defining metrics that are impactful to the business. Prioritize efforts based on customer value. Has an in-depth understanding of Agile techniques. Can set expectations for deliverables of high complexity. Can assist in the creation of roadmaps for data solutions. Can turn vague ideas or problems into data product solutions. Influences strategic thinking across the team and the broader organization. Maintains proof-of-concepts and prototype data solutions, and manages any assessment of their viability and scalability, with own team or in partnership with IT. Working with IT, assists in building robust systems focusing on long-term and ongoing maintenance and support. Ensure data solutions include deliverables required to achieve high quality data. Displays a strong understanding of complex multi-tier, multi-platform systems, and applies principles of metadata, lineage, business definitions, compliance, and data security to project work. Has an in-depth understanding of Business Intelligence tools, including visualization and user experience techniques. Can set expectations for deliverables of high complexity. Works with IT to help scale prototypes. Demonstrates a comprehensive understanding of new technologies as needed to progress initiatives. Requirements Expertise in Python programming, with demonstrated real-world experience building out data tools in a Python environment. Expertise in AWS Services , with demonstrated real-world experience building out data tools in a Python environment. Bachelor`s Degree in Computer Science, Computer Engineering, or related discipline preferred. Masters in same or related disciplines strongly preferred. 3+ years experience in coding for data management, data warehousing, or other data environments, including, but not limited to, Python and Spark. Experience with SAS is preferred. 3+ years experience as developer working in an AWS cloud computing environment. 3+ years experience using GIT or Bitbucket. Experience with Redshift, RDS, DynomoDB is preferred. Strong written and oral communication skills are required. Experience in the healthcare industry with healthcare data analytics products Experience with healthcare vocabulary and data standards (OMOP, FHIR) is a plus
Posted 3 weeks ago
5.0 - 8.0 years
7 - 14 Lacs
Bengaluru
Work from Office
Key Skills: Core Java, Spring / Spring Boot, AWS Services (EC2, S3, Lambda, RDS, API Gateway, CloudWatch), RESTful APIs, Microservices Architecture, Docker, Git / Version Control, CI/CD Tools (SQL / RDBMS , Agile Methodologies, Unit Testing (JUnit, Mockito), Maven / Gradle, API Documentation . Roles and Responsibilities: Design, develop, and maintain scalable Java-based applications. Implement and manage services on AWS cloud infrastructure. Collaborate with cross-functional teams to gather requirements and deliver solutions. Develop RESTful APIs and integrate them with front-end components. Ensure high performance, reliability, and scalability of the application. Monitor, troubleshoot, and optimize application performance. Write unit and integration tests to ensure code quality. Participate in code reviews and follow best practices in software development. Experience Requirement: Strong proficiency in Java development. 5-8 years of experience with AWS services such as EC2, S3, Lambda, RDS, and API Gateway. Familiarity with microservices architecture and containerization tools like Docker. Experience with version control systems such as Git. Good understanding of CI/CD pipelines and DevOps practices. Strong analytical and problem-solving skills. Minimum years of relevant experience. Education : B.E., B.Tech, B. Sc.
Posted 3 weeks ago
8.0 - 12.0 years
22 - 35 Lacs
Bengaluru
Hybrid
Role & responsibilities As a Senior Data Engineer and database specialist you will be designing, creating and managing the cloud databases and data pipelines that underpin our decoupled cloud architecture and API first approach. You have proven expertise in database design, data ingestion, transformation, data writing, scheduling and query management within a cloud environment. You will have proven experience and expertise in working with AWS Cloud Infrastructure Engineers, Software/API Developers and Architects to design, develop, deploy and operate data services and solutions that underpin a cloud ecosystem. You will take ownership and accountability of functional and non-functional design and work within a team of Engineers to create innovative solutions that unlock value and modernise technology designs. You will role model continuous improvement mindset in the team, and in your project interactions, by taking technical ownership of key assets, including roadmaps and technical direction of data services running on our AWS environments. See yourself in our team The Business Banking Technology Domain works in an Agile methodology with our business banking business to plan, prioritise and deliver on high value technology objectives with key results that meet our regulatory obligations and protect the community. You will work within the VRM Crew that is working on initiatives such as Gen AI based cash flow coach to provide relevant data to our regulators. To achieve our objectives, you will use you deep understanding of data modelling and data quality and your extensive experience with SQL to access relational databases such as Oracle and Postgres to identify, transform and validate data required for complex business reporting requirements. You will use your experience in designing and building reliable and efficient data pipelines preferably using modern cloud services on AWS such as S3, Lambda, Redshift, Glue, etc to process large volumes of data efficiently. Experience with data centric frameworks such as Spark with programming knowledge in Scala or Python is highly advantageous. As is experience working on Linux with shell and automation frameworks to manage code and infrastructure in a well-structured and reliable manner. Experience with Pega workflow software as a source or target for data integration is also highly regarded. Were interested in hearing from people who: • Can design and implement databases for data integration in the enterprise • Can performance tune applications from a database code and design perspective • Can automate data ingestion and transformation processes using scheduling tools. Monitor and troubleshoot data pipelines to ensure reliability and performance. • Have experience working through performance and scaling through horizontal scaling designs vs database tuning • Can design application logical database requirements and implement physical solutions • Can collaborate with business and technical teams in order to design and build critical databases and data pipelines • Can advise business owners on strategic database direction and application solution design Tech skills We use a broad range of tools, languages, and frameworks. We dont expect you to know them all but having significant experience and exposure with some of these (or equivalents) will set you up for success in this team. • AWS Data products such as AWS Glue and AWS EMR • Oracle and AWS Aurora RDS such as PostgreSQL • AWS S3 ingestion, transformation and writing to databases • Proficiency in programming languages like Python, Scala or Java for developing data ingestion and transformation scripts. • Strong knowledge of SQL for writing, optimizing, and debugging queries. • Familiarity with database design, indexing, and normalization principles. Understanding of data formats (JSON, CSV, XML) and techniques for converting between them. Ability to handle data validation, cleaning, and transformation. • Proficiency in automation tools and scripting (e.g., bash scripting, cron jobs) for scheduling and monitoring data processes. • Experience with version control systems (e.g., Git) for managing code and collaboration. Working with us: Whether youre passionate about customer service, driven by data, or called by creativity, a career with CommBank is for you. Our people bring their diverse backgrounds and unique perspectives to build a respectful, inclusive, and flexible workplace with flexible work locations. One where were driven by our values, and supported to share ideas, initiatives, and energy. One where making a positive impact for customers, communities and each other is part of our every day. Here, youll thrive. You’ll be supported when faced with challenges and empowered to tackle new opportunities. We’re hiring engineers from across all of Australia and have opened technology hubs in Melbourne and Perth. We really love working here, and we think you will too. We support our people with the flexibility to balance where work is done with at least half their time each month connecting in office. We also have many other flexible working options available including changing start and finish times, part-time arrangements and job share to name a few. Talk to us about how these arrangements might work in the role you’re interested in. If this sounds like the role for you then we would love to hear from you. Apply today! If you are interested for this job so Please share your detail with updated CV on Krishankant@thinkpeople.in Total Exp.- Rel Exp.- Current Company- CTC- ECTC- Notice Period- DOB- Edu.-
Posted 3 weeks ago
3.0 - 7.0 years
8 - 12 Lacs
Mumbai, Navi Mumbai, Mumbai (All Areas)
Work from Office
Cloud DevOps Engineer (AWS) 3 to 5 Years Experience Location - Mumbai ( WFO) Core Technical Skills: 1. AWS Services: AWS EKS/ECS, RDS, NACLs, Route Tables, Security Services config 2. CI/CD & Infrastructure As Code: Jenkins, Terraform, DevSecOps, End-to-end DevOps 3. Containerization: Docker and Kubernetes 4. Familiarity with tools: ELK, Redis, WAf, Firewall, VPN, Cloudfront/CDN, 5. Linux OS Bonus Skills: 1. AWS Certification (e.g., AWS Certified DevOps Engineer/AWS Solution Architect) 2. Knowledge of cost optimization techniques 4. Linux Administration: Strong in managing Linux and automation via scripts
Posted 3 weeks ago
6.0 - 10.0 years
22 - 25 Lacs
Bengaluru
Work from Office
Our engineering team is looking for a Data Engineer who is very proficient in python, has a very good understanding of AWS cloud computing, ETL Pipelines and demonstrates proficiency with SQL and relational database concepts. In this role you will be a very mid-to senior-level individual contributor guiding our migration efforts by serving as a Senior data engineer working closely with the Data architects to evaluate best-fit solutions and processes for our team. You will work with the rest of the team as we move away from legacy tech and introduce new tools and ETL pipeline solutions. You will collaborate with subject matter experts, data architects, informaticists and data scientists to evolve our current cloud based ETL to the next Generation. Responsibilities Independently prototypes/develop data solutions of high complexity to meet the needs of the organization and business customers. Designs proof-of-concept solutions utilizing an advanced understanding of multiple coding languages to meet technical and business requirements, with an ability to perform iterative solution testing to ensure specifications are met. Designs and develops data solutions that enables effective self-service data consumption, and can describe their value to the customer. Collaborates with stakeholders in defining metrics that are impactful to the business. Prioritize efforts based on customer value. Has an in-depth understanding of Agile techniques. Can set expectations for deliverables of high complexity. Can assist in the creation of roadmaps for data solutions. Can turn vague ideas or problems into data product solutions. Influences strategic thinking across the team and the broader organization. Maintains proof-of-concepts and prototype data solutions, and manages any assessment of their viability and scalability, with own team or in partnership with IT. Working with IT, assists in building robust systems focusing on long-term and ongoing maintenance and support. Ensure data solutions include deliverables required to achieve high quality data. Displays a strong understanding of complex multi-tier, multi-platform systems, and applies principles of metadata, lineage, business definitions, compliance, and data security to project work. Has an in-depth understanding of Business Intelligence tools, including visualization and user experience techniques. Can set expectations for deliverables of high complexity. Works with IT to help scale prototypes. Demonstrates a comprehensive understanding of new technologies as needed to progress initiatives. Requirements Expertise in Python programming, with demonstrated real-world experience building out data tools in a Python environment. Expertise in AWS Services , with demonstrated real-world experience building out data tools in a Python environment. Bachelor`s Degree in Computer Science, Computer Engineering, or related discipline preferred. Masters in same or related disciplines strongly preferred. 3+ years experience in coding for data management, data warehousing, or other data environments, including, but not limited to, Python and Spark. Experience with SAS is preferred. 3+ years experience as developer working in an AWS cloud computing environment. 3+ years experience using GIT or Bitbucket. Experience with Redshift, RDS, DynomoDB is preferred. Strong written and oral communication skills are required. Experience in the healthcare industry with healthcare data analytics products Experience with healthcare vocabulary and data standards (OMOP, FHIR) is a plus
Posted 3 weeks ago
6.0 - 9.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Our engineering team is looking for a Data Engineer who is very proficient in python, has a very good understanding of AWS cloud computing, ETL Pipelines and demonstrates proficiency with SQL and relational database concepts. In this role you will be a very mid-to senior-level individual contributor guiding our migration efforts by serving as a Senior data engineer working closely with the Data architects to evaluate best-fit solutions and processes for our team. You will work with the rest of the team as we move away from legacy tech and introduce new tools and ETL pipeline solutions. You will collaborate with subject matter experts, data architects, informaticists and data scientists to evolve our current cloud based ETL to the next Generation. Responsibilities Independently prototypes/develop data solutions of high complexity to meet the needs of the organization and business customers. Designs proof-of-concept solutions utilizing an advanced understanding of multiple coding languages to meet technical and business requirements, with an ability to perform iterative solution testing to ensure specifications are met. Designs and develops data solutions that enables effective self-service data consumption, and can describe their value to the customer. Collaborates with stakeholders in defining metrics that are impactful to the business. Prioritize efforts based on customer value. Has an in-depth understanding of Agile techniques. Can set expectations for deliverables of high complexity. Can assist in the creation of roadmaps for data solutions. Can turn vague ideas or problems into data product solutions. Influences strategic thinking across the team and the broader organization. Maintains proof-of-concepts and prototype data solutions, and manages any assessment of their viability and scalability, with own team or in partnership with IT. Working with IT, assists in building robust systems focusing on long-term and ongoing maintenance and support. Ensure data solutions include deliverables required to achieve high quality data. Displays a strong understanding of complex multi-tier, multi-platform systems, and applies principles of metadata, lineage, business definitions, compliance, and data security to project work. Has an in-depth understanding of Business Intelligence tools, including visualization and user experience techniques. Can set expectations for deliverables of high complexity. Works with IT to help scale prototypes. Demonstrates a comprehensive understanding of new technologies as needed to progress initiatives. Requirements Expertise in Python programming, with demonstrated real-world experience building out data tools in a Python environment. Expertise in AWS Services , with demonstrated real-world experience building out data tools in a Python environment. Bachelor`s Degree in Computer Science, Computer Engineering, or related discipline preferred. Masters in same or related disciplines strongly preferred. 3+ years experience in coding for data management, data warehousing, or other data environments, including, but not limited to, Python and Spark. Experience with SAS is preferred. 3+ years experience as developer working in an AWS cloud computing environment. 3+ years experience using GIT or Bitbucket. Experience with Redshift, RDS, DynomoDB is preferred. Strong written and oral communication skills are required. Experience in the healthcare industry with healthcare data analytics products Experience with healthcare vocabulary and data standards (OMOP, FHIR) is a plus
Posted 3 weeks ago
6.0 - 11.0 years
10 - 20 Lacs
Bengaluru
Work from Office
We are looking for energetic, self-motivated and exceptional Data engineer to work on extraordinary enterprise products based on AI and Big Data engineering leveraging AWS/Databricks tech stack. He/she will work with star team of Architects, Data Scientists/AI Specialists, Data Engineers and Integration. Skills and Qualifications: 5+ years of experience in DWH/ETL Domain; Databricks/AWS tech stack 2+ years of experience in building data pipelines with Databricks/ PySpark /SQL Experience in writing and interpreting SQL queries, designing data models and data standards. Experience in SQL Server databases, Oracle and/or cloud databases. Experience in data warehousing and data mart, Star and Snowflake model. Experience in loading data into database from databases and files. Experience in analyzing and drawing design conclusions from data profiling results. Understanding business process and relationship of systems and applications. Must be comfortable conversing with the end-users. Job Description: Our engineering team is looking for a Data Engineer who is very proficient in python, has a very good understanding of AWS cloud computing, ETL Pipelines and demonstrates proficiency with SQL and relational database concepts. In this role you will be a very mid-to senior-level individual contributor guiding our migration efforts by serving as a Senior data engineer working closely with the Data architects to evaluate best-fit solutions and processes for our team. You will work with the rest of the team as we move away from legacy tech and introduce new tools and ETL pipeline solutions. You will collaborate with subject matter experts, data architects, informaticists and data scientists to evolve our current cloud based ETL to the next Generation. Responsibilities Independently prototypes/develop data solutions of high complexity to meet the needs of the organization and business customers. Designs proof-of-concept solutions utilizing an advanced understanding of multiple coding languages to meet technical and business requirements, with an ability to perform iterative solution testing to ensure specifications are met. Designs and develops data solutions that enables effective self-service data consumption, and can describe their value to the customer. Collaborates with stakeholders in defining metrics that are impactful to the business. Prioritize efforts based on customer value. Has an in-depth understanding of Agile techniques. Can set expectations for deliverables of high complexity. Can assist in the creation of roadmaps for data solutions. Can turn vague ideas or problems into data product solutions. Influences strategic thinking across the team and the broader organization. Maintains proof-of-concepts and prototype data solutions, and manages any assessment of their viability and scalability, with own team or in partnership with IT. Working with IT, assists in building robust systems focusing on long-term and ongoing maintenance and support. Ensure data solutions include deliverables required to achieve high quality data. Displays a strong understanding of complex multi-tier, multi-platform systems, and applies principles of metadata, lineage, business definitions, compliance, and data security to project work. Has an in-depth understanding of Business Intelligence tools, including visualization and user experience techniques. Can set expectations for deliverables of high complexity. Works with IT to help scale prototypes. Demonstrates a comprehensive understanding of new technologies as needed to progress initiatives. Requirements Expertise in Python programming, with demonstrated real-world experience building out data tools in a Python environment. Expertise in AWS Services , with demonstrated real-world experience building out data tools in a Python environment. Bachelor`s Degree in Computer Science, Computer Engineering, or related discipline preferred. Masters in same or related disciplines strongly preferred. 3+ years experience in coding for data management, data warehousing, or other data environments, including, but not limited to, Python and Spark. Experience with SAS is preferred. 3+ years experience as developer working in an AWS cloud computing environment. 3+ years experience using GIT or Bitbucket. Experience with Redshift, RDS, DynomoDB is preferred. Strong written and oral communication skills are required. Experience in the healthcare industry with healthcare data analytics products Experience with healthcare vocabulary and data standards (OMOP, FHIR) is a plus
Posted 4 weeks ago
6.0 - 11.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Experience - 6- 12 Years Notice - Immediate to 30 days Requirements:- Full Stack Development: Build and maintain web applications using React.js and Node.js. Develop back-end services and APIs using Node.js and selectively in Python FastAPI. Create RESTful and GraphQL APIs; integrate with internal and external systems. Optimize frontend performance and backend scalability. Cloud & DevOps (AWS): Deploy and manage services on AWS (EC2, S3, Lambda, RDS, API Gateway, etc.). Set up CI/CD pipelines for automated build, test, and deployment. Monitor cloud environments for reliability, cost-efficiency, and performance. Implement security best practices (IAM policies, encryption, WAF, etc.). Skills: Tech Stack: Frontend: React.js, Redux, HTML5/CSS3, Next.js (optional) Backend: Node.js (Express.js), Python (FastAPI) Database: MongoDB, PostgreSQL/MySQL (optional) Cloud: AWS (EC2, S3, Lambda, API Gateway, IAM, RDS, etc.) DevOps: Docker, CI/CD, GitHub/GitLab Actions Others: REST APIs, GraphQL, JWT/OAuth, WebSockets, and Microservices.
Posted 4 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Noida
Work from Office
Job Summary: We are seeking an experienced and results-driven Senior Java Developer to join our team in Noida . The ideal candidate should have strong hands-on experience in Java, Spring Framework, JPA, Hibernate, Kubernetes, AWS, and Microservices architecture. You will play a critical role in the design, development, and deployment of scalable and high-performance applications. Key Responsibilities: Design, develop, and implement robust and scalable Java-based applications. Develop microservices using Spring Boot . Hands-on experience with Docker-based containerization and Kubernetes for application deployment. Work with JPA and Hibernate for effective data persistence and database operations. Deploy and manage services on AWS cloud infrastructure . Collaborate with architects, DevOps engineers, QA, and other developers to deliver enterprise-grade solutions. Optimize application performance and ensure responsiveness to front-end requests. Ensure code quality and maintainability through code reviews and best practices. Participate in the full software development life cycle: requirement analysis, design, development, testing, and deployment. Required Skills & Qualifications: 4 to 8 years of strong Java development experience. Hands-on experience with Spring Boot , Spring Core , and other spring modules. Strong knowledge of JPA and Hibernate ORM frameworks. Experience with Kubernetes for container orchestration and microservices management. Working knowledge of AWS services (EC2, S3, RDS, ECS, etc.). Strong understanding of RESTful APIs and Microservices architecture . Familiarity with CI/CD tools and version control systems (e.g., Git, Jenkins). Solid problem-solving skills and a strong sense of ownership. Bachelors or Masters degree in Computer Science, Engineering, or related discipline. Preferred Skills: Experience with Docker and containerization. Exposure to monitoring tools like Prometheus, Grafana, etc. Knowledge of Agile/Scrum methodologies.
Posted 1 month ago
5.0 - 8.0 years
8 - 18 Lacs
Gurugram
Remote
Job Title: Part-Time DevOps Engineer (AWS) Contract Location: Remote Engagement Type: Part-Time Contract (Hourly/Monthly Block) Experience Required: 5+ Years in DevOps and AWS Overview: We are seeking an experienced DevOps Engineer with a strong background in AWS cloud infrastructure for a part-time, remote contract role. This is not a full-time opportunity; we are looking to engage a professional on an hourly or monthly block-of-time basis to support ongoing infrastructure, CI/CD, automation, and cloud optimization needs. Key Responsibilities: Manage and optimize AWS infrastructure (EC2, S3, IAM, VPC, Lambda, RDS, etc.) Build and maintain CI/CD pipelines using tools such as GitHub Actions, Jenkins, or similar Implement Infrastructure as Code (IaC) using Terraform or CloudFormation Monitor and troubleshoot system performance, scalability, and security issues Automate repetitive tasks and deploy updates with zero downtime Collaborate with development and product teams to align DevOps practices Provide on-demand support and availability within agreed working hours Required Skills & Qualifications: 5+ years of hands-on experience in DevOps roles Strong expertise in AWS cloud services and cost optimization strategies Proficiency with CI/CD, Docker, Kubernetes, Git, and scripting (Bash/Python) Experience with monitoring/logging tools (CloudWatch, ELK, Prometheus, etc.) Solid understanding of security, networking, and cloud architecture Excellent problem-solving and communication skills Prior experience working in remote/contract-based roles preferred
Posted 1 month ago
4.0 - 9.0 years
5 - 12 Lacs
Silchar, Goalpara, Dimapur
Work from Office
Role & responsibilities Area of Responsibility Deliver volume & revenue sales target for all products by executing the distribution strategy at the channel-partner(RDS) level Monitor quality of distribution through the RDS sales team Strength relationship with key retail customers Competition Tracking & reporting schemes & programs Ensure availability of stock at RDS and Retail while adhering to the norms Execute promotional activities for channel partners to drive sales and build market credibility Distribution expansion and extraction: Achieve retail (MBO) expansion targets through increase in number of outlets in existing and new geographies Requirements & Expectations RDS Sales Executive Management (RDS SE) Target Setting for RDS SE RDS SE beat plan adherence Systems / formats at RDS SE Manage In-store promoters Impart product knowledge to sellers Drive distribution KPIs delivery RDS Management RDS Infra / SE Availability monitoring Monitor Stock holding & Market credit Day to day Performance Review & discussions Problem Solving Systems/formats at RDS point Compliance to company policies Critical Success Factors Continuous Learning & Empowering Talent Building Team Commitment Leads Decision Making & Delivering Results Builds Strategic Relationships & Organizational Agility Analytical Thinking Core Competencies Products Services & Technology Knowledge Consumer Negotiation Working with Partners Solving Problems Sales Planning & Forecasting Formal qualifications University degree in Business, Marketing or Engineering/ICT (or similar/equivalent). Higher university such as an MBA considered a merit. Three to Five years of experience in distribution planning and channel implementation. Understanding of general retail management best practices and customer relationship management. Hardworking, persistent, and dependable. Positive and enthusiastic. Financial Accountability for revenue targets for distribution channel for all products. Non Financial Monitoring of distributors sales force and retailers Resolution of channel-specific issues within timelines. Key performance indicators – Your Background Achievement of key targets in the distribution network (Sales, Revenue) in the territory. Achievement of retail outlet (MBO) expansion targets. Performance management of channel partners, sales force. Delivery of distribution metrics Interested candidate kindly share your updated resume amrita.singh@manpower.co.in
Posted 1 month ago
6.0 - 9.0 years
14 - 24 Lacs
Hyderabad, Bengaluru
Hybrid
Hiring for Dotnet Fullstack with Cloud Exp- 6-9 yrs Level - Assistant Manager but IC role Skill and Location Dotnet Core with Angular and Azure Service (Not Azure Devops) for Bangalore / Hyderabad Dotnet Core with Angular and AWS for Hyderabad
Posted 1 month ago
6.0 - 11.0 years
11 - 12 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Devops Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Devops and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in devops, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 6 to 11+ years of experience in full-stack development, with a strong focus on DevOps. DevOps with AWS Data Engineer - Roles & Responsibilities: Use AWS services like EC2, VPC, S3, IAM, RDS, and Route 53. Automate infrastructure using Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation . Build and maintain CI/CD pipelines using tools AWS CodePipeline, Jenkins,GitLab CI/CD. Cross-Functional Collaboration Automate build, test, and deployment processes for Java applications. Use Ansible , Chef , or AWS Systems Manager for managing configurations across environments. Containerize Java apps using Docker . Deploy and manage containers using Amazon ECS , EKS (Kubernetes) , or Fargate . Monitoring & Logging using Amazon CloudWatch,Prometheus + Grafana,E Stack (Elasticsearch, Logstash, Kibana),AWS X-Ray for distributed tracing manage access with IAM roles/policies . Use AWS Secrets Manager / Parameter Store for managing credentials. Enforce security best practices , encryption, and audits. Automate backups for databases and services using AWS Backup , RDS Snapshots , and S3 lifecycle rules . Implement Disaster Recovery (DR) strategies. Work closely with development teams to integrate DevOps practices. Document pipelines, architecture, and troubleshooting runbooks. Monitor and optimize AWS resource usage. Use AWS Cost Explorer , Budgets , and Savings Plans . Must-Have Skills: Experience working on Linux-based infrastructure. Excellent understanding of Ruby, Python, Perl, and Java . Configuration and managing databases such as MySQL, Mongo. Excellent troubleshooting. Selecting and deploying appropriate CI/CD tools Working knowledge of various tools, open-source technologies, and cloud services. Awareness of critical concepts in DevOps and Agile principles. Managing stakeholders and external interfaces. Setting up tools and required infrastructure. Defining and setting development, testing, release, update, and support processes for DevOps operation. Have the technical skills to review, verify, and validate the software code developed in the project. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm
Posted 1 month ago
4.0 - 6.0 years
6 - 8 Lacs
Bengaluru, Bellandur
Hybrid
Hiring an AWS Data Engineer for a 6-month hybrid contractual role based in Bellandur, Bengaluru. The ideal candidate will have 4-6 years of experience in data engineering, with strong expertise in AWS services (S3, EC2, RDS, Lambda, EKS), PostgreSQL, Redis, Apache Iceberg, and Graph/Vector Databases. Proficiency in Python or Golang is essential. Responsibilities include designing and optimizing data pipelines on AWS, managing structured and in-memory data, implementing advanced analytics with vector/graph databases, and collaborating with cross-functional teams. Prior experience with CI/CD and containerization (Docker/Kubernetes) is a plus.
Posted 1 month ago
2.0 - 7.0 years
4 - 6 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Develop scalable microservices using Java Spring Boot Design, implement REST APIs and integrate with frontend and external services Deploy and manage services using AWS services like EC2, S3, Lambda, RDS, and ECS Required Candidate profile Use CI/CD pipelines for automated builds, deployments (e.g., Jenkins, GitHub Actions) Collaborate with frontend, QA, DevOps, and business teams Write unit, integration tests to ensure code quality Perks and benefits Perks and Benefits
Posted 1 month ago
4.0 - 9.0 years
15 - 25 Lacs
Hyderabad
Work from Office
python, experienced in performing ETL andData Engineering concepts (PySpark, NumPy, Pandas, AWS Glue and Airflow)SQL exclusively Oracle hands-on work experience SQL profilers / Query Analyzers AWS cloud related (S3, RDS, RedShift) ETLPython
Posted 1 month ago
4.0 - 9.0 years
6 - 11 Lacs
Silchar, Goalpara, Dimapur
Work from Office
Role & responsibilities Area of Responsibility Deliver volume & revenue sales target for all products by executing the distribution strategy at the channel-partner(RDS) level Monitor quality of distribution through the RDS sales team Strength relationship with key retail customers Competition Tracking & reporting schemes & programs Ensure availability of stock at RDS and Retail while adhering to the norms Execute promotional activities for channel partners to drive sales and build market credibility Distribution expansion and extraction: Achieve retail (MBO) expansion targets through increase in number of outlets in existing and new geographies Requirements & Expectations RDS Sales Executive Management (RDS SE) Target Setting for RDS SE RDS SE beat plan adherence Systems / formats at RDS SE Manage In-store promoters Impart product knowledge to sellers Drive distribution KPIs delivery RDS Management RDS Infra / SE Availability monitoring Monitor Stock holding & Market credit Day to day Performance Review & discussions Problem Solving Systems/formats at RDS point Compliance to company policies Critical Success Factors Continuous Learning & Empowering Talent Building Team Commitment Leads Decision Making & Delivering Results Builds Strategic Relationships & Organizational Agility Analytical Thinking Core Competencies Products Services & Technology Knowledge Consumer Negotiation Working with Partners Solving Problems Sales Planning & Forecasting Formal qualifications University degree in Business, Marketing or Engineering/ICT (or similar/equivalent). Higher university such as an MBA considered a merit. Three to Five years of experience in distribution planning and channel implementation. Understanding of general retail management best practices and customer relationship management. Hardworking, persistent, and dependable. Positive and enthusiastic. Financial Accountability for revenue targets for distribution channel for all products. Non Financial Monitoring of distributors’ sales force and retailers Resolution of channel-specific issues within timelines. Key performance indicators – Your Background Achievement of key targets in the distribution network (Sales, Revenue) in the territory. Achievement of retail outlet (MBO) expansion targets. Performance management of channel partners, sales force. Delivery of distribution metrics Interested candidate kindly share your updated resume amrita.singh@manpower.co.in
Posted 1 month ago
7.0 - 12.0 years
11 - 12 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Devops Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Devops and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in devops, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 7 to 12+ years of experience in full-stack development, with a strong focus on DevOps. DevOps with AWS Data Engineer - Roles & Responsibilities: Use AWS services like EC2, VPC, S3, IAM, RDS, and Route 53. Automate infrastructure using Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation . Build and maintain CI/CD pipelines using tools AWS CodePipeline, Jenkins,GitLab CI/CD. Cross-Functional Collaboration Automate build, test, and deployment processes for Java applications. Use Ansible , Chef , or AWS Systems Manager for managing configurations across environments. Containerize Java apps using Docker . Deploy and manage containers using Amazon ECS , EKS (Kubernetes) , or Fargate . Monitoring & Logging using Amazon CloudWatch,Prometheus + Grafana,E Stack (Elasticsearch, Logstash, Kibana),AWS X-Ray for distributed tracing manage access with IAM roles/policies . Use AWS Secrets Manager / Parameter Store for managing credentials. Enforce security best practices , encryption, and audits. Automate backups for databases and services using AWS Backup , RDS Snapshots , and S3 lifecycle rules . Implement Disaster Recovery (DR) strategies. Work closely with development teams to integrate DevOps practices. Document pipelines, architecture, and troubleshooting runbooks. Monitor and optimize AWS resource usage. Use AWS Cost Explorer , Budgets , and Savings Plans . Must-Have Skills: Experience working on Linux-based infrastructure. Excellent understanding of Ruby, Python, Perl, and Java . Configuration and managing databases such as MySQL, Mongo. Excellent troubleshooting. Selecting and deploying appropriate CI/CD tools Working knowledge of various tools, open-source technologies, and cloud services. Awareness of critical concepts in DevOps and Agile principles. Managing stakeholders and external interfaces. Setting up tools and required infrastructure. Defining and setting development, testing, release, update, and support processes for DevOps operation. Have the technical skills to review, verify, and validate the software code developed in the project. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough