Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
6 - 10 Lacs
Hyderabad
Work from Office
About the Role As Creative Operations Lead , you'll own the end-to-end production operations of our video content lifecycle. From managing on-ground logistics to organizing post-production workflows, your role is to keep the creative wheels turning, without a single file falling through the cracks. Youll lead a team of Production Assistants, BOAs, and QC Specialists , manage our storage systems (including NAS & S3 workflows) , and be the backbone between campus teams, editors, and leadership . What Youll Own -Drive and maintain video production operations across 9+ campuses and a central editing hub. -Lead a cross-functional ops team including Production Assistants, Back Office Assistants (BOAs), and QC Specialists . -Oversee our data management infrastructure , including NAS configuration, folder hygiene, backups, and file retention protocols. I-mplement and optimize workflow tools (Trello, Notion, Frame.io, Google Drive) for daily tasks and file movement. -Own the production calendar , tracking every shoot, transfer, edit, delivery, and review milestone. -Coordinate across shoot teams, post-production, and leadership to ensure frictionless handoffs and zero delays. -Standardize and enforce file naming, version control, and backup policies . -Identify gaps, anticipate roadblocks, and build repeatable systems that scale with content volume. What You Bring -59 years of experience in video production operations, creative project management , or digital content workflows. -Strong understanding of file systems, storage planning, and backup solutions (NAS, S3, etc.) . -Hands-on experience managing video production logistics , crew schedules, and asset delivery -pipelines. -Experience working with creative teams, editors, directors, motion, scriptwriters knowing how to organize without slowing them down . -Proficiency in tools like Trello, Notion, Frame.io, Drive, Airtable, or Monday.com. -Excellent communicator, systems-thinker, and timeline enforcer. Bonus Skills -Prior work in media, production houses, or multi-location content ops -Familiarity with file formats, compression, codec workflows, and delivery specs -Exposure to editorial tech stacks and file-heavy post environments Why This Role Matters -Youll be the engine room that keeps one of Indias largest in-house video systems operational. -Youll empower editors, creators, and directors to focus on storytelling while you own the systems . -You’ll bring stability, scale, and sanity to a creative operation producing 250+ videos a month . -You won’t just manage files, you’ll manage creative flow at scale .
Posted 2 weeks ago
8.0 - 13.0 years
50 - 55 Lacs
Bengaluru
Work from Office
As a Lead Data Engineer , you will lead, design, implement, and maintain data processing pipelines and workflows using Databricks on the Azure platform. Your expertise in PySpark, SQL, Databricks, test-driven development, and Docker will be essential to the success of our data engineering initiatives. Roles and responsibilities: Collaborate with cross-functional teams to understand data requirements and design scalable and efficient data processing solutions. Develop and maintain data pipelines using PySpark and SQL on the Databricks platform. Optimise and tune data processing jobs for performance and reliability. Implement automated testing and monitoring processes to ensure data quality and reliability. Work closely with data scientists, data analysts, and other stakeholders to understand their data needs and provide effective solutions. Troubleshoot and resolve data-related issues, including performance bottlenecks and data quality problems. Stay up to date with industry trends and best practices in data engineering and Databricks. Key Requirements: 8+ years of experience as a Data Engineer, with a focus on Databricks and cloud-based data platforms, with a minimum of 4 years of experience in writing unit/end-to-end tests for data pipelines and ETL processes on Databricks. Hands-on experience in PySpark programming for data manipulation, transformation, and analysis. Strong experience in SQL and writing complex queries for data retrieval and manipulation. Experience in Docker for containerising and deploying data engineering applications is good to have. Strong knowledge of the Databricks platform and its components, including Databricks notebooks, clusters, and jobs. Experience in designing and implementing data models to support analytical and reporting needs will be an added advantage.
Posted 2 weeks ago
5.0 - 9.0 years
6 - 8 Lacs
Hyderabad
Work from Office
About the Role As Creative Operations Lead , you'll own the end-to-end production operations of our video content lifecycle. From managing on-ground logistics to organizing post-production workflows, your role is to keep the creative wheels turning, without a single file falling through the cracks. Youll lead a team of Production Assistants, BOAs, and QC Specialists , manage our storage systems (including NAS & S3 workflows) , and be the backbone between campus teams, editors, and leadership . What Youll Own -Drive and maintain video production operations across 9+ campuses and a central editing hub. -Lead a cross-functional ops team including Production Assistants, Back Office Assistants (BOAs), and QC Specialists . -Oversee our data management infrastructure , including NAS configuration, folder hygiene, backups, and file retention protocols. I-mplement and optimize workflow tools (Trello, Notion, Frame.io, Google Drive) for daily tasks and file movement. -Own the production calendar , tracking every shoot, transfer, edit, delivery, and review milestone. -Coordinate across shoot teams, post-production, and leadership to ensure frictionless handoffs and zero delays. -Standardize and enforce file naming, version control, and backup policies . -Identify gaps, anticipate roadblocks, and build repeatable systems that scale with content volume. What You Bring -59 years of experience in video production operations, creative project management , or digital content workflows. -Strong understanding of file systems, storage planning, and backup solutions (NAS, S3, etc.) . -Hands-on experience managing video production logistics , crew schedules, and asset delivery -pipelines. -Experience working with creative teams, editors, directors, motion, scriptwriters knowing how to organize without slowing them down . -Proficiency in tools like Trello, Notion, Frame.io, Drive, Airtable, or Monday.com. -Excellent communicator, systems-thinker, and timeline enforcer. Bonus Skills -Prior work in media, production houses, or multi-location content ops -Familiarity with file formats, compression, codec workflows, and delivery specs -Exposure to editorial tech stacks and file-heavy post environments Why This Role Matters -Youll be the engine room that keeps one of Indias largest in-house video systems operational. -Youll empower editors, creators, and directors to focus on storytelling while you own the systems . -You’ll bring stability, scale, and sanity to a creative operation producing 250+ videos a month . -You won’t just manage files, you’ll manage creative flow at scale .
Posted 3 weeks ago
6.0 - 8.0 years
18 - 20 Lacs
Bengaluru
Hybrid
Hi all, We are hiring for the role C&S ETL Engineer Experience: 6 - 8 Years Location: Bangalore Notice Period: Immediate - 15 Days Skills: Mandatory Skills: AWS Glue Job Description: Minimum experience of 6 years in building, optimizing, and maintaining scalable data pipelines as an ETL Engineer. Hands-on experience in coding techniques with a proven record. Hands-on experience in end-to-end data workflows, including pulling data from third-party and in-house tools via APIs, transforming and loading it into data warehouses, and improving performance across the ETL lifecycle. Hands-on experience with scripting (Python, shell scripting), relational databases (PostgreSQL, Redshift), REST APIs (OAuth, JWT, Basic Auth), job scheduler (cron), version control system (Git), and in AWS environment. Hands-on experience in integrating data from various data sources. Understanding of Agile processes and principles. Good communication and presentation skills. Good documentation skills. Preferred: Ability to understand business problems and customer needs and provide data solutions. Hands-on experience in working with Qualys and its APIs. Understanding of business intelligence tools such as PowerBI. Knowledge of data security and privacy. Design, develop, implement, and maintain robust and scalable ETL pipelines using Python and SQL as well as AWS Glue and AWS Lambda for data ingestion, transformation, loading into various data targets (e.g., PostgreSQL, Amazon S3, Redshift, Aurora) and structured data management. If you are interested drop your resume at mojesh.p@acesoftlabs.com Call: 9701971793
Posted 3 weeks ago
3.0 - 7.0 years
15 - 25 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 15-20 LPA Exp: 3 to 5 years Location: Gurgaon/Pune Notice: Immediate to 15 days..!! Job Title: AWS DevOps Engineer Job Description: We are seeking a highly skilled AWS DevOps Engineer with extensive experience in Chef and CloudWatch The ideal candidate will have a strong background in cloud infrastructure, automation, and monitoring. Key Responsibilities: Design, implement, and manage scalable and reliable cloud infrastructure on AWS. Automate provisioning, configuration management, and deployment using Chef. Monitor system performance and reliability using AWS CloudWatch and other monitoring tools. Develop and maintain CI/CD pipelines to ensure smooth and efficient software releases. Collaborate with development, QA, and operations teams to ensure high availability and reliability of applications. Troubleshoot and resolve infrastructure and application issues in a timely manner. Implement security best practices and ensure compliance with industry standards. Optimize infrastructure for cost and performance. Maintain documentation related to infrastructure, processes, and procedures.
Posted 4 weeks ago
1.0 years
2 - 3 Lacs
IN
Remote
About the job: Selected intern's day-to-day responsibilities include: 1. Collaborate with the development team to design and implement user-friendly interfaces for healthcare applications. 2. Develop and maintain backend systems using Node.js and MongoDB to ensure seamless data management and processing. 3. Write clean, efficient code in Python and JavaScript to enhance the functionality of our software products. 4. Conduct thorough testing and debugging to identify and fix any issues, ensuring the quality and reliability of our applications. 5. Stay updated on the latest industry trends and technologies to continuously improve your skills and contribute fresh ideas to the team. 6. Participate in code reviews and provide feedback to your peers to foster a collaborative and productive work environment. 7. Communicate effectively with team members and stakeholders to gather requirements, provide updates on progress, and address any concerns that may arise during the development process. Who can apply: Only those candidates can apply who: have minimum 1 years of experience can work from 8:30 pm - 9:30 am Indian Standard Time (as the company is based outside of India & their local work timings are 8:00 am - 9:00 pm Pacific Standard Time) are Computer Science Engineering students Salary: ₹ 2,00,000 - 3,00,000 /year Experience: 1 year(s) Deadline: 2025-08-10 23:59:59 Other perks: 5 days a week Skills required: PHP, MySQL, HTML, CSS, JavaScript, Python, Linux, MongoDB, AngularJS, Node.js, React, Firebase, Bubble.io, Amazon S3 and Unit Testing Other Requirements: As a Junior Full Stack Developer at Vantech Med International, you will have the exciting opportunity to work on cutting-edge projects in the healthcare technology industry. Your role will involve utilizing your expertise in Python, Node.js, CSS, JavaScript, MongoDB, and HTML to contribute to the development of innovative software solutions that will revolutionize the way medical professionals interact with technology. If you are a motivated and enthusiastic developer with a passion for creating impactful solutions, we invite you to join our dynamic team at Vantech Med International and help us shape the future of healthcare technology. Apply now to be part of this exciting journey! About Company: Vantech Medical is a MedTech company. The company focuses on developing innovative and sustainable medical equipment and therapy dolls. Leveraging the user's extensive background in Python-based software solutions, Vantech Medical aims to integrate advanced technology with healthcare to provide practical and effective solutions. The company's mission is to create products that improve patient care and outcomes while maintaining a strong emphasis on sustainability and environmental impact. By combining expertise in both technology and medicine, Vantech Medical is poised to make significant contributions to the healthcare industry.
Posted 1 month ago
5.0 - 10.0 years
0 - 1 Lacs
Hyderabad, Pune, Ahmedabad
Hybrid
Contractual (Project-Based) Notice Period: Immediate - 15 Days Fill this form: https://forms.office.com/Pages/ResponsePage.aspx?id=hLjynUM4c0C8vhY4bzh6ZJ5WkWrYFoFOu2ZF3Vr0DXVUQlpCTURUVlJNS0c1VUlPNEI3UVlZUFZMMC4u Resume- shweta.soni@panthsoftech.com
Posted 1 month ago
2.0 - 5.0 years
6 - 10 Lacs
Kochi
Work from Office
Job description: Seeking a skilled & proactive Data Engineer with 24 years of experience to support our enterprise data warehousing and analytics initiatives. The Candidate will be responsible for building scalable data pipelines, transforming data for analytics, and enabling data integration across cloud and On-premise systems. Key Responsibilities: Build and manage data lakes and data warehouses using services like Amazon S3, Redshift, and Athena Design and build secure, scalable, and efficient ETL/ELT pipelines on AWS using services like Glue, Lambda, Step Functions Work on SAP Datasphere to build and maintain Spaces, Data Builders, Views, and Consumption Layers Develop and maintain scalable data models and optimize queries for performance Monitor and optimize data workflows to ensure reliability, performance, and cost-efficiency Collaborate with Data Analysts and BI teams to provide clean, validated, and well-documented datasets Monitor, troubleshoot, and enhance data workflows and pipelines Ensure data quality, integrity, and governance policies are met Required Skills Strong SQL skills and experience with relational databases like MySQL, or SQL Server Proficient in Python or Scala for data transformation and scripting Familiarity with cloud platforms like AWS (S3, Redshift, Glue), Azure Good-to-Have Skills AWS Certification AWS Certified Data Analytics Exposure to modern data stack tools like Snowflake Experience in cloud-based projects and working in an Agile environment Understanding of data governance, security best practices, and compliance standards
Posted 1 month ago
5.0 - 8.0 years
10 - 15 Lacs
Bengaluru
Remote
We are looking for dot net developer with 5+ years of experince.
Posted 1 month ago
5.0 - 8.0 years
9 - 19 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Greetings from Teamware Solutions a division of Quantum Leap Consulting Pvt. Ltd We are hiring a Chatbot and Telephony Test Engineer Work Mode: Hybrid Locations: Hyderabad, Bengaluru, Mumbai, Kolkata Experience: 5 - 8 Years Notice Period: Immediate to 15 days Chat bot, Telephony, Amazon Connect IVR, Amazon S3, Lamda, Dynamo Db Job Description: Test case prep, Test case execution, Defect management and Reporting using QMetry and JIRA Chat bot, Telephony, Amazon Connect IVR, Amazon S3, Lamda, Dynamo Db Nice to have skills - Integrations (Mulesoft), API testing, Postman Please let me know if you are interested in this position and send me the resumes to netra.s@twsol.com
Posted 1 month ago
12.0 - 15.0 years
16 - 18 Lacs
Bengaluru
Hybrid
iSource Services is hiring for one of their client for the position of AWS. AWS experience (not Azure or GCP), with 12-15 years of experience, and hands-on expertise in design and implementation. Design and Develop Data Solutions, Design and implement efficient data processing pipelines using AWS services like AWS Glue, AWS Lambda, Amazon S3, and Amazon Redshift. Candidates should possess exceptional communication skills to engage effectively with US clients. The ideal candidate must be hands-on with significant practical experience. Availability to work overlapping US hours is essential. The contract duration is 6 months. For this role, we're looking for candidates with 12 to 15 years of experience. AWS experience communication skills
Posted 1 month ago
2.0 - 5.0 years
6 - 10 Lacs
Kochi
Work from Office
Job description: Seeking a skilled & proactive Data Engineer with 24 years of experience to support our enterprise data warehousing and analytics initiatives. The Candidate will be responsible for building scalable data pipelines, transforming data for analytics, and enabling data integration across cloud and On-premise systems. Key Responsibilities: Build and manage data lakes and data warehouses using services like Amazon S3, Redshift, and Athena Design and build secure, scalable, and efficient ETL/ELT pipelines on AWS using services like Glue, Lambda, Step Functions Work on SAP Datasphere to build and maintain Spaces, Data Builders, Views, and Consumption Layers Support data integration between AWS, Datasphere, and various source systems(SAP S4HANA, Non-SAP apps, Flat-files etc) Develop and maintain scalable data models and optimize queries for performance Monitor and optimize data workflows to ensure reliability, performance, and cost-efficiency Collaborate with Data Analysts and BI teams to provide clean, validated, and well-documented datasets Monitor, troubleshoot, and enhance data workflows and pipelines Ensure data quality, integrity, and governance policies are met Required Skills Strong SQL skills and experience with relational databases like MySQL, or SQL Server Proficient in Python or Scala for data transformation and scripting Familiarity with cloud platforms like AWS (S3, Redshift, Glue), Datasphere, Azure Good-to-Have Skills AWS Certification – AWS Certified Data Analytics Exposure to modern data stack tools like Snowflake Experience in cloud-based projects and working in an Agile environment Understanding of data governance, security best practices, and compliance standards
Posted 1 month ago
4.0 - 8.0 years
10 - 20 Lacs
Nagpur, Pune
Work from Office
Roles & Responsibilities: Develop and maintain backend services using Node.js Implement event-driven architectures with Apache Kafka or AWS Kinesis for real-time data processing Deploy and manage containerized applications using Docker Design and manage MongoDB databases for efficient data storage and retrieval Work with AWS services (e.g., Load balancer, EC2, S3, Lambda, API Gateway) for scalable cloud solutions Integrate MQTT protocols for IoT and messaging-based applications Configure and maintain Linux-based production environments Read and analyse existing Java code for integration and troubleshooting purposes Implement secure authentication using OpenID Connect (OIDC) Collaborate with development teams to improve system reliability and performance Must have Technical Skills: Strong proficiency in Node.js with experience in frameworks like Express.js or NestJS Hands-on experience with Apache Kafka and event-driven systems Experience with AWS services , including compute, storage, and networking solutions Docker & container orchestration experience for scalable deployments Experience with MongoDB , including schema design, indexing, and performance optimization Basic understanding of Java , with the ability to read and analyse code Good to have: Experience with Kubernetes for managing containers Analysis of Quarkus microservices to ensure best practices and efficiency Understanding of Terraform or CloudFormation for infrastructure setup Knowledge of serverless computing (AWS Lambda, Azure Functions) AWS Certification is a plus Key Competencies: Excellent problem-solving skills and attention to detail Strong communication and teamwork skills Ability to work collaboratively in cross-functional teams Ability to write clean, well-documented, and efficient code
Posted 2 months ago
1.0 years
4 - 4 Lacs
Bangalore, Karnataka, IN
On-site
About the job: Key responsibilities: 1. Build big, robust, scalable, and maintainable applications 2. Debug, fix bugs, identify performance issues, and improve app performance 3. Set up and manage AWS EC2 instances, S3 buckets, and related services 4. Manage server configurations and deployments 5. Continuously discover, evaluate, and implement new technologies to maximize development efficiency 6. Handle complex technical issues related to web app development and discuss solutions with the team 7. Manage and monitor application deployment using DevOps tools 8. Configure and maintain AWS EC2 instances 9. Set up and manage Docker containers 10. Implement automated deployment pipelines 11. Monitor application performance and server health 12. Manage database backups and recovery procedures 13. Handle server security and SSL certificate management Requirements: 1. Proficient understanding of Python, with knowledge of the Python web framework Django 2. Knowledge of Version control like Git 3. Understanding of the threading limitations of Python and multi-process architecture 4. Understanding of fundamental design principles behind a scalable application 5. Experience with AWS services (EC2, S3, CloudWatch) and basic Linux commands 6. Knowledge of Docker and container orchestration 7. Basic understanding of CI/CD pipelines and DevOps practices 8. Familiarity with deployment strategies and server configuration Who can apply: Only those candidates can apply who: have minimum 1 years of experience are Computer Science Engineering students Salary: ₹ 4,20,000 - 4,50,000 /year Experience: 1 year(s) Deadline: 2025-07-05 23:59:59 Other perks: Informal dress code, 5 days a week, Free snacks & beverages, Health Insurance Skills required: Python, SQL, Django, Docker, GitHub, Amazon EC2, Amazon S3 and Amazon CloudWatch About Company: Trade Brains is a financial website helping readers learn the art of stock investing, trading, portfolio management, financial planning, money management, and more. At FinGrad (an initiative by Trade Brains), we offer the best online courses, webinars, and resources from various top experts who have real skin in the financial game. FinGrad has been built in the mind to deliver end-to-end financial education at our best standard to our novice investors & traders.
Posted 2 months ago
0.0 years
3 - 4 Lacs
IN
Remote
About the job: Key responsibilities: 1. Design and develop scalable backend systems and APIs for our AI-powered SaaS platform using Python and Node.js 2. Build and maintain cloud infrastructure on AWS, including configuration and management of S3, DynamoDB, SNS, EC2, and CloudWatch services 3. Implement and optimize data processing pipelines for machine learning model deployment and integration 4. Collaborate with data scientists to integrate AI models into production systems and ensure efficient model serving 5. Deploy and monitor applications using DevOps practices and LLMOps for large language model implementations 6. Create robust API endpoints that connect our frontend applications with AI functionalities 7. Design and implement efficient database schemas and queries optimized for AI applications 8. Develop and maintain secure authentication and authorization systems for our platform 9. Write clean, maintainable, and well-tested code following best practices 10. Troubleshoot and resolve complex technical issues in production environments Additional candidate preferences: 1. Computer Science or related Engineering degree preferred 2. Experience with containerization technologies like Docker 3. Familiarity with AI model serving platforms Who can apply: Only those candidates can apply who: are Computer Science Engineering students Salary: ₹ 3,10,000 - 4,60,000 /year Experience: 0 year(s) Deadline: 2025-06-16 23:59:59 Other perks: 5 days a week Skills required: Python, Node.js, Artificial intelligence, DevOps, Amazon EC2, Amazon S3, Amazon CloudWatch, Amazon SNS, Amazon DynamoDB and LLMOps Other Requirements: 1. Computer Science or related Engineering degree preferred 2. Experience with containerization technologies like Docker 3. Familiarity with AI model serving platforms and ML workflows About Company: Smartify is a marketplace for automation companies and also India's leading home automation store. We are trying to reduce the knowledge-execution gap and encourage early-adopters in the IoT space to launch their products and get to the mainstream market.
Posted 2 months ago
6 - 8 years
10 - 12 Lacs
Hyderabad, Gurugram, Bengaluru
Work from Office
We are hiring a Senior AWS Engineer with deep, hands-on experience across critical AWS services. This is not a deployment-only role we are looking for someone who has built real systems, owned the backend logic, and debugged live issues under pressure. You should have worked extensively with each of the following: • AWS Lambda – Must have created functions from scratch, handled orchestration, and optimized for performance and cost. • Amazon DynamoDB – Practical experience with schema design, query optimization, and handling throughput tuning. • Amazon SQS & SNS – Integration for asynchronous processing, error handling, dead-letter queues, and fan-out patterns. • Amazon S3 – Secure and optimized usage, including versioning, lifecycle rules, and event triggers. • Amazon ECS & EC2 – Containerized workloads, launch/configure/manage clusters, auto-scaling, IAM role assignment. ________________________________________ Key Responsibilities: • Build and maintain Lambda functions with clean, production-ready code • Integrate and manage AWS services (SQS, SNS, DynamoDB, S3, ECS/EC2) • Handle scripting and automation (Python, Bash, or Node.js preferred) • Collaborate with other developers using Agile (Scrum) workflows • Debug and resolve production issues using Cloud Watch and related tools • Focus on performance, security, and stability of AWS-powered applications ________________________________________ What We’re Not Looking For: • Profiles with only deployment or environment setup experience • Passive usage of services without ownership or debugging exposure • General cloud familiarity without specific implementation depth ________________________________________ Must-Have Skills: • AWS Lambda • DynamoDB • SQS, SNS • S3 • ECS, EC2 • Infrastructure scripting (Python / Bash / Node.js) • Clear understanding of IAM, roles, policies, and cross-service permissions • Hands-on debugging and log analysis using CloudWatch ________________________________________ Nice to Have: • AWS certifications • Familiarity with CI/CD pipelines • Experience with step functions, EventBridge, or advanced serverless patterns ________________________________________ Why Join Us: This isn’t just another cloud role — this is about owning your stack. You’ll work with a team that values signal over noise, trusts your decisions, and expects you to lead delivery, not follow instructions. If you’re someone who doesn’t wait for instructions — but reads the room, senses the stack, and moves — we’d love to speak with you.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39973 Jobs | Dublin
Wipro
19601 Jobs | Bengaluru
Accenture in India
16747 Jobs | Dublin 2
EY
15791 Jobs | London
Uplers
11569 Jobs | Ahmedabad
Amazon
10606 Jobs | Seattle,WA
Oracle
9430 Jobs | Redwood City
IBM
9385 Jobs | Armonk
Accenture services Pvt Ltd
8587 Jobs |
Capgemini
7916 Jobs | Paris,France