Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
15 - 30 Lacs
Gurugram
Work from Office
Hiring For .NET CORE developer Lead Salry upto 35 LPA Graduate Min 7 years exp + min 3 years in .net core Developer with AWS service tools and Microservices Shift 1-10 PM 5 days WFO Immediate joiners call / Wats AP Anisha # 9972310991
Posted 3 days ago
5.0 - 9.0 years
18 - 24 Lacs
Hyderabad
Work from Office
Responsibilities: * Design, develop & maintain scalable software solutions using Python, Django & AWS. * Collaborate with cross-functional teams on project requirements & deliverables.
Posted 4 days ago
4.0 - 6.0 years
6 - 10 Lacs
Pune
Work from Office
Role & responsibilities Manage AEM infrastructure and CI/CD pipelines using Jenkins, Git, and Adobe Cloud Manager for efficient code deployments and environment management. Automate provisioning, scaling, and monitoring of AEM instances using tools like Ansible, Terraform, or Cloud-native services (AWS/GCP/Azure). PFB the 1.AWS Lambdas, ECS Fargate, EC2, Load Balancers, SGs, RDS, DynamoDB 2.Scripting – NodeJS, Python, Shell Scripting 3.Pipelines - GitHub Actions and Azure Pipelines 4.Monitoring – New Relic Alerts, Dashboards and Synthetic Monitoring, ELK stack Preferred candidate profile
Posted 4 days ago
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Hybrid
Company Description Epsilon is an all-encompassing global marketing innovator, supporting 15 of the top 20 global brands. We provide unrivaled data intelligence and customer insights, world-class technology including loyalty, email and CRM platforms and data-driven creative, activation and execution. Epsilon's digital media arm, Conversant, is a leader in personalized digital advertising and insights through its proprietary technology and trove of consumer marketing data, delivering digital marketing with unprecedented scale, accuracy and reach through personalized media programs and through CJ Affiliate by Conversant, one of the world's largest affiliate marketing networks. Together, we bring personalized marketing to consumers across offline and online channels, at moments of interest, that help drive business growth for brands. Recognized by Ad Age as the #1 World's Largest CRM/Direct Marketing Agency Network, #1 Largest U.S. Agency from All Disciplines, #1 Largest U.S. CRM/Direct Marketing Agency Network and #1 Largest U.S. Mobile Marketing Agency, Epsilon employs over 8,000 associates in 70 offices worldwide. Epsilon is part of Alliance Data, a Fortune 500's and Fortune 100 Best Places to Work For a company. For more information, visit www.epsilon.com and follow us on Twitter @EpsilonMktg. Job Description About BU The Product team forms the crux of our powerful platforms and connects millions of customers to the product magic. This team of innovative thinkers develop and build products that help Epsilon be a market differentiator. They map the future and set new standards for our products, empowered with industry best practices, ML and AI capabilities. The team passionately delivers intelligent end-to-end solutions and plays a key role in Epsilon's success story. Why we are looking for you We are looking for Senior Software Engineer to work on groundbreaking multichannel SaaS Digital Marketing Platform that focuses on uniquely identify the customer's patterns, effectively interact with them across channels and achieve a positive return on marketing investment (ROMI). The platform helps consolidate and integrates the features and functionality typically found in stand-alone services and channel-specific messaging platforms to give marketers a tightly integrated, easily orchestrated, insights-driven, cross channel marketing capability. Primary role of the Senior Software Engineer is to envision and build internet scale services on Cloud using Java and distributed technologies with 60-40 involvement in backend development with Java and frontend development using Angular. Responsible for development and maintenance of applications with technologies involving Java and Distributed technologies. Collaborate with developers, product manager, business analysts and business users in conceptualizing, estimating and developing new software applications and enhancements. Assist in the development, and documentation of softwares objectives, deliverables, and specifications in collaboration with internal users and departments. Collaborate with QA team to define test cases, metrics, and resolve questions about test results. Assist in the design and implementation process for new products, research and create POC for possible solutions. Develop components based on business and/or application requirements Create unit tests in accordance with team policies & procedures Advise, and mentor team members in specialized technical areas as well as fulfill administrative duties as defined by support process Create Value-adds that would contribute to Cost Optimizations/ Scalability/ Reliability/Secure solutions What you will enjoy in this role Tech Stack: Our integrated suite of modular products is designed to help deliver personalized experiences and drive meaningful outcomes. Our tech stack caters to a fusion of data and technology with SaaS offerings developed as a Cloud-first approach. Here, a solid understanding of software security practices including user authentication and authorization and being data-savvy would be key. You should also come with the ability to leverage best practices in design patterns, and design algorithms for software development that focus on high quality and agility. You must also have a good understanding of Agile Methodologies like SCRUM. You can refer this article also. What you will do Be responsible for development and maintenance of applications with technologies involving Java and Distributed technologies. Collaborate with developers, product manager, business analysts and business users in conceptualizing, estimating and developing new software applications and enhancements. Assist in the development, and documentation of softwares objectives, deliverables, and specifications in collaboration with internal users and departments. Collaborate with QA team to define test cases, metrics, and resolve questions about test results. Assist in the design and implementation process for new products, research and create POC for possible solutions. Develop components based on business and/or application requirements Create unit tests in accordance with team policies & procedures Advise, and mentor team members in specialized technical areas as well as fulfill administrative duties as defined by support process Create Value-adds that would contribute to Cost Optimizations/ Scalability/ Reliability/Secure solutions Qualifications Bachelor's degree or equivalent in computer science 6+ years of experience in Java/Angular/SQL/ AWS/Microservices Preferred knowledge/experience in the following technologies 2 + years of UI Technologies like Angular 2 or > 1 + year of experience in Cloud computing like AWS or Azure or GCP or PCF or OCI Experience in following Tools: Eclipse, Maven, Gradle, DB tools, Bitbucket/JIRA/Confluence Can develop SOA services and good knowledge of REST API and Micro service architectures Solid knowledge of web architectural and design patterns Understands software security practices including user authentication and authorization, data validation and an understanding of common DOS and SQL injection techniques. Familiar with profiling, code coverage, logging, common IDE's and other development tools. Familiar with Agile Methodologies SCRUM and Strong communication skills (verbal and written) Ability to work within tight deadlines and effectively prioritize and execute tasks in a high-pressure environment. Demonstrated verbal and written communication skills, and ability to interface with Business, Analytics and IT organizations Ability to work effectively in short-cycle, team oriented environment, managing multiple priorities and tasks Ability to identify non-obvious solutions to complex problems
Posted 6 days ago
4.0 - 6.0 years
6 - 11 Lacs
Kolkata, Pune, Chennai
Work from Office
Job Title : Data scientist Location State : Maharashtra,Tamil Nadu,West Bengal Location City : Pune, Chennai, Kolkata Experience Required : 4 to 6 Year(s) CTC Range : 6 to 11 LPA Shift: Day Shift Work Mode: Onsite Position Type: C2H Openings: 4 Company Name: VARITE INDIA PRIVATE LIMITED About The Client: Client is an Indian multinational technology company specializing in information technology services and consulting. Headquartered in Mumbai, it is a part of the Tata Group and operates in 150 locations across 46 countries. About The Job: Job Title: Developer Work Location: ~KOLKATA,WB || CHENNAI,TN || PUNE, MH Skill Required: Digital : Python~Digital : Client Web Service(AWS) Cloud Computing Experience Range in Required Skills: 4-6 Years Job Description: Language: Python Cloud Platform: AWS (Lambda, EC2, S3)DevOps Tools: GitLab Data / ML: NumPy, Pandas3+ years Hands experience for Developer Essential Job Functions: Essential Skills: Language: Python Cloud Platform: AWS (Lambda, EC2, S3)DevOps Tools: GitLab Data / ML: NumPy, Pandas3+ years Hands experience for Developer Qualifications: Qualifications: Python Cloud Platform: AWS (Lambda, EC2, S3)DevOps Tools: GitLab Data / ML: NumPy, Pandas3+ years Hands experience for Developer How to Apply: Interested candidates are invited to submit their resume using the apply online button on this job post. About VARITE: VARITE is a global staffing and IT consulting company providing technical consulting and team augmentation services to Fortune 500 Companies in USA, UK, CANADA and INDIA. VARITE is currently a primary and direct vendor to the leading corporations in the verticals of Networking, Cloud Infrastructure, Hardware and Software, Digital Marketing and Media Solutions, Clinical Diagnostics, Utilities, Gaming and Entertainment, and Financial Services. Equal Opportunity Employer: VARITE is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, veteran status, or disability status. Unlock Rewards: Refer Candidates and Earn. If you're not available or interested in this opportunity, please pass this along to anyone in your network who might be a good fit and interested in our open positions. VARITE offers a Candidate Referral program, where you'll receive a one-time referral bonus based on the following scale if the referred candidate completes a three-month assignment with VARITE. Exp Req - Referral Bonus 0 - 2 Yrs. - INR 5,000 2 - 6 Yrs. - INR 7,500 6 + Yrs. - INR 10,000
Posted 1 week ago
3.0 - 7.0 years
5 - 10 Lacs
Kolkata
Work from Office
AWS Certified required with IAM, VPC, ELB, ALB, Autoscaling, Lambda.should know EC2, EKS, ECS, ECR, Route 53, SES, Elasticache, RDS, Redshift,Strong in Serverless Development Architecture.Build & maintain highly available production system.
Posted 1 week ago
2.0 - 6.0 years
8 - 18 Lacs
Gurugram
Remote
Role Characteristics: Analytics team provides analytical support to multiple stakeholders (Product, Engineering, Business development, Ad operations) by developing scalable analytical solutions, identifying problems, coming up with KPIs and monitor those to measure impact/success of product improvements/changes and streamlining processes. This will be an exciting and challenging role that will enable you to work with large data sets, expose you to cutting edge analytical techniques, work with latest AWS analytics infrastructure (Redshift, s3, Athena, and gain experience in the usage of location data to drive businesses. Working in a dynamic start up environment will give you significant opportunities for growth within the organization. A successful applicant will be passionate about technology and developing a deep understanding of human behavior in the real world. They would also have excellent communication skills, be able to synthesize and present complex information and be a fast learner. You Will: Perform root cause analysis with minimum guidance to figure out reasons for sudden changes/abnormalities in metrics Understand objective/business context of various tasks and seek clarity by collaborating with different stakeholders (like Product, Engineering) Derive insights and putting them together to build a story to solve a given problem Suggest ways for process improvements in terms of script optimization, automating repetitive tasks Create and automate reports and dashboards through Python to track certain metrics basis given requirements Automate reports and dashboards through Python Technical Skills (Must have) B.Tech degree in Computer Science, Statistics, Mathematics, Economics or related fields 4-6 years of experience in working with data and conducting statistical and/or numerical analysis Ability to write SQL code Scripting/automation using python Hands on experience in data visualisation tool like Looker/Tableau/Quicksight Basic to advance level understanding of statistics Other Skills (Must have) Be willing and able to quickly learn about new businesses, database technologies and analysis techniques Strong oral and written communication Understanding of patterns/trends and draw insights from those Preferred Qualifications (Nice to have) Experience working with large datasets Experience with AWS analytics infrastructure (Redshift, S3, Athena, Boto3) Hands on experience on AWS services like lambda, step functions, Glue, EMR + exposure to pyspark What we offer At GroundTruth, we want our employees to be comfortable with their benefits so they can focus on doing the work they love. Parental leave- Maternity and Paternity Flexible Time Offs (Earned Leaves, Sick Leaves, Birthday leave, Bereavement leave & Company Holidays) In Office Daily Catered Lunch Fully stocked snacks/beverages Health cover for any hospitalization. Covers both nuclear family and parents Tele-med for free doctor consultation, discounts on health checkups and medicines Wellness/Gym Reimbursement Pet Expense Reimbursement Childcare Expenses and reimbursements Employee assistance program Employee referral program Education reimbursement program Skill development program Cell phone reimbursement (Mobile Subsidy program) Internet reimbursement Birthday treat reimbursement Employee Provident Fund Scheme offering different tax saving options such as VPF and employee and employer contribution up to 12% Basic Creche reimbursement Co-working space reimbursement NPS employer match Meal card for tax benefit Special benefits on salary account We are an equal opportunity employer and value diversity, inclusion and equity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Posted 1 week ago
8.0 - 12.0 years
30 - 40 Lacs
Bengaluru
Work from Office
Work Location: Bangalore (in-office) We're seeking hands-on Lead Software Engineers who can: Build production-ready AI solutions on AWS Guide distributed teams while continuously scouting and prototyping emerging AI technologies You'll join our Product Engineering group, owning end-to-end delivery of next-generation Amorphic AI solutions while mentoring engineers across global teams. What is Expected: Technical Leadership Lead the integration of cutting-edge AI/LLMs into Amorphic AI Solutions, ensuring seamless interoperability and optimal performance Design and architect complex software systems with focus on scalability, maintainability, and performance Architect production-grade RAG pipelines and multi-agent orchestration on AWS (Lambda, ECS/Fargate, Bedrock, SageMaker, DynamoDB, S3, EventBridge, Step Functions) Drive the design and implementation of scalable AI pipelines Development & Innovation Design, develop, test, and maintain scalable backend applications using Python and AWS services Stay current with AI advancements through hands-on experimentation with emerging frameworks (LangChain, Hugging Face Transformers, CrewAI) via prototypes and side projects Optimize AI solution performance focusing on cost-effectiveness, latency, and resource utilization Develop strategies for monitoring, maintaining, and improving deployed AI models in production Team Leadership Lead 5-10 engineers through design reviews, pair-programming, and PR feedback Conduct code reviews and design discussions to ensure adherence to best practices Collaborate with cross-functional teams globally to identify requirements and implement solutions Create and maintain comprehensive documentation for architecture, design decisions, and coding practices Preferred Candidate Profile BE / B.Tech in Computer Science or related field 8+ years of experience in software development Solid understanding of large language models (LLMs), including experience with prompt engineering, fine-tuning, or integrating LLM APIs (e.g., from OpenAI, Anthropic, or AWS Bedrock) Hands-on experience building AI solutions using latest tools and frameworks (e.g., langchain, crewAI), demonstrated through side projects, open-source contributions, or personal prototypes Proven leadership experience in managing and mentoring high-performing teams of software and application developers Exceptional proficiency in Python programming language Solid understanding of AWS ecosystem including Lambda functions, S3 buckets, EMR clusters, DynamoDB tables etc. Proven experience in a leadership role, leading software development teams in the delivery of complex projects Deep understanding of software architecture and design principles, with a focus on building scalable and maintainable systems Experience with distributed systems, microservices architecture, and cloud-based solutions Strong knowledge of software development best practices, including code reviews, testing, and CI/CD pipelines Experience working with AWS services and developing Cloud Native Applications using REST APIs is must have Experience working in an agile delivery environment, especially product engineering teams How We'll Take Care Of You: We believe in supporting our team members both professionally and personally. Here's a look at the comprehensive benefits and perks we offer: Financial Well-being & Security Competitive Compensation : Enjoy competitive salaries and bonuses that reward your hard work and dedication Robust Insurance Coverage : Benefit from health, life, and disability insurance to ensure you and your family are protected Provident Fund Eligibility : Secure your future with eligibility for the provident fund Work-Life Balance & Flexibility Flexible Working Hours : We offer flexible working hours to help you manage your personal and professional commitments Generous Paid Time Off : Take advantage of unlimited Paid Time Off (PTO), with a mandatory minimum of 1 week per year to ensure you recharge Comprehensive Leave Policies : We provide paid vacation days, sick leave, and holidays, plus supportive parental leave (maternity, paternity, and adoption) and bereavement leave when you need it most Professional Growth & Development Learning & Development : Elevate your skills with access to extensive certification and training programs Cutting-Edge Technologies : You'll work at the forefront of innovation with cutting-edge technologies, constantly igniting your passion for continuous learning and growth Culture & Community Recognition & Rewards : Your contributions won't go unnoticed with our recognition and reward programs Engaging Activities : Connect with your colleagues through company-sponsored events, outings, team-building activities, and retreats .
Posted 1 week ago
6.0 - 11.0 years
9 - 19 Lacs
Noida
Work from Office
We are looking for a skilled Machine Learning Engineer with strong expertise in Natural Language Processing (NLP) and AWS cloud services to design, develop, and deploy scalable ML models and pipelines. You will play a key role in building innovative NLP solutions for classification, forecasting, and recommendation systems, leveraging cutting-edge technologies to drive data-driven decision-making in the US healthcare domain. Key Responsibilities: Design and deploy scalable machine learning models focused on NLP tasks, classification, forecasting, and recommender systems. Build robust, end-to-end ML pipelines encompassing data ingestion, feature engineering, model training, validation, and production deployment. Apply advanced NLP techniques including sentiment analysis, named entity recognition (NER), embeddings, and document parsing to extract actionable insights from healthcare data. Utilize AWS services such as SageMaker, Lambda, Comprehend, and Bedrock for model training, deployment, monitoring, and optimization. Collaborate effectively with cross-functional teams including data scientists, software engineers, and product managers to integrate ML solutions into existing products and workflows. Implement MLOps best practices for model versioning, automated evaluation, CI/CD pipelines, and continuous improvement of deployed models. Leverage Python and ML/NLP libraries including scikit-learn, PyTorch, Hugging Face Transformers, and spaCy for daily development tasks. Research and explore advanced NLP/ML techniques such as Retrieval-Augmented Generation (RAG) pipelines, foundation model fine-tuning, and vector search methods for next-generation solutions. Required Qualifications: Bachelors or Masters degree in Computer Science, Engineering, or a related technical field. 6+ years of professional experience in machine learning, with a strong focus on NLP and AWS cloud services. Hands-on experience in designing and deploying production-grade ML models and pipelines. Strong programming skills in Python and familiarity with ML/NLP frameworks like PyTorch, Hugging Face, spaCy, scikit-learn. Proven experience with AWS ML ecosystem: SageMaker, Lambda, Comprehend, Bedrock, and related services. Solid understanding of MLOps principles including version control, model monitoring, and automated deployment. Experience working in the US healthcare domain is a plus. Excellent problem-solving skills and ability to work collaboratively in an agile environment. Preferred Skills: Familiarity with advanced NLP techniques such as RAG pipelines and foundation model tuning. Knowledge of vector databases and semantic search technologies. Experience with containerization (Docker, Kubernetes) and cloud infrastructure automation. Strong communication skills with the ability to translate complex technical concepts to non-technical stakeholders.
Posted 1 week ago
5.0 - 9.0 years
5 - 20 Lacs
Bengaluru
Work from Office
Roles and Responsibilities : Design, develop, test, deploy and maintain scalable and secure AWS-based solutions for clients. Collaborate with cross-functional teams to identify business requirements and translate them into technical designs. Troubleshoot complex issues related to AWS services such as EC2, S3, Lambda functions etc. Ensure compliance with security best practices and industry standards in the design of cloud architectures. Job Requirements : 5-9 years of experience in AWS Cloud Engineering or similar role. Strong understanding of Python programming language and its application in AWS services like Lambda functions. Experience working with EventBridge or other messaging queues (e.g. Kinesis) is an added advantage.
Posted 1 week ago
2.0 - 3.0 years
3 - 4 Lacs
Mumbai
Work from Office
We are looking for a Python Developer with expertise in Python, SQL (PostgreSQL), Pandas, Numpy, Django, and AWS services (Lambda, S3, RDS, EC2). Be a part of developing and optimizing scalable solutions, enhancing performance & driving innovation. Provident fund
Posted 2 weeks ago
3.0 - 7.0 years
15 - 25 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 15-20 LPA Exp: 3 to 5 years Location: Gurgaon/Pune Notice: Immediate to 15 days..!! Job Title: AWS DevOps Engineer Job Description: We are seeking a highly skilled AWS DevOps Engineer with extensive experience in Chef and CloudWatch The ideal candidate will have a strong background in cloud infrastructure, automation, and monitoring. Key Responsibilities: Design, implement, and manage scalable and reliable cloud infrastructure on AWS. Automate provisioning, configuration management, and deployment using Chef. Monitor system performance and reliability using AWS CloudWatch and other monitoring tools. Develop and maintain CI/CD pipelines to ensure smooth and efficient software releases. Collaborate with development, QA, and operations teams to ensure high availability and reliability of applications. Troubleshoot and resolve infrastructure and application issues in a timely manner. Implement security best practices and ensure compliance with industry standards. Optimize infrastructure for cost and performance. Maintain documentation related to infrastructure, processes, and procedures.
Posted 2 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Hiring for a FAANG company. Note: This position is open only for women professionals returning to the workforce after a career break (9+ months career gap, e.g, last working day prior to NOV 2024). We encourage you to apply only if you fit this criteria. Position Overview This is a Level 5 Data Engineer role within a leading e-commerce organization's Selling Partner Services division in India. The position focuses on building and scaling API authorization and customization systems that serve thousands of global selling partners. This is a senior-level position requiring significant technical expertise and leadership capabilities. Team Context & Mission Organization : Selling Partner Services division Focus : API authorization and customization systems for global selling partners Mission : Create flexible, reliable, and extensible API solutions to help businesses thrive on the platform Culture : Startup excitement with enterprise-level resources and scale Impact : Direct influence on thousands of global selling partners Key Responsibilities Technical Leadership Lead design and implementation of complex data pipelines and ETL processes Architect scalable, high-performance data systems using cloud technologies and big data platforms Evaluate and recommend new technologies and tools for data infrastructure enhancement Troubleshoot and resolve complex data-related issues in production environments Collaboration & Stakeholder Management Work closely with data scientists, analysts, and business stakeholders Understand data requirements and implement appropriate solutions Contribute to data governance policies and procedures development Performance & Quality Optimization Optimize data storage and retrieval systems for performance and cost-effectiveness Implement data quality checks and monitoring systems Ensure data integrity and reliability across all systems Mentorship & Leadership Mentor junior engineers on the team Provide technical leadership on data engineering best practices and methodologies Drive adoption of industry standards and innovative approaches Required Qualifications (Must-Have) Experience Requirements 5+ years of data engineering experience - Senior-level expertise expected 5+ years of SQL experience - Advanced SQL skills for complex data manipulation Data modeling, warehousing, and ETL pipeline building - Core competencies Distributed systems knowledge - Understanding of data storage and computing in distributed environments Technical Skills Advanced proficiency in designing and implementing data solutions Strong understanding of data architecture principles Experience with production-level data systems Knowledge of data governance and quality assurance practices Preferred Qualifications Cloud Technology Stack Data Warehousing : Redshift, Snowflake, BigQuery Object Storage : S3, Azure Blob, Google Cloud Storage ETL Services : AWS Glue, Azure Data Factory, Google Dataflow Big Data Processing : EMR, Databricks, Apache Spark Real-time Streaming : Kinesis, Kafka, Apache Storm Data Delivery : FireHose, Apache NiFi Serverless Computing : Lambda, Azure Functions, Google Cloud Functions Identity Management : IAM, Active Directory, role-based access control Non-Relational Database Experience Object Storage : S3, blob storage systems Document Stores : MongoDB, CouchDB Key-Value Stores : Redis, DynamoDB Graph Databases : Neo4j, ArangoDB Column-Family : Cassandra, HBase Key Success Factors Scalability Focus : Building systems that can handle massive enterprise scale Performance Optimization : Continuous improvement of system efficiency Quality Assurance : Maintaining high data quality and reliability standards Innovation : Staying current with emerging technologies and best practices Collaboration : Effective partnership with stakeholders across the organization This role represents a significant opportunity for a senior data engineer to make a substantial impact on a global e-commerce seller ecosystem while working with cutting-edge technologies and leading a team of talented professionals.
Posted 2 weeks ago
6.0 - 8.0 years
10 - 18 Lacs
Pune
Hybrid
Roles and Responsibilities Develop security requirements and standards in collaboration with technical teams to safeguard the corporate computing enterprise as well as customer-facing cloud infrastructure, applications, and data Support day-to-day execution of security processes in areas related to perimeter and endpoint security, cloud security posture management, vulnerability management, security observability, and security operations Leverage coding skills and experience working with Infrastructure as code (IaC) pipelines to develop, manage, and continuously audit cloud security solutions and safeguards on the AWS platform Develop tools to integrate security systems and automate workflows for Global Information Security teams across enterprise/corporate IT and customer-facing product computing environments Champion or strongly contribute to security initiatives and projects Support day-to-day execution of security processes in areas related to perimeter and endpoint security, cloud security posture management, vulnerability management, security observability, and security operations Skills 4+ years of experience in security engineering, security operations, or systems engineering with cyber security engineering responsibilities Bachelors degree in Computer Science, or related technology degree Proficiency with common security infrastructure and data protection tools and technologies: SIEM, DLP, vulnerabilities scanners, EDR/NGAV, UTM, SASE/CASB, and/or WAF Experience with the AWS platform and services such as Config, Security Hub, Lambda, CloudWatch, CloudTrail, S3, WAF, Guard Duty, Shield, and others Experience coding in Python, Java, and/or JavaScript/TypeScript Experience deploying and managing infrastructure in AWS using Infrastructure as Cod (IaC) tools, such as CloudFormation, Terraform, the AWS Cloud Development Kit (CDK), or Chef Team player with the ability to partner effectively with others and adapt to a hyper-growth pace while balancing multiple priorities Excellent problem solving and analytical skills Outstanding verbal and written communication skills
Posted 3 weeks ago
6.0 - 8.0 years
5 - 15 Lacs
Bengaluru
Remote
We are looking for a talented and experienced .NET Developer to join our innovative team. The ideal candidate will have a strong foundation in software development, particularly in building robust applications and services using the .NET framework. You will play a key role in designing, developing, and maintaining software solutions that align with our business objectives. Required Qualifications: - 6-8 years of experience in a Software Development role, specifically with .NET technologies. - Bachelors degree in Computer Science or a related field. - At least 1 year of experience with AWS, Lambda, and .NET Core. - Strong Knowledge in Postgres Database and SQL queries 3-5 years of experience with: - C#, ASP.NET MVC, .NET Core - MS-SQL Tables, Views, and Stored Procedures - JavaScript, jQuery/Mobile, Angular - CSS, HTML, and Responsive CSS Working knowledge of: - Cloud Services, particularly Amazon Web Services - Front-end debugging tools and procedures - Web Services (RESTful, WCF, ASP.NET Web API, SOAP) - Git - PostgreSQL - NodeJS or Python Key Responsibilities: - Design, develop, and implement software applications using C#, ASP.NET MVC, and .NET Core. - Collaborate with cross-functional teams to gather requirements, define, design, and deliver new features. - Develop and maintain MS-SQL databases, including creating and optimizing tables, views, and stored procedures. - Build responsive web applications utilizing HTML, CSS, JavaScript, jQuery, and Angular. - Leverage AWS services, including Lambda, to create and deploy cloud-based applications. - Implement and maintain web services using RESTful APIs, WCF, ASP.NET Web API, and SOAP protocols. - Utilize version control systems such as Git for effective source code management. - Troubleshoot and debug applications to ensure high performance and user satisfaction. - Stay current with industry trends and emerging technologies to continuously improve development practices. Preferred Qualifications: - Must have bachelors degree in computer science or related field. - Strong analytical and problem-solving skills with attention to detail. - Excellent communication and collaboration skills. - Familiarity with Agile development methodologies.
Posted 3 weeks ago
10.0 - 20.0 years
15 - 30 Lacs
Bengaluru
Hybrid
Job Description Job Title Senior Back-end Developer - Immediate Joining Job Summary - Strong proficiency with JavaScript with practical knowledge of ES versions (ES6 to ES2020). - Experience of Node.js with strong debugging skills and frameworks available for it. - Knowledge of SQL & NoSQL Databases. Creating database schemas that represent and support business processes. - Integration of multiple data sources and databases into one system. - Knowledge of basic AWS services and exposure to serverless stack (Lambda, API gateway, DynamoDB, e), caching and messaging (SQS, SNS etc). - Understanding the nature of asynchronous programming and its quirks and workarounds. - Good understanding of server-side templating languages. - Basic understanding of front-end technologies, such as HTML5, and CSS3. - Understanding unit testing, code quality and security concepts. Practical experience with relevant tools. - User authentication and authorization between multiple systems, servers, and environments. - Understanding of microservices. - Understanding fundamental design principles behind a scalable application. - Understanding differences between multiple delivery platforms, such as mobile vs. desktop, and optimizing output to match the specific platform. - Proficient understanding of code versioning tools such as Git and AWS CodeCommit/CodeBuild/CodePipeline. Eligibility • Minimum 10 years experienced in advanced JavaScript (with ES6). • Minimum 8 years experienced in Node with good debugging and problem-solving skills. • Experience in API development and expert in REST APIs. • Reasonable knowledge of SQL DB (e.g. Postgres, MySQL, Oracle etc...). 5+ years experienced. • Some experience in AWS. 3+ years experienced. • Conceptually strong in Web technologies/tools ( http, graphql, chrome debugger, postman, authentication etc.) Keyskills: Node.js , Express.JS , Lambda AWS, Postgres, MySQL
Posted 4 weeks ago
6.0 - 11.0 years
6 - 12 Lacs
Gurugram
Work from Office
Responsibilities: * Design, develop & maintain backend APIs using Rust microservices architecture with AWS services like S3, Lambda & Glue.
Posted 4 weeks ago
3.0 - 6.0 years
15 - 20 Lacs
Hyderabad
Hybrid
Hello, Urgent job openings for Data Engineer role @ GlobalData(Hyd). Job Description given below please go through to understand the requirement. if requirement is matching to your profile & interested to apply please share your updated resume @ mail id (m.salim@globaldata.com). Mention Subject Line :- Applying for Data Engineer @ GlobalData(Hyd) Share your details in the mail :- Full Name : Mobile # : Qualification : Company Name : Designation : Total Work Experience Years : How many years of experience working on Snowflake/Google BigQuery : Current CTC : Expected CTC : Notice Period : Current Location/willing to relocate to Hyd? : Office Address : 3rd Floor, Jyoti Pinnacle Building, Opp to Prestige IVY League Appt, Kondapur Road, Hyderabad, Telangana-500081. Job Description :- We are looking for a skilled and experienced Data Delivery Specification (DDS) Engineer to join our data team. The DDS Engineer will be responsible for designing, developing, and maintaining robust data pipelines and delivery mechanisms, ensuring timely and accurate data delivery to various stakeholders. This role requires strong expertise in cloud data platforms such as AWS, Snowflake, and Google BigQuery, along with a deep understanding of data warehousing concepts. Key Responsibilities Design, develop, and optimize data pipelines for efficient data ingestion, transformation, and delivery from various sources to target systems. Implement and manage data delivery solutions using cloud platforms like AWS (S3, Glue, Lambda, Redshift), Snowflake, and Google BigQuery. Collaborate with data architects, data scientists, and business analysts to understand data requirements and translate them into technical specifications. Develop and maintain DDS documents, outlining data sources, transformations, quality checks, and delivery schedules. Ensure data quality, integrity, and security throughout the data lifecycle. Monitor data pipelines, troubleshoot issues, and implement solutions to ensure continuous data flow. Optimize data storage and query performance on cloud data warehouses. Implement automation for data delivery processes and monitoring. Stay current with new data technologies and best practices in data engineering and cloud platforms. Required Skills & Qualifications Bachelors or Master’s degree in Computer Science, Data Engineering, or a related quantitative field. 4+ years of experience in data engineering, with a focus on data delivery and warehousing. Proven experience with cloud data platforms, specifically: AWS: S3, Glue, Lambda, Redshift, or other relevant data services. Snowflake: Strong experience with data warehousing, SQL, and performance optimization. Google BigQuery: Experience with data warehousing, SQL, and data manipulation. Proficient in SQL for complex data querying, manipulation, and optimization. Experience with scripting languages (e.g., Python) for data pipeline automation. Solid understanding of data warehousing concepts, ETL/ELT processes, and data modeling. Experience with version control systems (e.g., Git). Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams Thanks & Regards, Salim (Human Resources)
Posted 1 month ago
7.0 - 12.0 years
30 - 40 Lacs
Hyderabad
Work from Office
Support enhancements to the MDM platform Develop pipelines using snowflake python SQL and airflow Track System Performance Troubleshoot issues Resolve production issues Required Candidate profile 5+ years of hands on expert level Snowflake, Python, orchestration tools like Airflow Good understanding of investment domain Experience with dbt, Cloud experience (AWS, Azure) DevOps
Posted 1 month ago
4.0 - 9.0 years
8 - 16 Lacs
Kolkata
Remote
Enhance/modify applications, configure existing systems and provide user support DotNET Full Stack or [Angular 18 + developer + DotNET backend] SQL Server Angular version 18+ (it is nice to have) Angular version 15+ (mandatory)
Posted 1 month ago
6.0 - 10.0 years
30 - 40 Lacs
Hyderabad
Hybrid
About the Role We are looking for a Full Stack Developer with strong expertise in Java Spring Boot, microservices architecture, PDF processing, and AWS DevOps . You will play a key role in building reliable and performant applications that power our AI-driven construction platform. Key responsibilities below and relevant experience in those areas is a requirement. Key Responsibilities Backend Development: Design and implement scalable microservices using Java Spring Boot , optimized for performance and maintainability. PDF Document Processing: Build and integrate modules for extracting, processing, and managing PDF documents such as construction plans, contracts, and specifications. Front-End Integration: Collaborate with frontend engineers to ensure seamless communication with backend services via REST APIs or GraphQL. Cloud Architecture & Deployment: Deploy and manage services on AWS using DevOps best practices including containerization (Docker), orchestration (Kubernetes/ECS), and CI/CD pipelines (GitHub Actions, CodePipeline). Database & Data Flow: Design data models using PostgreSQL and MongoDB ; manage data pipelines and integrations across services. Security & Scalability: Implement access controls, encryption standards, and secure API endpoints to support enterprise-level deployments. Cross-Team Collaboration: Work with AI/ML engineers, product managers, and domain experts to develop backend services that support AI features like document understanding and risk analysis. Required Skills & Qualifications Technical Skills Strong programming skills in Java , with hands-on experience in Spring Boot and microservices architecture Experience processing and managing data from PDFs using tools like Apache PDFBox, iText, or similar libraries Proficient in designing and consuming RESTful APIs or GraphQL APIs Experience with AWS services like EC2, S3, Lambda, API Gateway, CloudWatch, and RDS Hands-on experience with Docker , CI/CD pipelines , and infrastructure automation (e.g., Terraform, CloudFormation) Familiarity with PostgreSQL , MongoDB , and distributed caching mechanisms (e.g., Redis) Understanding of authentication and security principles (OAuth2, JWT, etc.) Exposure to AI/ML model consumption via APIs (e.g., OpenAI, SageMaker) Soft Skills Ability to work independently and take full ownership of backend services Excellent problem-solving skills and attention to detail Strong communication and collaboration in agile cross-functional teams Passion for delivering high-quality, reliable, and scalable solutions Beyond Technical Skills What We're Looking For At Wyre AI, we're building more than just software. Were building a team that thrives in a fast-paced, high-ownership environment. Heres what we value deeply beyond strong technical capabilities Startup Readiness & Ownership Bias for action : You’re someone who ships fast, tests quickly, and iterates with purpose. Comfort with ambiguity : Ability to make decisions with limited information and adapt as things evolve. Ownership mindset : You treat the product as your own - not just a list of tickets to complete. Resourcefulness: You know when to hack something together to keep moving, and when it’s time to build it right. Product Thinking User-Centric Approach: You care about the “why” behind what you're building and understand the user’s perspective. Collaborative in Shaping Product: You’re comfortable challenging and refining product specs instead of just executing them. Strategic Trade-off Awareness: You can navigate choices—speed vs scalability, UX vs tech debt, MVP vs V1—with clarity. Collaboration & Communication Cross-Functional Comfort: You work well with product, design, and founders. Clear communicator: You can explain technical concepts in simple terms when needed. Feedback culture fit: You give and receive feedback without ego. Growth Potential Fast Learner: Startups change, and so do stacks. Willingness to learn is gold. Long-Term Mindset: Lot of opportunity to scale Mentorship Readiness: If you can bring others up as the team scales, that’s a win. Startup Cultural Fit Mission-Driven: You care deeply about what you’re building. Flexible Work Style: Especially if remote, please be flexible. No big-company baggage: No expectations of layered teams or polished specs. We move fast and build together.
Posted 1 month ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Hiring for a FAANG company. Note: This position is part of a program designed to support women professionals returning to the workforce after a career break (9+ months career gap) About the Role Join a high-impact global business team that is building cutting-edge B2B technology solutions. As part of a structured returnship program, this role is ideal for experienced professionals re-entering the workforce after a career break. Youll work on mission-critical data infrastructure in one of the worlds largest cloud-based environments, helping transform enterprise procurement through intelligent architecture and scalable analytics. This role merges consumer-grade experience with enterprise-grade features to serve businesses worldwide. Youll collaborate across engineering, sales, marketing, and product teams to deliver scalable solutions that drive measurable value. Key Responsibilities: Design, build, and manage scalable data infrastructure using modern cloud technologies Develop and maintain robust ETL pipelines and data warehouse solutions Partner with stakeholders to define data needs and translate them into actionable solutions Curate and manage large-scale datasets from multiple platforms and systems Ensure high standards for data quality, lineage, security, and governance Enable data access for internal and external users through secure infrastructure Drive insights and decision-making by supporting sales, marketing, and outreach teams with real-time and historical data Work in a high-energy, fast-paced environment that values curiosity, autonomy, and impact Who You Are: 5+ years of experience in data engineering or related technical roles Proficient in SQL and familiar with relational database management Skilled in building and optimizing ETL pipelines Strong understanding of data modeling and warehousing Comfortable working with large-scale data systems and distributed computing Able to work independently, collaborate with cross-functional teams, and communicate clearly Passionate about solving complex problems through data Preferred Qualifications: Hands-on experience with cloud technologies including Redshift, S3, AWS Glue, EMR, Lambda, Kinesis, and Firehose Familiarity with non-relational databases (e.g., object storage, document stores, key-value stores, column-family DBs) Understanding of cloud access control systems such as IAM roles and permissions Returnship Benefits: Dedicated onboarding and mentorship support Flexible work arrangements Opportunity to work on meaningful, global-scale projects while rebuilding your career momentum Supportive team culture that encourages continuous learning and professional development Top 10 Must-Have Skills: SQL ETL Development Data Modeling Cloud Data Warehousing (e.g., Redshift or equivalent) Experience with AWS or similar cloud platforms Working with Large-Scale Datasets Data Governance & Security Awareness Business Communication & Stakeholder Collaboration Automation with Python/Scala (for ETL pipelines) Familiarity with Non-Relational Databases
Posted 1 month ago
7.0 - 12.0 years
12 - 22 Lacs
Hyderabad
Work from Office
Job Title: Sr. Managed Services Engineer AWS (L3) Company: SHI | LOCUZ Location: Hyderabad Experience: 8+ Years Level: L3 Managed Services Shift: 24/7 Support (Rotational Shifts) Notice Period: Immediate Joiners or Max 15 to 20 Days About the Role: We are looking for a seasoned Sr. Managed Services Engineer AWS (L3) to join our expert team supporting SHI Complete and Expert-level AWS services. The ideal candidate will have strong hands-on experience with core AWS services, managed services delivery, and a passion for proactive monitoring and automation in cloud environments. Key Responsibilities: Perform in-depth reviews and understanding of customer AWS environments Evaluate business requirements and develop tailored service delivery plans Configure, monitor, and maintain AWS infrastructure for performance and availability Handle L3-level escalations and troubleshoot complex customer incidents/tickets Conduct proactive system checks, health monitoring, and performance tuning Implement data backup and recovery best practices Maintain security compliance and ensure adherence to SLAs and KPIs Prepare AWS-level change roadmaps for continuous improvement Lead incident response and root cause analysis for critical issues Collaborate with L1, L2, and vendor support teams Mentor junior engineers and ensure knowledge transfer Required Skills & Experience: 8+ years of IT experience, with strong exposure to Managed Services environments Deep hands-on experience with a wide range of AWS services, including but not limited to: CloudWatch, EC2, EBS, S3, RDS, EKS, Lambda, CloudFormation, CloudTrail, VPC, Route53, Transit Gateway, IAM, Security Hub, GuardDuty, AWS Backup, WAF & Shield, ACM, FSx, EFS, Elastic Beanstalk, API Gateway, AWS Workspaces, Control Tower Excellent understanding of AWS operational excellence and Well-Architected Framework Experience with 24x7 production environments and ITIL-based service delivery Strong troubleshooting and analytical skills Excellent communication and documentation skills Nice to Have: AWS Certifications (e.g., Solutions Architect Associate/Professional, SysOps Admin, DevOps Engineer) Familiarity with Infrastructure as Code (IaC) tools like Terraform or CloudFormation Experience with monitoring/alerting via EventBridge, SNS, SQS, or 3rd party tools Why Join Us? Work with leading-edge AWS technologies Be part of a high-performance managed services team Great learning opportunities and certifications Stable and growth-oriented career path in cloud infrastructure Apply now and be part of our mission to deliver expert AWS support 24x7 for enterprise customers!
Posted 1 month ago
0.0 - 1.0 years
2 - 3 Lacs
Bengaluru
Work from Office
Responsibilities: * Collaborate with cross-functional teams on project delivery. * Develop backend solutions using Python, FastAPI & AWS. * Optimize performance through Redis DB & Nginx.
Posted 1 month ago
5.0 - 9.0 years
20 - 25 Lacs
Bengaluru
Hybrid
Job Description We are seeking a skilled Python Developer to join our team. The ideal candidate will have experience in designing, developing, and deploying scalable applications. You will work on backend services, data processing, APIs, and automation, ensuring performance and reliability. Responsibilities Design, develop, test, and deploy high-quality Python applications. Write clean, maintainable, and efficient code following best practices. Develop RESTful APIs and integrate with third-party services. Work with databases (SQL & NoSQL) to design efficient data storage solutions. Implement security, authentication, and authorization mechanisms. Optimize application performance and scalability. Collaborate with cross-functional teams, including frontend developers and DevOps. Debug, troubleshoot, and resolve software issues. Automate repetitive tasks using scripts and tools. Requirements Experience: 5 to 12 years of hands-on experience in Python development. Frameworks: Proficiency in Django, Flask, or Fast API. Database: Strong knowledge of MySQL/ any RDBMS, and MongoDB. APIs: Experience in developing RESTful APIs and working with API documentation tools like Swagger/Postman. Cloud & DevOps: Familiarity with AWS, Docker, Kubernetes, and CI/CD pipelines. Version Control: Proficiency in Git and GitHub/GitLab. Testing: Experience with unit testing frameworks like PyTest or Unittest. Messaging Queues: Knowledge of RabbitMQ, Kafka, or Celery is a plus. Security & Best Practices: Understanding of authentication (OAuth, JWT) and secure coding practice. Role & responsibilities Preferred candidate profile
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough