Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
karnataka
On-site
About Us 6thstreet.com is one of the largest omnichannel fashion & lifestyle destinations in the GCC, home to 1200+ international brands. The fashion-savvy destination offers collections from over 150 international fashion brands such as Dune London, ALDO, Naturalizer, Nine West, Charles & Keith, New Balance, Crocs, Birkenstock, Skechers, Levi's, Aeropostale, Garage, Nike, Adidas Originals, Rituals, and many more. The online fashion platform also provides free delivery, free returns, cash on delivery, and the option for click and collect. Job Description We are looking for a seasoned Data Engineer to design and manage data solutions. Expertise in SQL, Python, and AWS is essential. The role includes client communication, recommending modern data tools, and ensuring smooth data integration and visualization. Strong problem-solving and collaboration skills are crucial. Responsibilities Understand and analyze client business requirements to support data solutions. Recommend suitable modern data stack tools based on client needs. Develop and maintain data pipelines, ETL processes, and data warehousing. Create and optimize data models for client reporting and analytics. Ensure seamless data integration and visualization with cross-functional teams. Communicate with clients for project updates and issue resolution. Stay updated on industry best practices and emerging technologies. Skills Required 3-5 years in data engineering/analytics with a proven track record. Proficient in SQL and Python for data manipulation and analysis. Knowledge of Pyspark is a plus. Experience with data warehouse platforms like Redshift and Google BigQuery. Experience with AWS services like S3, Glue, Athena. Proficient in Airflow. Familiarity with event tracking platforms like GA or Amplitude is a plus. Strong problem-solving skills and adaptability. Excellent communication skills and proactive client engagement. Ability to get things done, unblock yourself, and effectively collaborate with team members and clients. Benefits Full-time role. Competitive salary + bonus. Company employee discounts across all brands. Medical & health insurance. Collaborative work environment. Good vibes work culture. Medical insurance.,
Posted 2 months ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
You should possess a Bachelors/Master's degree in Computer Science/Computer Engineering or a related field. Along with this, you must have at least 2-6 years of experience in server-side development using languages like GoLang, Node.JS, or Python. Furthermore, it is essential to have proficiency in AWS services such as Lambda, DynamoDB, Step Functions, S3, etc. and hands-on experience in deploying and managing Serverless service environments. Experience with Docker, containerization, and Kubernetes is also required for this role. In addition, knowledge of database technologies like MongoDB and DynamoDB, along with experience in CI/CD pipeline and automation, would be beneficial. Experience in Video Transcoding/Streaming on Cloud would be considered a plus. Lastly, strong problem-solving skills are a must-have for this position.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Consumer & Community Banking Team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. You will execute creative software solutions, design, development, and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions or break down technical problems. Your role includes creating secure and high-quality production code and maintaining algorithms that run synchronously with appropriate systems. You will produce architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development. Additionally, you will gather, analyze, synthesize, and develop visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems. You will also build Microservices that will run on the bank's internal cloud and the public cloud platform (AWS) and collaborate with teams in multiple regions and time zones. Participation in scrum team stand-ups, code reviews, and other ceremonies, contributing to task completion and blocker resolution within your team is expected. Required qualifications, capabilities, and skills include formal training or certification on software engineering concepts and 3+ years of applied experience in Java, AWS, Terraforms. You should have experience with technologies like Java 11/17, Spring/Spring Boot, Kafka, Relational/Non-Relational Databases such as Oracle, Cassandra, Dynamo, Postgres. A minimum of 3 years of hands-on experience on the Public Cloud platform using AWS for building secure Microservices is required. Hands-on experience with AWS services like EKS, Fargate, SQS/SNS/Eventbridge, Lambda, S3, EBS, Dynamo/Arora Postgres DB, and Terraform scripts is essential. Additionally, experience with DevOps concepts for automated build and deployment is crucial. Preferred qualifications, capabilities, and skills include familiarity with modern front-end technologies and exposure to cloud technologies.,
Posted 2 months ago
5.0 - 8.0 years
14 - 22 Lacs
Bengaluru, Mumbai (All Areas)
Work from Office
Hiring For Top IT Company- Designation: Python Developer Skills: Python + Pyspark Location :Bang/Mumbai Exp: 5-8 yrs Best CTC 9783460933 9549198246 9982845569 7665831761 6377522517 7240017049 Team Converse
Posted 2 months ago
3.0 - 5.0 years
12 - 24 Lacs
Bengaluru
Work from Office
Exp writing test cases, building test frameworks, AWS services ( Lambda, S3, EC2, CloudWatch, Datadog. Exp in software/systems testing concepts. Exp Automotive Security, IT Security, Linux security concepts. Testing embedded systems, IoT devices.
Posted 2 months ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As an online travel booking platform, Agoda is committed to connecting travelers with a vast network of accommodations, flights, and more. With cutting-edge technology and a global presence, Agoda strives to enhance the travel experience for customers worldwide. As part of Booking Holdings and headquartered in Asia, Agoda boasts a diverse team of over 7,100 employees from 95+ nationalities across 27 markets. The work environment at Agoda is characterized by diversity, creativity, and collaboration, fostering innovation through a culture of experimentation and ownership. The core purpose of Agoda is to bridge the world through travel, believing that travel enriches lives, facilitates learning, and brings people and cultures closer together. By enabling individuals to explore and experience the world, Agoda aims to promote empathy, understanding, and happiness. As a member of the Observability Platform team at Agoda, you will be involved in building and maintaining the company's time series database and log aggregation system. This critical infrastructure processes a massive volume of data daily, supporting various monitoring tools and dashboards. The team faces challenges in scaling data collection efficiently while minimizing costs. In this role, you will have the opportunity to: - Develop fault-tolerant, scalable solutions in multi-tenant environments - Tackle complex problems in distributed and highly concurrent settings - Enhance observability tools for all developers at Agoda To succeed in this role, you will need: - Minimum of 8 years of experience in writing performant code using JVM languages (Java/Scala/Kotlin) or Rust (C++) - Hands-on experience with observability products like Prometheus, InfluxDB, Victoria Metrics, Elasticsearch, and Grafana Loki - Proficiency in working with messaging queues such as Kafka - Deep understanding of concurrency, multithreading, and emphasis on code simplicity and performance - Strong communication and collaboration skills It would be great if you also have: - Expertise in database internals, indexes, and data formats (AVRO, Protobuf) - Familiarity with observability data types like logs and metrics and proficiency in using profilers, debuggers, and tracers in a Linux environment - Previous experience in building large-scale time series data stores and monitoring solutions - Knowledge of open-source components like S3 (Ceph), Elasticsearch, and Grafana - Ability to work at low-level when required Agoda is an Equal Opportunity Employer and maintains a policy of considering all applications for future positions. For more information about our privacy policy, please refer to our website. Please note that Agoda does not accept third-party resumes and is not responsible for any fees associated with unsolicited resumes.,
Posted 2 months ago
7.0 - 11.0 years
0 Lacs
haryana
On-site
As a Senior Manager specializing in Data Analytics & AI, you will be a pivotal member of the EY Data, Analytics & AI Ireland team. Your role as a Databricks Platform Architect will involve enabling clients to extract significant value from their information assets through innovative data analytics solutions. You will have the opportunity to work across various industries, collaborating with diverse teams and leading the design and implementation of data architecture strategies aligned with client goals. Your key responsibilities will include leading teams with varying skill sets in utilizing different Data and Analytics technologies, adapting your leadership style to meet client needs, creating a positive learning culture, engaging with clients to understand their data requirements, and developing data artefacts based on industry best practices. Additionally, you will assess existing data architectures, develop data migration strategies, and ensure data integrity and minimal disruption during migration activities. To qualify for this role, you must possess a strong academic background in computer science or related fields, along with at least 7 years of experience as a Data Architect or similar role in a consulting environment. Hands-on experience with cloud services, data modeling techniques, data management concepts, Python, Spark, Docker, Kubernetes, and cloud security controls is essential. Ideally, you will have the ability to effectively communicate technical concepts to non-technical stakeholders, lead the design and optimization of the Databricks platform, work closely with the data engineering team, maintain a comprehensive understanding of the data pipeline, and stay updated on new and emerging technologies in the field. EY offers a competitive remuneration package, flexible working options, career development opportunities, and a comprehensive Total Rewards package. Additionally, you will benefit from support, coaching, opportunities for skill development, and a diverse and inclusive culture that values individual contributions. If you are passionate about leveraging data to solve complex problems, drive business outcomes, and contribute to a better working world, consider joining EY as a Databricks Platform Architect. Apply now to be part of a dynamic team dedicated to innovation and excellence.,
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
Founded in 2006, Rite Software is a global IT consulting company headquartered in Houston, Texas. Rite Software delivers strategic IT solutions for clients facing complex challenges involving cloud applications, cloud infrastructure, analytics and digital transformation. You are looking for an AWS Developer with 3-7 years of experience. The bare minimum requirements include proficiency in Python/Java scripting and API development. It would be good to have experience with Django and Flask. Keywords for this role include serverless architecture, Lambda, EC2, S3, CloudFormation, and CloudWatch.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
The client is a global technology consulting and digital solutions company with a vast network of entrepreneurial professionals spread across more than 30 countries. They cater to over 700 clients, leveraging their domain and technology expertise to drive competitive differentiation, enhance customer experiences, and improve business outcomes. As a part of the agile team, you will be responsible for developing applications, leading design sprints, and ensuring timely deliveries. Your role will involve designing and implementing low-latency, high-availability, and high-performance applications. You will also be required to ensure code modularity using microservices architecture in both frontend and backend development, following best practices in backend API development. Throughout the software development lifecycle, you will write code that is maintainable, clear, and concise. Your technical leadership will be crucial in mentoring team members to help them achieve their goals. Additionally, you will manage application deployment with a focus on security, scalability, and reliability. Your responsibilities will also include managing and evolving automated testing setups for backend and frontend applications to facilitate faster bug reporting and fixing. A solid understanding of RESTful API design, database design, and management, along with experience in version control systems, will be essential for this role. Strong problem-solving and communication skills, along with proficiency in object-oriented programming, C or VBNet, and writing reusable libraries are required. Familiarity with design and architectural patterns like Singleton and Factory patterns, RDBMS such as SQL, Postgres, MySQL, and writing clean, readable, and maintainable code will be beneficial. Experience in implementing automated testing platforms, unit tests, and identifying opportunities to optimize code and improve performance will be valuable assets. Understanding the best software engineering coding practices is essential for this position. Nice-to-have skills include proficiency in AWS services like EC2, S3, RDS, EKS, Lambda, CloudWatch, CloudFront, VPC, experience with Git, DevOps tools such as Jenkins, UCD, Kubernetes, ArgoCD, Splunk, and skills in NET-ReactJs.,
Posted 2 months ago
7.0 - 12.0 years
0 Lacs
maharashtra
On-site
As a Lead Data Engineer, you will be responsible for leveraging your 7 to 12+ years of hands-on experience in SQL database design, data architecture, ETL, Data Warehousing, Data Mart, Data Lake, Big Data, Cloud (AWS), and Data Governance domains. Your expertise in a modern programming language such as Scala, Python, or Java, with a preference for Spark/ Pyspark, will be crucial in this role. Your role will require you to have experience with configuration management and version control apps like Git, along with familiarity working within a CI/CD framework. If you have experience in building frameworks, it will be considered a significant advantage. A minimum of 8 years of recent hands-on SQL programming experience in a Big Data environment is necessary, with a preference for experience in Hadoop/Hive. Proficiency in PostgreSQL, RDBMS, NoSQL, and columnar databases will be beneficial for this role. Your hands-on experience in AWS Cloud data engineering components, including API Gateway, Glue, IoT Core, EKS, ECS, S3, RDS, Redshift, and EMR, will play a vital role in developing and maintaining ETL applications and data pipelines using big data technologies. Experience with Apache Kafka, Spark, and Airflow is a must-have for this position. If you are excited about this opportunity and possess the required skills and experience, please share your CV with us at omkar@hrworksindia.com. We look forward to potentially welcoming you to our team. Regards, Omkar,
Posted 2 months ago
10.0 - 20.0 years
25 - 40 Lacs
Hyderabad
Hybrid
Role & responsibilities : We are seeking dynamic individuals to join our team as individual contributors, collaborating closely with stakeholders to drive impactful results. Working hours - 5:30 pm to 1:30 am (Hybrid model) Must have Skills* 1. 15 years of experience in design and delivery of Distributed Systems capable of handling petabytes of data in a distributed environment. 2. 10 years of experience in the development of Data Lakes with Data Ingestion from disparate data sources, including relational databases, flat files, APIs, and streaming data. 3. Experience in providing Design and development of Data Platforms and data ingestion from disparate data sources into the cloud. 4. Expertise in core AWS Services including AWS IAM, VPC, EC2, EKS/ECS, S3, RDS, DMS, Lambda, CloudWatch, CloudFormation, CloudTrail, CloudWatch. 5. Proficiency in programming languages like Python and PySpark to ensure efficient data processing. preferably Python. 6. Architect and implement robust ETL pipelines using AWS Glue, Lamda, and step-functions defining data extraction methods, transformation logic, and data loading procedures across different data sources 7. Experience in the development of Event-Driven Distributed Systems in the Cloud using Serverless Architecture. 8. Ability to work with Infrastructure team for AWS service provisioning for databases, services, network design, IAM roles and AWS cluster. 9. 2-3 years of experience working with Document DB or MongoDB environment. Nice to have Skills: 1. 10 years of experience in the development of Data Audit, Compliance and Retention standards for Data Governance, and automation of the governance processes. 2. Experience in data modelling with NoSQL Databases like Document DB. 3. Experience in using column-oriented data file format like Apache Parquet, and Apache Iceberg as the table format for analytical datasets. 4. Expertise in development of Retrieval-Augmented Generation (RAG) and Agentic Workflows for providing context to LLMs based on proprietary enterprise data. 5. Ability to develop re-ranking strategies using results from Index and Vector stores for LLMs to improve the quality of the output. 6. Knowledge of AWS AI Services like AWS Entity Resolution, AWS Comprehend.
Posted 2 months ago
4.0 - 9.0 years
0 - 2 Lacs
Nagpur, Pune, Bengaluru
Work from Office
Hi We have one urgent open position for Redshift database admin. Please find the details for the Redshift Admin role below: Exp range- 7 to 10 yrs. Location: All Infocepts / Remote (only in exceptional cases Must-haves: Overall exp. Of 7 to 10 yrs. Redshift administration (key role,5 years experience with redshift) Data Migration / Sync – specifically DATA UNLOAD, COPY Deployments Tuning / Optimization Key Result Areas and Activities: Design and Development: Design, implement, and manage Redshift clusters for high availability, performance, and security. Performance Optimization: Monitor and optimize database performance, including query tuning and resource management. Backup and Recovery: Develop and maintain database backup and recovery strategies. Security Enforcement: Implement and enforce database security policies and procedures. Cost-Performance Balance: Ensure an optimal balance between cost and performance. Collaboration with Development Teams: Work with development teams to design and optimize database schemas and queries. Perform database migrations, upgrades, and patching. Issue Resolution: Troubleshoot and resolve database-related issues, providing support to development and operations teams. Automate routine database tasks using scripting languages and tools. Must-Have: Strong understanding of database design, performance tuning, and optimization techniques Proficiency in SQL and experience with database scripting languages (e.g., Python, Shell) Experience with database backup and recovery, security, and high availability solutions Familiarity with AWS services and tools, including S3, EC2, IAM, and CloudWatch Operating System - Any flavor of Linux, Windows Core Redshift Administration Skills -Cluster Management, Performance Optimization, workload management (WLM), vacuuming/analyzing tables for optimal performance, IAM policies, role[1]based access control, Backup & Recovery, automated backups, and restoration strategies. SQL Query Optimization, distribution keys, sort keys, and compression encoding Knowledge of COPY and UNLOAD commands, S3 integration, and best practices for bulk data loading Scripting & Automation for automating routine DBA tasks Expertise in debugging slow queries, troubleshooting system tables -- Thanks Regards, Tharani S, Recruitment Lead Sight Spectrum Technology Solutions Pvt. Ltd. 9500066211. | www.sightspectrum.com t tharani@sightspectrum.com | Chennai
Posted 2 months ago
8.0 - 10.0 years
12 - 18 Lacs
Zirakpur
Work from Office
AWS Services ( Lambda, Glue, S3, Dynamo, EventBridge, Appsync, Open search) Terraform Python React/Vite Unit testing (Jest, Pytest) Software development lifecycle
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You should have proven experience as a Linux Systems Administrator, focusing on HPC environments. Your understanding of Linux operating systems such as CentOS, Ubuntu, and Red Hat should be strong. You should also have intermediate knowledge in SLURM resource scheduler. Hands-on experience with AWS services related to HPC like EC2, S3, FSx for Lustre, AWS Batch, and AWS ParallelCluster is required. Familiarity with parallel file systems like Lustre, GPFS, and network storage solutions is essential. Knowledge of GPU computing and working with GPU-enabled HPC systems on AWS is a plus. Experience with configuration management tools such as Ansible, Puppet, and Chef is desired. Moreover, experience with cloud-based HPC solutions and hybrid HPC environments will be beneficial for this role.,
Posted 2 months ago
4.0 - 8.0 years
0 - 0 Lacs
maharashtra
On-site
You are currently looking to hire a Java Full Stack & Java Cloud Developer/Lead with the following skill sets: - Java Fullstack experience with Java and Angular 6+ / React, requiring 4-7 years of relevant experience with a maximum offered CTC of 15 Lacs. - Java Cloud experience with Java and either AWS (Lambda, S3, EC2) or Azure (Docker, Kubernetes), requiring 4-7 years of relevant experience with a maximum offered CTC of 16.5 Lacs. The job location for this position is in Chennai, Pune, Coimbatore, and Ahmendabad. The ideal candidate should be an immediate joiner with a notice period of 30 days. This position is for a Tier 1 client. If you are interested in this opportunity, please provide the following details for consideration: - Full Name - Primary Contact - Email ID - Reason for job change - Are you currently working If not, please provide the reason. - Notice Period - Last Working Day (LWD) if serving notice - Do you have any other offer in hand If yes, please provide the reason for considering other opportunities. - Any challenges with the current offer - Current CTC - Expected CTC - Offered CTC - Current Location - Preferred Location We look forward to receiving your details for consideration.,
Posted 2 months ago
7.0 - 11.0 years
0 Lacs
noida, uttar pradesh
On-site
Job Title/Role: Teach Lead [Java & Python] Location: Noida/Delhi NCR Experience : 7- 10 yrs Roles & Responsibilities Understanding the clients business use cases and technical requirements and be able to convert them into technical solutions which elegantly meets the requirements Identifying different solutions and being able to narrow down the best option that meets the business requirements. Develop solution design considering various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Excellent communication and teamwork skills Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Should be confident, self-driven with a lot of initiative, and should have the zeal and energy to quickly ramp-up on upcoming technologies. Create and contribute to an environment that is geared to innovation, high productivity, high quality and customer service. Experience in communicating with end clients, business users, other technical teams and provide estimates. Qualification BTech. Or MCA in computer science More than 7 years of experience working as Java full stack technologist/ - software development, testing and production support & Python programming language and the FastAPI framework. Design / Development experience in Java technical stack, Java/J2EE, Design Patterns, Spring Framework-Core| Boot| MVC /Hibernate, JavaScript, CSS, HTML, Multithreading, Data Structures, Kafka and SQL Experience with data analytics tools and libraries such as pandas, NumPy, and scikit-learn Familiarity in Content Management Tools and, experience with integrating databases both relational (e.g. Oracle, PostgreSQL, MySQL) and non-relational databases (e.g. DynamoDB, Mongo, Cassandra) An in-depth understanding of Public/Private/Hybrid Cloud solutions and experience in securely integrating public cloud into traditional hosting/delivery models with a specific focus on AWS (S3, lambda, API gateway, EC2, Cloudflare) Working knowledge of Docker, Kubernetes, UNIX-based operating systems, and Micro-services Should have clear understating on continuous integration, build, release, code quality GitHub/Jenkins Should have an experience of managing teams and time bound projectsWorking in F&B Industry or Aerospace could be an added advantage,
Posted 2 months ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
Role Description Salesforce has immediate opportunities for software developers who want their lines of code to have significant and measurable positive impact for users, the company's bottom line, and the industry. You will be working with a group of world-class engineers to build the breakthrough features our customers will love, adopt, and use while keeping our trusted CRM platform stable and scalable. The software engineer role at Salesforce encompasses architecture, design, implementation, and testing to ensure we build products right and release them with high quality. We pride ourselves on writing high quality, maintainable code that strengthens the stability of the product and makes our lives easier. We embrace the hybrid model and celebrate the individual strengths of each team member while cultivating everyone on the team to grow into the best version of themselves. We believe that autonomous teams with the freedom to make decisions will empower the individuals, the product, the company, and the customers they serve to thrive. Your Impact As a Senior Backend Software Engineer, your job responsibilities will include: Build new and exciting components in an ever-growing and evolving market technology to provide scale and efficiency. Develop high-quality, production-ready code that millions of users of our cloud platform can use. Design, implement, and tune robust APIs and API framework-related features that perform and scale in a multi-tenant environment. Work in a Hybrid Engineering model and contribute to all phases of SDLC including design, implementation, code reviews, automation, and testing of the features. Build efficient components/algorithms on a microservice multi-tenant SaaS cloud environment Code review, mentoring junior engineers, and providing technical guidance to the team (depending on the seniority level) Required Skills: Mastery of multiple programming languages and platforms 6 + years of backend software development experience including designing and developing distributed systems at scale Deep knowledge of object-oriented programming and other scripting languages: Java, Python, Scala C#, Go, Node.JS and C++. Strong PostgreSQL/SQL skills and experience with relational and non-relational databases including writing queries. A deeper understanding of software development best practices and demonstrate leadership skills. Degree or equivalent relevant experience required. Experience will be evaluated based on the core competencies for the role (e.g. extracurricular leadership roles, military experience, volunteer roles, work experience, etc.) Preferred Skills: Experience with developing SAAS products over public cloud infrastructure - AWS/Azure/GCP. Experience with Big-Data/ML and S3 Hands-on experience with Streaming technologies like Kafka Experience with Elastic Search Experience with Terraform, Kubernetes, Docker Experience working in a high-paced and rapidly growing multinational organization BENEFITS & PERKS Comprehensive benefits package including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more! World-class enablement and on-demand training with Trailhead.com Exposure to executive thought leaders and regular 1:1 coaching with leadership Volunteer opportunities and participation in our 1:1:1 model for giving back to the community For more details, visit https://www.salesforcebenefits.com/,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You should have a good knowledge of the JavaScript language and expertise in the Cypress Automation Framework through BDD. It is essential to be well-versed with GIT repositories and have expertise in RESTful API Automation and UI Automation. Additionally, experience in Performance Testing using Gatling is required. Knowledge of AWS services like S3, SES, EC2, IAM, Lambda is a plus. Hands-on experience in DevOps tools such as Docker, Kubernetes, Gitlab/Jenkins, and CICD implementation is necessary. Basic networking knowledge is also required. You should have good experience in Testing and Test management, along with experience working in the Unified Communication domain. Your key responsibilities will include participating in Pre-PI, PI planning, and all SAFe Agile ceremonies. You will be responsible for creating an Automation test framework from scratch and working on automated scripts, back-end scripting, and user interface. Implementing new ideas to enhance the Automation framework and conducting Performance tests using Gatling are also part of the role. You will be involved in peer/junior code reviews, supporting juniors in creating reusable keywords/utils, and training new joiners and juniors through knowledge sharing. Monitoring and managing changes in regression tests, understanding and implementing CI/CD pipeline flow, and pushing code daily into the code repository are crucial tasks. Understanding the product by conducting exploratory testing manually, participating in estimation, Agile Scrum ceremonies, and backlog grooming are part of your responsibilities. Collating and monitoring the defect management process, uploading test cases in JIRA, providing daily status updates of stories in JIRA, and generating reports will also be expected. Furthermore, you will be required to provide input to the Test Manager.,
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
Data Scientist (5+ Years of Experience) We are seeking a highly motivated Data Scientist with over 5 years of hands-on experience in data mining, statistical analysis, and developing high-quality machine learning models. The ideal candidate will have a passion for solving real-world problems using data-driven approaches and possess strong technical expertise across various data science domains. Key Responsibilities: Apply advanced data mining techniques and statistical analysis to extract actionable insights. Design, develop, and deploy robust machine learning models to address complex business challenges. Conduct A/B and multivariate experiments to evaluate model performance and optimize outcomes. Monitor, analyze, and enhance the performance of machine learning models post-deployment. Collaborate cross-functionally to build customer cohorts for CRM campaigns and conduct market basket analysis. Stay updated with state-of-the-art techniques in NLP, particularly within the e-commerce domain. Required Skills & Qualifications: Programming & Tools: Proficient in Python, PySpark, and SQL for data manipulation and analysis. Machine Learning & AI: Strong experience with ML libraries (e.g., Scikit-learn, TensorFlow, PyTorch) and expertise in NLP, Computer Vision, Recommender Systems, and Optimization techniques. Cloud & Big Data: Hands-on experience with AWS services, including Glue, EKS, S3, SageMaker, and Redshift. Model Deployment: Experience deploying pre-trained models from platforms like Hugging Face and AWS Bedrock. DevOps & MLOps: Understanding of Git, Docker, CI/CD pipelines, and deploying models with frameworks such as FastAPI. Advanced NLP: Experience in building, retraining, and optimizing NLP models for diverse use cases. Preferred Qualifications: Strong research mindset with a keen interest in exploring new data science methodologies. Background in e-commerce analytics is a plus. If youre passionate about leveraging data to drive impactful business decisions and thrive in a dynamic environment, wed love to hear from you!,
Posted 2 months ago
7.0 - 12.0 years
17 - 27 Lacs
Hyderabad
Work from Office
Job Title: Data Quality Engineer Mandatory Skills Data Engineer, Python, AWS, SQL, Glue, Lambda, S3, SNS, ML, SQS Job Summary: We are seeking a highly skilled Data Engineer (SDET) to join our team, responsible for ensuring the quality and reliability of complex data workflows, data migrations, and analytics solutions across both cloud and on-premises environments. The ideal candidate will have extensive experience in SQL, Python, AWS, and ETL testing, along with a strong background in data quality assurance, data science platforms, DevOps pipelines, and automation frameworks. This role involves close collaboration with business analysts, developers, and data architects to support end-to-end testing,data validation, and continuous integration for data products. Expertise in tools like Redshift, EMR,Athena, Jenkins, and various ETL platforms is essential, as is experience with NoSQL databases, big data technologies, and cloud-native testing strategies. Role and Responsibilities: Work with business stakeholders, Business Systems Analysts and Developers to ensure quality delivery of software. Interact with key business functions to confirm data quality policies and governed attributes. Follow quality management best practices and processes to bring consistency and completeness to integration service testing. Designing and managing the testing AWS environments of data workflows during development and deployment of data products Provide assistance to the team in Test Estimation & Test Planning Design, development of Reports and dashboards. Analyzing and evaluating data sources, data volume, and business rules. Proficiency with SQL, familiarity with Python, Scala, Athena, EMR, Redshift and AWS. No SQL data and unstructured data experience. Extensive experience in programming tools like Map Reduce to HIVEQL. Experience in data science platforms like SageMaker/Machine Learning Studio/ H2O. Should be well versed with the Data flow and Test Strategy for Cloud/ On Prem ETL Testing. Interpret and analyses data from various source systems to support data integration and data reporting needs. Experience in testing Database Application to validate source to destination data movement and transformation. Work with team leads to prioritize business and information needs. Develop complex SQL scripts (Primarily Advanced SQL) for Cloud ETL and On prem. Develop and summarize Data Quality analysis and dashboards. Knowledge of Data modeling and Data warehousing concepts with emphasis on Cloud/ On Prem ETL. Execute testing of data analytic and data integration on time and within budget. Work with team leads to prioritize business and information needs Troubleshoot & determine best resolution for data issues and anomalies Experience in Functional Testing, Regression Testing, System Testing, Integration Testing & End to End testing. Has deep understanding of data architecture & data modeling best practices and guidelines for different data and analytic platforms Required Skills and Qualifications: Extensive Experience in Data migration is a must (Teradata to Redshift preferred). Extensive testing Experience with SQL/Unix/Linux scripting is a must. Extensive experience testing Cloud/On Prem ETL (e.g. Abinitio, Informatica, SSIS, Datastage, Alteryx, Glu). Extensive experience DBMS like Oracle, Teradata, SQL Server, DB2, Redshift, Postgres and Sybase. Extensive experience using Python scripting and AWS and Cloud Technologies. Extensive experience using Athena, EMR, Redshift, AWS, and Cloud Technologies. Experienced in large-scale application development testing Cloud/ On Prem Data warehouse, Data Lake, Data science. Experience with multi-year, large-scale projects. Expert technical skills with hands-on testing experience using SQL queries. Extensive experience with both data migration and data transformation testing. Extensive experience DBMS like Oracle, Teradata, SQL Server, DB2, Redshift, Postgres and Sybase. Extensive testing Experience with SQL/Unix/Linux. Extensive experience testing Cloud/On Prem ETL (e.g. Abinitio, Informatica, SSIS, Datastage, Alteryx, Glu). Extensive experience using Python scripting and AWS and Cloud Technologies. Extensive experience using Athena, EMR , Redshift and AWS and Cloud Technologies. API/Rest Assured automation, building reusable frameworks, and good technical expertise/acumen. Java/Java Script - Implement core Java, Integration, Core Java and API. Functional/UI/ Selenium - BDD/Cucumber, Specflow, Data Validation/Kafka, BigData, also automation experience using Cypress. AWS/Cloud - Jenkins/ Gitlab/ EC2 machine, S3 and building Jenkins and CI/CD pipelines, SouceLabs. Preferred Skills: API/Rest API - Rest API and Micro Services using JSON, SoapUI. Extensive experience in DevOps/Data Ops space. Strong experience in working with DevOps and build pipelines. Strong experience of AWS data services including Redshift, Glue, Kinesis, Kafka (MSK) and EMR/Spark, Sage Maker etc. Experience with technologies like Kubeflow, EKS, Docker. Extensive experience using No SQL data and unstructured data experience like MongoDB, Cassandra, Redis, ZooKeeper. Extensive experience in Map reduce using tools like Hadoop, Hive, Pig, Kafka, S4, Map R. Experience using Jenkins and Gitlab. Experience using both Waterfall and Agile methodologies. Experience in testing storage tools like S3, HDFS. Experience with one or more industry-standard defect or Test Case management Tools. Great communication skills (regularly interacts with cross functional team members).
Posted 2 months ago
8.0 - 11.0 years
30 - 35 Lacs
Hyderabad
Work from Office
NP: Immediate to 15 Days. Required Skills & Qualifications: Strong experience in backend development using Java (Java 8 or later). Hands-on experience with front-end technologies such as Vue.js or Angular (with a strong preference to work with Vue.js). Solid understanding of PostgreSQL and ability to write optimized SQL queries and stored procedures. AWS cloud experience with knowledge of services like EC2, RDS, S3, Lambda, API Gateway, etc. Experience building and consuming RESTful APIs. Proficiency with version control systems (e.g., Git). Familiarity with Agile/Scrum methodologies. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills.
Posted 2 months ago
5.0 - 10.0 years
20 - 35 Lacs
Kochi, Bengaluru
Work from Office
Job Summary: We are seeking a highly skilled and motivated Machine Learning Engineer with a strong foundation in programming and machine learning, hands-on experience with AWS Machine Learning services (especially SageMaker), and a solid understanding of Data Engineering and MLOps practices. You will be responsible for designing, developing, deploying, and maintaining scalable ML solutions in a cloud-native environment. Key Responsibilities: • Design and implement machine learning models and pipelines using AWS SageMaker and related services. • Develop and maintain robust data pipelines for training and inference workflows. • Collaborate with data scientists, engineers, and product teams to translate business requirements into ML solutions. • Implement MLOps best practices including CI/CD for ML, model versioning, monitoring, and retraining strategies. • Optimize model performance and ensure scalability and reliability in production environments. • Monitor deployed models for drift, performance degradation, and anomalies. • Document processes, architectures, and workflows for reproducibility and compliance. Required Skills & Qualifications: • Strong programming skills in Python and familiarity with ML libraries (e.g., scikitlearn, TensorFlow, PyTorch). • Solid understanding of machine learning algorithms, model evaluation, and tuning. • Hands-on experience with AWS ML services, especially SageMaker, S3, Lambda, Step Functions, and CloudWatch. • Experience with data engineering tools (e.g., Apache Airflow, Spark, Glue) and workflow orchestration. Machine Learning Engineer - Job Description • Proficiency in MLOps tools and practices (e.g., MLflow, Kubeflow, CI/CD pipelines, Docker, Kubernetes). • Familiarity with monitoring tools and logging frameworks for ML systems. • Excellent problem-solving and communication skills. Preferred Qualifications: • AWS Certification (e.g., AWS Certified Machine Learning Specialty). • Experience with real-time inference and streaming data. • Knowledge of data governance, security, and compliance in ML systems
Posted 2 months ago
6.0 - 7.0 years
27 - 35 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role & responsibilities Provide technical leadership and mentorship to data engineering teams. Architect, design, and deploy scalable, secure, and high-performance data pipelines. Collaborate with stakeholders, clients, and cross-functional teams to deliver end-to-end data solutions. Drive technical strategy and implementation plans in alignment with business needs. Oversee project execution using tools like JIRA, ensuring timely delivery and adherence to best practices. Implement and maintain CI/CD pipelines and automation tools to streamline development workflows. Promote best practices in data engineering and AWS implementations across the team. Preferred candidate profile Strong hands-on expertise in Python, PySpark, and Spark architecture, including performance tuning and optimization. Advanced proficiency in SQL and experience in writing optimized stored procedures. In-depth knowledge of the AWS data engineering stack, including: AWS Glue Lambda API Gateway EMR S3 Redshift Athena Experience with Infrastructure as Code (IaC) using CloudFormation and Terraform. Familiarity with Unix/Linux scripting and system administration is a plus. Proven ability to design and deploy robust, production-grade data solutions.
Posted 2 months ago
3.0 - 5.0 years
10 - 15 Lacs
Mumbai, Aurangabad
Work from Office
Joining Preferred immediate joiners. Job Summary: We are seeking a skilled and motivated Data Developer with 3 to 5 years of hands-on experience in designing, developing, and maintaining scalable data solutions. The ideal candidate will work closely with data architects, data analysts, and application developers to build efficient data pipelines, transform data, and support data integration across various platforms. Key Responsibilities: Design, develop, and maintain ETL/ELT pipelines to ingest, transform, and load data from various sources (structured and unstructured). Develop and optimize SQL queries , stored procedures, views, and functions for data analysis and reporting. Work with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery, Azure Synapse) to support business intelligence solutions. Collaborate with data engineers and analysts to implement robust data models and schemas for analytics. Ensure data quality , consistency, and accuracy through data validation, testing, and monitoring. Implement data security, compliance, and governance protocols in alignment with organizational policies. Maintain documentation related to data sources, data flows, and business rules. Participate in code reviews , sprint planning, and agile development practices. Technical Skills Required: Languages & Tools: SQL (Advanced proficiency required) Python or Scala for data processing Shell scripting (Bash, PowerShell) ETL Tools / Data Integration: Apache NiFi, Talend, Informatica, Azure Data Factory, SSIS, or equivalent Data Warehousing & Databases: Snowflake, Amazon Redshift, Google BigQuery, Azure Synapse SQL Server, PostgreSQL, Oracle, or MySQL Cloud Platforms (at least one): AWS (Glue, S3, Redshift, Lambda) Azure (ADF, Blob Storage, Synapse) GCP (Dataflow, BigQuery, Cloud Storage) Big Data & Streaming (Nice to Have): Apache Spark, Databricks, Kafka, Hadoop ecosystem Version Control & DevOps: Git, Bitbucket CI/CD pipelines (Jenkins, GitHub Actions) Qualifications: Bachelors or Masters degree in Computer Science , Information Systems , or related field. 35 years of professional experience as a Data Developer or Data Engineer. Strong problem-solving skills and the ability to work both independently and in a team environment. Experience working in Agile/Scrum teams is a plus. Excellent communication and documentation skills. Preferred Certifications (Optional): Microsoft Certified: Azure Data Engineer Associate AWS Certified Data Analytics Specialty Google Cloud Professional Data Engineer
Posted 2 months ago
0.0 - 3.0 years
3 - 8 Lacs
Chennai
Hybrid
Key Responsibilities AWS Infrastructure Management Design, deploy, and manage AWS infrastructure using services such as EC2, ECS, EKS, Lambda, RDS, S3, VPC, and CloudFront Implement and maintain Infrastructure as Code using AWS CloudFormation, AWS CDK, or Terraform Optimize AWS resource utilization and costs through rightsizing, reserved instances, and automated scaling Manage multi-account AWS environments using AWS Organizations and Control Tower Implement disaster recovery and backup strategies using AWS services CI/CD Pipeline Development Build and maintain CI/CD pipelines using AWS CodePipeline, CodeBuild, CodeDeploy, and CodeCommit Integrate with third-party tools like Jenkins, GitLab CI, or GitHub Actions when needed Implement automated testing and security scanning within deployment pipelines Manage deployment strategies including blue-green deployments using AWS services Automate application deployments to ECS, EKS, Lambda, and EC2 environments Container and Serverless Management Deploy and manage containerized applications using Amazon ECS and Amazon EKS Implement serverless architectures using AWS Lambda, API Gateway, and Step Functions Manage container registries using Amazon ECR Optimize container and serverless application performance and costs Implement service mesh architectures using AWS App Mesh when applicable Monitoring and Observability Implement comprehensive monitoring using Amazon CloudWatch, AWS X-Ray, and AWS Systems Manager Set up alerting and dashboards for proactive incident management Configure log aggregation and analysis using CloudWatch Logs and AWS OpenSearch Implement distributed tracing for microservices architectures Create and maintain operational runbooks and documentation Security and Compliance Implement AWS security best practices using IAM, Security Groups, NACLs, and AWS Config Manage secrets and credentials using AWS Secrets Manager and Systems Manager Parameter Store Implement compliance frameworks and automated security scanning Configure AWS GuardDuty, AWS Inspector, and AWS Security Hub for threat detection Manage SSL/TLS certificates using AWS Certificate Manager Automation and Scripting Develop automation scripts using Python, Bash, and AWS CLI/SDK Create AWS Lambda functions for operational automation Implement event-driven automation using CloudWatch Events and EventBridge Automate backup, patching, and maintenance tasks using AWS Systems Manager Build custom tools and utilities to improve operational efficiency Required Qualifications AWS Expertise Strong experience with core AWS services: EC2, S3, RDS, VPC, IAM, CloudFormation Experience with container services (ECS, EKS) and serverless technologies (Lambda, API Gateway) Proficiency with AWS networking concepts and security best practices Experience with AWS monitoring and logging services (CloudWatch, X-Ray) Technical Skills Expertise in Infrastructure as Code using CloudFormation, CDK, or Terraform Strong scripting skills in Python, Bash, or PowerShell Experience with CI/CD tools, preferably AWS native services and Bitbucket Pipelines. Knowledge of containerization with Docker and orchestration with Kubernetes Understanding of microservices architecture and distributed systems Experience with configuration management and automation tools DevOps Practices Strong understanding of CI/CD best practices and GitOps workflows Experience with automated testing and deployment strategies Knowledge of monitoring, alerting, and incident response procedures Understanding of security scanning and compliance automation AWS Services Experience Compute & Containers Amazon EC2, ECS, EKS, Fargate, Lambda, Batch Storage & Database Amazon S3, EBS, EFS, RDS, DynamoDB, ElastiCache, Redshift Networking & Security VPC, Route 53, CloudFront, ALB/NLB, IAM, Secrets Manager, Certificate Manager Developer Tools CodePipeline, CodeBuild, CodeDeploy, CodeCommit, CodeArtifact Monitoring & Management CloudWatch, X-Ray, Systems Manager, Config, CloudTrail, AWS OpenSearch
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |