Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
12 - 22 Lacs
kolkata, bengaluru, mumbai (all areas)
Hybrid
Job Description Role & Responsibilities Design and develop highly scalable backend applications using Node.js, JavaScript, and TypeScript. Build and maintain RESTful APIs and microservices. Implement complex MongoDB aggregation queries (MongoDB Atlas experience is a plus). Work with AWS services (Lambda, SQS, S3, CloudWatch) for serverless and event-driven applications. Ensure application security, performance, and scalability. Collaborate with front-end developers, architects, and DevOps teams. Debug production issues, monitor logs, and optimize workflows. Preferred Candidate Profile 510 years of backend development experience. Strong expertise in Node.js, REST APIs, MongoDB aggregation. Hands-on AWS cloud exposure (Lambda, SQS, S3, CloudWatch). Experience in microservices-based architecture. Knowledge of GraphQL is an advantage. Strong problem-solving skills and ability to work in a fast-paced environment.
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
bangalore, karnataka
On-site
Role Overview: You will be responsible for architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions. Your role will involve designing scalable data architectures with Snowflake, integrating cloud technologies such as AWS, Azure, GCP, and ETL/ELT tools like DBT. Additionally, you will guide teams in proper data modeling, transformation, security, and performance optimization. Key Responsibilities: - Architect and deliver highly scalable, distributed, cloud-based enterprise data solutions - Design scalable data architectures with Snowflake and integrate cloud technologies like AWS, Azure, GCP, and ETL/ELT tools such as DBT - Guide teams in proper data modeling (star, snowflake schemas), transformation, security, and performance optimization - Load data from disparate data sets and translate complex functional and technical requirements into detailed design - Deploy Snowflake features such as data sharing, events, and lake-house patterns - Implement data security and data access controls and design - Understand relational and NoSQL data stores, methods, and approaches (star and snowflake, dimensional modeling) - Utilize AWS, Azure, or GCP data storage and management technologies such as S3, Blob/ADLS, and Google Cloud Storage - Implement Lambda and Kappa Architectures - Utilize Big Data frameworks and related technologies, with mandatory experience in Hadoop and Spark - Utilize AWS compute services like AWS EMR, Glue, and Sagemaker, as well as storage services like S3, Redshift, and DynamoDB - Experience with AWS Streaming Services like AWS Kinesis, AWS SQS, and AWS MSK - Troubleshoot and perform performance tuning in Spark framework - Spark core, SQL, and Spark Streaming - Experience with flow tools like Airflow, Nifi, or Luigi - Knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build, and Code Commit - Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules Qualifications Required: - 8-12 years of relevant experience - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, Big Data model techniques using Python/Java - Strong expertise in the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS - Proficiency in AWS, Data bricks, and Snowflake data warehousing, including SQL, Snow pipe - Experience in data security, data access controls, and design - Strong AWS hands-on expertise with a programming background preferably Python/Scala - Good knowledge of Big Data frameworks and related technologies, with mandatory experience in Hadoop and Spark - Good experience with AWS compute services like AWS EMR, Glue, and Sagemaker and storage services like S3, Redshift & Dynamodb - Experience with AWS Streaming Services like AWS Kinesis, AWS SQS, and AWS MSK - Troubleshooting and Performance tuning experience in Spark framework - Spark core, SQL, and Spark Streaming - Experience in one of the flow tools like Airflow, Nifi, or Luigi - Good knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build, and Code Commit - Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules Kindly share your profiles on dhamma.b.bhawsagar@pwc.com if you are interested in this opportunity.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Saarthee is a Global Strategy, Analytics, Technology, and AI consulting company dedicated to helping organizations succeed by providing comprehensive data and analytics solutions. Our holistic and tool-agnostic approach sets us apart in the marketplace, ensuring that we meet our customers at every stage of their data journey. With a diverse and global team united by a commitment to our customers" success, we strive to guide organizations towards insights-driven achievements. As a Java Developer at Saarthee, you will play a crucial role in designing, implementing, and integrating scalable backend solutions using Java 11+, Spring Boot, and related technologies. Your responsibilities will include developing and maintaining RESTful APIs that seamlessly integrate with AWS services like DynamoDB, S3, SQS, and Lambda. Collaborating with various stakeholders, you will ensure the successful delivery of projects, support existing services, and drive technical improvements in both new and established systems. Key Responsibilities: - Design, implement, and integrate backend solutions using Java 11+, Spring Boot, and related technologies. - Develop and maintain RESTful APIs, integrating with AWS services like DynamoDB, S3, SQS, and Lambda. - Collaborate with cross-functional teams to deliver projects successfully. - Support and enhance existing live services and products. - Conduct peer code reviews, mentor junior team members, and ensure adherence to best practices. - Drive functional and technical improvements in services, incorporating caching technologies and CI/CD tools when necessary. - Debug and resolve complex issues in high-volume systems. - Uphold high standards of code quality, scalability, and maintainability. - Actively participate in team discussions and contribute to achieving project goals. Required Skills: Mandatory: - Proficiency in Java 11+, Spring Boot, and REST API development. - Strong expertise in AWS services like DynamoDB, S3, SQS, and Lambda. - Excellent debugging and problem-solving skills. - Ability to work independently and deliver high-quality solutions. - Hands-on experience with high-volume systems (preferred). Good to Have: - Experience with Jenkins and CI/CD pipelines. - Knowledge of caching technologies such as Redis and Memcached. Saarthee offers a stable and financially sound environment with competitive remuneration, including additional compensation tied to project execution and business development. Join our team for professional growth opportunities, a results-driven culture, and a commitment to mentoring and learning.,
Posted 6 days ago
5.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You are a Senior .NET Developer with a strong background in building scalable APIs and cloud-native solutions using AWS. Your primary focus will be on logistics/transportation systems, where you will be responsible for providing leadership, mentoring, and hands-on development using modern .NET technologies. Key Technical Skills - Languages/Frameworks: You should be proficient in C#, .NET Core, and ASP.NET Core Web API. - Data Access: Experience with Entity Framework Core, LINQ, Dapper (Micro-ORM), and PostgreSQL is required. - Cloud: Hands-on experience with AWS services such as Lambda, S3, SQS/SNS, X-Ray, and Secrets Manager is essential. - Architecture: Familiarity with Clean Architecture, SOLID principles, and Microservices is expected. - Testing/Dev Tools: Knowledge of xUnit, Swagger, Git, Fluent Validation, and Serilog is a plus. - Security: Understanding of CORS, API Versioning, and Secure Coding Practices is necessary. Responsibilities - Design and develop high-performance APIs and microservices to meet project requirements. - Build and deploy cloud-native applications using various AWS services according to best practices. - Lead code reviews, ensure clean code and architecture principles are followed throughout the development process. - Mentor junior developers, guiding them in adopting best practices and improving their technical skills. - Collaborate with the team on requirement analysis, project planning, and estimation to deliver high-quality solutions. - Maintain documentation, ensure adequate test coverage, and focus on system reliability to meet business needs. Preferred Experience & Domain - You should have 5-10 years of hands-on experience in .NET development. - Minimum of 2 years of experience working with AWS cloud infrastructure is required. - Experience in logistics or transportation domains is preferred for this role. - Any prior experience in technical leadership or mentoring will be beneficial. Soft Skills - Strong problem-solving and debugging skills are essential for this position. - Excellent communication and collaboration abilities are necessary to work effectively with the team. - Experience in project planning, estimation, and successful project execution is valuable. - A passion for knowledge sharing and team growth is highly encouraged. Education & Certifications - A Bachelor's degree in computer science or a related field is required. - Preferred certifications in AWS or Microsoft .NET will be advantageous for this role.,
Posted 6 days ago
6.0 - 15.0 years
0 Lacs
karnataka
On-site
The role involves designing and developing scalable BI and Data Warehouse (DWH) solutions, leveraging tools like Power BI, Tableau, and Azure Databricks. Responsibilities include overseeing ETL processes using SSIS, creating efficient data models, and writing complex SQL queries for data transformation. You will design interactive dashboards and reports, working closely with stakeholders to translate requirements into actionable insights. The role requires expertise in performance optimization, data quality, and governance. It includes mentoring junior developers and leveraging Python for data analysis (Pandas, NumPy, PySpark) and scripting ETL workflows with tools like Airflow. Experience with cloud platforms (AWS S3, Azure SDK) and managing databases such as Snowflake, Postgres, Redshift, and MongoDB is essential. Qualifications include 6-15+ years of BI architecture and development experience, a strong background in ETL (SSIS), advanced SQL skills, and familiarity with the CRISP-DM model. You should also possess skills in web scraping, REST API interaction, and data serialization (JSON, CSV, Parquet). Strong programming foundations with Python and experience in version control for code quality and collaboration are required for managing end-to-end BI projects.,
Posted 6 days ago
3.0 - 8.0 years
3 - 8 Lacs
mumbai, pune, bengaluru
Work from Office
Role Overview - (Looking for Immediate joiner or who can join within 30days) We are looking for an experienced AWS Support Engineer to join our production support and infrastructure team in the Banking domain . The role involves supporting, maintaining, and optimizing AWS-based applications and infrastructure, ensuring reliability, security, and scalability of business-critical systems. The ideal candidate will have strong hands-on AWS expertise along with problem-solving and troubleshooting skills in a 24/7 support environment. Key Responsibilities Collaborate with cross-functional teams to support, monitor, and troubleshoot AWS cloud-based applications. Manage and maintain AWS infrastructure including EC2, S3, Lambda, RDS, CloudFormation, IAM, and VPC . Ensure scalability, reliability, performance tuning, and security compliance of cloud applications. Provide L2/L3 production support for AWS-based applications in the BFSI domain. Troubleshoot incidents, perform RCA (Root Cause Analysis), and provide permanent fixes. Automate repetitive tasks using scripting (Shell/Python/Golang) and IaC tools ( CloudFormation/Terraform ). Monitor cloud costs and optimize infrastructure for performance and efficiency. Stay updated with the latest AWS tools, features, and security best practices. Mentor junior support engineers and provide technical guidance when needed. Basic Qualifications Bachelors degree in Computer Science, Engineering, or equivalent experience. 3–8 years of hands-on experience with AWS Cloud Services . Proficiency in AWS services including EC2, S3, Lambda, RDS, CloudFormation, VPC, and IAM . Hands-on development or scripting experience in Java, C#, Python, or Golang . Strong understanding of cloud architecture principles, networking, and security . Experience with Infrastructure as Code (IaC) using CloudFormation or Terraform. Familiarity with monitoring/logging tools (CloudWatch, Splunk, Dynatrace, AppDynamics). Exposure to incident management tools (ServiceNow, JIRA, Remedy, etc.). Strong communication skills for working with technical and non-technical stakeholders. Prior BFSI/Banking domain experience is highly preferred .
Posted 6 days ago
7.0 - 12.0 years
7 - 17 Lacs
hyderabad
Work from Office
About this role: Wells Fargo is seeking a Principal Engineer In this role, you will: Act as an advisor to leadership to develop or influence applications, network, information security, database, operating systems, or web technologies for highly complex business and technical needs across multiple groups Lead the strategy and resolution of highly complex and unique challenges requiring in-depth evaluation across multiple areas or the enterprise, delivering solutions that are long-term, large-scale and require vision, creativity, innovation, advanced analytical and inductive thinking Translate advanced technology experience, an in-depth knowledge of the organizations tactical and strategic business objectives, the enterprise technological environment, the organization structure, and strategic technological opportunities and requirements into technical engineering solutions Provide vision, direction and expertise to leadership on implementing innovative and significant business solutions Maintain knowledge of industry best practices and new technologies and recommends innovations that enhance operations or provide a competitive advantage to the organization Strategically engage with all levels of professionals and managers across the enterprise and serve as an expert advisor to leadership Required Qualifications: 7+ years of Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Overall 16+ years' experience and Lead the Data technology transformation and implementation initiative for Corporate & Investment Banking. Strong experience with real-time low latency Data pipelines using Apache Flink. Strong experience with GCP-Cloud transformation and implementation, well versed with Big Query. Experience in cloud Migration from HDFS to GCP. Experience working with real-time low latency Data pipelines. Experience with Iceberg, Object Store (AWS S3/Minio) will be an added advantage. Well versed with Data Warehousing/Lakehouse methodologies. Strong communication and interpersonal skills. Data Environment Transformation/Simplification Lead the application transformation and rationalization initiatives Analyzes performance trends and recommends process improvements; assesses changes for risk to production systems and assures quality, security and compliance requirements Evaluate, incubate, adopt modern technologies and engineering practices and recommends innovations that provide competitive advantage to the organization Develop strategies to improve developer productivity and reduce technology debt Develop strategy to improve data quality and maintenance of data lineage Architecture Oversight Develops consistent architecture strategy and delivers safe, secure and consistent architecture solutions. Reduces technology risk by working closely with architect and design solutions that aligns to architecture roadmap, enterprise principles, policies and standards Partner with Enterprise cloud platform, CI/CD pipelines, platform teams, architects, engineering managers and the developer community. Job Expectations: Act as liaison between business and technical organization by planning, conducting, and directing the analysis of highly complex business problems to be solved with automated systems. Provides technical assistance in identifying, evaluating, and developing systems and procedures that are cost effective and meet business requirements Acts as an internal consultant within technology and business groups by using quality tools and process definition/improvement to re-engineer technical processes for greater efficiencies We are open for both the locations - Hyderabad or Bangalore and will be required to work in the office as per organizations In Office Adherence / Return to Office (RTO)
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a passionate and skilled Frontend Developer with over 5 years of experience, you will have the opportunity to contribute to various projects and products across different technology stacks. Your expertise in React, Next.js, AngularJS, JavaScript, and jQuery will be essential for this role, along with a strong understanding of PDF generation scripts and basic exposure to AWS S3 and Amplify for hosting. Your key responsibilities will include developing, maintaining, and enhancing frontend applications using React, Next.js, AngularJS, jQuery, and JavaScript. You will collaborate closely with backend teams to integrate APIs and deliver seamless user experiences. Additionally, you will work on dynamic PDF generation scripts, implement responsive UI designs, ensure cross-browser compatibility, and optimize applications for speed, scalability, and security. Code reviews, maintaining high coding standards, and troubleshooting production issues efficiently will also be part of your role. Basic knowledge of deploying and managing builds using AWS S3 and Amplify is required. To excel in this role, you should possess strong proficiency in React.js, Next.js, JavaScript (ES6+), AngularJS, jQuery, and experience with PDF generation scripts such as jsPDF, Puppeteer, PDFKit, or similar. Understanding RESTful APIs, data integration, and familiarity with Git or similar version control systems are also necessary. Knowledge of performance optimization, security best practices, and basic understanding of AWS S3 and AWS Amplify for application hosting are key qualifications. While not mandatory, experience with TypeScript, exposure to CI/CD pipelines, familiarity with Content Security Policy (CSP) implementations, and knowledge of SEO optimization in Next.js would be considered advantageous for this role.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
Our client, one of the United States" largest insurers, is looking for a highly organized and detail-oriented Testing Lead to join their team in developing a next-generation digital platform for the life insurance industry. The platform integrates various SaaS solutions using modern cloud-native architectures and APIs, and your role as a Testing Lead will be crucial in ensuring its quality, performance, and reliability. You will be part of a cross-functional, agile team of engineers, architects, and product leaders who are dedicated to revolutionizing the life insurance experience. Your responsibilities will include leading the end-to-end testing strategy for a complex, API and Data driven, cloud-native platform, designing and implementing testing frameworks, collaborating with different teams to ensure test coverage, and driving test automation initiatives and CI/CD pipeline integration. Key Responsibilities: - Lead the end-to-end testing strategy for a complex, cloud-native platform - Design and implement robust testing frameworks - Collaborate with product managers, developers, and DevOps - Analyze system and business requirements for testability - Work closely with cross-functional teams across multiple geographies - Drive test automation initiatives and CI/CD pipeline integration - Mentor junior QA engineers - Champion quality best practices and continuous improvement - Participate in Agile ceremonies such as PI planning, sprint reviews, and retrospectives - Report test plan execution status through defect tracking system - Conduct audits to ensure adherence to standards - Demonstrate working code to Product and Business Owners Requirements: - 7+ years of software testing experience with cloud/applications environment - BS in Computer Sciences, Technology, Engineering or similar - Full English fluency - Strong understanding of cloud-native architectures, microservices, and RESTful APIs - SAFe Practitioner or SAFe Agilist Certification within 180 days of hire - Certification around test frameworks highly desired - Experience in the insurance or financial services domain is a plus Competencies: - Excellent problem-solving skills - Ability to troubleshoot complex integration issues - Proficiency in using data and automation to improve application quality - Strong documentation skills - Excellent communication and organizational skills Software / Tools: - Hands-on experience with test automation tools (e.g., Selenium, Postman, JMeter) - Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions) - Familiarity with other cloud services (e.g., AWS S3, Lamda, MuleSoft, Cloudhub) Benefits: - Competitive compensation and benefits package - Career development and training opportunities - Flexible work arrangements - Dynamic and inclusive work culture - Private Health Insurance - Pension Plan - Paid Time Off - Training & Development About Capgemini: Capgemini is a global leader in partnering with companies to transform and manage their business through technology. With over 340,000 team members in more than 50 countries, Capgemini is trusted by its clients to address the entire breadth of their business needs.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
maharashtra
On-site
As a Data Engineer, you will be responsible for building scalable data pipelines using PySpark. Your role will involve implementing complex business logic using Spark SQL, DataFrame, and RDD APIs. You should have strong programming skills in Python, with a solid understanding of data structures, algorithms, and software engineering principles. Your expertise in designing, developing, and maintaining batch and streaming data pipelines will be crucial. You should be familiar with ETL/ELT processes and best practices for data transformation, data quality, and performance optimization. Knowledge of the modern data engineering ecosystem, including distributed data processing, storage systems, and workflow orchestration tools like Apache Airflow, dbt, and Delta Lake, is desirable. Experience with cloud data platforms, preferably AWS, is preferred. You should have hands-on experience with AWS services such as S3 for data lake, Glue/EMR for Spark workloads, Lambda, Step Functions for orchestration, and Redshift or other cloud data warehouses. As an expert in Spark APIs, you should be able to choose and apply the right APIs (DataFrame, Dataset, RDD) for efficient implementation of business logic at scale. This role offers a 12+ month contract with a likely long-term opportunity, following a hybrid work mode with an immediate to 15 days notice period. If you have a passion for data engineering and the skills mentioned above, we would like to hear from you.,
Posted 1 week ago
0.0 - 4.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Software Developer intern at URE Legal Advocates, you will be responsible for designing, developing, and deploying a cross-platform employee monitoring tool that includes screenshot capture, activity tracking, and timesheet monitoring features for Windows and macOS. You must have a strong foundation in Computer Science/Software Engineering, with expertise in algorithms, system design, distributed computing, operating systems, and secure data handling. Knowledge of DPDP Act, 2023 compliance, and global privacy frameworks such as GDPR and HIPAA is mandatory. Your key responsibilities will include: - Building cross-platform desktop applications for Windows & MacOS. - Implementing automatic screenshot capture, timesheet monitoring, idle detection, and productivity logging. - Designing modular, scalable, and extensible code architecture. - Creating Windows installers (MSI/EXE) and Mac installers (DMG). - Ensuring data protection compliance, including consent-based data collection, encryption, and user controls for data access, correction, and erasure. - Integrating with cloud APIs for secure storage and backup. - Conducting testing, maintenance, and optimization for performance and memory consumption. - Delivering fully functional software, real-time monitoring dashboard, compliance architecture, documentation, and regular software updates. Your core technical skills should include proficiency in programming languages such as C, C++, C#, Python, Java, Swift, Objective-C, cross-platform development frameworks, installer creation tools, system-level programming, database management, cloud services, security, compliance, DevOps, and deployment. You should also demonstrate mastery in areas like data structures & algorithms, operating systems, database systems, computer networks, software engineering principles, distributed systems, compiler design, cybersecurity, and privacy laws. Preferred qualifications include a degree in computer science/software engineering from a reputable university, previous experience with monitoring software, and publications/projects related to secure systems or distributed computing. Join us at URE Legal Advocates and be part of a dynamic team working on cutting-edge software solutions for employee monitoring and timesheet management.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
The primary purpose of your role is to design and construct high-performance trading systems and data infrastructure for Nuvama's capital markets operations. Your responsibilities will include building trading execution systems, market data pipelines, backtesting frameworks, and collaborating with traders to develop custom solutions. It is crucial to ensure ultra-low latency execution, high data accuracy, and system uptime to meet the performance metrics and deliver efficient trading solutions. As a qualified candidate, you are required to hold a Bachelor's/Master's degree in Computer Science, Engineering, Mathematics, Physics, or Quantitative Finance. Additionally, you should possess 2-5 years of hands-on experience in quantitative finance or financial technology, with recent exposure to equity markets and trading systems. Technical certifications in AWS, Databricks, or financial industry certifications are preferred qualifications for this role. Your technical competencies should include expertise in programming languages like PySpark, Scala, Rust, C++, and Java, along with proficiency in Python ecosystem tools for quantitative analysis. You must also have experience in data engineering, system design principles, and developing trading systems from scratch. Knowledge of financial markets, trading mechanics, and algorithmic trading strategy development is essential for this position. In terms of behavioral competencies, you should demonstrate technical leadership, innovation, collaboration with stakeholders, and a focus on project execution and delivery. Your ability to understand market dynamics, regulatory requirements, and continuously adapt to market evolution will be critical for success in this role. Moreover, staying current with technology advancements in quantitative finance, data engineering, and trading technology is essential for continuous learning and improvement. Overall, your role will involve designing scalable trading systems, implementing real-time data infrastructure, and collaborating with traders to optimize trading execution and risk management platforms. Your technical expertise, market knowledge, and behavioral competencies will be key to achieving high performance and operational efficiency in Nuvama's capital markets operations.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As an AWS Senior Data Engineer at EY, you will play a crucial role in designing, developing, and maintaining scalable data pipelines for data ingestion and processing using Python, Spark, and various AWS services. With a minimum of 5+ years of experience, you will be responsible for ensuring data quality, optimizing data storage solutions, and collaborating with cross-functional teams to deliver solutions that meet business needs. Your key responsibilities will include working with on-prem Oracle databases, batch files, and Confluent Kafka for data sourcing, implementing ETL processes using AWS Glue and EMR for batch and streaming data, and developing data storage solutions using Medallion Architecture in S3, Redshift, and Oracle. You will also monitor and optimize data workflows using Airflow, ensure data quality and integrity throughout the data lifecycle, and implement CI/CD practices for data pipeline deployment using Terraform. Technical skills required for this role include proficiency in Python, strong experience with Apache Spark, familiarity with AWS S3, experience with Kafka for real-time data streaming, knowledge of Redshift for data warehousing solutions, and proficiency in Oracle databases. Additionally, experience with AWS Glue, Apache Airflow, EMR, and strong AWS data engineering skills are mandatory. Good additional skills that would be beneficial for this role include familiarity with Terraform for infrastructure as code, experience with messaging services such as SNS and SQS, knowledge of monitoring and logging tools like CloudWatch, Datadog, and Splunk, as well as experience with AWS DataSync, DMS, Athena, and Lake Formation. Effective communication skills, both verbal and written, are mandatory for successful collaboration with team members and stakeholders. By joining EY, you will be part of a team that is dedicated to building a better working world by leveraging data, AI, and advanced technology to help clients shape the future with confidence and address the most pressing issues of today and tomorrow.,
Posted 1 week ago
7.0 - 12.0 years
11 - 21 Lacs
chennai
Work from Office
Analyze business and data requirements within the Life Sciences domain , including clinical, regulatory, and commercial data use cases. Design and execute comprehensive ETL test plans and test cases for data pipelines involving AWS S3, Redshift, Snowflake, and Databricks . Validate complex data transformations, aggregations, and data quality rules across data layers. Perform data reconciliation between source systems (clinical data feeds, trial management systems, CSVs on S3, etc.) and downstream targets. Write and run advanced SQL queries for large-scale data validation in Redshift and Snowflake. Validate Power BI and Tableau dashboards for data accuracy, visualization consistency, and business rule implementation. Collaborate with data engineers, clinical data managers, and business analysts to ensure alignment of testing with domain expectations.
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
The ideal candidate for this position should have 5-10 years of experience in ETL development with Java & Spark. They should possess strong expertise in Redshift, AWS S3, and SQL, along with experience in developing microservices. Proficiency in Lambda expressions, Pyspark, and hands-on experience with AWS services is essential. The candidate must have the capability to lead the delivery for distributed teams and should be well-versed in data modeling, design, and SQL development. Knowledge of systems development life cycle, project management approaches, and testing techniques is also required. Purview is a prominent Digital Cloud & Data Engineering company headquartered in Edinburgh, United Kingdom, with a presence in 14 countries, including India, Poland, Germany, USA, UAE, Singapore, and Australia. The company offers services to Captive Clients and top IT tier 1 organizations, providing fully managed solutions and co-managed capacity models. Purview has a strong market presence in the UK, Europe, and APEC regions. For further details, you can contact Purview at the following locations: India Office: 3rd Floor, Sonthalia Mind Space Near Westin Hotel, Gafoor Nagar Hitechcity, Hyderabad Phone: +91 40 48549120 / +91 8790177967 UK Office: Gyleview House, 3 Redheughs Rigg, South Gyle, Edinburgh, EH12 9DQ. Phone: +44 7590230910 Email: careers@purviewservices.com Login to Apply!,
Posted 1 week ago
3.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
Greetings from LTIMindtree!! We are currently seeking an AWS Data Engineer to join our team. The ideal candidate should have a minimum of 3 years of experience in IT, with at least 3 years dedicated to working as an AWS Data Engineer. This position is based in Pune or Hyderabad. The key technologies and tools that you will be working with include AWS S3, Glue, Lambda, Python, and PySpark. If you have the relevant experience and skills, we encourage you to share your updated profile with us at madhuvanthi.s@ltimindtree.com. Looking forward to potentially welcoming you to our team!,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
vellore, tamil nadu
On-site
As a Data Engineer, you will be responsible for designing, developing, and optimizing data pipelines and ETL workflows using AWS Glue, AWS Lambda, and Apache Spark. Your role will involve implementing big data processing solutions utilizing AWS EMR and AWS Redshift. You will also be tasked with developing and maintaining data lakes and data warehouses on AWS, including S3, Redshift, and RDS. Ensuring data quality, integrity, and governance will be a key aspect of your responsibilities, which will be achieved through leveraging AWS Glue Data Catalog and AWS Lake Formation. It will be essential for you to optimize data storage and processing for both performance and cost efficiency. Working with structured, semi-structured, and unstructured data across various storage formats such as Parquet, Avro, and JSON will be part of your daily tasks. Automation and orchestration of data workflows using AWS Step Functions and Apache Airflow will also fall within your scope of work. You will be expected to implement best practices for CI/CD pipelines in data engineering with AWS CodePipeline and AWS CodeBuild. Monitoring, troubleshooting, and optimizing data pipeline performance and scalability will be critical to ensuring smooth operations. Collaborating with cross-functional teams, including data scientists, analysts, and software engineers, will be necessary to drive successful outcomes. Your role will require a minimum of 6 years of experience in data engineering and big data processing. Proficiency in AWS cloud services like AWS Glue, AWS Lambda, AWS Redshift, AWS EMR, and S3 is paramount. Strong skills in Python for data engineering tasks, hands-on experience with Apache Spark and SQL, as well as knowledge of data modeling, schema design, and performance tuning are essential. Understanding AWS Lake Formation and Lakehouse principles, experience with version control using Git, and familiarity with CI/CD pipelines are also required. Knowledge of data security, compliance, and governance best practices is crucial. Experience with real-time streaming technologies such as Kafka and Kinesis will be an added advantage. Strong problem-solving, analytical, and communication skills are key attributes for success in this role.,
Posted 1 week ago
5.0 - 8.0 years
17 - 18 Lacs
bengaluru
Hybrid
Hi all, We are hiring for the role Senior Java Developer Middleware & AWS Cloud Experience: 5+ Years Location: Bangalore Notice Period: Immediate - 15 Days Job Summary: We are looking for a Java Developer with strong expertise in middleware technologies, secure file transfer, and cloud integration. The ideal candidate will lead the development of robust, scalable microservices using Java and Spring Boot, while ensuring secure and efficient data exchange across systems using encryption and AWS services. Key Responsibilities: Managing the complete software development process from conception to deployment Lead the development team and provide guidance on building end-to-end systems optimized for speed and scalability Involved in the entire product development lifecycle including the design, development, testing, deployment and maintenance of new and existing features. Work across the full stack, building highly scalable distributed solutions that enable positive user experiences and measurable business growth Mandatory Technical Skills: • Java & Spring Boot advanced proficiency in building scalable, secure microservices • Middleware technologies integration patterns and message routing • AES & PGP encryption for secure data exchange • Linux, SFTP, AWS S3 – file transfer and storage concepts • AWS IAM – access control and identity management • Shell scripting – automation and system-level scripting • RESTful API – design, development, and integration • AWS Certification – Associate or Professional level If you are interested drop your resume at mojesh.p@acesoftlabs.com Call: 9701971793
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
indore, madhya pradesh
On-site
As a DevOps Engineer, you will be responsible for managing server infrastructures and applications in production environments, focusing on delivering secure, scalable, and high-performance solutions. The ideal candidate should have a strong background in PHP, Laravel, and MSSQL, with a minimum of 6 years of experience in this field. Your key responsibilities will include managing PHP/NODE servers and application deployments, demonstrating proficiency in XAMPP/WAMP/LAMPP servers, and possessing knowledge of MySQL and MSSQL databases, including backup and restore processes. You will also be involved in implementing and managing CI/CD pipelines using Git and other tools, deploying Laravel applications to production environments, and configuring IP tunnelling, firewalls, and WLAN. Additionally, server management using platforms such as cPanel, Apanel, and WHM will be part of your daily tasks. In addition to the essentials, having the following skills will be considered good to have: creating virtual servers in the cloud and securely managing connections, building and setting up CI/CD pipelines for Laravel applications, managing and optimizing Nginx, FastCGI, FPM, and workers, running queues and workers in production environments, managing domains and HTTPS with Nginx, and backing up and restoring processes with AWS S3. If you possess advanced skills, it would be desirable to have experience in dockerizing Laravel projects and deploying containers to production, using docker-compose and managing Docker Swarm clusters, proficiency in Kubernetes for running Laravel APIs, SPAs, and workers in a cluster, monitoring and error tracking with tools like Grafana and Fluentbit, serverless deployments with AWS Lambda and API Gateway, and experience with DigitalOcean Kubernetes clusters and other PaaS solutions.,
Posted 1 week ago
5.0 - 7.0 years
12 - 20 Lacs
pune
Work from Office
About the Role We are looking for an experienced Data Engineer to lead the migration of our data platform from Amazon Redshift to Snowflake. This role involves re-engineering existing data logic, building efficient pipelines, and ensuring seamless performance optimization in Snowflake. Key Responsibilities Analyze and extract existing data logic, queries, and transformations from Redshift. Rewrite and optimize SQL queries and data transformations in Snowflake. Design and implement ETL/data pipelines to migrate and sync data (S3 to Snowflake using Snowpipe, bulk copy, etc.). Ensure high performance through Snowflake-specific optimizations (clustering, caching, warehouse scaling). Collaborate with cross-functional teams to validate data accuracy and business requirements. Monitor, troubleshoot, and improve ongoing data workflows. Required Skills & Experience 5 - 8 years of experience in Data Engineering Strong SQL expertise in both Redshift and Snowflake. Proven experience in data migration projects, specifically Redshift to Snowflake. Hands-on experience with ETL/data pipeline development (using Python, Airflow, Glue, dbt, or similar tools). Solid understanding of AWS ecosystem, particularly S3 to Snowflake ingestion. Experience in performance tuning and optimization within Snowflake. Strong problem-solving skills and ability to work independently. Nice to Have Experience with dbt, Airflow, AWS Glue, or other orchestration tools. Knowledge of modern data architecture and best practices. Work Mode: Initially remote for 12 months, then onsite in Pune.
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
kolkata, west bengal
On-site
StockEdge is seeking enthusiastic Senior Software Engineers to join our experienced software development team. As a member of our dynamic team, you will collaborate closely with the Development Manager throughout all phases of the Software Development Life Cycle (SDLC). Your main objective will be to address real-world challenges by utilizing your technical expertise to positively impact millions of users. The ideal candidate should possess a solid understanding of programming languages, effective problem-solving abilities, strong analytical skills, and a keen aptitude for rapid learning. A team-oriented approach is essential, as you will be responsible for efficiently and punctually resolving problems. Key Responsibilities: - Collaborating with the development team across all Software Development activities. - Contributing to the Company's Vision and Mission. - Daily code writing and reviewing tasks. - Continuous learning of new technologies to enhance team capabilities and product advancement. - Resolving critical and impactful issues to ensure seamless delivery. - Monitoring the technical performance of internal systems. - Addressing and resolving user issues promptly. Skills Required: Non-Negotiable Skills: - ASP.Net Core, MVC, LINQ, SQL Must-Have Skills: - .Net Core, NOSQL, AWS EC2 & S3 Should Have Skills: - Experience in Microservice Architecture, Team Handling & Code Review Nice To Have Skills: - AWS ECS, Kubernetes & Docker Join our team at StockEdge and be a part of our innovative approach to Equity markets and mutual fund research.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
The Machine Learning Engineer position based in GGN requires a professional with 6-9 years of experience. The ideal candidate should possess expertise in Spark, SQL, Python/Scala, AWS EMR, AWS S3, ML Life Cycle Management, and Machine Learning Operations (ML Ops). Additionally, experience with Airflow or any other orchestrator is considered a good to have qualification. Experience with Kafka, Spark Streaming, Datadog, and Kubernetes are also valued assets for this role. If you meet these qualifications and are passionate about machine learning, this position could be an excellent fit for you.,
Posted 1 week ago
2.0 - 5.0 years
3 - 7 Lacs
itanagar
Remote
Job Title : Databricks Developer (Contract) Contract Duration : 4 Months +Exendible based on Performance Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 4+ Years Job Description We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. Work in multi-cloud environments including AWS, Azure, and GCP. Implement workflow orchestration using Airflow or similar frameworks. Design, implement, and manage data warehouse solutions, schema evolution, and data versioning. Collaborate with cross-functional teams to deliver high-quality data solutions. Required Skills & Experience 4+ years of hands-on experience in Databricks, Python, Spark (PySpark), DBT, and AWS data services. Strong experience with SQL and large-scale datasets. Hands-on exposure to multi-tenant environments (AWS/Azure/GCP). Knowledge of data modeling, data warehouse design, and best practices. Good understanding of workflow orchestration tools like Airflow.
Posted 1 week ago
6.0 - 8.0 years
12 - 18 Lacs
bengaluru
Hybrid
AWS Lambda, AWS EC2, AWS S3, RESTful APIs, Java, REST API
Posted 1 week ago
3.0 - 8.0 years
10 - 20 Lacs
hyderabad, chennai
Hybrid
Roles & Responsibilities : • We are looking for a strong Senior Data Engineering who will be majorly responsible for designing, building and maintaining ETL/ ELT pipelines . • Integration of data from multiple sources or vendors to provide the holistic insights from data. • You are expected to build and manage Data Lake and Data warehouse solutions, design data models, create ETL processes, implementing data quality mechanisms etc. • Perform EDA (exploratory data analysis) required to troubleshoot data related issues and assist in the resolution of data issues. • Should have experience in client interaction oral and written. • Experience in mentoring juniors and providing required guidance to the team. Required Technical Skills • Extensive experience in languages such as Python, Pyspark, SQL (basics and advanced). • Strong experience in Data Warehouse, ETL, Data Modelling, building ETL Pipelines, Data Architecture . • Must be proficient in Redshift, Azure Data Factory, Snowflake etc. • Hands-on experience in cloud services like Azure , AWS S3, Glue, Lambda, CloudWatch, Athena etc. • Good to have knowledge in Dataiku, Big Data Technologies and basic knowledge of BI tools like Power BI, Tableau etc will be plus. • Sound knowledge in Data management, data operations, data quality and data governance. • Knowledge of SFDC, Waterfall/ Agile methodology. • Strong knowledge of Pharma domain / life sciences commercial data operations. Qualifications • Bachelors or masters Engineering/ MCA or equivalent degree. • 4-6 years of relevant industry experience as Data Engineer . • Experience working on Pharma syndicated data such as IQVIA, Veeva, Symphony; Claims, CRM, Sales, Open Data etc. • High motivation, good work ethic, maturity, self-organized and personal initiative. • Ability to work collaboratively and providing the support to the team. • Excellent written and verbal communication skills. • Strong analytical and problem-solving skills. Location • Preferably Hyderabad/ Chennai, India
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |