Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
haryana
On-site
As a Frontend Engineer at our company, you will play a crucial role in developing high-quality frontend applications that deliver exceptional user experiences. You should excel in a dynamic work environment, demonstrate proficiency in writing efficient code, and leverage AI/automation tools to enhance productivity. In addition to optimizing frontend performance, you will be responsible for creating real-time user interfaces and providing guidance to junior team members. Your responsibilities will include translating product requirements into scalable and responsive frontend applications, utilizing React.js to build high-performance user interfaces, and implementing real-time messaging capabilities through technologies like WebSockets. As a leader in the team, you will conduct code reviews, mentor junior developers, and contribute to architectural decisions and technical implementations. Furthermore, you will oversee the entire process of frontend component delivery, from development to deployment, ensuring effective monitoring and error tracking. The ideal candidate for this role should have a minimum of 3-6 years of experience in frontend development, preferably in SaaS or high-scale applications. You must possess strong expertise in TypeScript, React.js ecosystem, and state management tools such as Redux and Zustand. Proficiency in real-time communication technologies, PWA development, responsive design, and modern frontend tooling is essential. Experience with deployment pipelines, AWS CloudFront/S3 hosting, and frontend security practices is also required. Having an AI-first mindset, familiarity with Git workflows, and the ability to thrive in a fast-paced startup environment are additional qualifications that we value. If you are self-driven, proactive, and passionate about delivering high-quality frontend solutions, we encourage you to apply for this exciting opportunity.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Agivant is looking for a skilled and dedicated Senior Data Engineer to join the expanding data team. As a Senior Data Engineer at Agivant, you will have a vital role in constructing and expanding our data infrastructure to facilitate data-informed decision-making throughout the organization. Your primary responsibilities will include designing, developing, and managing efficient and dependable data pipelines for both ELT (Extract, Load, Transform) and ETL (Extract, Transform, Load) processes. Your duties will involve collaborating with stakeholders to comprehend data requirements and transforming them into effective data models and pipelines. Additionally, you will be tasked with constructing and fine-tuning data pipelines utilizing a range of technologies such as Elastic Search, AWS S3, Snowflake, and NFS. Furthermore, you will be responsible for establishing and managing data warehouse schemas and ETL/ELT processes to cater to business intelligence and analytical needs. It will be your responsibility to introduce data quality checks and monitoring mechanisms to ensure data integrity and identify potential issues. You will also work closely with data scientists and analysts to guarantee data accessibility and usability for various analytical purposes. Staying informed about industry best practices, CI/CD/DevSecFinOps, Scrum, and emerging data engineering technologies is also essential. Your contribution to advancing and refining our data warehouse architecture will be invaluable. ### Responsibilities: - Design, develop, and maintain robust and scalable data pipelines for ELT and ETL processes, ensuring data accuracy, completeness, and timeliness. - Collaborate with stakeholders to understand data requirements and translate them into efficient data models and pipelines. - Build and optimize data pipelines using technologies like Elastic Search, AWS S3, Snowflake, and NFS. - Develop and maintain data warehouse schemas and ETL/ELT processes to support business intelligence and analytics needs. - Implement data quality checks and monitoring to ensure data integrity and identify potential issues. - Work closely with data scientists and analysts to ensure data accessibility and usability for analytical purposes. - Stay current with industry best practices, CI/CD/DevSecFinOps, Scrum, and emerging data engineering technologies. - Contribute to the development and enhancement of our data warehouse architecture. ### Requirements: #### Mandatory: - Bachelor's degree in Computer Science, Engineering, or a related field. - 5+ years of experience as a Data Engineer focusing on ELT/ETL processes. - At least 3+ years of experience in Snowflake data warehousing technologies. - At least 3+ years of experience in creating and maintaining Airflow ETL pipelines. - Minimum 3+ years of professional-level experience with Python for data manipulation and automation. - Working experience with Elastic Search and its application in data pipelines. - Proficiency in SQL and experience with data modeling techniques. - Strong understanding of cloud-based data storage solutions like AWS S3. - Experience working with NFS and other file storage systems. - Excellent problem-solving and analytical skills. - Strong communication and collaboration skills.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Engineer (Snowflake) at our new-age, AI-first Digital & Cloud Engineering Services company, you will have the opportunity to play a crucial role in building and scaling our data infrastructure. Your primary responsibility will be to design, develop, and maintain efficient and reliable data pipelines for both ELT (Extract, Load, Transform) and ETL (Extract, Transform, Load) processes. By collaborating with stakeholders, you will translate data requirements into efficient data models and pipelines to facilitate data-driven decision-making across the organization. Your key responsibilities will include designing, developing, and maintaining robust and scalable data pipelines for ELT and ETL processes, ensuring data accuracy, completeness, and timeliness. Additionally, you will work closely with stakeholders to understand data requirements and build optimized data pipelines using technologies such as Elastic Search, AWS S3, Snowflake, and NFS. You will also be responsible for implementing data warehouse schemas and ETL/ELT processes to support business intelligence and analytics needs, ensuring data integrity through data quality checks and monitoring. To be successful in this role, you must possess a Bachelor's degree in Computer Science, Engineering, or a related field, along with 5+ years of experience as a Data Engineer focusing on ELT/ETL processes. You should have at least 3+ years of experience with Snowflake data warehousing technologies and creating/maintaining Airflow ETL pipelines. Proficiency in Python for data manipulation and automation, as well as experience with Elastic Search, SQL, and cloud-based data storage solutions like AWS S3, are essential requirements. Moreover, staying updated with industry best practices, CI/CD/DevSecFinOps, Scrum, and emerging technologies in data engineering will be crucial. Your strong problem-solving and analytical skills, coupled with excellent communication and collaboration abilities, will enable you to work effectively with data scientists, analysts, and other team members to ensure data accessibility and usability for various analytical purposes. If you are a passionate and talented Data Engineer with a keen interest in Snowflake, data pipelines, and data-driven decision-making, we invite you to join our growing data team and contribute to the development and enhancement of our data warehouse architecture.,
Posted 3 weeks ago
5.0 - 8.0 years
20 - 25 Lacs
pune
Work from Office
5 + years experience with JAVA(Core Java, J2EE, Spring Boot RESTful Services), Python, Web services (REST, SOAP), XML, Java Script, Micro services, SOA etc Knowledge on technologies like ELK, Docker, Kubernetes, Azure Cloud, AWS S3 etc Knowledge on NOSQL Databases like MongoDB, Hbase, Cassandra etc. Knowledge in version control systems (e.g Git) and CI/CD pipelines. Working experience with financial application / Finance processes is a plus Extensive years of working in a multi-cultural environment delivering results with virtual teams. A well-diversified background with a successful track record of leadership, experience, and performance Strong problem-solving skills with the ability to work independently, multi-task, and take ownership of various analyses or reviews
Posted 3 weeks ago
6.0 - 11.0 years
17 - 30 Lacs
kolkata, hyderabad/secunderabad, bangalore/bengaluru
Hybrid
Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts
Posted 3 weeks ago
6.0 - 11.0 years
17 - 30 Lacs
kolkata, hyderabad/secunderabad, bangalore/bengaluru
Hybrid
Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts
Posted 3 weeks ago
6.0 - 10.0 years
15 - 20 Lacs
bengaluru
Work from Office
Role & responsibilities Design, develop, and deploy scalable applications using Java and AWS services. Build and maintain serverless architectures using AWS Lambda. Manage and optimize storage solutions with AWS S3. Develop and integrate RESTful APIs and Java-based APIs. Collaborate with cross-functional teams to deliver robust, secure, and high-performance solutions. Ensure adherence to best practices in coding, testing, and deployment. Preferred candidate profile Strong programming skills in Java. Hands-on experience with AWS services including Lambda and S3. Proficiency in RESTful API development. Good understanding of cloud architecture and best practices. Strong problem-solving and communication skills. 6+ years of full-time experience. PF (Provident Fund) account. Face-To-face Interview Preferred candidate profile
Posted 3 weeks ago
7.0 - 10.0 years
30 - 32 Lacs
hyderabad, pune, coimbatore
Work from Office
Job Overview: We are seeking a highly skilled Senior Developer with expertise in Node.js, Python, and Azure, and proven experience in cloud migration from AWS to Azure. The ideal candidate will play a key role in migrating applications, performing code refactoring, and configuring applications for seamless deployment on Azure platforms including AKS, Azure Functions, and other PaaS or serverless services. Key Responsibilities: Lead the migration of Node.js and Python applications from AWS to Azure. Analyze existing AWS architecture, source code, and service dependencies. Identify and implement necessary code remediations and configuration changes. Refactor applications to integrate with Azure services including Blob Storage, Azure Functions, and AKS. Perform unit testing, provide application testing support, and troubleshoot issues in the Azure environment. Develop and maintain deployment pipelines, including scripts and CI/CD integration for containerized and serverless applications. Ensure seamless integration with Azure App Services, APIM, and microservices architecture using Kubernetes and Helm charts. Work with AWS and Azure SDKs for cloud-native development and migration. Required Skills & Experience: 8+ years of experience in application development using Node.js and Python. Strong hands-on experience in developing and deploying applications on both AWS and Azure. Demonstrated expertise in AWS to Azure cloud migration. In-depth knowledge of Azure services like AKS, Azure Functions, Blob Storage, and App Services. Familiarity with Azure PaaS and serverless architecture for application hosting. Solid experience with containerized applications, Kubernetes, Helm charts, and microservices. Proficient in writing and maintaining deployment scripts and CI/CD pipelines. Hands-on experience in unit testing, debugging, and application support in the Azure environment. Technical Stack: Node.js REST APIs Python (Serverless - AWS Lambda to Azure Functions) Confluent Kafka with AWS S3 Sync Connector Azure Blob Storage AWS Lambda Azure Functions Migration S3 Azure Blob Storage migration AWS to Azure SDK Conversion (Mandatory) Location - Hyderabad Bangalore/ Coimbatore/ Pune
Posted 3 weeks ago
3.0 - 8.0 years
10 - 20 Lacs
hyderabad, chennai
Hybrid
Roles & Responsibilities : • We are looking for a strong Senior Data Engineering who will be majorly responsible for designing, building and maintaining ETL/ ELT pipelines . • Integration of data from multiple sources or vendors to provide the holistic insights from data. • You are expected to build and manage Data Lake and Data warehouse solutions, design data models, create ETL processes, implementing data quality mechanisms etc. • Perform EDA (exploratory data analysis) required to troubleshoot data related issues and assist in the resolution of data issues. • Should have experience in client interaction oral and written. • Experience in mentoring juniors and providing required guidance to the team. Required Technical Skills • Extensive experience in languages such as Python, Pyspark, SQL (basics and advanced). • Strong experience in Data Warehouse, ETL, Data Modelling, building ETL Pipelines, Data Architecture . • Must be proficient in Redshift, Azure Data Factory, Snowflake etc. • Hands-on experience in cloud services like Azure , AWS S3, Glue, Lambda, CloudWatch, Athena etc. • Good to have knowledge in Dataiku, Big Data Technologies and basic knowledge of BI tools like Power BI, Tableau etc will be plus. • Sound knowledge in Data management, data operations, data quality and data governance. • Knowledge of SFDC, Waterfall/ Agile methodology. • Strong knowledge of Pharma domain / life sciences commercial data operations. Qualifications • Bachelors or masters Engineering/ MCA or equivalent degree. • 4-6 years of relevant industry experience as Data Engineer . • Experience working on Pharma syndicated data such as IQVIA, Veeva, Symphony; Claims, CRM, Sales, Open Data etc. • High motivation, good work ethic, maturity, self-organized and personal initiative. • Ability to work collaboratively and providing the support to the team. • Excellent written and verbal communication skills. • Strong analytical and problem-solving skills. Location • Preferably Hyderabad/ Chennai, India
Posted 3 weeks ago
6.0 - 8.0 years
10 - 12 Lacs
noida
Work from Office
Full-stack developer with 6-8 years of experience in designing and developing robust, scalable, and maintainable applications applying Object Oriented Design principles. Strong experience in Spring frameworks like Spring Boot, Spring Batch, Spring Data etc. and Hibernate, JPA. Strong experience in micro services architecture and implementation Strong knowledge of HTML, CSS and JavaScript, Angular Experience with SOAP Web-Services, REST Web-Services and Java Messaging Service (JMS) API. Familiarity designing, developing, and deploying web applications using Amazon Web Services (AWS). Good experience on AWS Services - S3, Lambda, SQS, SNS, Dynamo DB, IAM, API Gateways Hands on experience in SQL, PL/SQL and should be able to write complex queries. Hands-on experience in REST-APIs Experience with version control systems (e.g., GIT) Knowledge of web standards and accessibility guidelines Knowledge of CI/CD pipelines and experience in tools such as JIRA, Splunk, SONAR etc. Must have strong analytical and problem-solving abilities Good experience in JUnit testing and mocking techniques Experience in SDLC processes (Waterfall/Agile), Docker, GIT, Sonar Qube Excellent communication and interpersonal skills, Ability to work independently and as part of a team. Mandatory Competencies Programming Language - Java - Core Java (java 8+) Programming Language - Java Full Stack - HTML/CSS Programming Language - Java - Spring Framework Programming Language - Java - Hibernate Programming Language - Java Full Stack - JavaScript Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Git Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Docker Beh - Communication and collaboration Cloud - AWS - AWS S3, S3 glacier, AWS EBS Database - Oracle - PL/SQL Packages Development Tools and Management - Development Tools and Management - CI/CD Programming Language - Java Full Stack - Angular Material Programming Language - Java Full Stack - Spring Framework Middleware - Java Middleware - Springboot Middleware - API Middleware - Microservices Middleware - API Middleware - WebServies (REST, SOAP) Middleware - API Middleware - API (SOAP, REST) Agile - Agile - SCRUM Database - Sql Server - SQL Packages
Posted 4 weeks ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
At PwC, the focus in data and analytics engineering is on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. You play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will concentrate on designing and building data infrastructure and systems to enable efficient data processing and analysis. Your responsibilities include developing and implementing data pipelines, data integration, and data transformation solutions. As an AWS Architect / Manager at PwC - AC, you will interact with Offshore Manager/Onsite Business Analyst to understand the requirements and will be responsible for end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake and Data hub in AWS. Strong experience in AWS cloud technology is required, along with planning and organization skills. You will work as a cloud Architect/lead on an agile team and provide automated cloud solutions, monitoring the systems routinely to ensure that all business goals are met as per the Business requirements. **Position Requirements:** **Must Have:** - Experience in architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions - Strong expertise in the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, Big Data model techniques using Python / Java - Design scalable data architectures with Snowflake, integrating cloud technologies (AWS, Azure, GCP) and ETL/ELT tools such as DBT - Guide teams in proper data modeling (star, snowflake schemas), transformation, security, and performance optimization - Experience in load from disparate data sets and translating complex functional and technical requirements into detailed design - Deploying Snowflake features such as data sharing, events, and lake-house patterns - Experience with data security and data access controls and design - Understanding of relational as well as NoSQL data stores, methods, and approaches (star and snowflake, dimensional modeling) - Good knowledge of AWS, Azure, or GCP data storage and management technologies such as S3, Blob/ADLS, and Google Cloud Storage - Proficient in Lambda and Kappa Architectures - Strong AWS hands-on expertise with a programming background preferably Python/Scala - Knowledge of Big Data frameworks and related technologies with experience in Hadoop and Spark - Strong experience in AWS compute services like AWS EMR, Glue, and Sagemaker and storage services like S3, Redshift & Dynamodb - Experience with AWS Streaming Services like AWS Kinesis, AWS SQS, and AWS MSK - Troubleshooting and Performance tuning experience in Spark framework - Spark core, Sql, and Spark Streaming - Experience in flow tools like Airflow, Nifi, or Luigi - Knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build, and Code Commit - Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules - Understanding of Cloud data migration processes, methods, and project lifecycle - Business/domain knowledge in Financial Services/Healthcare/Consumer Market/Industrial Products/Telecommunication, Media and Technology/Deal advisory along with technical expertise - Experience in leading technical teams, guiding and mentoring team members - Analytical & problem-solving skills - Communication and presentation skills - Understanding of Data Modeling and Data Architecture **Desired Knowledge/Skills:** - Experience in building stream-processing systems using solutions such as Storm or Spark-Streaming - Experience in Big Data ML toolkits like Mahout, SparkML, or H2O - Knowledge in Python - Certification on AWS Architecture desirable - Worked in Offshore/Onsite Engagements - Experience in AWS services like STEP & Lambda - Project Management skills with consulting experience in Complex Program Delivery **Professional And Educational Background:** BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA **Minimum Years Experience Required:** Candidates with 8-12 years of hands-on experience **Additional Application Instructions:** Add here and change text color to black or remove bullet and section title if not applicable.,
Posted 4 weeks ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
MongoDB's mission is to empower innovators to create, transform, and disrupt industries by unleashing the power of software and data. We enable organizations of all sizes to easily build, scale, and run modern applications by helping them modernize legacy workloads, embrace innovation, and unleash AI. Our industry-leading developer data platform, MongoDB Atlas, is the only globally distributed, multi-cloud database and is available in more than 115 regions across AWS, Google Cloud, and Microsoft Azure. Atlas allows customers to build and run applications anywhereon premises, or across cloud providers. With offices worldwide and over 175,000 new developers signing up to use MongoDB every month, it's no wonder that leading organizations, like Samsung and Toyota, trust MongoDB to build next-generation, AI-powered applications. As a Senior Analytics Engineer at MongoDB, you will play a critical role in leveraging data to drive informed decision-making and simplify end user engagement across our most critical data sets. You will be responsible for designing, developing, and maintaining robust analytics solutions, ensuring data integrity, and enabling data-driven insights across all of MongoDB. This role requires an analytical thinker with strong technical expertise to contribute to the growth and success of the entire business. This role can be based out of Gurugram. Responsibilities - Design, implement, and maintain highly performant data post-processing pipelines - Create shared data assets that will act as the company's source-of-truth for critical business metrics - Partner with analytics stakeholders to curate analysis-ready datasets and augment the generation of actionable insights - Partner with data engineering to expose governed datasets to the rest of the organization - Make impactful contributions to our analytics infrastructure, systems, and tools - Create and manage documentation, and conduct knowledge sharing sessions to proliferate tribal knowledge and best practices - Maintain consistent planning and tracking of work in JIRA tickets Skills & Attributes - Bachelor's degree (or equivalent) in mathematics, computer science, information technology, engineering, or related discipline - 3-5 years of relevant experience - Strong Proficiency in SQL and experience working with relational databases - Solid understanding of data modeling and ETL processes - Proficiency in Python for automation, data manipulation, and analysis - Experience managing ETL and data pipeline orchestration with dbt and Airflow - Comfortable with command line functions - Familiarity with Hive, Trino (Presto), SparkSQL, Google BigQuery - Experience with cloud data storage like AWS S3, GCS - Experience with managing codebases with git - Consistently employs CI/CD best practices - Experience translating project requirements into a set of technical sub-tasks that build towards a final deliverable - Experience combining data from disparate data sources to identify insights that were previously unknown - Previous project work requiring expertise in business metrics and datasets - Strong communication skills to document technical processes clearly and lead knowledge-sharing efforts across teams - The ability to effectively collaborate cross-functionally to drive actionable and measurable results - Committed to continuous improvement, with a passion for building processes/tools to make everyone more efficient - A passion for AI as an enhancing tool to improve workflows, increase productivity, and generate smarter outcomes - A desire to constantly learn and improve themselves At MongoDB, we're committed to developing a supportive and enriching culture for everyone to drive personal growth and business impact. From employee affinity groups to fertility assistance and a generous parental leave policy, we value our employees" wellbeing and want to support them along every step of their professional and personal journeys. MongoDB is committed to providing any necessary accommodations for individuals with disabilities within our application and interview process. To request an accommodation due to a disability, please inform your recruiter. MongoDB is an equal opportunities employer.,
Posted 4 weeks ago
7.0 - 13.0 years
0 Lacs
pune, maharashtra
On-site
HCL Technologies is a next-generation global technology company dedicated to assisting enterprises in reimagining their businesses for the digital age. With a strong foundation of four decades of innovation, a renowned management philosophy, a culture of invention and risk-taking, and a steadfast commitment to customer relationships, HCL offers a wide range of technology products and services. The company takes pride in its various diversity, social responsibility, sustainability, and education initiatives. Through its extensive network of R&D facilities, co-innovation labs, and a workforce of over 197,000+ Ideapreneurs spanning 52 countries, HCL provides comprehensive services across industry verticals to leading enterprises worldwide, including 250 of the Fortune 500 and 650 of the Global 2000. The driving force behind HCL's work is its diverse, creative, and passionate workforce that consistently raises the bar for excellence. HCL is dedicated to bringing out the best in its employees, helping them discover their potential and evolve into the best version of themselves. HCL Tech is currently seeking a Java Full Stack Developer for a prominent product-based client. Join us in reshaping the future. **Qualification Required:** - BE/B.Tech/M. Tech/MCA educational background. - **Work Location:** Pune - **Experience:** 7 to 13 years - **Notice Period:** 30 days **Job Description:** - **Mandate Skills:** Java8, Angular, Spring Boot, Microservices - 5 to 10 years of experience in Java 8, Angular, Spring Boot, Spring Cloud, Microservices. - Proficiency in J2EE technologies, JDBC, ORM, Hibernate, JAXB, XML, XSD, SOAP services, and REST services. - Familiarity with Oauth2 principles, JWT, API security, Redis cache, AWS S3, logback, Spring Cloud config server, GraphQL, etc. - Strong understanding of OOPs principles, Java design patterns, Multithreading, Serialization, etc. - Experience with Junit testing and Mockito framework. - Proficiency in SQL and NoSQL databases like Oracle, Cassandra, Couchbase, or similar DBs. - Good grasp of Microservices architectural patterns. - Strong technical skills in architecture patterns, solution design, and integration development. - Knowledge of tools like Jenkins, Git, CI/CD, any Cloud (AWS preferred). - Experience with agile development methodologies, DevOps tools, Jira, Postman, SOAP UI. **How You'll Grow:** At HCLTech, we provide continuous opportunities for you to discover your potential and grow alongside us. We aim for your happiness and satisfaction in your role, encouraging you to explore the work that brings out your brilliance. We offer transparent communication with senior-level employees, learning and career development programs at all levels, and chances to experiment in different roles or industries. We believe in empowering you to steer your career with limitless opportunities to find the role that aligns best with your strengths. **Why Us:** - One of the fastest-growing large tech companies globally, with offices in 60+ countries and 222,000 employees. - Highly diverse company with 165 nationalities represented. - Opportunity to collaborate with colleagues worldwide. - Virtual-first work environment promoting work-life integration and flexibility. - Investment in your growth through learning and career development opportunities at all levels. - Comprehensive benefits for all employees. - Certified great place to work and a top employer in 17 countries, fostering a positive work environment that values employee recognition.,
Posted 4 weeks ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As an AWS Developer at PwC's Advisory Acceleration Center, you will collaborate with the Offshore Manager and Onsite Business Analyst to comprehend requirements and take charge of implementing Cloud data engineering solutions on AWS, such as Enterprise Data Lake and Data hub. With a focus on architecting and delivering scalable cloud-based enterprise data solutions, you will bring your expertise in end-to-end implementation of Cloud data engineering solutions using tools like Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, and Big Data model techniques using Python/Java. Your responsibilities will include loading disparate data sets, translating complex requirements into detailed designs, and deploying Snowflake features like data sharing, events, and lake-house patterns. You are expected to possess a deep understanding of relational and NoSQL data stores, including star and snowflake dimensional modeling, and demonstrate strong hands-on expertise in AWS services such as EMR, Glue, Sagemaker, S3, Redshift, Dynamodb, and AWS Streaming Services like Kinesis, SQS, and MSK. Troubleshooting and performance tuning experience in Spark framework, familiarity with flow tools like Airflow, Nifi, or Luigi, and proficiency in Application DevOps tools like Git, CI/CD frameworks, Jenkins, and Gitlab are essential for this role. Desired skills include experience in building stream-processing systems using solutions like Storm or Spark-Streaming, knowledge in Big Data ML toolkits such as Mahout, SparkML, or H2O, proficiency in Python, and exposure to Offshore/Onsite Engagements and AWS services like STEP & Lambda. Candidates with 2-4 years of hands-on experience in Cloud data engineering solutions, a professional background in BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA, and a passion for problem-solving and effective communication are encouraged to apply to be part of PwC's dynamic and inclusive work culture, where learning, growth, and excellence are at the core of our values. Join us at PwC, where you can make a difference today and shape the future tomorrow!,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
ahmedabad, gujarat
On-site
We are seeking an experienced ASP.NET MVC Developer with over 8 years of experience to join our development team. The position involves researching, designing, and implementing new web technologies for our healthcare products. The ideal candidate should possess hands-on experience in building scalable applications, strong problem-solving skills, and excellent communication skills. The qualifications include a BTech/BE in Computer Science or related technical field, or proof of exceptional skills in related fields with practical software engineering experience. Key requirements for the role: - Minimum 8 years of professional experience in ASP.NET MVC and ASP.NET MVC Core. - Proficiency in .NET Framework 4.6+, .NET Core 8+, C#, and object-oriented design principles. - Experience with Dapper/Entity Framework Core, LINQ, and in-depth knowledge of SQL Server. - Hands-on experience with deploying .NET applications on Microsoft Azure or AWS. - Familiarity with cloud storage and databases like Azure Blob Storage, Azure SQL Database, NoSql Database Elasticsearch or CosmosDB or DynamoDB. - Experience with Azure App Services, Azure Functions, AWS Lambda Functions, and AWS S3. - Proficiency in frontend technologies such as HTML5, CSS3, JavaScript, and jQuery. - Experience with source control management using Git, Azure DevOps. - Understanding of the software development lifecycle and Agile methodologies. - Ability to research and quickly learn new technologies. Responsibilities include: - Design, develop, and maintain ASP.NET MVC applications using .NET Framework 4.6+ and .NET Core 8+ for high performance and scalability. - Develop RESTful Web APIs and Web Services to support application functionality. - Create and optimize SQL Server database schemas, stored procedures, functions, and queries. - Proactively manage work queue, expectations, and meet deadlines. - Application security practices understanding. - Optimize application performance and ensure high-quality code through best practices and unit testing. - Write clean, maintainable code with proper documentation. - Participate in code reviews, architectural discussions, and sprint planning. - Stay updated with the latest .NET technologies and industry trends. Advantages: - Good knowledge of FHIR and HL7. - Understanding of microservice architecture. - Experience with Angular 15+, Typescript, and React. - Experience with unit testing, integration testing (NUnit or xUnit). - Knowledge of DevOps practices and CI/CD pipelines. Apply now and be part of our dynamic development team!,
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As an AWS Developer at PwC's Acceleration Center in Bangalore, you will be responsible for the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake and Data hub in AWS. You will collaborate with Offshore Manager/Onsite Business Analyst to understand requirements and architect scalable, distributed, cloud-based enterprise data solutions. Your role will involve hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, and Big Data model techniques using Python/Java. You must have a deep understanding of relational and NoSQL data stores, methods, and approaches such as star and snowflake dimensional modeling. Strong expertise in AWS services like EMR, Glue, Sagemaker, S3, Redshift, Dynamodb, and streaming services like Kinesis, SQS, and MSK is essential. Troubleshooting and performance tuning experience in Spark framework, along with knowledge of flow tools like Airflow, Nifi, or Luigi, is required. Experience with Application DevOps tools like Git, CI/CD Frameworks, Jenkins, or Gitlab is preferred. Familiarity with AWS CloudWatch, Cloud Trail, Account Config, Config Rules, and Cloud data migration processes is expected. Good analytical, problem-solving, communication, and presentation skills are essential for this role. Desired skills include building stream-processing systems using Storm or Spark-Streaming, experience in Big Data ML toolkits like Mahout, SparkML, or H2O, and knowledge of Python. Exposure to Offshore/Onsite Engagements and AWS services like STEP and Lambda would be a plus. Candidates with 2-4 years of hands-on experience in cloud data engineering solutions and a background in BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA are encouraged to apply. Travel to client locations may be required based on project needs. This position falls under the Advisory line of service and the Technology Consulting horizontal, with the designation of Associate based in Bangalore, India. If you are passionate about working in a high-performance culture that values diversity, inclusion, and professional development, PwC could be the ideal place for you to grow and excel in your career. Apply now to be part of a global team dedicated to solving important problems and making a positive impact on the world.,
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
As a software developer, you will be working in a constantly evolving environment driven by technological advances and the strategic direction of the organization you are employed by. Your primary responsibilities will include creating, maintaining, auditing, and enhancing systems to meet specific needs, often based on recommendations from systems analysts or architects. You will be tasked with testing both hardware and software systems to identify and resolve system faults. Additionally, you will be involved in writing diagnostic programs and designing and developing code for operating systems and software to ensure optimal efficiency. In situations where necessary, you will also provide recommendations for future developments. Joining us offers numerous benefits, including the opportunity to work on challenging projects and solve complex technical problems. You can expect rapid career growth and the chance to assume leadership roles. Our mentorship program allows you to learn from experienced mentors and industry experts, while our global opportunities enable you to collaborate with clients from around the world and gain international experience. We offer competitive compensation packages and benefits to our employees. If you are passionate about technology and interested in working on innovative projects with a skilled team, pursuing a career as an Infosys Power Programmer could be an excellent choice for you. To be considered for this role, you must possess the following mandatory skills: - Proficiency in AWS Glue, AWS Redshift/Spectrum, S3, API Gateway, Athena, Step, and Lambda functions. - Experience with Extract Transform Load (ETL) and Extract Load & Transform (ELT) data integration patterns. - Expertise in designing and constructing data pipelines. - Development experience in one or more object-oriented programming languages, preferably Python. In terms of job specifications, we are looking for candidates who meet the following criteria: - At least 5 years of hands-on experience in developing, testing, deploying, and debugging Spark Jobs using Scala in the Hadoop Platform. - Profound knowledge of Spark Core and working with RDDs and Spark SQL. - Familiarity with Spark Optimization Techniques and Best Practices. - Strong understanding of Scala Functional Programming concepts like Try, Option, Future, and Collections. - Proficiency in Scala Object-Oriented Programming covering Classes, Traits, Objects (Singleton and Companion), and Case Classes. - Sound knowledge of Scala Language Features including the Type System and Implicit/Givens. - Hands-on experience working in the Hadoop Environment (HDFS/Hive), AWS S3, EMR. - Proficiency in Python programming. - Working experience with Workflow Orchestration tools such as Airflow and Oozie. - Experience with API calls in Scala. - Familiarity and exposure to file formats like Apache AVRO, Parquet, and JSON. - Desirable knowledge of Protocol Buffers and Geospatial data analytics. - Ability to write test cases using frameworks like scalatest. - Good understanding of Build Tools such as Gradle & SBT. - Experience using GIT, resolving conflicts, and working with branches. - Preferred experience in workflow systems like Airflow. - Strong programming skills focusing on data structures and algorithms. - Excellent analytical and communication skills. Candidates applying for this position should have: - 7-10 years of industry experience. - A BE/B.Tech in Computer Science or an equivalent qualification.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Business Intelligence Analyst in our team, you will collaborate with product managers, engineers, and business stakeholders to establish key performance indicators (KPIs) and success metrics for Creator Success. Your role involves creating detailed dashboards and self-service analytics tools utilizing platforms like QuickSight, Tableau, or similar Business Intelligence (BI) tools. You will conduct in-depth analysis on customer behavior, content performance, and livestream engagement patterns. Developing and maintaining robust ETL/ELT pipelines to handle large volumes of streaming and batch data from the Creator Success platform is a key responsibility. Additionally, you will be involved in designing and optimizing data warehouses, data lakes, and real-time analytics systems using AWS services such as Redshift, S3, Kinesis, EMR, and Glue. Ensuring data accuracy and reliability is crucial, and you will implement data quality frameworks and monitoring systems. Your qualifications should include a Bachelor's degree in Computer Science, Engineering, Mathematics, Statistics, or a related quantitative field. With at least 3 years of experience in business intelligence or analytic roles, you should have proficiency in SQL, Python, and/or Scala. Expertise in AWS cloud services like Redshift, S3, EMR, Glue, Lambda, and Kinesis is required. You should have a strong background in building and optimizing ETL pipelines, data warehousing solutions, and big data technologies like Spark and Hadoop. Familiarity with distributed computing frameworks, business intelligence tools (QuickSight, Tableau, Looker), and data visualization best practices is essential. Your proficiency in SQL and Python is highly valued, along with skills in AWS Lambda, QuickSight, Power BI, AWS S3, AWS Kinesis, ETL, Scala, AWS EMR, Hadoop, Spark, AWS Glue, and data warehousing. If you are passionate about leveraging data to drive business decisions and have a strong analytical mindset, we welcome you to join our team and make a significant impact in the field of Business Intelligence.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
The Infor Data Services team at Infor is looking for a Software Engineer to develop customer-focused data solutions. As a Software Engineer, you will primarily focus on building backend systems for Infor Datamesh, a platform that provides scalable and robust data solutions for enterprise customers. Your responsibilities will include designing, developing, and maintaining backend services and microservices using Python, Docker, and Kafka. You will also be integrating these services with various databases and external storage solutions to ensure high performance and scalability in alignment with Infor's data management strategy. Your essential duties will involve developing and maintaining data-centric solutions with Python, building scalable microservices from scratch following best coding practices, and upgrading backend code to ensure optimal performance and security. Additionally, you will integrate code with databases such as MongoDB and MySQL, as well as external data storage solutions like AWS S3. Developing REST APIs using frameworks like Django or Flask, implementing multithreading and asynchronous programming, and collaborating with cross-functional teams for seamless integration with Infor CloudSuite products will be part of your daily tasks. To qualify for this role, you should have 4-5 years of experience working with Python, at least 2 years of experience with API frameworks like Django or Flask, and familiarity with AWS S3, MongoDB, MySQL, and SQL Server. Strong experience in building REST APIs, working with multithreading, asynchronous programming, Docker, Kafka, and Git in Linux environments is essential. Effective collaboration skills with cross-functional teams, strong communication, problem-solving abilities, and a Bachelor's degree in Computer Science or a related field are required. Preferred qualifications include experience with microservice architecture, Delta Lake, PySpark, familiarity with the Infor Cloud Suites SDK, working with data solutions at scale within enterprise systems, and knowledge of tools like Jira, Confluence, and Agile development methodologies. Infor is a global leader in business cloud software products, offering industry-specific market solutions. With a focus on user experience, data science, and seamless integration, Infor helps over 60,000 organizations worldwide achieve digital transformation. To learn more about Infor, visit www.infor.com. Infor's values are founded on Principle Based Management (PBM) and eight Guiding Principles: integrity, stewardship & compliance, transformation, principled entrepreneurship, knowledge, humility, respect, and self-actualization. Infor is committed to fostering a diverse environment that reflects the markets, customers, partners, and communities it serves, promoting innovation, improvement, and transformation while creating long-term value and fulfillment for its employees.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
telangana
On-site
As a Web Development Lead at our organization, you will be a key member of our dynamic IT team, with a primary focus on developing cutting-edge healthcare applications. This role provides a unique opportunity to contribute to the future of healthcare through the creation of data-driven technology solutions in a collaborative and innovative environment. Your responsibilities will include leading a team of front-end developers, both onshore and offshore, defining and documenting system specifications in alignment with architecture and business requirements, and planning system deployments while adhering to security standards. You will be responsible for developing and maintaining web applications using React.js and Material-UI, collaborating with UI/UX designers to implement responsive and user-friendly designs, and integrating RESTful APIs and AWS microservices with back-end developers. In addition, you will troubleshoot and resolve system performance issues, document and maintain updates, modifications, and system configurations, interface with key technology vendors and internal stakeholders, and evaluate standard platform capabilities versus custom development requirements. You will also be expected to write and maintain unit and integration tests, optimize performance, ensure high code quality, and ensure that implementation aligns with global regulatory standards such as HIPAA, SOC II, and ISO 27001. Preferred qualifications for this role include experience in healthcare or medical device environments, knowledge of healthcare standards such as HL7v2, FHIR, and DICOM, HL7 and/or FHIR certification, experience with Software as a Medical Device (SaMD), familiarity with EHR systems like Epic and Cerner, an understanding of security and privacy-by-design principles, a background in digital health product development, experience working on global applications, and a strong understanding of SDLC, including Agile and Waterfall methodologies. Proficiency in project estimation and development team leadership is also highly desirable. The ideal candidate for this role must possess technical skills in frontend technologies such as ReactJS, Redux/Toolkit, GraphQL, React/Next.js, Node.js, Angular, Tailwind CSS, React Testing Library, and Jest. Cloud expertise in AWS services like S3, Lambda, API Gateway, CloudFront, and CloudWatch is essential, along with experience in DevOps tools like GitLab CI, GitHub, Jenkins, unit/integration testing, and performance tuning. Familiarity with monitoring tools like ELK Stack and Dynatrace, as well as compliance and documentation standards such as HIPAA, ISO27001/27002, PVCS, JSON, XML, and technical documentation, is required for this role. Operating within the Care Platform team, you will play a crucial role in enhancing our healthcare product portfolio with innovative services aimed at improving both clinical and patient experiences. This is a full-time position based in Minneapolis, MN 55407. The ability to commute to this location is required, and relocation to Minneapolis, MN 55407 before starting work is also mandatory. The work location is in person.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As a Senior AWS Data Engineer Cloud Data Platform at Teamware Solutions, a division of Quantum Leap Consulting Pvt. Ltd, located in Bangalore, you will be responsible for end-to-end implementation of Cloud data engineering solutions like Enterprise Data lake and Data hub in AWS. Working onsite in an office environment for 5 days a week, you will collaborate with the Offshore Manager and Onsite Business Analyst to understand the requirements and deliver scalable, distributed, cloud-based enterprise data solutions. You should have a strong background in AWS cloud technology, with 4-8 years of hands-on experience. Proficiency in architecting and delivering highly scalable solutions is a must, along with expertise in Cloud data engineering solutions, Lambda or Kappa Architectures, Data Management concepts, and Data Modelling. You should be proficient in AWS services such as EMR, Glue, S3, Redshift, and DynamoDB, as well as have experience in Big Data frameworks like Hadoop and Spark. Additionally, you must have hands-on experience with AWS compute and storage services, AWS Streaming Services, troubleshooting and performance tuning in Spark framework, and knowledge of Application DevOps tools like Git and CI/CD Frameworks. Familiarity with AWS CloudWatch, Cloud Trail, Account Config, Config Rules, security, key management, data migration processes, and analytical skills is required. Good communication and presentation skills are essential for this role. Desired skills include experience in building stream-processing systems, Big Data ML toolkits, Python, Offshore/Onsite Engagements, flow tools like Airflow, Nifi or Luigi, and AWS services like STEP & Lambda. A professional background in BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA is preferred, and an AWS certified Data Engineer certification is recommended. If you are interested in this position and meet the qualifications mentioned above, please send your resume to netra.s@twsol.com.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
You are a skilled and motivated Backend Engineer with 8-10 years of experience, specializing in TypeScript, Node.js, and AWS. You will be responsible for designing and constructing scalable backend APIs and microservices, adhering to best practices in API development and cloud-native architecture. Your key responsibilities will include designing, developing, and maintaining RESTful APIs using TypeScript and Node.js to ensure scalability, performance, and maintainability. You will implement API best practices such as versioning, documentation, authentication, rate limiting, and error handling. As part of your role, you will architect and construct backend services on AWS by utilizing services like Lambda, API Gateway, ECS/Fargate, DynamoDB, and S3. You will also develop serverless applications using AWS Lambda and related tools, following infrastructure-as-code practices where possible. Additionally, you will build containerized applications using Docker and deploy them using AWS container services like ECS or EKS. Ensuring code quality is crucial, so you will write unit, integration, and end-to-end tests, while leveraging TypeScript's type safety features for robust backend implementations. Collaboration with cross-functional teams, including frontend developers, DevOps, and product managers, is essential to deliver high-quality software solutions. Monitoring and troubleshooting production systems using observability tools like CloudWatch, XRay, and logging solutions will be part of your responsibilities to ensure uptime and reliability. Staying up-to-date with the latest developments in backend technologies, AWS services, and the Node.js ecosystem will be necessary. Driving continuous improvement within the team is also expected from you. If you find this role exciting, please share your updated resume with pavan.k@s3staff.com.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Senior QA Test Analyst at our organization based in Bangalore, India, you will play a crucial role in our agile development team. Working collaboratively with talented and enthusiastic colleagues, you will need to possess a keen eye for detail, strong analytical and diagnostic skills, and a determination to excel in a dynamic work environment. Your primary tools will include Cucumber, NodeJS, Typescript, Playwright, Jest, Git, and AWS S3. In this role, your responsibilities will include understanding feature-centric test design techniques and effectively incorporating them into test plans and strategies. You should have a deep technical understanding of integrated systems to identify and address issues related to the Application Under Test (AUT). Proficiency in BDD (Behavior Driven Development) concepts is essential for deriving precise features and scenarios from PRDs, epics, and user stories. Your coding skills should be excellent, enabling you to write optimized and industry-standard code in Typescript following best practices. Additionally, a good grasp of test frameworks and SOC (Separation of Concerns) is required. You should be well-versed in GitHub, the code review process, pull requests, and branching strategies. Experience with REST API testing using standard libraries is a must. As a senior engineer, you will also be expected to mentor junior team members and demonstrate excellent team collaboration skills within a global team setting. Key Skills required for this role include: - 4-6 years of Test Automation experience in UI and API automation using JavaScript or Typescript - Proficiency in BDD, Node JS, and UI testing - Exposure to at least one test framework such as jest, mocha, protractor, jasmine, or cucumber - Familiarity with at least one automation framework like Playwright, Puppeteer, WDIO, or Cypress You will need to hold a Bachelor's degree in Computer Science, MIS, or a related field, along with 3-5 years of software testing experience for SaaS-products. This is a full-time permanent position with a hybrid work model. The role will be based at The Leela Office on Airport Road, Kodihalli, Bangalore, with the expectation that you will work from the office on Tuesdays, Wednesdays, and Thursdays, and have the flexibility to work from home on Mondays and Fridays.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
ahmedabad, gujarat
On-site
We are seeking an experienced ASP.NET MVC Developer with over 8 years of experience to become an integral part of our development team. Your role will involve researching, designing, and implementing cutting-edge web technologies specifically tailored for our healthcare products. The ideal candidate should possess hands-on experience in building scalable applications, exceptional problem-solving skills, and outstanding communication capabilities. You should hold a BTech/BE in Computer Science or a related technical field, or demonstrate extraordinary skills in relevant areas coupled with practical software engineering experience. Key requirements for this role include a minimum of 8 years of professional experience in ASP.NET MVC and ASP.NET MVC Core, a strong command of .NET Framework 4.6+, .NET Core 8+, C#, and object-oriented design principles. Proficiency in tools such as Dapper/Entity Framework Core, LINQ, and in-depth knowledge of SQL Server is essential. Moreover, you should have hands-on experience deploying .NET applications on Microsoft Azure or AWS, familiarity with Azure App Services, Azure Functions, AWS Lambda Functions, and AWS S3. An understanding of cloud storage and databases like Azure Blob Storage, Azure SQL Database, NoSQL databases such as Elasticsearch, CosmosDB, or DynamoDB is crucial. Experience with Azure API Management or AWS API Gateway, as well as containerization using Docker, is highly advantageous. Proficiency in frontend technologies like HTML5, CSS3, JavaScript, jQuery, and experience with source control management tools like Git and Azure DevOps are also required. Strong English communication skills, familiarity with software development lifecycles, Agile methodologies, and the ability to quickly research and learn new technologies are necessary. Your responsibilities will include developing and maintaining ASP.NET MVC applications using .NET Framework 4.6+ and .NET Core 8+, focusing on performance and scalability. You will be tasked with creating RESTful Web APIs and Web Services, optimizing SQL Server database schemas, stored procedures, functions, and queries, and ensuring application security practices. Additionally, you will be expected to optimize application performance, write clean and maintainable code with proper documentation, participate in code reviews, architectural discussions, sprint planning, and stay abreast of the latest .NET technologies and industry trends. Advantages for this role include knowledge of FHIR and HL7, understanding microservice architecture, experience with Angular 15+, Typescript, React, unit testing, integration testing (NUnit or xUnit), and familiarity with DevOps practices and CI/CD pipelines. If you meet these qualifications and are passionate about leveraging your skills to drive innovation in healthcare technology, we encourage you to apply.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
We are seeking a skilled and experienced Data Engineer with a minimum of 5 years of expertise in data engineering and data migration projects. The ideal candidate will have a strong proficiency in SQL, Python, data modeling, data warehousing, and ETL pipeline development. It is essential to have hands-on experience with big data tools such as Hadoop and Spark, as well as familiarity with various AWS services including Redshift, S3, Glue, EMR, and Lambda. This position offers a fantastic opportunity to contribute to large-scale data solutions that drive data-informed decision-making and operational efficiency. As a Data Engineer, your responsibilities will include designing, building, and maintaining scalable data pipelines and ETL processes. You will be tasked with developing and optimizing data models and data warehouse architectures, as well as implementing and managing big data technologies and cloud-based data solutions. Your role will involve performing data migration, transformation, and integration from multiple sources, collaborating with cross-functional teams to understand data requirements, and ensuring data quality, consistency, and security throughout all data pipelines and storage systems. Additionally, you will be responsible for optimizing performance and managing cost-efficient AWS cloud resources. Basic qualifications for this role include a Master's degree in Computer Science, Engineering, Analytics, Mathematics, Statistics, IT, or a related field, along with a minimum of 5 years of hands-on experience in Data Engineering and data migration projects. Proficiency in SQL and Python for data processing and analysis is required, as well as a strong background in data modeling, data warehousing, and building data pipelines. The ideal candidate will have practical experience with big data technologies like Hadoop and Spark, and expertise in utilizing AWS services such as Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, and IAM. An understanding of ETL development best practices and principles is also expected. Preferred qualifications include knowledge of data security and data privacy best practices, experience with DevOps and CI/CD practices related to data workflows, familiarity with data lake architectures and real-time data streaming, strong problem-solving abilities, attention to detail, excellent verbal and written communication skills, and the ability to work both independently and collaboratively in a team environment. Desirable skills for this role include experience with orchestration tools like Airflow or Step Functions, exposure to BI/Visualization tools like QuickSight, Tableau, or Power BI, and an understanding of data governance and compliance standards.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |