Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
maharashtra
On-site
You are an experienced and highly skilled Senior AWS Data Engineer with over 8+ years of experience, ready to join our dynamic team. Your deep understanding of data engineering principles, extensive experience with AWS services, and proven track record of designing and implementing scalable data solutions make you the ideal candidate for this role. Your key responsibilities will include designing and implementing robust, scalable, and efficient data pipelines and architectures on AWS. You will develop data models and schemas to support business intelligence and analytics requirements, utilizing AWS services such as S3, Redshift, EMR, Glue, Lambda, and Kinesis to build and optimize data solutions. It will be your responsibility to implement data security and compliance measures using AWS IAM, KMS, and other security services, as well as design and develop ETL processes to ingest, transform, and load data from various sources into data warehouses and lakes. Ensuring data quality and integrity through validation, cleansing, and transformation processes, optimizing data storage and retrieval performance through indexing, partitioning, and other techniques, and monitoring and troubleshooting data pipelines for high availability and reliability will also be part of your role. Collaboration with cross-functional teams, providing technical leadership and mentorship to junior data engineers, identifying opportunities to automate and streamline data processes, and participating in on-call rotations for critical systems and services are also expected from you. Your required qualifications, capabilities, and skills include experience in software development and data engineering, with hands-on experience in Python and PySpark. You should have proven experience with cloud platforms such as AWS, Azure, or Google Cloud, a good understanding of data modeling, data architecture, ETL processes, and data warehousing concepts, and experience with cloud native ETL platforms like Snowflake and/or Databricks. Proven experience with big data technologies and services like AWS EMRs, Redshift, Lambda, S3, efficient Cloud DevOps practices, and CI/CD tools like Jenkins/Gitlab for data engineering platforms, as well as good knowledge of SQL and NoSQL databases including performance tuning and optimization, and experience with declarative infra provisioning tools like Terraform, Ansible, or CloudFormation will be valuable assets. Strong analytical skills to troubleshoot issues and optimize data processes, working independently and collaboratively, are also necessary for this role. Preferred qualifications, capabilities, and skills that would be beneficial for this role include knowledge of machine learning model lifecycle, language models, and cloud-native MLOps pipelines and frameworks, as well as familiarity with data visualization tools and data integration patterns.,
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
kolkata, west bengal
On-site
The Python Software Engineer role involves designing, coding, and testing analytics applications using Python, including Core Python, Object Oriented Programming, and Functional Programming. The ideal candidate should be proficient in working with data sets from databases such as MSSQL, Oracle, data lakes like S3, and data lakehouses like Dremio. Proficiency in Python Object Relational Management (ORM) is mandatory. Strong experience with Python data manipulation libraries like Pandas and NumPy is essential. The responsibilities also include debugging, identifying, and resolving bugs and issues in the code, collaborating with other software engineers, data scientists, and stakeholders to deliver high-quality software that is scalable, readable, and maintainable. Following Test Driven Development (TDD) and delivering stories of Agile Scrum Sprint are part of the job responsibilities. Creating and maintaining documentation for the software and development processes is also expected. The ideal candidate for this position should have a strong understanding and a minimum of 6 years of exclusive experience in Python development. Proficiency with Python programming language and its libraries (Core, OOP & FP) is required, with a particular emphasis on libraries like Pandas & NumPy. Experience with SQL and NoSQL databases, strong analytical and problem-solving skills, and the ability to work effectively in a team and communicate technical information clearly are essential. Familiarity with debugging tools and techniques, databases, data lake (S3), continuous integration, and continuous deployment pipelines on Azure/AWS is advantageous. The Python Software Engineer will be responsible for designing, developing, testing, and maintaining Python analytics applications. The role involves developing clean and optimal Python code for various data manipulation operations over data lakes and data lakehouses. Shift Timing: 2 11 PM IST Job Types: Full-time, Contractual / Temporary Contract length: 12 months Application Question(s): Are you comfortable for C2H Notice period Work Location: In person,
Posted 4 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
The ideal candidate is a self-motivated, multi-tasker, and demonstrated team-player. You will be a lead developer responsible for the development of new software products and enhancements to existing products. You should excel in working with large-scale applications and frameworks and have outstanding communication and leadership skills. Responsibilities Own and drive key initiatives supporting the developer workflow from PR to release. Innovate and deliver technical solutions. Collaborate with partner engineering teams for inputs, support, and guidance. Monitor, review, and coach developer performance; conduct regular performance appraisals and provide disciplinary actions. Document technical solutions and articulate them to both business and technical audiences. Present, communicate, and advocate engineering perspectives. Participate in Agile ceremonies and show ownership of work throughout the sprint process. Qualifications Bachelor's degree in Computer Science (or related field). 5+ years of relevant work experience in Nodejs & 3+ years in GraphQL. Expertise in Object-Oriented Design, Database Design, and XML Schema. Experience with Agile or Scrum software development methodologies. Ability to multi-task, organize, and prioritize work. Technical Requirements Tech Stack: TypeScript, Node, GraphQL, PostgreSQL, AWS services. Development Practices: TDD, pair programming, code reviews, continuous integration and delivery (CI/CD). Deployment: Automated using Terraform/Github Actions on AWS. Technologies Used: API portals (API Gateway, CloudFront, WAF), serverless technologies (Lambda), storage and database systems (Aurora, S3), messaging systems (SNS, SQS), Kafka. Agile Practices: Dailies, story detailing, planning, retrospectives.,
Posted 4 weeks ago
10.0 - 14.0 years
0 Lacs
noida, uttar pradesh
On-site
You are an Individual Contributor at Adobe, one of the world's most innovative software companies, transforming digital experiences for billions of users globally. Adobe empowers individuals, businesses, and organizations to unleash their creativity, collaboration, and productivity through its cutting-edge products. As part of the 30,000+ strong global team at Adobe, you have the opportunity to shape the future by creating high-quality and performant web solutions and features. Your role involves driving solutioning, guiding the team technically, and collaborating with product management to ensure technical feasibility and enhance user experience and performance. To succeed in this role, you must possess a strong technical background, analytical skills, and hands-on experience in Java, JavaScript, and web applications. Your ability to adapt to new technologies, work effectively in diverse teams, and lead engineering projects is crucial. With over 10 years of software engineering experience and proficiency in technologies like Web Component, TypeScript, MVC frameworks, and AWS services, you are well-equipped to define APIs, integrate them into web applications, and drive innovation. At Adobe, a culture of collaboration, shared accomplishments, and continuous learning prevails. You are encouraged to stay updated on industry trends, make data-driven decisions, and foster a fun and impactful work environment. By leveraging your technical expertise, problem-solving skills, and proactive approach, you can contribute to Adobe's mission of revolutionizing digital experiences and creating personalized digital solutions that change the world. Join Adobe, where every employee is empowered to make a difference and where you can unleash your potential, grow your career, and be part of a global community dedicated to driving innovation and positive change. For a rewarding career experience and the opportunity to work in a supportive and inclusive environment, Adobe is the ideal place to thrive and make a meaningful impact.,
Posted 4 weeks ago
3.0 - 7.0 years
0 Lacs
kolkata, west bengal
On-site
You should have a minimum of 3 years of experience in data engineering, with a specific emphasis on AWS technologies. Your expertise should include proficiency in Python, SQL, PySpark, and data modeling. Additionally, you must have hands-on experience working with AWS services like Glue, Redshift, Lambda, S3, and Airflow. It is preferred that you have prior experience dealing with data sourced from energy systems, smart grids, or industrial IoT platforms. Strong problem-solving skills and keen attention to detail are essential for this role. Moreover, excellent communication and collaboration abilities will be highly beneficial in effectively working with the team.,
Posted 4 weeks ago
8.0 - 12.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As a candidate for this position, you should hold a Bachelor's degree in Computer Engineering, Computer Science, Information Systems, or a related field. In addition, you should have a minimum of 8 years of experience in a DevOps-related role, particularly for senior-level candidates. Strong English communication skills, both written and verbal, are essential for effective collaboration within the team. Proficiency in Linux system engineering is a must, along with hands-on experience in setting up CI/CD pipelines, preferably using GitHub Actions. Your expertise should extend to working with Infrastructure as Code (IaC) tools, with a particular focus on Terraform. Familiarity with configuration management tools like Ansible or Salt will be advantageous in this role. The ideal candidate will also have experience with logging and monitoring tools such as Grafana, Promtail, and Loki. Proficiency in Kubernetes, Helm, and GitOps tools like ArgoCD is highly preferred. Kubernetes administration experience will be a strong advantage, coupled with a solid understanding of Microservices Architecture and Service Mesh concepts. Your background should include experience in building and managing Windows-based infrastructure and familiarity with artifact management tools such as JFrog Artifactory. A strong knowledge of AWS services, including but not limited to EC2, ECS, RDS, S3, CloudFront, WAF, API Gateway, Lambda, ElastiCache, Elasticsearch, SQS, SNS, EKS, etc., is essential for this role. Moreover, proficiency in at least one programming language, preferably Python, is required. Experience with systems like Kafka, Keycloak, Airflow, NiFi, Rentaho, Redis, PostgreSQL, etc., will be beneficial in fulfilling the responsibilities of this position.,
Posted 4 weeks ago
10.0 - 16.0 years
0 Lacs
chennai, tamil nadu
On-site
You are a Senior Technical leader with over 12 years of experience, tasked with driving complex engineering initiatives, establishing technical direction, and guiding teams towards delivering impactful solutions. You will be working in a hybrid mode and must possess strong skills in Golang, AWS, DynamoDB, and Kinesis Data Streams. As a part of your responsibilities, you will be expected to implement a highly reliable and secure IAM platform service that complies with industry standard protocols. Your role will involve building and optimizing RESTful APIs for managing user roles, permissions, and access policies while ensuring adherence to standards like OAuth 2.0 and OpenID Connect. Integration of SDKs and APIs from 3rd Party Identity management solutions to enable authentication flows will be a crucial aspect of your work. Writing highly performant concurrent code to handle millions of authentication and authorization requests daily with minimal latency is also part of your responsibilities. You will apply API-first design principles and software patterns to construct modular, reusable, and well-documented services. Leveraging serverless architecture, such as AWS Lambda, for cost-effective and scalable services that minimize infrastructure overhead is essential. Upholding best practices in Software engineering, CICD pipelines, testing, and monitoring to maintain high-quality releases will be a key focus. Collaboration with product management, architecture, and other cross-functional teams to ensure seamless delivery of features is also expected from you. Your required skills include expertise in programming languages like Go, Java, and .Net. Experience in leveraging Auth0 for authentication and identity federation, integrating with third-party providers for seamless single sign-on experiences is crucial. Building and maintaining RESTful API interfaces to provide fine-grained access control and role-based permissions for platform consumers is a significant aspect of your work. Familiarity with event-based patterns and AWS serverless technology (Lambda, DynamoDB, S3, and CloudWatch) is also necessary. Designing and executing end-to-end test cases, including unit, integration, and load testing to ensure reliability and scalability of the IAM service, is part of your responsibilities. Your problem-solving and analytical skills should focus on delivering scalable, maintainable, and high-performing solutions.,
Posted 4 weeks ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a Business Analyst specializing in Business Intelligence at Bloom Energy, you will be part of a team that is driving a global mission to revolutionize the energy industry. Bloom Energy is dedicated to making clean, reliable energy accessible worldwide through its innovative Energy Server technology, which provides efficient and sustainable electric power for various applications such as microgrids. Reporting to the Business Intelligence Senior Manager in Mumbai, India, you will play a crucial role in enhancing the visibility and accuracy of financial data through the development of automated tools and dashboards for different P&L line items. Collaborating closely with the leadership team, you will contribute to improving forecasting tools, monitoring actuals versus forecasts, and supporting ad hoc data analysis requests from the operations team. Your expertise in cost analysis will be instrumental in providing valuable insights to enhance profitability. Key Responsibilities: - Develop automated tools and dashboards to enhance visibility and accuracy of P&L line items - Collaborate with the leadership team to improve forecasting tools and ensure accurate P&L forecasts - Work closely with the finance team to monitor actuals versus forecasts during the quarter - Provide support for ad hoc data analysis and scenario planning requests from the operations team - Conduct in-depth analysis of costs and offer insights to the leadership team to drive profitability - Collaborate with the IT team to develop production-ready tools for automating Services P&L Requirements: - Strong analytical and problem-solving skills - Proficiency in Python, Excel, and Powerpoint - Experience in financial planning and forecasting is a plus - Proficiency in dashboarding tools like Tableau - Familiarity with databases/datalakes such as PostgreSQL, Cassandra, AWS RDS, Redshift, S3 - Experience with version control software like Git Education: - Bachelor's degree in Business Management, Data Analytics, Computer Science, Industrial Engineering, or related fields Join Bloom Energy in its commitment to a 100% renewable future and be part of an organization that offers resilient electricity solutions in the face of power disruptions. With a focus on clean energy technologies, Bloom Energy is leading the transition to renewable fuels like hydrogen and biogas, providing sustainable power solutions to various industries including manufacturing, data centers, healthcare, retail, and more. Learn more about us at www.bloomenergy.com.,
Posted 4 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Snowflake Data Engineer with 3-5 years of experience, you will be responsible for designing, developing, and optimizing cloud-based data warehousing solutions. This is an exciting opportunity to work on a flagship data initiative for a premier Big 4 consulting client, offering ample scope for technical innovation, learning, and career growth. Your key responsibilities will include: - Designing and developing high-performance data pipelines in Snowflake for data ingestion, transformation, and storage. You will focus on external tables, semi-structured data handling, and transformation logic. - Optimizing Snowflake workloads to ensure optimal query execution and cost-effective utilization of compute and storage resources. You will tune performance across large-scale datasets and implement workload management strategies. - Developing robust ETL processes using SQL, Python, and orchestration tools like DBT, Apache Airflow, Matillion, or Talend. Automation, data transformation, and pipeline reliability will be your focus. - Integrating with AWS Glue by utilizing capabilities such as crawlers, jobs, and external tables for seamless integration with Snowflake. You will ensure consistent and automated data ingestion and cataloging. - Enforcing data governance, role-based access control, and compliance protocols within Snowflake to ensure secure handling of sensitive data and privacy adherence. - Handling diverse data formats including structured and semi-structured formats like JSON, Parquet, Avro, XML, etc., to enable flexibility in data consumption across reporting and analytics. - Designing dimensional models optimized for Snowflake architecture, including fact and dimension tables, to enable efficient querying and integration with BI tools. - Collaborating with business stakeholders, data analysts, and BI developers to translate business requirements into scalable data solutions. - Monitoring end-to-end data workflows, ensuring system reliability, and proactively troubleshooting failures and performance bottlenecks. Key Skills & Qualifications: - Hands-on experience with Snowflake development and architecture. - Proficiency in SQL, Python, and cloud-native ETL/ELT tools. - Experience with AWS Glue, S3, and Snowflake integration. - Strong knowledge of data modeling, performance tuning, and cost optimization. - Familiarity with handling semi-structured data. - Good understanding of data governance, access control, and security best practices. - Excellent problem-solving and communication skills. Nice To Have: - Experience working with Big 4 consulting clients or large enterprise environments. - Exposure to DevOps practices, CI/CD pipelines, and data quality framework. If you are looking to leverage your expertise in Snowflake and cloud-based data warehousing to drive technical innovation and deliver scalable solutions, this role offers an exciting opportunity to grow your career and make a significant impact.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
A career in our Advisory Acceleration Centre is the natural extension of PwC's leading-class global delivery capabilities. We provide premium, cost-effective, high-quality services that support process,
Posted 1 month ago
0.0 - 3.0 years
0 Lacs
noida, uttar pradesh
On-site
We are seeking a Data Engineer Intern or Trainee with the following key skills: - Proficient in SQL Database tuning and performance optimization - Experience with Airflow implementation using Python or Scala - Strong knowledge of Python and PySpark - Familiarity with AWS Redshift, Snowflake, or Databricks for data warehousing - Ability to work with ETL services in AWS such as EMR, GLUE, S3, Redshift, or similar services in GCP or Azure. This opportunity is open to both freshers and individuals with up to 1 year of experience. Comprehensive on-the-job training will be provided for freshers. Candidates with a B.Tech background and no prior IT experience are also encouraged to apply. Job Types: Full-time, Permanent, Fresher Benefits: - Paid sick time - Performance bonus Schedule: - Monday to Friday Experience: - Total work: 1 year (Preferred) Work Location: In person Expected Start Date: 04/08/2025,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You will be responsible for developing and working on consumer-facing web/app products using Node.js and React.js. Your primary focus will be on databases such as MongoDB (Expert Level) and PostgreSQL, along with Queue Systems like Kafka and Job Scheduler like Bull. You should have expertise in infrastructure technologies like Docker and Kubernetes (K8s). Additionally, you should have experience working with large datasets and possess expertise in logging, tracing, and application monitoring. You must have hands-on experience in JavaScript and Node.js, with knowledge of frameworks like Express.js, Koa.js, or Socket.io. Proficiency in async programming using Callbacks, Promises, and Async/Await is essential. You should also be familiar with Frontend technologies including HTML, CSS, and AJAX, along with databases like MongoDB, Redis, and MySQL. A good understanding of Data Structures, Algorithms, and Operating Systems is required. Experience with AWS services such as EC2, ELB, AutoScaling, CloudFront, and S3 is preferred. While experience with Frontend Stack and Vue.js would be beneficial, you will have the opportunity to learn new tools with guidance and resources. The role requires full-time commitment and offers permanent employment. Your experience in HTML, MongoDB, MySQL, AWS, and Express.js should be at least 4 years. The work location is in person.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
The role of .NET Backend Developer- AWS requires a professional with 6-8 years of experience to join our technology services client's team in Hyderabad or Gurugram on a contract basis. There is a strong possibility of converting to full-time employment after the initial contract period, and the notice period for this position ranges from immediate to 15 days. As a .NET Backend Developer- AWS, your primary responsibilities will include proficiency in C# programming language, experience in developing and integrating RESTful APIs using ASP.NET Web API, and the ability to design and integrate RESTful APIs. Additionally, you should have a basic understanding of AWS and GenAI technologies. In terms of secondary requirements, expertise in Cloud services such as AWS Lambda for serverless .NET functions, API Gateway for secure API exposure, S3 for object storage, DynamoDB for NoSQL database, RDS for relational databases like SQL Server or PostgreSQL, SQS for message queuing, and SNS for notifications and pub/sub is essential. Familiarity with GenAI technologies including basic prompt engineering, consuming GenAI APIs like OpenAI and Gemini, and understanding LLM use cases in enterprise apps is desirable. Moreover, knowledge of DevOps practices and tools such as GitHub for version control and GitHub Actions for CI/CD, optional but valuable experience with Docker, and proficiency in logging and monitoring tools like CloudWatch are highly beneficial for this role. If you meet the above requirements and are interested in this opportunity, kindly share your updated resume with sai.a@s3staff.com.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
A career in our Advisory Acceleration Centre is the natural extension of PwC's leading-class global delivery capabilities. We provide premium, cost-effective, high-quality services that support process,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer - Pyspark at Barclays, you will play a pivotal role in advancing our infrastructure and deployment pipelines, fostering innovation and operational excellence. Your responsibility will include leveraging state-of-the-art technology to develop and manage robust, scalable, and secure infrastructure, ensuring the seamless delivery of our digital solutions. To excel in this role, you should possess expertise in PySpark and Python, along with a strong foundation in SQL. You must exhibit proficiency in writing and debugging code, coupled with being a quick learner with exceptional analytical and problem-solving skills. Effective written and verbal communication skills are essential for this position. Additional highly valued skills may encompass knowledge in AWS cloud services like S3, Glue, Athena, Lake Formation, CloudFormation, familiarity with SCM tools such as GIT, previous experience in the banking or financial services sector, and exposure to technologies like Databricks, Snowflake, Starburst, and Iceberg. Your performance may be evaluated based on critical skills crucial for success in the role, such as risk and controls management, change and transformation facilitation, business acumen, strategic thinking, and digital and technology proficiency. The position is based in Chennai. In this role, your primary objective is to construct and uphold systems for collecting, storing, processing, and analyzing data, comprising data pipelines, data warehouses, and data lakes. This is essential to ensure the accuracy, accessibility, and security of all data. Key Accountabilities: - Develop and maintain data architecture pipelines that facilitate the transfer and processing of durable, complete, and consistent data. - Design and implement data warehouses and data lakes that can manage appropriate data volumes and velocity while adhering to necessary security measures. - Create processing and analysis algorithms tailored to the complexity and volumes of the intended data. - Collaborate with data scientists to construct and deploy machine learning models. Analyst Expectations: - Execute assigned tasks promptly and with high standards, driving continuous improvement. - Demonstrate profound technical knowledge and experience in the designated expertise area. - Lead and supervise a team, providing guidance, professional support, allocating work requirements, and coordinating resources. - Take responsibility for team operations and activities, ensuring adherence to policies and procedures, risk mitigation, and regulatory compliance. - Actively engage with stakeholders, influence decision-making, and take ownership of risk management and control enhancement. - Foster cross-functional collaboration, maintain organizational awareness, and contribute to achieving organizational objectives by resolving problems and guiding team members. In alignment with the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, you are expected to uphold a moral compass and exhibit the Barclays Mindset of Empower, Challenge, and Drive in your conduct.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You are a Senior Data Platform Engineer responsible for leading the design, development, and optimization of the data platform infrastructure. Your primary focus will be on driving scalability, reliability, and performance across data systems to enable data-driven decision-making at scale. Working closely with data engineers, analysts, and product teams, you will play a crucial role in enhancing the overall data platform. Your responsibilities will include architecting and implementing scalable, secure, and high-performance data platforms on AWS cloud using Databricks. You will be building and managing data pipelines and ETL processes utilizing modern data engineering tools such as AWS RDS, REST APIs, and S3 based ingestions. Monitoring and maintaining production data pipelines, along with working on enhancements, will be essential tasks. Optimizing data systems for improved performance, reliability, and cost efficiency will also fall under your purview. Implementation of data governance, quality, and observability best practices in line with Freshworks standards will be a key focus area. Collaboration with cross-functional teams to support diverse data needs is also a critical aspect of this role. Qualifications for this position include a Bachelor's/Masters degree in Computer Science, Information Technology, or a related field. You should have good exposure to data structures and algorithms, coupled with proven backend development experience using Scala, Spark, or Python. A strong understanding of REST API development, web services, and microservices architecture is essential. Experience with Kubernetes and containerized deployment is considered a plus. Proficiency in working with relational databases like MySQL, PostgreSQL, or similar platforms is required. A solid understanding and hands-on experience with AWS cloud services are also important. Knowledge of code versioning tools such as Git and Jenkins is necessary. Excellent problem-solving skills, critical thinking, and keen attention to detail will be valuable assets in this role.,
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
As a software developer, you will be working in a constantly evolving environment driven by technological advances and the strategic direction of the organization you are employed by. Your primary responsibilities will include creating, maintaining, auditing, and enhancing systems to meet specific needs, often based on recommendations from systems analysts or architects. You will be tasked with testing both hardware and software systems to identify and resolve system faults. Additionally, you will be involved in writing diagnostic programs and designing and developing code for operating systems and software to ensure optimal efficiency. In situations where necessary, you will also provide recommendations for future developments. Joining us offers numerous benefits, including the opportunity to work on challenging projects and solve complex technical problems. You can expect rapid career growth and the chance to assume leadership roles. Our mentorship program allows you to learn from experienced mentors and industry experts, while our global opportunities enable you to collaborate with clients from around the world and gain international experience. We offer competitive compensation packages and benefits to our employees. If you are passionate about technology and interested in working on innovative projects with a skilled team, pursuing a career as an Infosys Power Programmer could be an excellent choice for you. To be considered for this role, you must possess the following mandatory skills: - Proficiency in AWS Glue, AWS Redshift/Spectrum, S3, API Gateway, Athena, Step, and Lambda functions. - Experience with Extract Transform Load (ETL) and Extract Load & Transform (ELT) data integration patterns. - Expertise in designing and constructing data pipelines. - Development experience in one or more object-oriented programming languages, preferably Python. In terms of job specifications, we are looking for candidates who meet the following criteria: - At least 5 years of hands-on experience in developing, testing, deploying, and debugging Spark Jobs using Scala in the Hadoop Platform. - Profound knowledge of Spark Core and working with RDDs and Spark SQL. - Familiarity with Spark Optimization Techniques and Best Practices. - Strong understanding of Scala Functional Programming concepts like Try, Option, Future, and Collections. - Proficiency in Scala Object-Oriented Programming covering Classes, Traits, Objects (Singleton and Companion), and Case Classes. - Sound knowledge of Scala Language Features including the Type System and Implicit/Givens. - Hands-on experience working in the Hadoop Environment (HDFS/Hive), AWS S3, EMR. - Proficiency in Python programming. - Working experience with Workflow Orchestration tools such as Airflow and Oozie. - Experience with API calls in Scala. - Familiarity and exposure to file formats like Apache AVRO, Parquet, and JSON. - Desirable knowledge of Protocol Buffers and Geospatial data analytics. - Ability to write test cases using frameworks like scalatest. - Good understanding of Build Tools such as Gradle & SBT. - Experience using GIT, resolving conflicts, and working with branches. - Preferred experience in workflow systems like Airflow. - Strong programming skills focusing on data structures and algorithms. - Excellent analytical and communication skills. Candidates applying for this position should have: - 7-10 years of industry experience. - A BE/B.Tech in Computer Science or an equivalent qualification.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You have 5+ years of experience in designing and building data pipelines using Apache Spark, Databricks, or equivalent big data frameworks. You possess hands-on expertise with streaming and messaging systems like Apache Kafka, Confluent Cloud, RabbitMQ, or Azure Event Hub. Your experience includes creating producers, consumers, and topics, and integrating them into downstream processing. You have a deep understanding of relational databases and Change Data Capture (CDC). You are proficient in SQL Server, Oracle, or other RDBMSs, and have experience capturing change events using tools like Debezium or native CDC tools, transforming them for downstream consumption. Your proficiency extends to programming languages such as Python, Scala, or Java, along with solid knowledge of SQL for data manipulation and transformation. You also have cloud platform expertise, including experience with Azure or AWS services for data storage, compute, and orchestration (e.g., ADLS, S3, Azure Data Factory, AWS Glue, Airflow, Databricks, DLT). Furthermore, you have knowledge of data modeling and warehousing, including familiarity with data Lakehouse architectures, Delta Lake, partitioning strategies, and performance optimization. You are also well-versed in version control and DevOps practices, with experience in Git, CI/CD pipelines, and the ability to automate deployment and manage infrastructure as code.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
kolkata, west bengal
On-site
As a Java PHP Developer with 5-7 years of experience, your primary responsibility will be to design, develop, and maintain backend systems and RESTful APIs using Java (Spring Boot) and PHP (Laravel/Core). You will also be tasked with architecting and implementing microservices and cloud-native solutions using AWS. Managing databases including MySQL, MongoDB, and Redis, as well as integrating messaging services such as SQS and SES for asynchronous workflows, will be crucial aspects of your role. Collaboration with front-end teams for API integration and support is expected, along with optimizing application performance, security, and scalability. Utilizing Git for version control, managing tasks through JIRA, and documentation via Confluence will be part of your daily routine. To excel in this role, you must possess proficiency in PHP (Laravel/Core) and Java (Spring Boot, Hibernate), along with experience in Node.js and REST API development. Hands-on experience with MySQL, MongoDB, Redis, and Solr is essential, as well as practical knowledge of AWS Services like EC2, SQS, SES, RDS, S3, and API Gateway. Understanding microservices architecture and API-first development principles is critical. Familiarity with HTML, JavaScript, and modern web integration practices is advantageous, and strong debugging, problem-solving, and analytical skills are a must. Working knowledge of Git, JIRA, and Confluence is required for seamless task management. Having exposure to React.js, Tailwind CSS, or full-stack JavaScript development is considered a good-to-have skill. Familiarity with Docker, CI/CD pipelines, and container orchestration, as well as experience with GraphQL, RabbitMQ, or Elasticsearch, is beneficial. Knowledge of ETL, Apache Superset, Metabase, or other BI tools, along with monitoring experience using Splunk, New Relic, or CloudWatch, is advantageous. Agile methodology experience, specifically in Scrum/Kanban, is a plus. If you are interested in this exciting opportunity, we invite you to submit your resume and a cover letter detailing your relevant experience. Please ensure to include "Java PHP Specialist Application" in the subject line of your application. Apply now to be considered for this role.,
Posted 1 month ago
14.0 - 20.0 years
0 Lacs
maharashtra
On-site
As a Principal Architect - Data & Cloud at Quantiphi, you will bring your 14-20 years of experience in Technical, Solutioning, and Analytical roles to lead the way in architecting, designing, and implementing end-to-end data pipelines and data integration solutions for structured and unstructured data sources and targets. With a focus on Cloud platforms such as GCP, AWS, and Azure, you will be responsible for building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration, and Business Intelligence/Artificial Intelligence solutions. Your role will involve understanding business requirements and translating them into functional and non-functional areas, defining boundaries in terms of Availability, Scalability, Performance, Security, and Resilience. You will leverage your expertise in various Data Integration and ETL technologies on Cloud, including Spark, Pyspark/Scala, Dataflow, DataProc, and more. Additionally, you will have the opportunity to work with traditional ETL tools like Informatica, DataStage, OWB, Talend, and others. Your deep knowledge of Cloud and On-Premise Databases such as Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, and others will be instrumental in architecting scalable data warehouse solutions on Cloud platforms like Big Query or Redshift. Moreover, your exposure to No-SQL databases and experience with data integration, storage, and data pipeline tool sets will be crucial in designing optimized data analytics solutions. Being a thought leader in architecture design and development of cloud data analytics solutions, you will collaborate with internal and external stakeholders to present solutions, support sales teams in building proposals, and lead discovery workshops with potential customers globally. Your role will also involve mentoring young talent, contributing to building Assets and Accelerators, and ensuring the successful delivery of projects on parameters of Schedule, Quality, and Customer Satisfaction. The position offers the experience of working in a high-growth startup in the AI, Decision Science, and Big Data Domain, along with the opportunity to be part of a diverse and proactive team that constantly raises the bar in translating data into tangible business value for clients. Additionally, flexible remote working options are available to foster productivity and work-life balance. If you are passionate about innovation, excellence, and growth, and enjoy working with a dynamic team of tech enthusiasts, Quantiphi is the place for you to shape your career in Data & Cloud architecture. Join us on our journey of digital transformation and be a part of creating impactful solutions that drive business success.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
Join a dynamic leader in the cloud data engineering sector, specializing in advanced data solutions and real-time analytics for enterprise clients. This role offers an on-site opportunity in India to work on cutting-edge AWS infrastructures where innovation is at the forefront of business transformation. The ideal candidate for this role is a professional with 4+ years of proven experience in AWS data engineering, Python, and PySpark. You will play a crucial role in designing, optimizing, and maintaining scalable data pipelines that drive business intelligence and operational efficiency. As part of your responsibilities, you will design, develop, and maintain robust AWS-based data pipelines using Python and PySpark. You will implement efficient ETL processes, ensuring data integrity and optimal performance across AWS services such as S3, Glue, EMR, and Redshift. Collaboration with cross-functional teams to integrate data engineering solutions within broader business-critical applications will be a key aspect of your role. Additionally, you will troubleshoot and optimize existing data workflows to ensure high availability, scalability, and security of cloud solutions. It is essential to exercise best practices in coding, version control, and documentation to maintain a high standard of engineering excellence. The required skills and qualifications for this role include 4+ years of hands-on experience in AWS data engineering with expertise in Python and PySpark. Proficiency in developing and maintaining ETL processes using AWS services like S3, Glue, EMR, and Redshift is a must. Strong problem-solving skills, deep understanding of data modeling, data warehousing concepts, and performance optimization are essential. Preferred qualifications include experience with AWS Lambda, Airflow, or similar cloud orchestration tools, familiarity with containerization, CI/CD pipelines, and infrastructure-as-code like CloudFormation and Terraform, as well as AWS certifications or equivalent cloud credentials. In this role, you will work in a collaborative, fast-paced environment that rewards innovation and continuous improvement. You will have opportunities for professional growth and skill development through ongoing projects and training. Additionally, you will benefit from competitive compensation and the ability to work on transformative cloud technology solutions.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
You will be working as a Data Platform Engineer based in Gurgaon, with a minimum experience of 4 years. In this role, you will be responsible for managing AWS environments, including ensuring high performance, security, and availability. Your expertise in AWS SysOps, AWS DMS for database migrations, data processing pipelines, and Infrastructure as Code (Terraform) will be essential for success in this position. Collaborating with various teams like data engineering, analytics, and DevOps will be crucial to deliver scalable solutions for enterprise-level data platforms. Your responsibilities will include designing, configuring, and maintaining AWS DMS, developing data workflows, implementing infrastructure as code using Terraform, monitoring system health, and ensuring compliance with security and disaster recovery best practices. To qualify for this role, you should have at least 4 years of experience in cloud infrastructure and data platform operations. Proficiency in AWS SysOps, AWS DMS, ETL/data processing pipelines, Terraform, and other AWS services like EC2, S3, RDS, IAM, CloudWatch, and Lambda is required. Strong troubleshooting, analytical, and communication skills are necessary. Experience with containerization, CI/CD pipelines, DevOps practices, and big data tools will be considered advantageous. A bachelor's degree in Computer Science, Information Technology, or a related field is preferred. This is a full-time onsite position at the Gurgaon office, where your expertise in AWS, data platforms, and infrastructure automation will play a vital role in delivering robust and scalable data solutions.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Data Engineer (AWS) at our company, you will be utilizing your 3-5 years of experience, with a minimum of 3 years specifically focused on AWS cloud services. Your main responsibilities will revolve around providing technical support for data engineering systems, including troubleshooting issues related to data pipelines, AWS services, Snowflake, Hadoop, and Spark. Your day-to-day tasks will include investigating and resolving issues in data pipelines, optimizing Spark jobs, managing incidents related to data processing, and collaborating with cross-functional teams to address critical issues promptly. You will also need to possess a solid understanding of big data architectures such as Hadoop, Spark, Kafka, and Hive. To excel in this role, you must have hands-on experience with Hadoop, Spark with Python on AWS, knowledge of Terraform templates for infrastructure provisioning, and at least 3 years of experience in BI/DW development with Data Model Architecture/Design. Your familiarity with CI/CD implementation, scheduling tools and techniques on Hadoop/EMR, and best practices in cloud-based data engineering and support will be highly beneficial. Your technical essentials should include proven experience in providing technical support for data engineering systems, a strong understanding of AWS services like S3, Glue, Redshift, EMR, Lambda, Athena, and Step Functions, as well as hands-on experience supporting Snowflake, Hadoop, Spark, and Python in a production environment. Additionally, you should possess excellent problem-solving skills, analytical abilities, and communication skills to effectively work with cross-functional teams. Preferred qualifications for this role include being an AWS Certified Solutions Architect Associate. As a self-motivated team player with strong analytical skills and effective communication abilities, you will thrive in our dynamic and passionate work environment. If you are looking to work with a team of enthusiastic professionals and enjoy continuous growth, this position is perfect for you.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
We are seeking a customer-obsessed, analytical Senior Staff Engineer to take charge of the development and expansion of our Tax Compliance product suite. As a key member of our team, you will be instrumental in creating cutting-edge digital solutions that streamline and automate tax filing, reconciliation, and compliance processes for businesses of all sizes. Join our rapidly growing company and immerse yourself in a dynamic and competitive market, where you can play a pivotal role in helping businesses fulfill their statutory obligations efficiently, accurately, and confidently. In this role, you will have the following key responsibilities: - Lead a high-performing engineering team or serve as a hands-on technical lead. - Spearhead the design and implementation of scalable backend services using Python. - Utilize your expertise in Django, FastAPI, and Task Orchestration Systems. - Take ownership of and enhance our CI/CD pipelines through Jenkins to ensure swift, secure, and dependable deployments. - Architect and oversee infrastructure leveraging AWS and Terraform with a DevOps-centric approach. - Collaborate closely with product managers, designers, and compliance specialists to deliver features that enhance the seamless tax compliance experience for our users. - Demonstrate proficiency in containerization tools like Docker and orchestration with Kubernetes. - Background knowledge in security, observability, or compliance automation. To be successful in this role, you should meet the following requirements: - Possess over 5 years of software engineering experience, with a minimum of 2 years in a leadership or principal-level position. - Demonstrate deep proficiency in Python/Node.js, encompassing API development, performance enhancement, and testing. - Experience in Event-driven architecture, Kafka/RabbitMQ-like technologies. - Strong familiarity with AWS services such as ECS, Lambda, S3 RDS, and CloudWatch. - Solid grasp of Terraform for managing infrastructure as code. - Proficiency in Jenkins or similar CI/CD tools. - Capable of effectively balancing technical leadership with hands-on coding and creative problem-solving. - Excellent communication skills and a collaborative approach. This opportunity has been shared by Parvinder Singh from Masters India.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Frontend Engineer at CodeChavo, you will be a key player in building world-class responsive web applications. Your passion for user experience and real-time performance will drive you to excel in this fast-paced startup environment. Your responsibilities will include developing responsive and high-performance frontend applications using React.js and TypeScript. You will implement real-time features and messaging experiences using WebSockets / Socket.io. Collaborating closely with design and product teams, you will translate product requirements into scalable UI components. Leading by example, you will guide frontend architecture, conduct code reviews, mentor junior developers, and foster a strong engineering culture. Taking ownership of end-to-end delivery, you will handle development, deployment, monitoring, and error tracking. Rapid prototyping and vibe-based coding will be your tools to deliver fast without compromising on code quality. Your contribution to project planning and estimation will be crucial as you collaborate cross-functionally to meet release timelines. Applying a strong sense of design and UI aesthetics, you will ensure polished, accessible, and user-friendly interfaces. To be successful in this role, you should have at least 5 years of frontend development experience, preferably in SaaS or high-scale applications. Deep expertise in React.js, state management tools like Redux or Zustand, and real-time communication technologies is essential. Proficiency in modern frontend tooling, frontend security practices, Git workflows, and DSA foundation will be beneficial. If you are ready to make a real impact in the digital transformation space and contribute to building quality tech teams, we would love to have you on board at CodeChavo.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |