Jobs
Interviews

153 Step Functions Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

At Lilly, the global healthcare leader headquartered in Indianapolis, Indiana, we are dedicated to uniting caring with discovery to make life better for people around the world. Our 35,000 employees are committed to discovering and delivering life-changing medicines, improving disease management, and contributing to our communities through philanthropy and volunteerism. At Lilly, we prioritize putting people first and giving our best effort to our work. The LCCI Tech Team is currently seeking a motivated leader for the Enterprise Data Organization. This individual will play a crucial role in optimizing data extraction, enhancing data infrastructure, and aligning operations with organizational objectives. The ideal candidate will be responsible for driving operational excellence, ensuring compliance, and fostering collaboration across teams and stakeholders. As a senior leader, this individual will serve as a strategic business partner in advancing data and analytics services to provide business-driven insights and maintain a competitive edge. The Enterprise Data organization has developed an integrated data and analytics platform that enables Lilly team members to efficiently ingest, transform, consume, and analyze data sets across various environments. Contributors can easily prepare, analyze, and publish new data sets for the benefit of others within the organization. Key responsibilities for this role include designing, building, testing, and deploying high-performance and scalable data pipelines and consumption solutions in AWS. This involves ensuring optimal storage, retrieval, and processing of data, as well as maintaining data integrity, security, and privacy. The successful candidate will also be tasked with developing reusable components to accelerate enterprise data delivery, conducting comprehensive system testing, and actively participating in SAFe Agile framework ceremonies. Basic qualifications for this position include a Bachelor's degree in computer science, information technology, management information systems, or related fields, along with 6 to 10 years of development experience in tools such as SQL, Python, and AWS services. Additionally, candidates should have exposure to Agile development, code deployment using Github and CI-CD pipelines, and experience in data design, modeling, and management. In addition to technical expertise, strong commitment to good software design principles, understanding of cloud and hybrid data architecture concepts, excellent written and oral communication skills, and a willingness to learn are essential attributes for prospective candidates. If you are passionate about leveraging technology to drive innovation in the healthcare industry and are committed to making a meaningful impact, we invite you to join us at Lilly. Together, we can continue doing extraordinary things and making a difference in the lives of people worldwide. Lilly is an equal opportunity employer dedicated to creating an inclusive and diverse workforce. We are committed to providing individuals with disabilities equal opportunities for employment. If you require accommodation during the application process, please complete the accommodation request form on our website for further assistance.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

As a hands-on and strategic Solution/Technical Architect, you will be responsible for leading the design and implementation of AWS-based integration solutions for large-scale enterprise environments. Your main focus will be on building scalable, secure, and high-performing architectures to connect ERP systems (JD Edwards), eCommerce platforms, PIM/DAM ecosystems, and external partners using AWS-native services, real-time APIs, and EDI workflows. You will collaborate closely with offshore and onshore teams to ensure the delivery of modern integration solutions with architectural rigor and excellence. Your key responsibilities will include designing and implementing integration architectures on AWS, including data pipelines, APIs, and event-driven workflows. You will take the lead in real-time and batch integrations across ERP (JDE), eCommerce, PIM/DAM, and external partner systems using AWS-native services such as API Gateway, Lambda, EventBridge, Step Functions, SQS/SNS, and S3 to build modular, cloud-native solutions. Additionally, you will define interface specifications, architecture diagrams, deployment models, and technical documentation to ensure that technical solutions meet performance, scalability, security, and compliance standards. You will also support EDI-style integrations using custom services or third-party tools hosted on AWS, aligning with established architecture patterns. The ideal candidate for this role should have at least 10 years of experience in solution architecture and enterprise integration, with a minimum of 3 years of hands-on experience specifically on AWS. Deep expertise with AWS-native integration services like Lambda, Step Functions, API Gateway, EventBridge, CloudWatch, S3, and IAM is required. Proven experience in integrating with JD Edwards (JDE) and commerce platforms, along with a solid understanding of event-driven architecture, serverless patterns, and API lifecycle management, will be essential. Strong knowledge of message transformation (XML, JSON, XSLT), asynchronous messaging, and security enforcement is also necessary, alongside excellent communication skills and the ability to document and present technical solutions clearly. In return, we offer you a lead architecture role on mission-critical AWS-based integration programs, exposure to complex enterprise modernization projects across retail and digital commerce, a clear career path towards Enterprise Architect or Platform Practice Lead roles, and an engineering-first culture that prioritizes quality, innovation, and platform excellence.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

You should have at least 6 years of experience in Java, JDBC, and AWS services such as Lambda, ECS, API Gateway, RDS, SQS, SNS, DynamoDB, MQ, and Step Functions. It is important to be familiar with Terraform, SQL, PL/SQL, Jenkins, GitLab, and other standard development tools. Immediate joiners are preferred for this role.,

Posted 1 month ago

Apply

7.0 - 8.0 years

5 - 7 Lacs

Hyderabad, Telangana, India

On-site

Key Responsibilities: Collaborate with stakeholders to identify, scope, and execute data-driven business solutions. Extract, clean, and analyze large datasets from internal databases. Develop custom data models , machine learning algorithms , and predictive models to address key business challenges. Operationalize models into structured software programs or business processes. Monitor model performance and ensure continuous improvement and data integrity. Assess and improve the accuracy of data sources and data collection methods. Implement recommendations derived from data analysis into business operations. Communicate results and insights effectively with both technical and non-technical stakeholders. Provide post-implementation support to track KPIs and ensure outcomes are achieved. Key Performance Indicators (KPIs): Operational and Financial Impact: Measured improvements in time or cost savings following model or tool implementation. Stakeholder Satisfaction: Feedback scores and qualitative input from end users regarding usability and value of delivered solutions. Qualifications and Education Requirements: Bachelor's degree (BSc/BTech) in Applied Sciences, Mathematics, Statistics, Computer Science, or a related field. Completion of Year 2 Statistics coursework is required. Internship experience (min. 2 months) OR recognized certifications in data science, analytics, or related fields. Preferred Technical Skills: Programming: Proficiency in Python or R (data cleaning, modeling, statistical analysis). Data Handling: Strong skills in SQL , Power BI (DAX) , and VBA . Machine Learning: Familiarity with techniques such as clustering, decision trees, neural networks, etc. Statistical Concepts: Regression, distributions, statistical tests, and real-world application. Data Architecture: Experience working with structured data models and data pipelines. Soft Skills and Competencies: Excellent analytical , problem-solving , and critical thinking skills. Strong written and verbal communication skills. Ability to collaborate across teams and present data insights to stakeholders . High attention to detail and ability to manage multiple projects simultaneously. Negotiation and persuasion abilities when influencing data-driven decisions. Positive attitude with a never give up mindset. Adaptability and flexibility in dynamic environments.

Posted 1 month ago

Apply

10.0 - 12.0 years

5 - 7 Lacs

Chennai, Tamil Nadu, India

On-site

Key Responsibilities: Design & Implementation Lead the design and development of cloud-native applications with a focus on serverless architectures and event-driven systems . Architect solutions using core AWS services ensuring scalability, resilience, and cost-efficiency. AWS Services Expertise Hands-on development with services such as: AWS Lambda , API Gateway , S3 , DynamoDB , Step Functions , SQS , AppSync Amazon Pinpoint , Cognito , EventBridge , KMS , CloudWatch Logs , X-Ray Infrastructure as Code (IaC) Implement and manage IaC using AWS CDK . Automate deployments via AWS CodePipeline or similar CI/CD tooling. Ensure infrastructure is consistent, version-controlled, and maintainable. Serverless & Event-Driven Architecture Champion serverless patterns and decoupled design using event-driven workflows . Integrate systems via SQS, SNS, EventBridge, Step Functions , and API Gateway . Monitoring & Observability Design and implement observability strategies using: CloudWatch Logs AWS X-Ray Custom metrics Proactively identify and resolve performance or availability issues. Security & Compliance Enforce security best practices , including: Fine-grained IAM role and policy management PII/PHI data tagging and encryption (HIPAA compliance) Secure configurations via Cognito , KMS , and Isolation Patterns Cost Optimization Optimize cloud spend via: S3 lifecycle policies Leveraging serverless pricing models Comparing tools like Amazon Pinpoint vs. SES Scalability & Resilience Design fault-tolerant systems with: Auto-scaling Dead-Letter Queues (DLQs) Retry & Backoff strategies Circuit breakers and fallback mechanisms CI/CD & DevOps Collaborate on CI/CD pipeline design for streamlined, automated delivery. Ensure robust deployment strategies and rollback plans are in place. Documentation & Workflow Design Maintain high-quality documentation for: Architecture diagrams Workflow processes Operational procedures Cross-Functional Collaboration Partner with developers, QA, product managers, and business stakeholders to align on goals and deliverables. AWS Best Practices Ensure all architectural decisions and deployments follow AWS Well-Architected Framework and security/compliance standards. Required Skills & Experience: 5+ years experience in AWS cloud engineering or solution architecture roles. Expertise in: AWS Lambda, S3, DynamoDB, Step Functions, API Gateway, SQS, AppSync, EventBridge, Cognito, Pinpoint, KMS Advanced IaC skills with AWS CDK (TypeScript or Python preferred). Strong understanding of CI/CD processes and tools like CodePipeline. In-depth knowledge of event-driven design , microservices, and serverless frameworks . Experience with security and compliance implementation, especially HIPAA-related practices. Proven record in monitoring/observability and performance tuning. Excellent written documentation and cross-functional collaboration skills. Preferred Qualifications: AWS Certification (e.g., Solutions Architect , DevOps Engineer ) Experience with API-first development and tools like Postman , Swagger/OpenAPI Familiarity with container technologies (e.g., Docker, ECS, Fargate) is a plus Knowledge of cost monitoring tools like AWS Cost Explorer or CloudHealth

Posted 1 month ago

Apply

8.0 - 13.0 years

8 - 10 Lacs

Bengaluru, Karnataka, India

On-site

Job description The Legal & Research Technology in Bangalore provides systems development and support for the content pathways and content processing needs of WestLaw. The group oversees and executes on a wide range of project types, ranging from cost-saving infrastructure to revenue-driving product development initiatives. We are looking for a highly motivated, innovative, and detailed oriented individual who will make an impact by contributing to the teams development needs right away. The key area of focus for this position is serving as a Software engineer for a multi-year project to deliver new and re-engineered systems using AWS and its capabilities with excellent proficiency in Python, Groovy, JavaScript, and Angular 6+. In this opportunity as a Software Engineer, you will: Development of high-quality code/script on the below bullet points Working with Python programming languageandXSLT transformation AWS services like Lambda, Step Functions, CloudWatch, CloudFormation, S3, DynamoDB, PostgreSQL, Glue etc Hands-on on custom Template creation, local stack deployments Known to GitHub Copilot functionalities to be used on job for quicker turnaround Good to have working knowledge on Groovy, JavaScript and/or Angular 6+ Work with XML content Write Lambdas for AWS Step functions Adhere to best practices for development in Python, Groovy, JavaScript, and Angular Come up with Functional Unit Test cases for the requirements in Python, Groovy, JavaScript, and Angular Actively participate in Code review of own and the peers Work with different AWS capabilities Understand Integration points of upstream and downstream processes Learn new frameworks that are needed for implementation Maintain and update the Agile/Scrum dashboard for accurate tracking of own tasks Proactively pick up tasks and work toward the completion of them with aggressive timelines Understand the existing functionality of the systems and suggest how we can improve About you: you're a fit for the role of Software Engineer if you: Very Strong in OO design patterns and concepts Must Have 3 to 6 years experience in relevantTechnologies. Python programming language XSLT transformation AWS services like Lambda, Step Functions, CloudWatch, CloudFormation, S3, DynamoDB, PostgreSQL, Glue etc Hands-on on custom Template creation, local stack deployments Known to GitHub Copilot functionalities to be used on job for wuicker turnaround Good understanding of cloud concepts Strong understanding of Agile and Scrum methodologies Strong written and verbal communication skills Ability to work under pressure Attention to Detail Working Knowledge of some of the AWS capabilities Knowledge of Agile/Scrum tracking tools Keen on picking up newer technologies Team Player Interact with internal/external teams Adaptability and Flexibility #LI-SS6 What s in it For You Join us to inform the way forward with the latest AI solutions and address real-world challenges in legal, tax, compliance, and news. Backed by our commitment to continuous learning and market-leading benefits, you'll be prepared to grow, lead, and thrive in an AI-enabled future. This includes: Industry-Leading Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, and hybrid model, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow s challenges and deliver real-world solutions. Our skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Culture: Globally recognized and award-winning reputation for inclusion, innovation, and customer-focus. Our eleven business resource groups nurture our culture of belonging across the diverse backgrounds and experiences represented across our global footprint. Hybrid Work Model: We ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives.

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are a skilled Full Stack Developer, primarily focusing on backend development with Node.js and having a working knowledge of React.js. Your main responsibility will be to develop a custom enterprise platform that interfaces with SDLC tools like JIRA, Jenkins, GitLab, and others. This platform aims to streamline license and access management, automate administrative tasks, and provide robust dashboards and governance features. To excel in this role, you should have at least 4-6 years of professional development experience. Your expertise in Node.js should cover async patterns and performance tuning. Additionally, hands-on experience with AWS Lambda and serverless architecture is essential. You must also be adept at building integrations with tools like JIRA, Jenkins, GitLab, Bitbucket, etc. Knowledge of React.js for UI development and integration is required, along with a solid understanding of RESTful APIs, Webhooks, and API security. It would be beneficial if you have familiarity with Git and collaborative development workflows, exposure to CI/CD practices, and infrastructure as code. Experience with AWS services such as DynamoDB, S3, EventBridge, and Step Functions is a plus. Knowledge of enterprise SSO, OAuth2, or SAML is desirable. Prior experience in automating tool admin tasks and DevOps workflows will be advantageous, as well as an understanding of modern monitoring/logging tools like CloudWatch or ELK. Working in this role, you will have the opportunity to work on a transformative platform with direct enterprise impact. You will have the freedom to innovate and contribute to the automation of key IT and DevOps functions. This role will also expose you to modern architectures including serverless, microservices, and event-driven systems. You can expect a collaborative, outcome-oriented work culture that fosters growth and success.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As an AWS Developer at Barclays, you will play a key role in shaping the digital landscape and driving innovation to deliver exceptional customer experiences. Your primary responsibility will be to leverage cutting-edge technology to enhance our digital offerings and ensure the successful delivery of the technology stack. Your strong analytical and problem-solving skills will be crucial in understanding business requirements and delivering high-quality solutions. Collaborating with a team of engineers, business analysts, and stakeholders, you will tackle complex technical challenges that require detailed analytical skills and analysis. To excel in this role, you should possess expertise in AWS cloud development, core services, Infrastructure as Code (CloudFormation/Terraform), and cloud security. Proficiency in CI/CD pipelines and programming languages such as Python, Java, or Node.js is essential. Additionally, experience with Docker, Kubernetes, and AWS service integrations (API Gateway, Step Functions, SQS) will be highly valued. Holding relevant AWS certifications and demonstrating a mindset focused on continuous improvement, collaboration, and delivering customer-centric solutions will further enhance your candidacy. In this position based in Chennai, your primary purpose will be to build and maintain systems that collect, store, process, and analyze data, including data pipelines, data warehouses, and data lakes. Your responsibilities will include designing and implementing data architectures pipelines, data warehouses, and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to deploy machine learning models. As an AWS Developer, you will be expected to demonstrate in-depth technical knowledge, drive continuous improvement, and lead a team by guiding and supporting professional development. Your role may involve leadership responsibilities, where you are required to exhibit leadership behaviors such as listening, inspiring, aligning across the enterprise, and developing others. Alternatively, as an individual contributor, you will focus on developing technical expertise and acting as an advisor as needed. Your impact will extend to related teams within your area, as you partner with other functions and business areas to drive results. You will be accountable for operational processing, risk management, and compliance with relevant rules and regulations. By maintaining a deep understanding of how your sub-function integrates with the broader organization, you will contribute to achieving the organization's objectives. Demonstrating the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as embodying the Barclays Mindset to Empower, Challenge, and Drive, will be essential for all colleagues. Your commitment to these values and mindset will serve as a moral compass, guiding your actions and interactions within the organization.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer with 6-9 years of experience, your role will be based in Pune. You should have a minimum of 5 years of experience as a Data Engineer, along with hands-on expertise in Star/Snowflake schema design, data modeling, data pipelining, and MLOps. Your proficiency in Data Warehouse technologies like Snowflake, AWS Redshift, and AWS data pipelines (Lambda, AWS Glue, Step Functions, etc.) will be crucial. Strong skills in SQL and at least one major programming language (Python/Java) are required. Additionally, you should be experienced with Data Analysis Tools such as Looker or Tableau, and have familiarity with Pandas, Numpy, Scikit-learn, and Jupyter notebooks. Knowledge of Git, GitHub, and JIRA is preferred. Your ability to identify and resolve data quality issues, provide end-to-end data platform support, and work effectively as an individual contributor is essential. In this role, you will need to possess strong analytical and problem-solving skills, with meticulous attention to detail. A positive mindset, can-do attitude, and a focus on simplifying tasks and building reusable components will be highly valued. You should be able to assess the suitability of new technologies for solving business problems and establish strong relationships with various stakeholders. Your responsibilities will involve designing, developing, and maintaining an accurate, secure, available, and fast data platform. You will engineer efficient, adaptable, and scalable data pipelines, integrate various data sources, create standardized datasets, and ensure product changes align well with the data platform. Collaborating with cross-functional teams, understanding their challenges, and providing data-driven solutions will be key aspects of your role. Overall, your technical skills, including expertise in data engineering, schema design, data modeling, and data warehousing, will play a vital role in driving the success of the data platform and meeting the goals of the organization.,

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

hyderabad, telangana

On-site

About Us: Fission Labs is a leading software development company headquartered in Sunnyvale, with offices in Dallas and Hyderabad. We specialize in crafting flexible, agile, and scalable solutions that drive businesses forward. Our comprehensive services include product development, cloud engineering, big data analytics, QA, DevOps consulting, and AI/ML solutions, empowering clients to achieve sustainable digital transformation aligned with their business goals. Fission Labs Website: https://www.fissionlabs.com/ Work Location: Hyderabad Notice Period: Immediate to 30 Days Role Overview: Omada is dedicated to developing next-gen intelligent systems that seamlessly integrate real-time APIs, cloud-native infrastructure, and external AI capabilities. We are seeking a talented Python Engineer with expertise in FastAPI, AWS, and practical experience in utilizing GenAI APIs and data pipelines. Key Responsibilities: Backend & API Development - Design, develop, and maintain robust REST APIs using FastAPI and Python. - Construct scalable microservices that interface with AWS services such as Lambda, EC2, EKS, API Gateway, DynamoDB, and S3. - Implement workflow automation and event-driven pipelines employing tools like Step Functions, SQS, and SNS. - Create real-time and streaming APIs using WebSockets or Kinesis as needed. - Integrate with external GenAI APIs including OpenAI (ChatGPT APIs), Google Gemini APIs, and other third-party AI/ML APIs or services. - Design and execute web crawlers or integrate with crawling frameworks/tools to extract and process structured/unstructured data. Required Skills: - 7-9 years of backend development experience with a strong proficiency in Python. - Demonstrated production-level experience utilizing FastAPI. - Extensive expertise in AWS services, particularly Lambda, EC2, EKS, API Gateway, Step Functions, DynamoDB, S3, and SNS/SQS. - Hands-on experience in calling and managing responses from ChatGPT APIs (OpenAI) and Google Gemini APIs. - Familiarity with writing or integrating web crawlers (e.g., BeautifulSoup, Playwright, Scrapy). - Proficiency in Git and GitHub, encompassing branching strategies, pull requests, and code reviews. - Ability to work independently in a dynamic startup environment. - Prior experience working on Chat Agents. Preferred Qualifications: - Bachelor's degree in Computer Science, Engineering, or a related field. - Familiarity with NoSQL and relational databases (DynamoDB, PostgreSQL, etc.). - Experience in CI/CD workflows, Docker, and Kubernetes. - Bonus: Exposure to distributed data processing frameworks like Apache Beam or Spark. - Bonus: Previous experience integrating with external data and media APIs. Why Join Omada: - Contribute to building API-first systems integrated with cutting-edge AI and cloud technologies. - Shape scalable, real-time backend architecture in a greenfield product. - Collaborate with a modern Python + AWS + GenAI stack.,

Posted 1 month ago

Apply

3.0 - 8.0 years

2 - 4 Lacs

Remote, , India

Remote

Technologies / Skills: Strong hands-on experience with AWS data engineering services (ETL, orchestration, and streaming tools. Proficiency in SQL, Python (Pandas, NumPy) and PySpark. Experience in ETL/ELT pipeline development, data modeling and working with large-scale data systems. Familiarity with CI/CD workflows, Git and version control practices. Responsibilities: Collaborate with business stakeholders and technical teams to gather requirements and translate them into scalable data solutions. Design, build and maintain robust, secure, high-performance data pipelines Ensure adherence to data quality, governance and compliance standards Work with architects and engineering leads to follow best practices and optimize for performance Communicate insights and technical decisions effectively with cross-functional stakeholders Participate in Agile ceremonies and deliver iterative improvements in data infrastructure Required Qualifications: 3+ years of hands-on experience with AWS data engineering services, including: AWS Glue, Athena, EMR, Redshift, Lambda, Kinesis, S3, Step Functions, CloudWatch. Expertise in building data pipelines using PySpark/Spark SQL. Strong SQL and Python skills with experience in large-scale data transformation and ingestion. Knowledge of Spark internals (joins, partitioning, memory management) and performance tuning via Spark UI. Experience with Git, CI/CD pipelines, and working in Agile environments. Understanding of partitioning, schema design, and query performance optimization. Desired Qualifications: Experience with workflow orchestration tools like Apache Airflow or AWS Step Functions. Exposure to streaming platforms such as Kafka, Kinesis etc . Working knowledge of AWS SDKs (e.g., Boto3) for automation and integration. Familiarity with modern ETL/ELT patterns and data modeling techniques. Basic proficiency in shell scripting for automation and support tasks.

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are a skilled Full Stack Developer with 4-6 years of hands-on experience, proficient in React.js for front-end development and Node.js for back-end development. Your strong backend experience includes RESTful API development and familiarity with AWS Lambda, API Gateway, DynamoDB, and S3 among other AWS services. You have prior experience integrating and automating workflows for SDLC tools such as JIRA, Jenkins, GitLab, Bitbucket, GitHub, and SonarQube. Your understanding of OAuth2, SSO, and API key-based authentications is solid. Additionally, you are familiar with CI/CD pipelines, microservices, and event-driven architectures. Your knowledge of Git and modern development practices is strong, and you possess good problem-solving skills enabling you to work independently. Experience with Infrastructure-as-Code tools like Terraform or CloudFormation is a plus. It would be beneficial if you have experience with AWS EventBridge, Step Functions, or other serverless orchestration tools, as well as knowledge of enterprise-grade authentication methods such as LDAP, SAML, or Okta. Familiarity with monitoring/logging tools like CloudWatch, ELK, or DataDog will also be advantageous in this role.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As an Engineering IT Developer at Garrett Motion, you will play a crucial role in the Engineering IT 4.0 team as a key developer, contributing to the Engineering 4.0 transformation project in collaboration with Engineering teams and business stakeholders. Your primary objective will be to engage in software development activities, drive continuous improvement, and provide support for new development in product design validation from a software developer's perspective. Your responsibilities will include designing, deploying, and optimizing cloud-based applications and workflows related to Engineering Process Integration and Design Optimization. You will directly contribute to the enhancement of Ansys and similar engineering tools for running simulation projects in a fully automated mode. Additionally, you will actively participate in agile project management activities, providing regular updates to managers and stakeholders on project progress and risk mitigation. Your role will involve contributing to every sprint as a software developer, actively challenging and supporting the team. You will also engage in continuous improvement activities, implementing DevEx principles, coding best practices, and utilizing Kaizen methodologies. Collaboration with team members to enhance skill sets and provide support for skill development will be essential. In this position, you will be expected to approach problem-solving with an innovative mindset, leveraging new technologies and staying abreast of emerging trends. Your educational background should include a Bachelor's or Master's degree in Computer Science or a related field, with at least 4 years of relevant experience or an advanced degree with comparable experience. Key skills and knowledge required for this role include fluency in English, hands-on experience in object-oriented programming and Python scripting, as well as expertise in cloud-hosted applications/development. Familiarity with Ansys product family tools, integration technologies/architecture, and cloud-based APIs will be advantageous. Strong problem-solving skills, experience with Agile/scrum methodologies, and an understanding of Predictive/Generative AI and Machine Learning basics are also essential. Joining the Garrett Motion team at this exciting time will allow you to be part of a global innovator and technology leader in the automotive industry. With a focus on emission-reducing and zero-emission solutions, Garrett is committed to advancing sustainable motion through innovative turbocharging and electric boosting technologies. By combining mechanical and electric expertise, Garrett is at the forefront of redefining zero-emission automotive technologies. The Garrett Information Technology (IT) team is dedicated to understanding business needs, market challenges, and emerging technologies to deliver innovative services that enhance business flexibility and competitiveness both now and in the future.,

Posted 1 month ago

Apply

5.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Role Overview We are looking for a highly skilled Data Engineer with strong expertise in building data pipelines, managing cloud-based data platforms, and deploying scalable data architectures on AWS. The ideal candidate should have hands-on experience with AWS services and must hold a valid AWS certification. Key Responsibilities Design, build, and maintain robust, scalable, and efficient data pipelines and ETL/ELT processes on AWS. Work closely with data scientists, analysts, and business teams to understand data requirements and deliver solutions. Integrate data from multiple internal and third-party sources into unified data platforms. Optimize data lake and data warehouse performance (e.g., S3, Redshift, Glue, Athena). Ensure data quality, governance, and lineage using appropriate tools and frameworks. Implement CI/CD practices for data pipelines and workflows. Monitor and troubleshoot production data pipelines to ensure reliability and accuracy. Ensure compliance with data privacy and information security policies. Must-Have Qualifications 5 years of experience in data engineering or a related role. Strong programming skills in Python, PySpark, or Scala. Proficient in SQL and working with structured, semi-structured, and unstructured data. Solid experience with AWS services such as: S3, Glue, Lambda, Redshift, Athena, Kinesis, Step Functions, CloudFormation Hands-on experience with workflow orchestration tools like Apache Airflow or AWS-native alternatives. Must have an active AWS certification (e.g., AWS Certified Data Analytics, AWS Certified Solutions Architect). Experience with infrastructure as code (IaC) and DevOps practices is a plus. Preferred Skills Experience with Delta Lake, Apache Hudi, or similar formats. Exposure to data cataloging and metadata management tools. Familiarity with data security frameworks and GDPR/data privacy considerations. Experience in client-facing roles and agile delivery environments. Show more Show less

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Full Stack Developer, you will be responsible for utilizing your expertise in React.js, Node.js, and AWS Lambda to develop a custom enterprise platform that interacts with various SDLC tools. This platform aims to enhance tool administration, automate access provisioning and deprovisioning, manage licenses efficiently, and provide centralized dashboards for governance and monitoring purposes. With a minimum of 4-6 years of hands-on experience in Full Stack Development, you should possess a strong command over React.js for building component-based front-end architecture. Your backend skills in Node.js and proficiency in RESTful API development will be crucial for the success of this project. Additionally, your solid experience with AWS services such as Lambda, API Gateway, DynamoDB, and S3 will be highly valued. Your role will also involve integrating and automating workflows for SDLC tools like JIRA, Jenkins, GitLab, Bitbucket, GitHub, and SonarQube. A good understanding of OAuth2, SSO, and API key-based authentications is essential. Familiarity with CI/CD pipelines, microservices, and event-driven architectures will further enhance your contributions to the project. It is expected that you bring in-depth knowledge of Git and modern development practices to the table. Strong problem-solving skills and the ability to work independently are qualities that will be beneficial in this role. While not mandatory, experience with Infrastructure-as-Code tools like Terraform or CloudFormation would be advantageous. Familiarity with AWS EventBridge, Step Functions, or other serverless orchestration tools is considered a plus. Knowledge of enterprise-grade authentication methods such as LDAP, SAML, or Okta, as well as familiarity with monitoring/logging tools like CloudWatch, ELK, or DataDog, are also desirable skills. Join us in this exciting opportunity to work on a cutting-edge enterprise platform and contribute to streamlining processes and enhancing efficiency within the organization.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

The Max Maintenance team is currently in search of an experienced Principal Software Architect to take charge of leading the modernization and cloud transformation of a legacy .NET web application with a SQL Server backend. This role necessitates a profound understanding of AWS cloud services, including API Gateway, AWS Lambda, Step Functions, DynamoDB, and Neptune, in order to re-architect the system into a scalable, serverless, event-driven platform. The ideal candidate for this position will possess a robust architectural vision, hands-on technical proficiency, and a dedication to mentoring and guiding development teams through digital transformation initiatives. Are you someone who thrives in a fast-paced and dynamic team environment If so, we invite you to join our diverse and motivated team. Key Responsibilities: - Lead the comprehensive cloud transformation strategy for a legacy .NET/SQL Server web application. - Develop and deploy scalable, secure, and serverless AWS-native architectures with services like API Gateway, AWS Lambda, Step Functions, DynamoDB, and Neptune. - Establish and execute data migration plans, transitioning relational data models into NoSQL (DynamoDB) and graph-based (Neptune) storage paradigms. - Set standards for infrastructure-as-code, CI/CD pipelines, and monitoring utilizing AWS CloudFormation, CDK, or Terraform. - Offer hands-on technical guidance to development teams, ensuring high code quality and compliance with cloud-native principles. - Assist teams in adopting cloud technologies, service decomposition, and event-driven design patterns. - Mentor engineers in AWS technologies, microservices architecture, and best practices in DevOps and modern software engineering. - Develop and evaluate code for critical services, APIs, and data access layers using appropriate languages (e.g., Python, Node.js). - Create and implement APIs for both internal and external consumers, ensuring secure and dependable integrations. - Conduct architecture reviews, threat modeling, and enforce strict testing practices, including automated unit, integration, and load testing. - Collaborate closely with stakeholders, project managers, and cross-functional teams to define technical requirements and delivery milestones. - Translate business objectives into technical roadmaps and prioritize technical debt reduction and performance enhancements. - Engage stakeholders to manage expectations and provide clear communication on technical progress and risks. - Stay informed about AWS ecosystem updates, architectural trends, and emerging technologies. - Assess and prototype new tools, services, or architectural approaches that can expedite delivery and decrease operational complexity. - Advocate for a DevOps culture emphasizing continuous delivery, observability, and security-first development. Requirements: - Bachelor's or Master's degree in Computer Science, Engineering, or related field. - Minimum of 8 years of software development experience, with at least 3 years focused on architecting cloud-native solutions on AWS. - Proficiency in AWS services like API Gateway, Lambda, Step Functions, DynamoDB, Neptune, IAM, CloudWatch. - Experience in legacy application modernization and cloud migration. - Strong familiarity with the .NET stack and the ability to map legacy components to cloud-native equivalents. - Extensive knowledge of distributed systems, serverless design, data modeling (both relational and NoSQL/graph), and security best practices. - Demonstrated leadership and mentoring skills within agile software teams. - Exceptional problem-solving, analytical, and decision-making capabilities. The oil and gas industry's top professionals leverage over 150 years of combined experience every day to assist customers in achieving enduring success. We Power the Industry that Powers the World Our family of companies has delivered technical expertise, cutting-edge equipment, and operational assistance across every region and aspect of drilling and production, ensuring current and future success. Global Family We operate as a unified global family, comprising thousands of individuals working together to make a lasting impact on ourselves, our customers, and the communities we serve. Purposeful Innovation Through intentional business innovation, product development, and service delivery, we are committed to enhancing the industry that powers the world. Service Above All Our commitment to anticipating and meeting customer needs drives us to deliver superior products and services promptly and within budget.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are a skilled Full Stack Developer with a strong backend focus on Node.js and working knowledge of React.js. You will be responsible for developing a custom enterprise platform that interfaces with SDLC tools like JIRA, Jenkins, GitLab, and others. The platform aims to streamline license and access management, automate administrative tasks, and provide robust dashboards and governance features. With 4-6 years of professional development experience, you possess strong expertise in Node.js, including async patterns and performance tuning. You have hands-on experience with AWS Lambda and serverless architecture. Additionally, you have experience building integrations with tools like JIRA, Jenkins, GitLab, Bitbucket, etc. Your working knowledge of React.js for UI development and integration is essential, along with a solid understanding of RESTful APIs, Webhooks, and API security. Familiarity with Git and collaborative development workflows is also required, as well as exposure to CI/CD practices and infrastructure as code. It would be beneficial if you have experience with AWS services such as DynamoDB, S3, EventBridge, and Step Functions. Familiarity with enterprise SSO, OAuth2, or SAML is a plus, along with prior experience automating tool admin tasks and DevOps workflows. An understanding of modern monitoring/logging tools like CloudWatch or ELK would also be advantageous. In this role, you will have the opportunity to work on a transformative platform with direct enterprise impact. You will have the freedom to innovate and contribute to the automation of key IT and DevOps functions. Additionally, you will gain exposure to modern architectures, including serverless, microservices, and event-driven systems. The work culture is collaborative and outcome-oriented, providing a conducive environment for growth and learning.,

Posted 1 month ago

Apply

7.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Were looking for a Cloud Architect / Lead to design, build, and manage scalable AWS infrastructure that powers our analytics and data product initiatives. This role focuses on automating infrastructure provisioning, application/API hosting, and enabling data and GenAI workloads through a modern, secure cloud environment. Key Responsibilities ? Design and provision AWS infrastructure using Terraform or AWS CloudFormation to support evolving data product needs. ? Develop and manage CI/CD pipelines using Jenkins, AWS CodePipeline, CodeBuild, or GitHub Actions. ? Deploy and host internal tools, APIs, and applications using ECS, EKS, Lambda, API Gateway, and ELB. ? Provision and support analytics and data platforms using S3, Glue, Redshift, Athena, Lake Formation, and orchestration tools like Step Functions or Apache Airflow (MWAA). ? Implement cloud security, networking, and compliance using IAM, VPC, KMS, CloudWatch, CloudTrail, and AWS Config. ? Collaborate with data engineers, ML engineers, and analytics teams to align infrastructure with application and data product requirements. ? Support GenAI infrastructure, including Amazon Bedrock, SageMaker, or integrations with APIs like OpenAI. Requirements ? 7-10 years of experience in cloud engineering, DevOps, or cloud architecture roles. ? Strong hands-on expertise with the AWS ecosystem and tools listed above. ? Proficiency in scripting (e.g., Python, Bash) and infrastructure automation. ? Experience deploying containerized workloads using Docker, ECS, EKS, or Fargate. ? Familiarity with data engineering and GenAI workflows is a plus. ? AWS certifications (e.g., Solutions Architect, DevOps Engineer) are preferred. Show more Show less

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

You will be part of a team responsible for developing a next-generation Data Analytics Engine that converts raw market and historical data into actionable insights for the electronics supply chain industry. This platform processes high-volume data from suppliers, parts, and trends to provide real-time insights and ML-driven applications. We are seeking an experienced Lead or Staff Data Engineer to assist in shaping and expanding our core data infrastructure. The ideal candidate should have a strong background in designing and implementing scalable ETL pipelines and real-time data systems in AWS and open-source environments such as Airflow, Spark, and Kafka. This role involves taking technical ownership, providing leadership, improving our architecture, enforcing best practices, and mentoring junior engineers. Your responsibilities will include designing, implementing, and optimizing scalable ETL pipelines using AWS-native tools, migrating existing pipelines to open-source orchestration tools, leading data lake and data warehouse architecture design, managing CI/CD workflows, implementing data validation and quality checks, contributing to Infrastructure as Code, and offering technical mentorship and guidance on architectural decisions. To qualify for this role, you should have at least 8 years of experience as a Data Engineer or similar role with production ownership, expertise in AWS tools, deep knowledge of open-source data stack, strong Python programming skills, expert-level SQL proficiency, experience with CI/CD tools, familiarity with Infrastructure as Code, and the ability to mentor engineers and drive architectural decisions. Preferred qualifications include a background in ML/AI pipelines, experience with serverless technologies and containerized deployments, and familiarity with data observability tools and alerting systems. A Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field is preferred. In return, you will have the opportunity to work on impactful supply chain intelligence problems, receive mentorship from experienced engineers and AI product leads, work in a flexible and startup-friendly environment, and enjoy competitive compensation with opportunities for career growth.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You should have 3-5 years of experience in writing and debugging intermediate to advance level Python code with a good understanding of concepts related to OOPS, APIs, and SQL Databases. Additionally, you should possess a strong grasp of fundamental basics of Generative AI, large language models (LLMs) pipelines like RAG, Open AI GPT models, and experience in NLP and Langchain. It is essential to be familiar with the AWS environment and services like S3, lambda, Step Functions, CloudWatch, etc. You should also have excellent analytical and problem-solving skills and be capable of working independently as well as collaboratively in a team-oriented environment. An analytical mind and business acumen are also important qualities for this role. You should demonstrate the ability to engage with client stakeholders at multiple levels and provide consultative solutions across different domains. It would be beneficial to have familiarity with Python libraries and frameworks such as Pandas, Scikit-learn, PyTorch, TensorFlow, BERT, GPT, or similar models, along with experience in Deep Learning and machine learning.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

This role requires you to be adept at troubleshooting, debugging, and working within a Cloud environment. You should be familiar with Agile and other development methodologies. Your responsibilities will include creating lambda functions with all the necessary security measures in place using AWS Lambda. You must demonstrate proficiency in Java & Node JS by developing services and conducting unit and integration testing. It is essential to have a strong understanding of security best practices such as using IAM Roles, KMS, and Pseudonymization. You should be able to define services on Swagger Hub and implement serverless approaches using AWS Lambda, including the Serverless Application Model (AWS SAM). Hands-on experience with RDS, Kafka, ELB, Secret Manager, S3, API Gateway, CloudWatch, and Event Bridge services is required. You should also be knowledgeable in writing unit test cases using the Mocha framework and have experience with Encryption & Decryption of PII data and file on Transit and at Rest. Familiarity with CDK (Cloud Development Kit) and creating SQS/SNS, DynamoDB, API Gateway using CDK is preferred. You will be working on a serverless stack involving Lambda, API Gateway, Step functions, and coding in Java / Node JS. Advanced networking concepts like Transit Gateway, VPC endpoints, and multi-account connectivity are also part of the role. Strong troubleshooting and debugging skills are essential, along with excellent problem-solving abilities and attention to detail. Effective communication skills and the ability to work in a team-oriented, collaborative environment are crucial for success in this role. Virtusa is a company that values teamwork, quality of life, and professional and personal development. By joining Virtusa, you become part of a global team that focuses on your growth and provides exciting projects, opportunities, and exposure to state-of-the-art technologies throughout your career. Collaboration and fostering excellence are at the core of Virtusa's values, offering a dynamic environment for great minds to thrive and innovate.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As an AWS Senior Data Engineer (SDE) at Infosys in India, you will be responsible for working on various technologies and tools related to cloud data engineering. Your role will involve expertise in SQL, Pyspark, API endpoint ingestion, Glue, S3, Redshift, Step Functions, Lambda, Cloudwatch, AppFlow, CloudFormation, and administrative tasks related to cloud services. Additionally, you will be expected to have knowledge of SDLF & OF frameworks, S3 ingestion patterns, and exposure to Git, Jfrog, ADO, SNOW, Visual Studio, DBeaver, and SF inspector. Your primary focus will be on leveraging these technologies to design, develop, and maintain data pipelines, ensuring efficient data processing and storage on the cloud platform. The ideal candidate for this position should have a strong background in cloud data engineering, familiarity with AWS services, and a proactive attitude towards learning and implementing new technologies. Excellent communication skills and the ability to work effectively within a team are essential for success in this role.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

Sykatiya Technology Pvt Ltd is a leading Semiconductor Industry innovator committed to leveraging cutting-edge technology to solve complex problems. We are currently looking for a highly skilled and motivated Data Scientist to join our dynamic team and contribute to our mission of driving innovation through data-driven insights. As the Lead Data Scientist and Machine Learning Engineer at Sykatiya Technology Pvt Ltd, you will play a crucial role in analyzing large datasets to uncover patterns, develop predictive models, and implement AI/ML solutions. Your responsibilities will include working on projects involving neural networks, deep learning, data mining, and natural language processing (NLP) to drive business value and enhance our products and services. Key Responsibilities: - Lead the design and implementation of machine learning models and algorithms to address complex business problems. - Utilize deep learning techniques to enhance neural network models and enhance prediction accuracy. - Conduct data mining and analysis to extract actionable insights from both structured and unstructured data. - Apply natural language processing (NLP) techniques for advanced text analytics. - Develop and maintain end-to-end data pipelines, ensuring data integrity and reliability. - Collaborate with cross-functional teams to understand business requirements and deliver data-driven solutions. - Mentor and guide junior data scientists and engineers in best practices and advanced techniques. - Stay updated with the latest advancements in AI/ML, neural networks, deep learning, data mining, and NLP. Technical Skills: - Proficiency in Python and its libraries such as NumPy, pandas, sci-kit-learn, TensorFlow, Keras, and PyTorch. - Strong understanding of machine learning algorithms and techniques. - Extensive experience with neural networks and deep learning frameworks. - Hands-on experience with data mining and analysis techniques. - Proficiency in natural language processing (NLP) tools and libraries like NLTK, spaCy, and transformers. - Proficiency in Big Data Technologies including Sqoop, Hadoop, HDFS, Hive, and PySpark. - Experience with Cloud Platforms such as AWS services like S3, Step Functions, EventBridge, Athena, RDS, Lambda, and Glue. - Strong knowledge of Database Management systems like SQL, Teradata, MySQL, PostgreSQL, and Snowflake. - Familiarity with Other Tools like ExactTarget, Marketo, SAP BO, Agile, and JIRA. - Strong Analytical Skills to analyze large datasets and derive actionable insights. - Excellent Problem-Solving Skills with the ability to think critically and creatively. - Effective Communication Skills and teamwork abilities to collaborate with various stakeholders. Experience: - At least 8 to 12 years of experience in a similar role.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role - Cloud Architect Analytics & Data Products Were looking for a Cloud Architect / Lead to design, build, and manage scalable AWS infrastructure that powers our analytics and data product initiatives. This role focuses on automating infrastructure provisioning , application/API hosting , and enabling data and GenAI workloads through a modern, secure cloud environment. Key Responsibilities Design and provision AWS infrastructure using Terraform or AWS CloudFormation to support evolving data product needs. Develop and manage CI/CD pipelines using Jenkins , AWS Code Pipeline , Code Build , or GitHub Actions . Deploy and host internal tools, APIs, and applications using ECS , EKS , Lambda , API Gateway , and ELB . Provision and support analytics and data platforms using S3 , Glue , Redshift , Athena , Lake Formation , and orchestration tools like Step Functions or Apache Airflow (MWAA) . Implement cloud security, networking, and compliance using IAM , VPC , KMS , CloudWatch , CloudTrail , and AWS Config . Collaborate with data engineers, ML engineers, and analytics teams to align infrastructure with application and data product requirements. Support GenAI infrastructure, including Amazon Bedrock , Sage Maker , or integrations with APIs like Open AI . Requirements 10-14 years of experience in cloud engineering, DevOps, or cloud architecture roles. Strong hands-on expertise with the AWS ecosystem and tools listed above. Proficiency in scripting (e.g., Python , Bash ) and infrastructure automation. Experience deploying containerized workloads using Docker , ECS , EKS , or Fargate . Familiarity with data engineering and GenAI workflows is a plus. AWS certifications (e.g., Solutions Architect , DevOps Engineer ) are preferred. Show more Show less

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As an AWS Senior Data Engineer at our organization, you will be responsible for working with various technologies and tools to support the data engineering activities. Your primary tasks will include utilizing SQL for data querying and manipulation, developing data processing pipelines using Pyspark, and integrating data from API endpoints. Additionally, you will be expected to work with AWS services such as Glue for ETL processes, S3 for data storage, Redshift for data warehousing, Step Functions for workflow automation, Lambda for serverless computing, Cloudwatch for monitoring, and AppFlow for data integration. You should have experience with Cloud formation and administrative roles, as well as knowledge of SDLF & OF frameworks for data lifecycle management. Understanding S3 ingestion patterns and version control using Git is essential for this role. Exposure to tools like Jfrog, ADO, SNOW, Visual Studio, DBeaver, and SF inspector will be beneficial in supporting your data engineering tasks effectively. Your role will involve collaborating with cross-functional teams to ensure the successful implementation of data solutions within the AWS environment.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies