Jobs
Interviews

83 Step Functions Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

ahmedabad, gujarat

On-site

As an AWS Data Engineer at Sufalam Technologies, located in Ahmedabad, India, you will be responsible for designing and implementing data engineering solutions on AWS. Your role will involve developing data models, managing ETL processes, and ensuring the efficient operation of data warehousing solutions. Collaboration with Finance, Data Science, and Product teams is crucial to understand reconciliation needs and ensure timely data delivery. Your expertise will contribute to data analytics activities supporting business decision-making and strategic goals. Key responsibilities include designing and implementing scalable and secure ETL/ELT pipelines for processing financial data. Collaborating closely with various teams to understand reconciliation needs and ensuring timely data delivery. Implementing monitoring and alerting for pipeline health and data quality, maintaining detailed documentation on data flows, models, and reconciliation logic, and ensuring compliance with financial data handling and audit standards. To excel in this role, you should have 5-6 years of experience in data engineering with a strong focus on AWS data services. Hands-on experience with AWS Glue, Lambda, S3, Redshift, Athena, Step Functions, Lake Formation, and IAM is essential for secure data governance. A solid understanding of data reconciliation processes in the finance domain, strong SQL skills, experience with data warehousing and data lakes, and proficiency in Python or PySpark for data transformation are required. Knowledge of financial accounting principles or experience working with financial datasets (AR, AP, General Ledger, etc.) would be beneficial.,

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Gurugram

Work from Office

Company Overview Incedo is a US-based consulting, data science and technology services firm with over 3000 people helping clients from our six offices across US, Mexico and India. We help our clients achieve competitive advantage through end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering, data science, and design capabilities coupled with deep domain understanding. We combine services and products to maximize business impact for our clients in telecom, Banking, Wealth Management, product engineering and life science & healthcare industries. Working at Incedo will provide you an opportunity to work with industry leading client organizations, deep technology and domain experts, and global teams. Incedo University, our learning platform, provides ample learning opportunities starting with a structured onboarding program and carrying throughout various stages of your career. A variety of fun activities is also an integral part of our friendly work environment. Our flexible career paths allow you to grow into a program manager, a technical architect or a domain expert based on your skills and interests. Our Mission is to enable our clients to maximize business impact from technology by Harnessing the transformational impact of emerging technologies Bridging the gap between business and technology Role Description Write and maintain build/deploy scripts. Work with the Sr. Systems Administrator to deploy and implement new cloud infra structure and designs. Manage existing AWS deployments and infrastructure. Build scalable, secure, and cost-optimized AWS architecture. Ensure best practices are followed and implemented. Assist in deployment and operation of security tools and monitoring. Automate tasks where appropriate to enhance response times to issues and tickets. Collaborate with Cross-Functional Teams: Work closely with development, operations, and security teams to ensure a cohesive approach to infrastructure and application security. Participate in regular security reviews and planning sessions. Incident Response and Recovery: Participate in incident response planning and execution, including post-mortem analysis and preventive measures implementation. Continuous Improvement: Regularly review and update security practices and procedures to adapt to the evolving threat landscape. Analyze and remediate vulnerabilities and advise developers of vulnerabilities requiring updates to code. Create/Maintain documentation and diagrams for application/security and network configurations. Ensure systems are monitored using monitoring tools such as Datadog and issues are logged and reported to required parties. Technical Skills Experience with system administration, provisioning and managing cloud infrastructure and security monitoring In-depth. Experience with infrastructure/security monitoring and operation of a product or service. Experience with containerization and orchestration such as Docker, Kubernetes/EKS Hands on experience creating system architectures and leading architecture discussions at a team or multi-team level. Understand how to model system infrastructure in the cloud with Amazon Web Services (AWS), AWS CloudFormation, or Terraform. Strong knowledge of cloud infrastructure (AWS preferred) services like Lambda, Cognito, SQS, KMS, S3, Step Functions, Glue/Spark, CloudWatch, Secrets Manager, Simple Email Service, CloudFront Familiarity with coding, scripting and testing tools. (preferred) Strong interpersonal, coordination and multi-tasking skills Ability to function both independently and collaboratively as part of a team to achieve desired results. Aptitude to pick up new concepts and technology rapidly; ability to explain it to both business & tech stakeholders. Ability to adapt and succeed in a fast-paced, dynamic startup environment. Experience with Nessus and other related infosec tooling Nice-to-have skills Strong interpersonal, coordination and multi-tasking skills Ability to work independently and follow through to achieve desired results. Quick learner, with the ability to work calmly under pressure and with tight deadlines. Ability to adapt and succeed in a fast-paced, dynamic startup environment Qualifications BA/BS degree in Computer Science, Computer Engineering, or related field; MS degree in Computer Science or Computer Engineering ( preferred) Company Value

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As the Lead Data Engineer at Mastercard, you will be responsible for designing and building scalable, cloud-native data platforms using PySpark, Python, and modern data engineering practices. Your role will involve mentoring and guiding other engineers, fostering a culture of curiosity and continuous improvement, and creating robust ETL/ELT pipelines to serve business-critical use cases. You will lead by example by writing high-quality, testable code, participating in architecture and design discussions, and decomposing complex problems into scalable components aligned with platform and product goals. Championing best practices in data engineering, you will drive collaboration across teams, support data governance and quality efforts, and continuously learn and apply new technologies to improve team productivity and platform reliability. To succeed in this role, you should have at least 5 years of hands-on experience in data engineering with strong PySpark and Python skills. You should also possess solid experience in designing and implementing data models, pipelines, and batch/stream processing systems. Additionally, you should be comfortable working with cloud platforms such as AWS, Azure, or GCP and have a strong foundation in data modeling, database design, and performance optimization. A bachelor's degree in computer science, engineering, or a related field is required, along with experience in Agile/Scrum development environments. Experience with CI/CD practices, version control, and automated testing is essential, as well as the ability to mentor and uplift junior engineers. Familiarity with cloud-related services like S3, Glue, Data Factory, and Databricks is highly desirable. Furthermore, exposure to data governance tools and practices, orchestration tools, containerization, and infrastructure automation will be advantageous. A master's degree, relevant certifications, or contributions to open source/data engineering communities will be considered a bonus. Exposure to machine learning data pipelines or MLOps is also a plus. If you are a curious, adaptable, and driven individual who enjoys problem-solving and continuous improvement, and if you have a passion for building clean data pipelines and cloud-native designs, then this role is perfect for you. Join us at Mastercard and be part of a team that is dedicated to unlocking the potential of data assets and shaping the future of data engineering.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Who We Are:- We are a digitally native company that helps organizations reinvent themselves and unleash their potential. We are the place where innovation, design and engineering meet scale. Globant is 20 years old, NYSE listed public organization with more than 33,000+ employees worldwide working out of 35 countries globally. www.globant.com Job location: Pune/Hyderabad/Bangalore Work Mode: Hybrid Experience: 5 to 10 Years Must have skills are 1) AWS (EC2 & EMR & EKS) 2) RedShift 3) Lambda Functions 4) Glue 5) Python 6) Pyspark 7) SQL 8) Cloud watch 9) No SQL Database - DynamoDB/MongoDB/ OR any We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have a strong background in designing, developing, and managing data pipelines, working with cloud technologies, and optimizing data workflows. You will play a key role in supporting our data-driven initiatives and ensuring the seamless integration and analysis of large datasets. Design Scalable Data Models: Develop and maintain conceptual, logical, and physical data models for structured and semi-structured data in AWS environments. Optimize Data Pipelines: Work closely with data engineers to align data models with AWS-native data pipeline design and ETL best practices. AWS Cloud Data Services: Design and implement data solutions leveraging AWS Redshift, Athena, Glue, S3, Lake Formation, and AWS-native ETL workflows. Design, develop, and maintain scalable data pipelines and ETL processes using AWS services (Glue, Lambda, RedShift). Write efficient, reusable, and maintainable Python and PySpark scripts for data processing and transformation. Optimize SQL queries for performance and scalability. Expertise in writing complex SQL queries and optimizing them for performance. Monitor, troubleshoot, and improve data pipelines for reliability and performance. Focusing on ETL automation using Python and PySpark, responsible for design, build, and maintain efficient data pipelines, ensuring data quality and integrity for various applications.

Posted 2 weeks ago

Apply

5.0 - 6.0 years

10 - 20 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

(Night shift 6:00 pm to 3:00 am) KEY REPONSIBILITIES Development with Python, TypeScript (TS), GitHub, and AWS Services (Step Functions and Lambda) Development with Terraform and Grafana Tools Experienced development with AWS-based API and ETL solutions. Design and implement scalable APIs leveraging AWS services such as API Gateway, Lambda, and RDS. Leverage Cursor AI with Playwright Automated Testing of Applications Ensure API and ETL job reliability through unit and integration testing, workflow orchestration, and event-driven automation. Monitor, troubleshoot, and enhance performance using AWS-native tools such as CloudWatch, SQS, and Event Bridge. Collaborate with cross-functional teams to ensure alignment with business and technical objectives. Ensure that all uptime and System performance metrics are defined and supported. REQUIRED SKILLS & EXPERIENCE Expert Python and TypeScript (TS) hands-on development Expert AWS Services (Step Functions and Lambda) hands-on solutions and development Hands-On Experience with AWS Services: API Gateway, Lambda, Observability, CloudWatch, SQS, Event Bridge, S3, RDS, AWS Glue, Glue Crawler, Athena. Hands-On design and development of relational databases (Oracle, Postgres, SQL) Knowledge of Cursor AI Code Editor and Playwright Automated Testing Knowledge of workflow orchestration, including job scheduling and API integration. Strong experience in TypeScript (TS) and Python for API development using Serverless Framework v3. Hands-on development of Docker-based APIs. Expertise in CRUD transactions with relational databases. Experience implementing CI/CD pipelines for API deployments. Strong understanding of unit and integration testing, including mocking strategies for API development. PREFERRED QUALIFICATIONS Experience with AWS Observability Maturity Models and Best Practices Experience with serverless architecture best practices. Experience with event-driven architecture using AWS services. Experience with CI/CD pipeline development Strong debugging and performance optimization skills. Ability to write clean, maintainable, and well-documented code. Knowledge of security best practices in AWS, API authentication, and access control. Python development experience for AWS Glue ETL workflows. Strong communication and collaboration skills. OTHER REQUIREMENTS 5+ years of Senior leadership and hands-on development of Python, TypeScript (TS), and API Services 3+ years of AWS Services hands-on development (Step Functions and Lambda) 1+ year of Senior leadership and hands-on development of Grafana, Terraform and GitHub Comprehensive understanding of the complete software development lifecycle Hands-on experience working in an agile software development environment (Preferably Scrum) WORK ENVIRONMENT: Typical office environment Mon-Fr during the hours of 8 A.M. to 5 P.M EST. Location : - Remote, Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bangalore Rural, Bengaluru

Hybrid

Responsibilities 3+ years of hands on application to lead and perform the development with experience in one or more programming languages like Python, Pyspark etc. 4+ years of hands on experience in development and deployment of cloud native solutions leveraging AWS Services: Compute(EC2, Lambda), Storage (S3), Database (RDS, Aurora, Postgres, DynamoDB), Orchestration (Apache Airflow, Step Function, SNS), ETL/Analytics(Glue, EMR, Athena, Redshift), Infra (Cloud Formation, Code Pipeline), Data Migration (AWS DataSync, AWS DMS), APIGateway, IAM etc. Expertise in the handling large data sets and data models in terms of design, data model creation, development of data pipeline for data ingestion, migration and transformation Strong on SQL Server, stored procedure Knowledge on API's , SSO, streaming technology will be nice to have Mandatory Skill Sets AWS, Pyspark, Spark Glue, Lambda Years Of Experience Required - 5+ Years. Education Qualification B.Tech / M.Tech / MBA / MCA

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Lead Data Engineer at Mastercard, you will be a key player in the Mastercard Services Technology team, responsible for driving the mission to unlock the potential of data assets by innovating, managing big data assets, ensuring accessibility of data, and enforcing standards and principles in the Big Data space. Your role will involve designing and building scalable, cloud-native data platforms using PySpark, Python, and modern data engineering practices. You will mentor and guide other engineers, foster a culture of curiosity and continuous improvement, and create robust ETL/ELT pipelines that integrate with various systems. Your responsibilities will include decomposing complex problems into scalable components aligned with platform goals, championing best practices in data engineering, collaborating across teams, supporting data governance and quality efforts, and optimizing cloud infrastructure components related to data engineering workflows. You will actively participate in architectural discussions, iteration planning, and feature sizing meetings while adhering to Agile processes. To excel in this role, you should have at least 5 years of hands-on experience in data engineering with strong PySpark and Python skills. You must possess solid experience in designing and implementing data models, pipelines, and batch/stream processing systems. Additionally, a strong foundation in data modeling, database design, and performance optimization is required. Experience working with cloud platforms like AWS, Azure, or GCP and knowledge of modern data architectures and data lifecycle management are essential. Furthermore, familiarity with CI/CD practices, version control, and automated testing is crucial. You should demonstrate the ability to mentor junior engineers effectively, possess excellent communication and collaboration skills, and hold a Bachelor's degree in computer science, Engineering, or a related field. Comfort with Agile/Scrum development environments, curiosity, adaptability, problem-solving skills, and a drive for continuous improvement are key traits for success in this role. Experience with integrating heterogeneous systems, building resilient data pipelines across cloud environments, orchestration tools, data governance practices, containerization, infrastructure automation, and exposure to machine learning data pipelines or MLOps will be advantageous. Holding a Master's degree, relevant certifications, or contributions to open-source/data engineering communities will be a bonus.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

10 - 15 Lacs

Pune

Work from Office

Key Responsibilities Design and develop scalable applications using Python and AWS services Debug and resolve production issues across complex distributed systems Architect solutions aligned with business strategies and industry standards Lead and mentor a team of India-based developers; guide career development Ensure technical deliverables meet highest standards of quality and performance Research and integrate emerging technologies and processes into development strategy Document solutions in compliance with SDLC standards using defined templates Assemble large, complex datasets based on functional and non-functional requirements Handle operational issues and recommend improvements in technology stack Facilitate end-to-end platform integration across enterprise-level applications Required Skills Technical Skills Cloud & Architecture Tools & Processes Python AWS (EC2, EKS, Glue, Lambda, S3, EMR, RDS, API Gateway) Terraform, CI/CD pipelines Data Engineering Step Functions, CloudFront EventBridge, ARRFlow, Airflow (MWAA), Quicksight Debugging & Troubleshooting System Integration SDLC, Documentation Templates Qualifications 10+ years of software development experience, preferably in financial/trading applications 5+ years of people management and mentoring experience Proven track record in technical leadership and architecture planning Expertise in developing applications using Python and AWS stack Strong grasp of Terraform and automated CI/CD processes Exceptional multitasking and prioritization capabilities

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

indore, madhya pradesh

On-site

As a Senior Data Scientist with 5+ years of experience, you will be responsible for designing and implementing models, mining data for insights, and interpreting complex data structures to drive business decision-making. Your expertise in machine learning, including areas such as NLP, Machine vision, and Time series, will be essential in this role. You will be expected to have strong skills in Model Tuning, Model Validation, Supervised and Unsupervised Learning, and hands-on experience with model development, data preparation, training, and inference-ready deployment of models. Your proficiency in descriptive and inferential statistics, hypothesis testing, and data analysis will help in developing code for reproducible analysis of data. Experience with AWS services like Sagemaker, Lambda, Glue, Step functions, and EC2 is necessary, along with knowledge of Databricks, Anaconda distribution, and similar data science code development and deployment IDEs. Your familiarity with ML algorithms related to time-series, natural language processing, optimization, object detection, topic modeling, clustering, and regression analysis will be highly valued. You should have expertise in Hive/Impala, Spark, Python, Pandas, Keras, SKLearn, StatsModels, Tensorflow, and PyTorch. End-to-end model deployment and production experience of at least 1 year is required, along with a good understanding of Model Deployment in Azure ML platform, Anaconda Enterprise, or AWS Sagemaker. Basic knowledge of deep learning algorithms such as MaskedCNN, YOLO, and familiarity with Visualization and analytics/Reporting Tools like Power BI, Tableau, and Alteryx will be considered advantageous for this role.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

You should possess a Bachelors/Master's degree in Computer Science/Computer Engineering or a related field. Along with this, you must have at least 2-6 years of experience in server-side development using languages like GoLang, Node.JS, or Python. Furthermore, it is essential to have proficiency in AWS services such as Lambda, DynamoDB, Step Functions, S3, etc. and hands-on experience in deploying and managing Serverless service environments. Experience with Docker, containerization, and Kubernetes is also required for this role. In addition, knowledge of database technologies like MongoDB and DynamoDB, along with experience in CI/CD pipeline and automation, would be beneficial. Experience in Video Transcoding/Streaming on Cloud would be considered a plus. Lastly, strong problem-solving skills are a must-have for this position.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

10 - 20 Lacs

Pune

Hybrid

Must Have: Strong programming skills with over 6 years of experience. Proficient Java engineer with solid knowledge of Java, multithreading, and object-oriented programming (OOP) concepts. Tech-savvy and eager to learn new skills; actively tracks industry trends. Hands-on experience with AWS services including EC2, ECS, Lambda, API Gateway, SQS, DynamoDB, SNS, S3, Redis, and CloudWatch. Good understanding of event-driven architecture with practical experience using Kafka or similar tools. Strong grasp of web application fundamentals and experience in developing web services (REST or SOAP) within microservices architecture and domain-driven design. Familiar with core Spring Framework concepts such as IoC, DI, Spring Boot, Spring Security, and other Spring modules. Experience with API gateways like Apigee or similar platforms. Proficient in using tools such as Git, Jenkins, SonarQube, SignalFx, and others. Experience with centralized logging tools like Splunk or similar. Familiar with microservices and distributed system architectures.

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Cloud Data Integration Consultant, you will be responsible for leading a complex data integration project that involves API frameworks, a data lakehouse architecture, and middleware solutions. The project focuses on technologies such as AWS, Snowflake, Oracle ERP, and Salesforce, with a high transaction volume POS system. Your role will involve building reusable and scalable API frameworks, optimizing middleware, and ensuring security and compliance in a multi-cloud environment. Your expertise in API development and integration will be crucial for this project. You should have deep experience in managing APIs across multiple systems, building reusable components, and ensuring bidirectional data flow for real-time data synchronization. Additionally, your skills in middleware solutions and custom API adapters will be essential for integrating various systems seamlessly. In terms of cloud infrastructure and data processing, your strong experience with AWS services like S3, Lambda, Fargate, and Glue will be required for data processing, storage, and integration. You should also have hands-on experience in optimizing Snowflake for querying and reporting, as well as knowledge of Terraform for automating the provisioning and management of AWS resources. Security and compliance are critical aspects of the project, and your deep understanding of cloud security protocols, API security, and compliance enforcement will be invaluable. You should be able to set up audit logs, ensure traceability, and enforce compliance across cloud services. Handling high-volume transaction systems and real-time data processing requirements will be part of your responsibilities. You should be familiar with optimizing AWS Lambda and Fargate for efficient data processing and be skilled in operational monitoring and error handling mechanisms. Collaboration and support are essential for the success of the project. You will need to provide post-go-live support, collaborate with internal teams and external stakeholders, and ensure seamless integration between systems. To qualify for this role, you should have at least 10 years of experience in enterprise API integration, cloud architecture, and data management. Deep expertise in AWS services, Snowflake, Oracle ERP, and Salesforce integrations is required, along with a proven track record of delivering scalable API frameworks and handling complex middleware systems. Strong problem-solving skills, familiarity with containerization technologies, and experience in retail or e-commerce industries are also desirable. Your key responsibilities will include leading the design and implementation of reusable API frameworks, optimizing data flow through middleware systems, building robust security frameworks, and collaborating with the in-house team for seamless integration between systems. Ongoing support, monitoring, and optimization post-go-live will also be part of your role.,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Kochi, Bengaluru

Work from Office

Job Summary: We are seeking a highly skilled and motivated Machine Learning Engineer with a strong foundation in programming and machine learning, hands-on experience with AWS Machine Learning services (especially SageMaker), and a solid understanding of Data Engineering and MLOps practices. You will be responsible for designing, developing, deploying, and maintaining scalable ML solutions in a cloud-native environment. Key Responsibilities: • Design and implement machine learning models and pipelines using AWS SageMaker and related services. • Develop and maintain robust data pipelines for training and inference workflows. • Collaborate with data scientists, engineers, and product teams to translate business requirements into ML solutions. • Implement MLOps best practices including CI/CD for ML, model versioning, monitoring, and retraining strategies. • Optimize model performance and ensure scalability and reliability in production environments. • Monitor deployed models for drift, performance degradation, and anomalies. • Document processes, architectures, and workflows for reproducibility and compliance. Required Skills & Qualifications: • Strong programming skills in Python and familiarity with ML libraries (e.g., scikitlearn, TensorFlow, PyTorch). • Solid understanding of machine learning algorithms, model evaluation, and tuning. • Hands-on experience with AWS ML services, especially SageMaker, S3, Lambda, Step Functions, and CloudWatch. • Experience with data engineering tools (e.g., Apache Airflow, Spark, Glue) and workflow orchestration. Machine Learning Engineer - Job Description • Proficiency in MLOps tools and practices (e.g., MLflow, Kubeflow, CI/CD pipelines, Docker, Kubernetes). • Familiarity with monitoring tools and logging frameworks for ML systems. • Excellent problem-solving and communication skills. Preferred Qualifications: • AWS Certification (e.g., AWS Certified Machine Learning Specialty). • Experience with real-time inference and streaming data. • Knowledge of data governance, security, and compliance in ML systems

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

Tezo is a new generation Digital & AI solutions provider, with a history of creating remarkable outcomes for our customers. We bring exceptional experiences using cutting-edge analytics, data proficiency, technology, and digital excellence. Job Overview The AWS Architect with Data Engineering Skills will be responsible for designing, implementing, and managing scalable, robust, and secure cloud infrastructure and data solutions on AWS. This role requires a deep understanding of AWS services, data engineering best practices, and the ability to translate business requirements into effective technical solutions. Key Responsibilities Architecture Design: Design and architect scalable, reliable, and secure AWS cloud infrastructure. Develop and maintain architecture diagrams, documentation, and standards. Data Engineering Design and implement ETL pipelines using AWS services such as Glue, Lambda, and Step Functions. Build and manage data lakes and data warehouses using AWS services like S3, Redshift, and Athena. Ensure data quality, data governance, and data security across all data platforms. AWS Services Management Utilize a wide range of AWS services (EC2, S3, RDS, Lambda, DynamoDB, etc.) to support various workloads and applications. Implement and manage CI/CD pipelines using AWS CodePipeline, CodeBuild, and CodeDeploy. Monitor and optimize the performance, cost, and security of AWS resources. Collaboration And Communication Work closely with cross-functional teams including software developers, data scientists, and business stakeholders. Provide technical guidance and mentorship to team members on best practices in AWS and data engineering. Security And Compliance Ensure that all cloud solutions follow security best practices and comply with industry standards and regulations. Implement and manage IAM policies, roles, and access controls. Innovation And Improvement Stay up to date with the latest AWS services, features, and best practices. Continuously evaluate and improve existing systems, processes, and architectures. (ref:hirist.tech),

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Senior Principal consultant- Data Engineer In this role, we are looking for candidates who have relevant years of experience in designing and developing machine learning and deep learning system . Who have professional software development experience. Hands on running machine learning tests and experiments. Implementing appropriate ML algorithms engineers. Responsibilities Drive the vision for modern data and analytics platform to deliver well architected and engineered data and analytics products leveraging cloud tech stack and third-party products Close the gap between ML research and production to create ground-breaking new products, features and solve problems for our customers Design, develop, test, and deploy data pipelines, machine learning infrastructure and client-facing products and services Build and implement machine learning models and prototype solutions for proof-of-concept Scale existing ML models into production on a variety of cloud platforms Analyze and resolve architectural problems, working closely with engineering, data science and operations teams. Design and develop data pipelines: Create efficient data pipelines to collect, process, and store large volumes of data from various sources. Implement data solutions: Develop and implement scalable data solutions using technologies like Hadoop, Spark, and SQL databases. Ensure data quality: Monitor and improve data quality by implementing validation processes and error handling. Collaborate with teams: Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions. Optimize performance: Continuously optimize data systems for performance, scalability, and cost-effectiveness. Experience in GenAI project Qualifications we seek in you! Minimum Qualifications / Skills Bachelor%27s degree in computer science engineering, information technology or BSc in Computer Science, Mathematics or similar field Master&rsquos degree is a plus Integration - APIs, micro- services and ETL/ELT patterns DevOps (Good to have) - Ansible, Jenkins, ELK Containerization - Docker, Kubernetes etc Orchestration - Airflow, Step Functions, Ctrl M etc Languages and scripting: Python, Scala Java etc Cloud Services - AWS, GCP, Azure and Cloud Native Analytics and ML tooling - Sagemaker , ML Studio Execution Paradigm - low latency/Streaming, batch Preferred Qualifications/ Skills Data platforms - Big Data (Hadoop, Spark, Hive, Kafka etc.) and Data Warehouse (Teradata, Redshift, BigQuery , Snowflake etc.) Visualization Tools - PowerBI , Tableau Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Principal Consultant, AI Engineer! In this role, we are looking for candidates who have relevant years of experience in designing and developing machine learning and deep learning system . Who have professional software development experience. Hands on running machine learning tests and experiments. Implementing appropriate AI engineers with GenAI . Responsibilities Drive the vision for modern data and analytics platform to deliver well architected and engineered data and analytics products leveraging cloud tech stack and third-party products Close the gap between AI research and production to create ground-breaking new products, features and solve problems for our customers with GenAI Design, develop, test, and deploy data pipelines, machine learning infrastructure and client-facing products and services Build and implement machine learning models and prototype solutions for proof-of-concept Scale existing AI models into production on a variety of cloud platforms with GenAI Analyze and resolve architectural problems, working closely with engineering, data science and operations teams Qualifications we seek in you! Minimum Qualifications / Skills Bachelor%27s degree in computer science engineering, information technology or BSc in Computer Science, Mathematics or similar field Master&rsquos degree is a plus Integration - APIs, micro- services and ETL/ELT patterns DevOps (Good to have) - Ansible, Jenkins, ELK Containerization - Docker, Kubernetes etc Orchestration - Airflow, Step Functions, Ctrl M etc Languages and scripting: Python, Scala Java etc Cloud Services - AWS, GCP, Azure and Cloud Native Analytics and AI tooling - Sagemaker , GenAI Execution Paradigm - low latency/Streaming, batch Ensure GenAI outputs are contextually relevant, Familiarity with Generative AI technologies, Design and Implement GenAI Solutions Collaborate with service line teams to design, implement and manage Gen-AI solution Preferred Qualifications/ Skills Data platforms - Big Data (Hadoop, Spark, Hive, Kafka etc.) and Data Warehouse (Teradata, Redshift, BigQuery , Snowflake etc.) AI and GenAI Tools Certifications in AI/ML or GenAI Familiarity with generative models, prompt engineering, and fine-tuning techniques to develop innovative AI solutions. Designing, developing, and implementing solutions tailored to meet client needs. Understanding business requirements and translating them into technical solutions using GEN AI Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Title: AWS Developer About the Company/Team Oracle FSGIU's Finergy division is a specialized team dedicated to the Banking, Financial Services, and Insurance (BFSI) industry, offering deep domain expertise and innovative solutions. With a focus on accelerated implementation, Finergy helps financial institutions rapidly deploy multi-channel platforms, ensuring an exceptional customer experience. Our team provides end-to-end banking solutions, leveraging integrated analytics and dashboards for improved efficiency. Finergy's consulting services offer strategic guidance, aligning technology investments with business objectives. Job Summary We are on the lookout for a skilled AWS Developer with 4-6 years of experience to design and build cutting-edge applications on the Amazon Web Services (AWS) platform. The ideal candidate will have hands-on expertise in developing serverless and containerized applications, integrating various AWS services, and ensuring the performance, security, and scalability of cloud-native solutions. Key Responsibilities Design and develop scalable applications using AWS Lambda, API Gateway, and other AWS services, focusing on serverless architecture. Build and manage RESTful APIs, integrating with Amazon DynamoDB, RDS, and S3 for data storage and management. Implement Infrastructure as Code (IaC) using CloudFormation or Terraform to provision and manage AWS resources. Set up and maintain CI/CD pipelines using AWS CodePipeline, CodeBuild, and CodeDeploy for efficient software delivery. Automate workflows and background processes using Step Functions, SQS, and SNS for enhanced application functionality. Utilize CloudWatch, X-Ray, and CloudTrail for logging, monitoring, and troubleshooting, ensuring application health. Implement security measures using IAM roles, KMS, and Secrets Manager to protect sensitive data. Collaborate closely with DevOps, testers, and product owners in an Agile environment to deliver high-quality solutions. Qualifications & Skills Mandatory: 4-6 years of software development experience, including at least 2 years in AWS development. Proficiency in Node.js, Python, or Java for backend development. In-depth knowledge of AWS services: Lambda, API Gateway, S3, DynamoDB, RDS, IAM, SNS/SQS. Hands-on experience with CI/CD pipelines and version control systems like Git, GitHub, or Bitbucket. Understanding of containerization: Docker, and familiarity with Amazon ECS or EKS. Scripting skills using Bash, Python, or AWS CLI for automation. Awareness of cloud security best practices, cost optimization techniques, and performance tuning. Good-to-Have: AWS certification: AWS Certified Developer - Associate or AWS Certified Solutions Architect - Associate. Experience with microservices, serverless computing, and event-driven architecture. Exposure to multi-cloud or hybrid cloud environments. Strong communication and collaboration skills, with a problem-solving mindset. Self-Assessment Questions: Describe a serverless application you developed on AWS. What services did you use, and how did you ensure scalability and security Explain your experience with CI/CD pipelines on AWS. How have you utilized CodePipeline, CodeBuild, and CodeDeploy to automate the deployment process Share your approach to monitoring and troubleshooting AWS-based applications. What tools do you use, and how do you identify and resolve issues Discuss a scenario where you implemented security measures using AWS IAM and other security services. Job Title: AWS Developer About the Company/Team Oracle FSGIU's Finergy division is a specialized team dedicated to the Banking, Financial Services, and Insurance (BFSI) industry, offering deep domain expertise and innovative solutions. With a focus on accelerated implementation, Finergy helps financial institutions rapidly deploy multi-channel platforms, ensuring an exceptional customer experience. Our team provides end-to-end banking solutions, leveraging integrated analytics and dashboards for improved efficiency. Finergy's consulting services offer strategic guidance, aligning technology investments with business objectives. Job Summary We are on the lookout for a skilled AWS Developer with 4-6 years of experience to design and build cutting-edge applications on the Amazon Web Services (AWS) platform. The ideal candidate will have hands-on expertise in developing serverless and containerized applications, integrating various AWS services, and ensuring the performance, security, and scalability of cloud-native solutions. Key Responsibilities Design and develop scalable applications using AWS Lambda, API Gateway, and other AWS services, focusing on serverless architecture. Build and manage RESTful APIs, integrating with Amazon DynamoDB, RDS, and S3 for data storage and management. Implement Infrastructure as Code (IaC) using CloudFormation or Terraform to provision and manage AWS resources. Set up and maintain CI/CD pipelines using AWS CodePipeline, CodeBuild, and CodeDeploy for efficient software delivery. Automate workflows and background processes using Step Functions, SQS, and SNS for enhanced application functionality. Utilize CloudWatch, X-Ray, and CloudTrail for logging, monitoring, and troubleshooting, ensuring application health. Implement security measures using IAM roles, KMS, and Secrets Manager to protect sensitive data. Collaborate closely with DevOps, testers, and product owners in an Agile environment to deliver high-quality solutions. Qualifications & Skills Mandatory: 4-6 years of software development experience, including at least 2 years in AWS development. Proficiency in Node.js, Python, or Java for backend development. In-depth knowledge of AWS services: Lambda, API Gateway, S3, DynamoDB, RDS, IAM, SNS/SQS. Hands-on experience with CI/CD pipelines and version control systems like Git, GitHub, or Bitbucket. Understanding of containerization: Docker, and familiarity with Amazon ECS or EKS. Scripting skills using Bash, Python, or AWS CLI for automation. Awareness of cloud security best practices, cost optimization techniques, and performance tuning. Good-to-Have: AWS certification: AWS Certified Developer - Associate or AWS Certified Solutions Architect - Associate. Experience with microservices, serverless computing, and event-driven architecture. Exposure to multi-cloud or hybrid cloud environments. Strong communication and collaboration skills, with a problem-solving mindset. Self-Assessment Questions: Describe a serverless application you developed on AWS. What services did you use, and how did you ensure scalability and security Explain your experience with CI/CD pipelines on AWS. How have you utilized CodePipeline, CodeBuild, and CodeDeploy to automate the deployment process Share your approach to monitoring and troubleshooting AWS-based applications. What tools do you use, and how do you identify and resolve issues Discuss a scenario where you implemented security measures using AWS IAM and other security services. Career Level - IC2

Posted 3 weeks ago

Apply

4.0 - 8.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Job Title: AWS Engineer Experience: 4 - 8 Years Location: Bengaluru (Hybrid 2- 3 Days Onsite per Week) Employment Type: Full-Time Notice Period: Only Immediate to 15 Days Joiners Preferred Job Description: We are looking for an experienced AWS Engineer to join our dynamic data engineering team. The ideal candidate will have hands-on experience building and maintaining robust, scalable data pipelines and cloud-based architectures on AWS. Key Responsibilities: Design, develop, and maintain scalable data pipelines using AWS services such as Glue, Lambda, S3, Redshift, and EMR Collaborate with data scientists and ML engineers to operationalize machine learning models using AWS SageMaker Implement efficient data transformation and feature engineering workflows Optimize ETL/ELT processes and enforce best practices for data quality and governance Work with structured and unstructured data using Amazon Athena, DynamoDB, RDS, and similar services Build and manage CI/CD pipelines for data and ML workflows using AWS CodePipeline, CodeBuild, and Step Functions Monitor data infrastructure for performance, reliability, and cost-effectiveness Ensure data security and compliance with organizational and regulatory standards Required Skills: Strong experience with AWS data and ML services Solid knowledge of ETL/ELT frameworks and data modeling Proficiency in Python, SQL, and scripting for data engineering Experience with CI/CD and DevOps practices on AWS Good understanding of data governance and compliance standards Excellent collaboration and problem-solving skills

Posted 3 weeks ago

Apply

8.0 - 13.0 years

14 - 24 Lacs

Hyderabad

Work from Office

Roles and Responsibilities Lead the backend development for our AI-based product, driving architectural decisions and hands on implementation. Design and develop scalable, secure, and maintainable APIs using AWS Lambda and API Gateway . Build and maintain CI/CD pipelines using AWS-native tools (CodePipeline, CodeBuild) and GitHub. Collaborate with frontend developers (React/MUI) to ensure seamless integration between frontend and backend systems. Work closely with AWS and infrastructure teams to implement best practices in performance, security, and cost optimization. Review code, provide technical guidance to junior developers, and drive high engineering standards. Participate in sprint planning, estimations, and cross-functional discussions. An Ideal Candidate would have Strong programming skills in Python , with experience building production-grade applications. Proven experience with AWS Lambda , API Gateway , and other serverless components. Deep understanding of RESTful API design and development. Hands-on experience in setting up CI/CD pipelines using AWS services and GitHub. Familiarity with event-driven architectures , cloud deployments , and security best practices. Experience in working with Agile/Scrum methodologies. Strong communication and leadership skills to coordinate across cross-functional teams. Good-to-Have: Exposure to AI/ML pipelines, vector databases, or model-serving workflows. Experience with AWS Step Functions , DynamoDB , S3 , CloudWatch , and CloudFormation . Knowledge of observability tools (e.g., X-Ray, Prometheus, Grafana). Familiarity with frontend architecture and integration patterns. Experience: 8+ years with at least 2 years in a lead capacity Location: Hyderabad, India Role: Full Time Salary: Competitive

Posted 1 month ago

Apply

6.0 - 11.0 years

27 - 35 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Job Title: Senior Python Backend Developer AWS Serverless & Event-Driven Architecture Job Description: We are seeking an experienced Python Backend Developer with expertise in asynchronous programming and AWS serverless architecture to design and develop scalable, event-driven microservices. Key Responsibilities: Develop APIs using FastAPI, Flask, or Django (async views) Design and implement event-driven microservices using AWS Lambda, API Gateway, DynamoDB (GSI/LSI), EventBridge, Step Functions, SNS, and SQS Apply API standards with Pydantic, OAuth2/JWT, and rate limiting Build resilient, idempotent services with observability using AWS X-Ray, CloudWatch, DLQs, and retries Optimize DynamoDB schemas, TTLs, and streams Requirements: 4+ years of backend development experience with Python Strong expertise in AWS Serverless stack

Posted 1 month ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Senior Principal Consultant, AIML Engineer! In this role, we are looking for candidates who have relevant years of experience in designing and developing machine learning and deep learning system. Who have professional software development experience. Hands on running machine learning tests and experiments. Implementing appropriate ML algorithms engineers. Responsibilities . Drive the vision for modern data and analytics platform to deliver well architected and engineered data and analytics products leveraging cloud tech stack and third-party products . Close the gap between ML research and production to create ground-breaking new products, features and solve problems for our customers . Design, develop, test, and deploy data pipelines, machine learning infrastructure and client-facing products and services . Build and implement machine learning models and prototype solutions for proof-of-concept . Scale existing ML models into production on a variety of cloud platforms . Analyze and resolve architectural problems, working closely with engineering, data science and operations teams Qualifications we seek in you! Minimum Qualifications / Skills . Bachelor%27s degree in computer science engineering, information technology or BSc in Computer Science, Mathematics or similar field . Master&rsquos degree is a plus . Integration - APIs, micro-services and ETL/ELT patterns . DevOps (Good to have) - Ansible, Jenkins, ELK . Containerization - Docker, Kubernetes etc . Orchestration - Airflow, Step Functions, Ctrl M etc . Languages and scripting: Python, Scala Java etc . Cloud Services - AWS, GCP, Azure and Cloud Native . Analytics and ML tooling - Sagemaker, ML Studio . Execution Paradigm - low latency/Streaming, batch Preferred Qualifications/ Skills . Data platforms - Big Data (Hadoop, Spark, Hive, Kafka etc.) and Data Warehouse (Teradata, Redshift, BigQuery, Snowflake etc.) . Visualization Tools - PowerBI, Tableau Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Pune, Chennai, Bengaluru

Work from Office

Mandatory keyskills : Athena, Step Functions, Spark - Pyspark, ETL Fundamentals, SQL (Basic + Advanced), Glue, Python, Lambda, Data Warehousing, EBS /EFS, AWS EC2, Lake Formation, Aurora, S3, Modern Data Platform Fundamentals, PLSQL, Cloud front We are looking for an experienced AWS Data Engineer to design, build, and manage robust, scalable, and high-performance data pipelines and data platforms on AWS. The ideal candidate will have a strong foundation in ETL fundamentals, data modeling, and modern data architecture, with hands-on expertise across a broad spectrum of AWS services including Athena, Glue, Step Functions, Lambda, S3, and Lake Formation. Key Responsibilities : Design and implement scalable ETL/ELT pipelines using AWS Glue, Spark (PySpark), and Step Functions. Work with structured and semi-structured data using Athena, S3, and Lake Formation to enable efficient querying and access control. Develop and deploy serverless data processing solutions using AWS Lambda and integrate them into pipeline orchestration. Perform advanced SQL and PL/SQL development for data transformation, analysis, and performance tuning. Build data lakes and data warehouses using S3, Aurora, and Athena. Implement data governance, security, and access control strategies using AWS tools including Lake Formation, CloudFront, EBS/EFS, and IAM. Develop and maintain metadata, lineage, and data cataloging capabilities. Participate in data modeling exercises for both OLTP and OLAP environments. Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights. Monitor, debug, and optimize data pipelines for reliability and performance. Required Skills & Experience: Strong experience with AWS data services: Glue, Athena, Step Functions, Lambda, Lake Formation, S3, EC2, Aurora, EBS/EFS, CloudFront. Proficient in PySpark, Python, SQL (basic and advanced), and PL/SQL. Solid understanding of ETL/ELT processes and data warehousing concepts. Familiarity with modern data platform fundamentals and distributed data processing. Experience in data modeling (conceptual, logical, physical) for analytical and operational use cases. Experience with orchestration and workflow management tools within AWS. Strong debugging and performance tuning skills across the data stack.

Posted 1 month ago

Apply

5.0 - 10.0 years

22 - 37 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Experience: 5-8 Years (Lead-23 LPA), 8-10 Years (Senior Lead 35 LPA), 10+ Years (Architect- 42 LPA)- Max Location : Bangalore as 1 st preference , We can also go for Hyderabad, Chennai, Pune, Gurgaon Notice: Immediate to max 15 Days Joiner Mode of Work: Hybrid Job Description: Athena, Step Functions, Spark - Pyspark, ETL Fundamentals, SQL (Basic + Advanced), Glue, Python, Lambda, Data Warehousing, EBS /EFS, AWS EC2, Lake Formation, Aurora, S3, Modern Data Platform Fundamentals, PLSQL, Cloud front We are looking for an experienced AWS Data Engineer to design, build, and manage robust, scalable, and high-performance data pipelines and data platforms on AWS. The ideal candidate will have a strong foundation in ETL fundamentals, data modeling, and modern data architecture, with hands-on expertise across a broad spectrum of AWS services including Athena, Glue, Step Functions, Lambda, S3, and Lake Formation. Key Responsibilities: Design and implement scalable ETL/ELT pipelines using AWS Glue, Spark (PySpark), and Step Functions. Work with structured and semi-structured data using Athena, S3, and Lake Formation to enable efficient querying and access control. Develop and deploy serverless data processing solutions using AWS Lambda and integrate them into pipeline orchestration. Perform advanced SQL and PL/SQL development for data transformation, analysis, and performance tuning. Build data lakes and data warehouses using S3, Aurora, and Athena. Implement data governance, security, and access control strategies using AWS tools including Lake Formation, CloudFront, EBS/EFS, and IAM. Develop and maintain metadata, lineage, and data cataloging capabilities. Participate in data modeling exercises for both OLTP and OLAP environments. Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights. Monitor, debug, and optimize data pipelines for reliability and performance. Required Skills & Experience: Strong experience with AWS data services: Glue, Athena, Step Functions, Lambda, Lake Formation, S3, EC2, Aurora, EBS/EFS, CloudFront. Proficient in PySpark, Python, SQL (basic and advanced), and PL/SQL. Solid understanding of ETL/ELT processes and data warehousing concepts. Familiarity with modern data platform fundamentals and distributed data processing. Experience in data modeling (conceptual, logical, physical) for analytical and operational use cases. Experience with orchestration and workflow management tools within AWS. Strong debugging and performance tuning skills across the data stack.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR. At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems. Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About the Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership withData Analysts and Architects with guidance from Lead Software Engineer. Innovatewithnew approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practicessuch as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR. At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems. Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About the Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership with Data Analysts and Architects with guidance from Lead Software Engineer. Innovatewithnew approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practicessuch as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies