Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 years
40 Lacs
Nashik, Maharashtra, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Kanpur, Uttar Pradesh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Kanpur, Uttar Pradesh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Nashik, Maharashtra, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Kolkata, West Bengal, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Kolkata, West Bengal, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Bhubaneswar, Odisha, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Cuttack, Odisha, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Cuttack, Odisha, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Bhubaneswar, Odisha, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Guwahati, Assam, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Ranchi, Jharkhand, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Guwahati, Assam, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Jamshedpur, Jharkhand, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Raipur, Chhattisgarh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Jamshedpur, Jharkhand, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Raipur, Chhattisgarh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Ranchi, Jharkhand, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Amritsar, Punjab, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Amritsar, Punjab, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Key Responsibilities: Design, implement, and manage CI/CD pipelines for rapid and reliable product delivery Automate infrastructure provisioning using tools like Terraform, CloudFormation, or Ansible Monitor and maintain system performance, reliability, and scalability Manage cloud infrastructure (AWS, Azure, or GCP) with a focus on cost, performance, and security Implement and maintain logging, monitoring, and alerting solutions (e.g., Prometheus, Grafana, ELK, Datadog) Ensure infrastructure security best practices including secrets management, access controls, and compliance Collaborate with development teams to ensure DevOps best practices are followed across the lifecycle Troubleshoot production issues and lead root cause analysis Support containerization and orchestration using Docker and Kubernetes Requirements : Bachelor’s degree in Computer Science, Engineering, or a related field 5+ years of experience in a DevOps, SRE, or similar role in a product-focused environment Proficient with CI/CD tools such as Jenkins, GitLab CI, CircleCI, etc. Strong experience with AWS, Azure, or Google Cloud Hands-on experience with infrastructure-as-code (Terraform, Ansible, etc.) Solid understanding of containerization (Docker) and orchestration (Kubernetes) Experience with scripting languages (Bash, Python, etc.) Knowledge of monitoring/logging tools like ELK, Prometheus, Grafana, or Datadog Strong communication and collaboration skills Show more Show less
Posted 1 day ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
This role is for one of the Weekday's clients Salary range: Rs 1200000 - Rs 2400000 (ie INR 12-24 LPA) Min Experience: 6 years Location: Hyderabad JobType: full-time We are looking for a seasoned Azure DevOps Engineer to lead the design, implementation, and management of DevOps practices within the Microsoft Azure ecosystem. The ideal candidate will bring deep expertise in automation, CI/CD pipelines, infrastructure as code (IaC), cloud-native tools, and security best practices. This position will collaborate closely with cross-functional teams to drive efficient, secure, and scalable DevOps workflows. Requirements Key Responsibilities: DevOps & CI/CD Implementation Build and maintain scalable CI/CD pipelines using Azure DevOps, GitHub Actions, or Jenkins. Automate software build, testing, and deployment processes to improve release cycles. Integrate automated testing, security scanning, and code quality checks into the pipeline. Infrastructure as Code (IaC) & Cloud Automation Develop and maintain IaC templates using Terraform, Bicep, or ARM templates. Automate infrastructure provisioning, scaling, and monitoring across Azure environments. Ensure cloud cost optimization and resource efficiency. Monitoring, Logging & Security Configure monitoring tools like Azure Monitor, App Insights, and Log Analytics. Apply Azure security best practices in CI/CD workflows and cloud architecture. Implement RBAC, Key Vault usage, and ensure policy and compliance adherence. Collaboration & Continuous Improvement Work with development, QA, and IT teams to enhance DevOps processes and workflows. Identify and resolve bottlenecks in deployment and infrastructure automation. Stay informed about industry trends and the latest features in Azure DevOps and IaC tooling. Required Skills & Experience: 5-7 years of hands-on experience in Azure DevOps and cloud automation Strong knowledge of: Azure DevOps Services (Pipelines, Repos, Boards, Artifacts, Test Plans) CI/CD tools: YAML Pipelines, GitHub Actions, Jenkins Version control: Git (Azure Repos, GitHub, Bitbucket) IaC: Terraform, Bicep, ARM templates Containerization & Orchestration: Docker, Kubernetes (AKS) Monitoring: Azure Monitor, App Insights, Prometheus, Grafana Security: Azure Security Center, RBAC, Key Vault, Compliance Policy Management Familiarity with configuration management tools like Ansible, Puppet, or Chef (optional) Strong analytical and troubleshooting skills Excellent communication skills and ability to work in Agile/Scrum environments Preferred Certifications: Microsoft Certified: Azure DevOps Engineer Expert (AZ-400) Microsoft Certified: Azure Administrator Associate (AZ-104) Certified Kubernetes Administrator (CKA) - optional Skills: Azure | DevOps | CI/CD | GitHub Actions | Terraform | Infrastructure as Code | Kubernetes | Docker | Monitoring | Cloud Security Show more Show less
Posted 1 day ago
0 years
0 Lacs
India
On-site
OpenShift and Kubernetes Cluster Management: Installation, configuration, and maintenance of OpenShift clusters, including control plane, worker nodes, and networking. Containerized Application Deployment: Deploying and managing containerized applications within OpenShift, including image building, registry management, and application lifecycle management. CI/CD Pipeline Implementation: Setting up and managing continuous integration and continuous delivery (CI/CD) pipelines for automating application deployments. Infrastructure-as-Code (IaC): Using tools like Ansible, Terraform, or similar to automate infrastructure provisioning and management. Troubleshooting and Performance Optimization: Diagnosing and resolving issues related to OpenShift performance, stability, and security. Security: Implementing and enforcing security best practices for OpenShift clusters and containerized applications. Automation: Developing and maintaining automation scripts to streamline OpenShift operations. Monitoring and Logging: Setting up and maintaining monitoring and logging systems to track OpenShift cluster and application health. Skills and Experience: OpenShift and Kubernetes: Extensive hands-on experience with OpenShift, including installation, configuration, troubleshooting, and optimization. Containerization: Understanding of containerization technologies (e.g., Docker), container registries, and container orchestration. Show more Show less
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are looking for a Service Writer to join our team and act as our liaison for customers to address their vehicle repair needs. A Service Writers responsibilities include documenting the repairs needed and scheduling appropriate technicians for each job in our computer system. Ultimately, you will ensure that the needs of our customers are met, coordinate transactions and estimate both time costs to ensure everything runs smoothly for our customers. Responsibilities Develop cost estimates, logging needed parts and the time needed for repairs Schedule the most appropriate Service Technician for each job Convey all necessary information regarding costs, parts, work and Technicians to customers Call the customer to arrange appointments Meet with customers to discuss their requirements and relay those requirements to the Service Technicians Contact customers in the case of additional work to relay the details and extra costs Enter the details of repair jobs on the companys network and prepare repair instructions This job is provided by Shine.com Show more Show less
Posted 1 day ago
0 years
0 Lacs
Greater Kolkata Area
On-site
We are looking for a Service Writer to join our team and act as our liaison for customers to address their vehicle repair needs. A Service Writers responsibilities include documenting the repairs needed and scheduling appropriate technicians for each job in our computer system. Ultimately, you will ensure that the needs of our customers are met, coordinate transactions and estimate both time costs to ensure everything runs smoothly for our customers. Responsibilities Develop cost estimates, logging needed parts and the time needed for repairs Schedule the most appropriate Service Technician for each job Convey all necessary information regarding costs, parts, work and Technicians to customers Call the customer to arrange appointments Meet with customers to discuss their requirements and relay those requirements to the Service Technicians Contact customers in the case of additional work to relay the details and extra costs Enter the details of repair jobs on the companys network and prepare repair instructions This job is provided by Shine.com Show more Show less
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2