Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 years
40 Lacs
Visakhapatnam, Andhra Pradesh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Chandigarh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Visakhapatnam, Andhra Pradesh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Chandigarh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Thiruvananthapuram, Kerala, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Dehradun, Uttarakhand, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Dehradun, Uttarakhand, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Thiruvananthapuram, Kerala, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Vijayawada, Andhra Pradesh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Mysore, Karnataka, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Mysore, Karnataka, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Patna, Bihar, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Vijayawada, Andhra Pradesh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Patna, Bihar, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Pune/Pimpri-Chinchwad Area
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Pune/Pimpri-Chinchwad Area
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
1.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Company: Manish Jewellers Pvt. Ltd. Location: Mumbai Industry: Gold Jewellery Manufacturing & Wholesale Employment Type: Full-Time Experience Required: Fresher to 1 year (Jewellery industry experience preferred but not mandatory) Role Overview We are seeking a smart, dependable, and detail-oriented Inventory Assistant to support our daily inventory operations. This role involves maintaining accurate product data, managing jewellery tags, assisting with packaging and storage, and contributing to creative content generation using AI tools. The ideal candidate is tech-friendly, organized, and eager to learn. Accuracy, discipline, and adaptability are crucial in this high-value industry. Key Responsibilities Product Data & Inventory Management: Enter product details (weight, karat, type) into IMS (Inventory Management System) Generate barcode tags and attach them accurately to products Upload and manage product images and preview links Support stock audits and physical verification Ensure proper bin placement and systematic storage of jewellery Track product movement between departments with proper logging Packaging & Dispatch Support: Assist in safe, clean, and organized packaging for clients or exhibitions Cross-check tagging and quantity before sealing parcels Maintain accurate records of packed and dispatched items Workflow & Communication: Update Trello or internal dashboards with inventory status Immediately flag discrepancies to QC or Production Coordinator Organize jewellery trays and maintain a neat, disciplined stockroom layout Coordinate with team via Slack, WhatsApp, or Email as per task requirement AI-Powered Creative Support: During non-peak hours, use AI tools such as Gemini, ChatGPT, Midjourney , or similar platforms to generate product creatives, visual mockups, and written content Experiment with tools like Sora or Veo 3 to help create product videos, short reels, or jewellery showcase clips Assist in organizing product imagery into moodboards or promotional assets for internal and marketing use Maintain a clean archive of all AI-generated outputs for team reference and feedback First 3 Months Expectations Learn internal IMS and tagging system thoroughly Issue 100% error-free tags and entries Keep trays and bins organized with up-to-date labels Demonstrate punctuality and careful handling of all jewellery items Begin contributing to basic creative tasks using AI tools under guidance Assist seamlessly in daily stockroom and dispatch operations Tools You’ll Use IMS (Internal Inventory Software) — training provided Excel / Google Sheets Barcode printer and tag machine Slack / WhatsApp / Email for internal coordination AI platforms: Gemini, ChatGPT, Midjourney, Sora, Veo 3 (training/guidance provided as needed) Required Skills & Qualifications Minimum 12th pass or graduate in any stream Fresher to 1 year of work experience (jewellery or warehouse experience is a bonus) Basic computer literacy — typing, Excel, printing, internet usage Strong attention to detail and accuracy in repetitive tasks Ability to handle physical inventory with care and discipline Curiosity and willingness to experiment with new tools and technologies Work Environment & Expectations Formal dress code — you’ll work around high-value products Personal phones not allowed during work hours Daily update of stock movement logs required Expected to assist other departments (Dispatch/QC) when needed Cleanliness, organization, and discipline are strictly maintained Creative tasks using AI are expected during downtime Ideal Candidate Traits Eager to learn and grow in a structured, tech-integrated role Disciplined and consistent in repetitive yet sensitive tasks Trustworthy with high-value items and confidential information Comfortable using digital tools for both operational and creative work Respectful of internal structure and escalation protocols What We Offer Structured hands-on experience in inventory and product management Exposure to emerging AI tools in a real business context Training in jewellery-specific product handling, tagging, and creative tools Supportive work culture with potential for role expansion Competitive salary with performance-based appraisals Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Ghaziabad, Uttar Pradesh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Noida, Uttar Pradesh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Noida, Uttar Pradesh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Agra, Uttar Pradesh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Ghaziabad, Uttar Pradesh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Agra, Uttar Pradesh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Noida, Uttar Pradesh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: AWS Q, CodeWhisperer, Gen AI, CI/CD, contenarization, Go, microservices, RESTful API, MySQL, PHP, PostgreSQL MatchMove is Looking for: As a Technical Lead (Backend ), you will play a pivotal role in shaping the engineering foundation for a robust, real-time, cross-border payment platform. You’ll be writing clean, secure, and scalable Go services powering billions in financial flows, while championing engineering excellence and thoughtful platform design. You will contribute to:: Developing and scaling distributed payment transaction systems for cross-border and domestic remittance use cases. Designing resilient microservices in Go for high-volume, low-latency transaction flows with regional compliance and localization. Owning service-level metrics such as SLA adherence, latency (p95/p99), throughput, and availability. Building API-first products with strong documentation, mocks, and observability from day one. Enabling faster, safer development by leveraging Generative AI for test generation, documentation, and repetitive coding tasks — while maintaining engineering hygiene. Mentoring a high-performing, globally distributed engineering team and contributing to code reviews, design sessions, and cross-team collaboration. Responsibilities Lead design and development of backend services in Go with concurrency, memory safety, and observability in mind. Manage service uptime and reliability across multi-region deployments via dashboards, tracing, and alerting. Maintain strict SLAs for mission-critical payment operations and support incident response during SLA violations. Profile and optimize Go services using tools like pprof, benchstat, and the Go race detector. Drive code quality through test-driven development, code reviews, and API-first workflows (OpenAPI / Swagger). Collaborate cross-functionally with Product, QA, DevOps, Compliance, and Business to ensure production-readiness. Maintain well-documented service boundaries and internal libraries for scalable engineering velocity. Encourage strategic use of Generative AI for API mocking, test data generation, schema validation, and static analysis. Advocate for clean architecture, technical debt remediation, and security best practices (e.g., rate limiting, mTLS, context timeouts). Requirements Atleast 7 years of engineering experience with deep expertise in Go (Golang). Expert-level understanding of concurrency, goroutines, channels, synchronization primitives, and distributed coordination patterns Strong grasp of profiling and debugging Go applications, memory management, and performance tuning. Proven experience in instrumenting production systems for SLAs/SLIs with tools like Prometheus, Grafana, or OpenTelemetry. Solid experience with PostgreSQL / MySQL, schema design for high-consistency systems, and transaction lifecycle in financial services. Experience building, documenting, and scaling RESTful APIs in an API-first platform environment. Comfort with cloud-native tooling, containerization, and DevOps workflows (CI/CD, blue-green deployment, rollback strategies). Demonstrated understanding of observability practices: structured logging, distributed tracing, and alerting workflows. Brownie Points Experience in payments, card issuance, or remittance infrastructure. Working knowledge of PHP (for legacy systems). Contributions to Go open-source projects or public technical content. Experience with GenAI development tools like AWS Q , CodeWhisperer in a team setting Track record of delivering high-quality services in regulated environments with audit, compliance, and security mandates. Engagement Model:: Direct placement with client This is remote role Shift timings: :10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
7.0 years
40 Lacs
Noida, Uttar Pradesh, India
Remote
Experience : 7.00 + years Salary : INR 4000000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: MatchMove) (*Note: This is a requirement for one of Uplers' client - MatchMove) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, Pyspark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points:: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model: : Direct placement with client This is remote role Shift timings ::10 AM to 7 PM How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The logging job market in India is vibrant and offers a wide range of opportunities for job seekers interested in this field. Logging professionals are in demand across various industries such as IT, construction, forestry, and environmental management. If you are considering a career in logging, this article will provide you with valuable insights into the job market, salary range, career progression, related skills, and common interview questions.
These cities are known for their thriving industries where logging professionals are actively recruited.
The average salary range for logging professionals in India varies based on experience and expertise. Entry-level positions typically start at INR 3-5 lakhs per annum, while experienced professionals can earn upwards of INR 10-15 lakhs per annum.
A typical career path in logging may include roles such as Logging Engineer, Logging Supervisor, Logging Manager, and Logging Director. Professionals may progress from entry-level positions to more senior roles such as Lead Logging Engineer or Logging Consultant.
In addition to logging expertise, employers often look for professionals with skills such as data analysis, problem-solving, project management, and communication skills. Knowledge of industry-specific software and tools may also be beneficial.
As you embark on your journey to explore logging jobs in India, remember to prepare thoroughly for interviews by honing your technical skills and understanding industry best practices. With the right preparation and confidence, you can land a rewarding career in logging that aligns with your professional goals. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2