Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
12.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Overview PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCo’s global business scale to enable business insights, advanced analytics, and new product development. PepsiCo’s Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. What PepsiCo Data Management and Operations does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/curation, and access to the PepsiCo corporate data asset. Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company. As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The candidate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Manage a team of data engineers and data analysts by delegating project responsibilities and managing their flow of work as well as empowering them to realize their full potential. Design, structure and store data into unified data models and link them together to make the data reusable for downstream products. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Create reusable accelerators and solutions to migrate data from legacy data warehouse platforms such as Teradata to Azure Databricks and Azure SQL. Enable and accelerate standards-based development prioritizing reuse of code, adopt test-driven development, unit testing and test automation with end-to-end observability of data Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality, performance and cost. Collaborate with internal clients (product teams, sector leads, data science teams) and external partners (SI partners/data providers) to drive solutioning and clarify solution requirements. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects to build and support the right domain architecture for each application following well-architected design standards. Define and manage SLA’s for data products and processes running in production. Create documentation for learnings and knowledge transfer to internal associates. Qualifications 12+ years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 8+ years of experience with Data Lakehouse, Data Warehousing, and Data Analytics tools. 6+ years of experience in SQL optimization and performance tuning on MS SQL Server, Azure SQL or any other popular RDBMS 6+ years of experience in Python/Pyspark/Scala programming on big data platforms like Databricks 4+ years in cloud data engineering experience in Azure or AWS. Fluent with Azure cloud services. Azure Data Engineering certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one business intelligence tool such as Power BI or Tableau Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like ADO, Github and CI/CD tools for DevOps automation and deployments. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals. Ability to lead others without direct authority in a matrixed environment. Comfortable working in a hybrid environment with teams consisting of contractors as well as FTEs spread across multiple PepsiCo locations. Domain Knowledge in CPG industry with Supply chain/GTM background is preferred. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Overview PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCo’s global business scale to enable business insights, advanced analytics, and new product development. PepsiCo’s Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. What PepsiCo Data Management and Operations does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/curation, and access to the PepsiCo corporate data asset Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company. As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The candidate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Provide leadership and management to a team of data engineers, managing processes and their flow of work, vetting their designs, and mentoring them to realize their full potential. Act as a subject matter expert across different digital projects. Oversee work with internal clients and external partners to structure and store data into unified taxonomies and link them together with standard identifiers. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance. Responsible for implementing best practices around systems integration, security, performance, and data management. Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape. Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners. Develop and optimize procedures to “productionalize” data science models. Define and manage SLA’s for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries. Qualifications 8+ years of overall technology experience that includes at least 4+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark, Scala etc.). 2+ years in cloud data engineering experience in Azure. Fluent with Azure cloud services. Azure Certification is a plus. Experience in Azure Log Analytics Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse or Snowflake. Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals. Ability to lead others without direct authority in a matrixed environment. Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
India
Remote
About The Company Teikametrics' AI-powered Marketplace Optimization platform help sellers and brand owners to maximize their potential on the world's most valuable marketplaces. Founded in 2015, Teikametrics uses Proprietary AI technology to maximize profitability in a simple SaaS interface. Teikametrics optimizes more than $8 billion in GMV across thousands of sellers around the world, with brands including Munchkin, mDesign, Clarks, Nutribullet, Conair, Nutrafol, and Solo Stove trusting Teikametrics to unlock the full potential of their selling and advertising on Amazon, Walmart and other marketplaces. Teikametrics continues to grow exponentially, with teams spanning 3+ countries. We are financially strong, continuously meeting or exceeding revenue targets, and we invest heavily in strengthening the foundation of our organization. About The Role Teikametrics is looking for a Senior Software Engineer - Data Engineering with strong computer science fundamentals and a background in data engineering, API Integration, or large-scale data processing. This role involves designing, developing, and scaling robust data pipelines to process massive amounts of structured and unstructured data. The candidate will work closely with data scientists, analysts, and product engineers to deliver high-performance, scalable solutions. The architecture and stack evolve continuously as we scale to cater to an ever-growing customer base. Our technology stack includes Databricks, Spark (Scala), Kafka, AWS S3, and other distributed computing tools. How You'll Spend Your Time Design and implement highly scalable, fault-tolerant data pipelines for real-time and batch processing. Develop and optimize end-to-end Databricks Spark pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data. Build and manage ETL (Extract, Transform, Load) processes to integrate data from diverse sources into our data ecosystem. Implement data validation, governance, and quality assurance mechanisms to ensure accuracy, completeness, and reliability. Collaborate with data scientists, ML engineers, and analysts to integrate AI/ML models into production environments, ensuring efficient data pipelines for training, deployment, and monitoring. Work with real-time data streaming solutions such as Kafka, Kinesis, or Flink to process and analyze event-driven data. Improve and optimize performance, scalability, and efficiency of data workflows and storage solutions. Document technical designs, workflows, and best practices to facilitate knowledge sharing and maintain system documentation. Who You Are 4+ years of experience as a professional software/data engineer, with a strong background in building large-scale distributed data processing systems. Experience with AI, machine learning, or data science concepts, including working on ML feature engineering, model training pipelines, or AI-driven data analytics. Hands-on experience with Apache Spark (Scala or Python) and Databricks. Experience with real-time data streaming technologies such as Kafka, Flink, Kinesis, or Dataflow. Proficiency in Java, Scala, or Python for building scalable data engineering solutions. Deep understanding of cloud-based architectures (AWS, GCP, or Azure) and experience with S3, Lambda, EMR, Glue, or Redshift. Experience in writing well-designed, testable, and scalable AI/ML data pipelines that can be efficiently reused and maintained with effective unit and integration testing. Strong understanding of data warehousing principles and best practices for optimizing large-scale ETL workflows. Experience with ML frameworks such as TensorFlow, PyTorch, or Scikit-learn. Optimize ML feature engineering and model training pipelines for scalability and efficiency. Knowledge of SQL and NoSQL databases for structured and unstructured data storage. Passion for collaborative development, continuous learning, and mentoring junior engineers. What Can Help You Stand Out Exposure to MLOps or Feature Stores for managing machine learning model data. Experience with data governance, compliance, and security best practices. Experience working in a fast-paced startup environment. WE'VE GOT YOU COVERED Every Teikametrics employee is eligible for company equity Remote Work – flexibility to work from home or from our offices + remote working allowance Broadband reimbursement Group Medical Insurance – Coverage of INR 7,50,000 per annum for a family Crèche benefit Training and development allowance Press Reference About Teika Teikametrics’ Marketplace Optimization Platform, Flywheel 2.0, Adds AI-Powered Automation to Maximize Advertising Performance Across Marketplaces The job description is representative of typical duties and responsibilities for the position and is not all-inclusive. Other duties and responsibilities may be assigned in accordance with business needs. We are proud to be an equal opportunity employer. A background check will be conducted after a conditional offer of employment is extended. Show more Show less
Posted 1 week ago
3.0 years
35 Lacs
India
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior- Snowflake As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Lead and Architect migration of data analytics environment from Teradata to Snowflake with performance and reliability Develop & deploy big data pipelines in a cloud environment using Snowflake cloud DW ETL design, development and migration of existing on-prem ETL routines to Cloud Service Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success Hands on developer in the field of data warehousing, ETL Hands on development experience in Snowflake. Experience in Snowflake modelling - roles, schema, databases. Experience in Integrating with third-party tools, ETL, DBT tools Experience in Snowflake advanced concepts like setting up resource monitors and performance tuning would be preferable Applying object-oriented and functional programming styles to real-world Big Data Engineering problems using Java/Scala/Python Develop data pipelines to perform batch and Real - Time/Stream analytics on structured and unstructured data. Data processing patterns, distributed computing and in building applications for real-time and batch analytics. Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing. To qualify for the role, you must have Be a computer science graduate or equivalent with 3 - 7 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Be a technical expert on all aspects of Snowflake Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Ideally, you’ll also have Client management skills What We Look For Minimum 5 years of experience as Architect on Analytics solutions and around 2 years of experience with Snowflake. People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
3.0 years
35 Lacs
Kochi, Kerala, India
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred Technical And Professional Experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers Show more Show less
Posted 1 week ago
4.0 - 9.0 years
17 - 27 Lacs
Chennai, Bengaluru
Work from Office
Role & responsibilities • Experience with big data technologies (Hadoop, Spark, Hive) • Proven experience as a development data engineer or similar role, with ETL background. • Experience with data integration / ETL best practices and data quality principles. • Play a crucial role in ensuring the quality and reliability of the data by designing, implementing, and executing comprehensive testing. • By going over the User Stories build the comprehensive code base and business rules for testing and validation of the data. • Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. • Familiarity with Agile/Scrum development methodologies. • Excellent analytical and problem-solving skills. • Strong communication and collaboration skills.
Posted 1 week ago
3.0 years
35 Lacs
Visakhapatnam, Andhra Pradesh, India
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Andhra Pradesh, India
On-site
Responsibilities Design and Develop Scalable Data Pipelines: Build and maintain robust data pipelines using Python to process, transform, and integrate large-scale data from diverse sources. Orchestration and Automation: Implement and manage workflows using orchestration tools such as Apache Airflow to ensure reliable and efficient data operations. Data Warehouse Management: Work extensively with Snowflake to design and optimize data models, schemas, and queries for analytics and reporting. Queueing Systems: Leverage message queues like Kafka, SQS, or similar tools to enable real-time or batch data processing in distributed environments. Collaboration: Partner with Data Science, Product, and Engineering teams to understand data requirements and deliver solutions that align with business objectives. Performance Optimization: Optimize the performance of data pipelines and queries to handle large scales of data efficiently. Data Governance and Security: Ensure compliance with data governance and security standards to maintain data integrity and privacy. Documentation: Create and maintain clear, detailed documentation for data solutions, pipelines, and workflows. Qualifications Required Skills: 5+ years of experience in data engineering roles with a focus on building scalable data solutions. Proficiency in Python for ETL, data manipulation, and scripting. Hands-on experience with Snowflake or equivalent cloud-based data warehouses. Strong knowledge of orchestration tools such as Apache Airflow or similar. Expertise in implementing and managing messaging queues like Kafka, AWS SQS, or similar. Demonstrated ability to build and optimize data pipelines at scale, processing terabytes of data. Experience in data modeling, data warehousing, and database design. Proficiency in working with cloud platforms like AWS, Azure, or GCP. Strong understanding of CI/CD pipelines for data engineering workflows. Experience working in an Agile development environment, collaborating with cross-functional teams. Preferred Skills Familiarity with other programming languages like Scala or Java for data engineering tasks. Knowledge of containerization and orchestration technologies (Docker, Kubernetes). Experience with stream processing frameworks like Apache Flink. Experience with Apache Iceberg for data lake optimization and management. Exposure to machine learning workflows and integration with data pipelines. Soft Skills Strong problem-solving skills with a passion for solving complex data challenges. Excellent communication and collaboration skills to work with cross-functional teams. Ability to thrive in a fast-paced, innovative environment. Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Udaipur, Rajasthan, India
On-site
Data Architect Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Role: Data Architect Experience: 10+ Yrs Location: Udaipur , Jaipur,Bangalore Domain: Telecom Job Description: We are seeking an experienced Telecom Data Architect to join our team. In this role, you will be responsible for designing comprehensive data architecture and technical solutions specifically for telecommunications industry challenges, leveraging TMforum frameworks and modern data platforms. You will work closely with customers, and technology partners to deliver data solutions that address complex telecommunications business requirements including customer experience management, network optimization, revenue assurance, and digital transformation initiatives. Key Responsibilities: • Design and articulate enterprise-scale telecom data architectures incorporating TMforum standards and frameworks, including SID (Shared Information/Data Model), TAM (Telecom Application Map), and eTOM (enhanced Telecom Operations Map) • Develop comprehensive data models aligned with TMforum guidelines for telecommunications domains such as Customer, Product, Service, Resource, and Partner management • Create data architectures that support telecom-specific use cases including customer journey analytics, network performance optimization, fraud detection, and revenue assurance • Design solutions leveraging Microsoft Azure and Databricks for telecom data processing and analytics • Conduct technical discovery sessions with telecom clients to understand their OSS/BSS architecture, network analytics needs, customer experience requirements, and digital transformation objectives • Design and deliver proof of concepts (POCs) and technical demonstrations showcasing modern data platforms solving real-world telecommunications challenges • Create comprehensive architectural diagrams and implementation roadmaps for telecom data ecosystems spanning cloud, on-premises, and hybrid environments • Evaluate and recommend appropriate big data technologies, cloud platforms, and processing frameworks based on telecom-specific requirements and regulatory compliance needs. • Design data governance frameworks compliant with telecom industry standards and regulatory requirements (GDPR, data localization, etc.) • Stay current with the latest advancements in data technologies including cloud services, data processing frameworks, and AI/ML capabilities • Contribute to the development of best practices, reference architectures, and reusable solution components for accelerating proposal development Required Skills: • 10+ years of experience in data architecture, data engineering, or solution architecture roles with at least 5 years in telecommunications industry • Deep knowledge of TMforum frameworks including SID (Shared Information/Data Model), eTOM, TAM, and their practical implementation in telecom data architectures • Demonstrated ability to estimate project efforts, resource requirements, and implementation timelines for complex telecom data initiatives • Hands-on experience building data models and platforms aligned with TMforum standards and telecommunications business processes • Strong understanding of telecom OSS/BSS systems, network management, customer experience management, and revenue management domains • Hands-on experience with data platforms including Databricks, and Microsoft Azure in telecommunications contexts • Experience with modern data processing frameworks such as Apache Kafka, Spark and Airflow for real-time telecom data streaming • Proficiency in Azure cloud platform and its respective data services with an understanding of telecom-specific deployment requirements • Knowledge of system monitoring and observability tools for telecommunications data infrastructure • Experience implementing automated testing frameworks for telecom data platforms and pipelines • Familiarity with telecom data integration patterns, ETL/ELT processes, and data governance practices specific to telecommunications • Experience designing and implementing data lakes, data warehouses, and machine learning pipelines for telecom use cases • Proficiency in programming languages commonly used in data processing (Python, Scala, SQL) with telecom domain applications • Understanding of telecommunications regulatory requirements and data privacy compliance (GDPR, local data protection laws) • Excellent communication and presentation skills with ability to explain complex technical concepts to telecom stakeholders • Strong problem-solving skills and ability to think creatively to address telecommunications industry challenges • Good to have TMforum certifications or telecommunications industry certifications • Relevant data platform certifications such as Databricks, Azure Data Engineer are a plus • Willingness to travel as required Educational Qualifications: · Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm Show more Show less
Posted 1 week ago
3.0 years
35 Lacs
Greater Bhopal Area
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior- Snowflake As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Lead and Architect migration of data analytics environment from Teradata to Snowflake with performance and reliability Develop & deploy big data pipelines in a cloud environment using Snowflake cloud DW ETL design, development and migration of existing on-prem ETL routines to Cloud Service Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success Hands on developer in the field of data warehousing, ETL Hands on development experience in Snowflake. Experience in Snowflake modelling - roles, schema, databases. Experience in Integrating with third-party tools, ETL, DBT tools Experience in Snowflake advanced concepts like setting up resource monitors and performance tuning would be preferable Applying object-oriented and functional programming styles to real-world Big Data Engineering problems using Java/Scala/Python Develop data pipelines to perform batch and Real - Time/Stream analytics on structured and unstructured data. Data processing patterns, distributed computing and in building applications for real-time and batch analytics. Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing. To qualify for the role, you must have Be a computer science graduate or equivalent with 3 - 7 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Be a technical expert on all aspects of Snowflake Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Ideally, you’ll also have Client management skills What We Look For Minimum 5 years of experience as Architect on Analytics solutions and around 2 years of experience with Snowflake. People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
3.0 years
35 Lacs
Indore, Madhya Pradesh, India
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
35 Lacs
Chandigarh, India
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
35 Lacs
Dehradun, Uttarakhand, India
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
35 Lacs
Mysore, Karnataka, India
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
35 Lacs
Vijayawada, Andhra Pradesh, India
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
35 Lacs
Thiruvananthapuram, Kerala, India
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
35 Lacs
Patna, Bihar, India
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Full-time Career Site Team: Data Science & Business Intelligence Company Description NielsenIQ is a global measurement and data analytics company that provides the most complete and trusted view available of consumers and markets worldwide. We provide consumer packaged goods manufacturers/fast-moving consumer goods and retailers with accurate, actionable information and insights and a complete picture of the complex and changing marketplace that companies need to innovate and grow. Our approach marries proprietary NielsenIQ data with other data sources to help clients around the world understand what’s happening now, what’s happening next, and how to best act on this knowledge. We like to be in the middle of the action. That’s why you can find us at work in over 90 countries, covering more than 90% of the world’s population. For more information, visit www.niq.com. NielsenIQ is committed to hiring and retaining a diverse workforce. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Job Description About the job Help our clients-internal and external-understand and use RMS services better by understanding their requirements, queries, and helping address the same through knowledge of data science and RMS. Responsibilities Building knowledge of Nielsen suite of products and demonstrating the same Understanding client concerns Able to put forth ways and means of solving client concerns with supervision Automation and development of solutions for existing processes Taking initiative to understand concerns/problems in the RMS product and participating in product improvement initiatives About The Job Help our clients-internal and external-understand and use RMS services better by understanding their requirements, queries, and helping address the same through knowledge of data science and RMS. Responsibilities Building knowledge of Nielsen suite of products and demonstrating the same Understanding client concerns Able to put forth ways and means of solving client concerns with supervision Automation and development of solutions for existing processes Taking initiative to understand concerns/problems in the RMS product and participating in product improvement initiatives Qualifications Professionals with degrees in Maths, Data Science, Statistics, or related fields involving statistical analysis of large data sets 2-3 years of experience in market research or relevant field Mindset And Approach To Work Embraces change, innovation and iterative processes in order to continuously improve the products value to clients. Continuously collaborate & support to improve the product. Active interest in arriving at collaboration and consensus in communication plans, deliverables and deadlines Plans and completes assignments independently within an established framework, breaking down complex tasks, making reasonable decisions. Work is reviewed for overall technical soundness. Participates in data experiments and PoCs, setting measurable goals, timelines and reproducible outcomes. Applies critical thinking and takes initiative. Continuously reviews the latest industry innovations and effectively applies them to their work Consistently challenges and analyzes data to ensure accuracy. Functional Skills Ability to manipulate, analyze and interpret large data sources Experienced in high-level programming languages (f.e. Python, R, SQL, Scala), as well as with data visualization tools (e.g. Power BI, Spotfire, Tableau, MicroStrategy) Able to work in virtual environment. Familiar with git/Bitbucket processes People with at least some experience in RMS, NIQ, will have an advantage Can use a logical reasoning process to break down and work through increasingly challenging situations or problems to arrive at positive outcomes. Identify and use data from various sources to influence decisions Interpret effectively the data in relation to business objectives Soft Skills Ability to engage/communicate with team and extended team members Can adapt to change and new ideas or ways of working. Exhibits emotional intelligence when partnering with internal and external stakeholders Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion I'm interested I'm interested Privacy Policy Show more Show less
Posted 1 week ago
3.0 years
35 Lacs
Agra, Uttar Pradesh, India
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
35 Lacs
Noida, Uttar Pradesh, India
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
35 Lacs
Ghaziabad, Uttar Pradesh, India
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
35 Lacs
Noida, Uttar Pradesh, India
Remote
Experience : 3.00 + years Salary : INR 3500000.00 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: NA) (*Note: This is a requirement for one of Uplers' client - Nomupay) What do you need for this opportunity? Must have skills required: Apache Hudi, Flink, Iceberg, Apache Airflow, Spark, AWS, Azure, GCP, Kafka, SQL Nomupay is Looking for: 📈 Opportunity in a company with a solid track record of performance 🤝 Opportunity to work with diverse, global teams 🚀 Rapid career advancement with opportunities to learn 💰 Competitive salary and Performance bonus Design, build, and optimize scalable ETL pipelines using Apache Airflow or similar frameworks to process and transform large datasets efficiently. Utilize Spark (PySpark), Kafka, Flink, or similar tools to enable distributed data processing and real-time streaming solutions. Deploy, manage, and optimize data infrastructure on cloud platforms such as AWS, GCP, or Azure, ensuring security, scalability, and cost-effectiveness. Design and implement robust data models, ensuring data consistency, integrity, and performance across warehouses and lakes. Enhance query performance through indexing, partitioning, and tuning techniques for large-scale datasets. Manage cloud-based storage solutions (Amazon S3, Google Cloud Storage, Azure Blob Storage) and ensure data governance, security, and compliance. Work closely with data scientists, analysts, and software engineers to support data-driven decision-making, while maintaining thorough documentation of data processes. Strong proficiency in Python and SQL, with additional experience in languages such as Java or Scala. Hands-on experience with frameworks like Spark (PySpark), Kafka, Apache Hudi, Iceberg, Apache Flink, or similar tools for distributed data processing and real-time streaming. Familiarity with cloud platforms like AWS, Google Cloud Platform (GCP), or Microsoft Azure for building and managing data infrastructure. Strong understanding of data warehousing concepts and data modeling principles. Experience with ETL tools such as Apache Airflow or comparable data transformation frameworks. Proficiency in working with data lakes and cloud based storage solutions like Amazon S3, Google Cloud Storage, or Azure Blob Storage. Expertise in Git for version control and collaborative coding. Expertise in performance tuning for large-scale data processing, including partitioning, indexing, and query optimization. NomuPay is a newly established company that through its subsidiaries will provide state of the art unified payment solutions to help its clients accelerate growth in large high growth countries in Asia, Turkey, and the Middle East region. NomuPay is funded by Finch Capital, a leading European and South East Asian Financial Technology investor. Nomu Pay has acquired WireCard Turkey on Apr 21, 2021 for an undisclosed amount. Founders Peter Burridge, CEO Investor, board member, and strategic executive, Peter has more than 30 years of management and leadership experience at rapid growth technology companies. His unique hands-on approach to business development and corporate governance has made him a trusted advisor and authority in the enterprise software industry and the financial technology sector. As President of Hyperwallet, Peter guided the organization through a successful recapitalization, followed by global expansion and the ultimate sale of the business to PayPal. Peter is a recognizable figure in the San Francisco fintech community and global payments industry. Peter has previously served in leadership roles at Oracle, Siebel, Travelex Global Business Payments, and as an investor and advisor in the technology sector. Outside the office, Peter’s passions include racing cars, golf and rugby union. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2